WO2022113326A1 - Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information - Google Patents
Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information Download PDFInfo
- Publication number
- WO2022113326A1 WO2022113326A1 PCT/JP2020/044443 JP2020044443W WO2022113326A1 WO 2022113326 A1 WO2022113326 A1 WO 2022113326A1 JP 2020044443 W JP2020044443 W JP 2020044443W WO 2022113326 A1 WO2022113326 A1 WO 2022113326A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- game
- progress
- information
- live
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 151
- 230000009471 action Effects 0.000 claims abstract description 77
- 230000015654 memory Effects 0.000 claims abstract description 47
- 230000002250 progressing effect Effects 0.000 claims abstract description 8
- 238000009826 distribution Methods 0.000 claims description 411
- 230000033001 locomotion Effects 0.000 claims description 171
- 230000008569 process Effects 0.000 claims description 96
- 230000004044 response Effects 0.000 claims description 29
- 230000010365 information processing Effects 0.000 claims description 10
- 238000012545 processing Methods 0.000 abstract description 41
- 238000012384 transportation and delivery Methods 0.000 description 63
- 238000003860 storage Methods 0.000 description 61
- 238000004891 communication Methods 0.000 description 46
- 238000010586 diagram Methods 0.000 description 32
- 230000006870 function Effects 0.000 description 32
- 210000001508 eye Anatomy 0.000 description 24
- 230000004048 modification Effects 0.000 description 19
- 238000012986 modification Methods 0.000 description 19
- 230000000694 effects Effects 0.000 description 18
- 230000006399 behavior Effects 0.000 description 17
- 230000005540 biological transmission Effects 0.000 description 15
- 230000036544 posture Effects 0.000 description 13
- 210000003128 head Anatomy 0.000 description 12
- 238000006243 chemical reaction Methods 0.000 description 10
- 230000008859 change Effects 0.000 description 8
- 238000010079 rubber tapping Methods 0.000 description 8
- 230000007704 transition Effects 0.000 description 7
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 229910052709 silver Inorganic materials 0.000 description 6
- 239000004332 silver Substances 0.000 description 6
- 102100026338 F-box-like/WD repeat-containing protein TBL1Y Human genes 0.000 description 5
- 101000835691 Homo sapiens F-box-like/WD repeat-containing protein TBL1X Proteins 0.000 description 5
- 101000835690 Homo sapiens F-box-like/WD repeat-containing protein TBL1Y Proteins 0.000 description 5
- 238000005401 electroluminescence Methods 0.000 description 5
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 5
- 229910052737 gold Inorganic materials 0.000 description 5
- 239000010931 gold Substances 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- 238000009877 rendering Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000014759 maintenance of location Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 238000005034 decoration Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 238000003780 insertion Methods 0.000 description 3
- 230000037431 insertion Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 210000005155 neural progenitor cell Anatomy 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 230000014509 gene expression Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000001771 impaired effect Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 210000004087 cornea Anatomy 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000011084 recovery Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/216—Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/85—Providing additional services to players
- A63F13/86—Watching games played by other players
Definitions
- This disclosure relates to game methods, computer-readable media, and information terminal devices.
- Patent Document 1 describes an information processing system that distributes data for displaying contents such as live moving images and game screens on a viewing terminal to the viewing terminal.
- contents such as live moving images and game screens
- decorations flowers, letters, logos, etc.
- the viewer can use only the item that matches the image of the content for the decoration of the display screen. can.
- the items displayed in a list that is, the items that can be input are fixed, so there is a risk that they lack a good taste (interesting taste).
- the present disclosure has been devised in view of such circumstances, and an object thereof is to provide a game program, a game method, and an information terminal device that can enhance the taste.
- a method for the progress of a game executed on a user terminal including a processor, a memory, and a display unit includes a first part in which the game progresses according to a user's operation, and a second part in which the content is displayed on the display unit based on the first information live-streamed via the server.
- the second step is the step of accepting an action, in which the progress of the game is switched from the first part to the second part according to the result of a specific action, and the privilege associated with the user in the first step. It includes a second step of performing a process for giving to the distributor of the live distribution source based on the first operation in the part.
- a method for the progress of a game executed on a user terminal including a processor, a memory, and a display unit is provided.
- the game includes a first part in which the game progresses according to a user's operation, and a second part in which the content is displayed on the display unit based on the first information lively distributed via the server.
- the method is a step of associating a user with a privilege that is available in the second part instead of the first part by satisfying a predetermined achievement condition in the first part by the processor, and a completed first step.
- an information processing device for the progress of a game which comprises a processor, a memory, and a display unit.
- the game includes a first part in which the game progresses according to a user's operation and a second part in which the content is displayed on the display unit based on the first information lively distributed via the server.
- the privilege that becomes available in the second part instead of the first part by satisfying the predetermined achievement condition in the first part is associated with the user.
- Request the progress of the completed second part receive the recorded operation instruction data from the server, and the operation instruction data includes the motion data and voice data input by the distributor of the live distribution source, and the operation instruction.
- the progress of the second part is executed by operating the distributor's avatar object based on the data.
- the system according to the present disclosure is a system for providing a game to a plurality of users.
- the system will be described with reference to the drawings. It should be noted that the present disclosure is not limited to these examples, and is indicated by the scope of claims, and it is intended that the present disclosure includes all changes in the meaning and scope equivalent to the scope of claims. To. In the following description, the same elements are designated by the same reference numerals in the description of the drawings, and duplicate description is not repeated.
- FIG. 1 is a diagram showing an outline of the system 1 according to the present embodiment.
- the system 1 includes a plurality of user terminals 100 (computers), a server 200, a game play terminal 300 (external device, a second external device), and a distribution terminal 400 (external, first external device).
- FIG. 1 describes user terminals 100A to 100C, in other words, three user terminals 100 as an example of a plurality of user terminals 100, but the number of user terminals 100 is not limited to the illustrated example. .. Further, in the present embodiment, when it is not necessary to distinguish the user terminals 100A to C, it is described as "user terminal 100".
- the user terminal 100, the game play terminal 300, and the distribution terminal 400 are connected to the server 200 via the network 2.
- the network 2 is composed of various mobile communication systems constructed by the Internet and a radio base station (not shown). Examples of this mobile communication system include so-called 3G and 4G mobile communication systems, LTE (Long Term Evolution), and wireless networks (for example, Wi-Fi (registered trademark)) that can be connected to the Internet by a predetermined access point. Will be.
- the game As an example of the game provided by the system 1 (hereinafter referred to as the game), a game mainly played by the user of the game play terminal 300 will be described.
- the user of the game play terminal 300 is referred to as a "player".
- the player advances the game by operating the characters appearing in the game.
- the user of the user terminal 100 plays a role of supporting the progress of the game by the player. Details of this game will be described later.
- the game provided by the system 1 may be a game in which a plurality of users participate, and is not limited to this example.
- Gameplay terminal 300 The game play terminal 300 advances the game in response to an input operation by the player. Further, the game play terminal 300 sequentially distributes information generated by the player's game play (hereinafter, game progress information) to the server 200 in real time.
- game progress information information generated by the player's game play
- the server 200 transmits the game progress information (second data) received in real time from the game play terminal 300 to the user terminal 100. Further, the server 200 mediates the transmission / reception of various information between the user terminal 100, the game play terminal 300, and the distribution terminal 400.
- the distribution terminal 400 generates operation instruction data (first data) in response to an input operation by the user of the distribution terminal 400, and distributes the operation instruction data to the user terminal 100 via the server 200.
- the operation instruction data is data for playing a moving image on the user terminal 100, and specifically, is data for operating a character appearing in the moving image.
- the game play terminal 300 and the distribution terminal 400 are separate devices.
- the game play terminal 300 and the distribution terminal 400 may be an integrated device.
- the user of the distribution terminal 400 is a player of this game.
- the moving image played on the user terminal 100 based on the operation instruction data is a moving image in which the character operated by the player in the game operates. "Movement" is to move at least a part of the character's body, including utterances. Therefore, the motion instruction data according to the present embodiment includes, for example, voice data for causing the character to speak and motion data for moving the character's body.
- the operation instruction data is transmitted to the user terminal 100 after the end of this game.
- the details of the operation instruction data and the moving image played based on the operation instruction data will be described later.
- the user terminal 100 receives game progress information in real time, and uses the information to generate and display a game screen. In other words, the user terminal 100 reproduces the game screen of the game being played by the player by real-time rendering. As a result, the user of the user terminal 100 can visually recognize the same game screen as the game screen that the player is visually recognizing while playing the game at substantially the same timing as the player.
- the user terminal 100 generates information for supporting the progress of the game by the player in response to the input operation by the user, and transmits the information to the game play terminal 300 via the server 200. Details of the information will be described later.
- the user terminal 100 receives the operation instruction data from the distribution terminal 400, and generates and reproduces a moving image (video) using the operation instruction data. In other words, the user terminal 100 renders and reproduces the operation instruction data.
- FIG. 2 is a diagram showing a hardware configuration of the user terminal 100.
- FIG. 3 is a diagram showing a hardware configuration of the server 200.
- FIG. 4 is a diagram showing a hardware configuration of the game play terminal 300.
- FIG. 5 is a diagram showing a hardware configuration of the distribution terminal 400.
- the user terminal 100 is not limited to the smartphone.
- the user terminal 100 may be realized as a feature phone, a tablet computer, a laptop computer (so-called notebook computer), a desktop computer, or the like.
- the user terminal 100 may be a game device suitable for game play.
- the user terminal 100 measures the processor 10, the memory 11, the storage 12, the communication interface (IF) 13, the input / output IF 14, the touch screen 15 (display unit), the camera 17, and the like.
- a distance sensor 18 is provided. These configurations included in the user terminal 100 are electrically connected to each other by a communication bus.
- the user terminal 100 may be provided with an input / output IF 14 to which a display (display unit) configured separately from the user terminal 100 main body can be connected in place of or in addition to the touch screen 15.
- the user terminal 100 may be configured to be communicable with one or more controllers 1020.
- the controller 1020 establishes communication with the user terminal 100 according to a communication standard such as Bluetooth (registered trademark).
- the controller 1020 may have one or more buttons or the like, and transmits an output value based on a user's input operation to the buttons or the like to the user terminal 100.
- the controller 1020 may have various sensors such as an acceleration sensor and an angular velocity sensor, and transmits the output values of the various sensors to the user terminal 100.
- the controller 1020 may have the camera 17 and the distance measuring sensor 18.
- the user terminal 100 causes a user who uses the controller 1020 to input user identification information such as the user's name or login ID via the controller 1020, for example, at the start of a game.
- the user terminal 100 can associate the controller 1020 with the user, and identifies which user the output value belongs to based on the source of the received output value (controller 1020). be able to.
- each user terminal 100 When the user terminal 100 communicates with a plurality of controllers 1020, each user grips each controller 1020 so that the user terminal 100 does not communicate with other devices such as the server 200 via the network 2. Multiplayer can be realized with.
- each user terminal 100 communicates with each other according to a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
- a wireless standard such as a wireless LAN (Local Area Network) standard (communication connection is made without going through the server 200), thereby realizing local multiplayer with a plurality of user terminals 100. You can also do it.
- the user terminal 100 When the above-mentioned multiplayer is realized locally by one user terminal 100, the user terminal 100 may further include at least a part of various functions described later described in the server 200. Further, when the above-mentioned multiplayer is realized locally by a plurality of user terminals 100, the
- the user terminal 100 may communicate with the server 200.
- information indicating a play result such as a result or a win or loss in a certain game may be associated with user identification information and transmitted to the server 200.
- the controller 1020 may be configured to be detachable from the user terminal 100.
- a coupling portion with the controller 1020 may be provided on at least one surface of the housing of the user terminal 100.
- the user terminal 100 may accept the attachment of a storage medium 1030 such as an external memory card via the input / output IF14. As a result, the user terminal 100 can read the program and data recorded in the storage medium 1030.
- the program recorded on the storage medium 1030 is, for example, a game program.
- the user terminal 100 may store the game program acquired by communicating with an external device such as the server 200 in the memory 11 of the user terminal 100, or may store the game program acquired by reading from the storage medium 1030 in the memory 11. You may memorize it in.
- the user terminal 100 includes a communication IF 13, an input / output IF 14, a touch screen 15, a camera 17, and a distance measuring sensor 18 as an example of a mechanism for inputting information to the user terminal 100.
- a communication IF 13 an input / output IF 14
- a touch screen 15 a camera 17, and a distance measuring sensor 18
- an input mechanism can be regarded as an operation part configured to accept a user's input operation.
- the operation unit when the operation unit is configured by at least one of the camera 17 and the distance measuring sensor 18, the operation unit detects an object 1010 in the vicinity of the user terminal 100 and performs an input operation from the detection result of the object. Identify.
- a user's hand as an object 1010, a marker having a predetermined shape, or the like is detected, and an input operation is specified based on the color, shape, movement, or type of the object 1010 obtained as a detection result.
- the user terminal 100 inputs a gesture (a series of movements of the user's hand) detected based on the captured image. Specify and accept as.
- the captured image may be a still image or a moving image.
- the user terminal 100 identifies and accepts the user's operation performed on the input unit 151 of the touch screen 15 as the user's input operation.
- the operation unit is configured by the communication IF 13
- the user terminal 100 identifies and accepts a signal (for example, an output value) transmitted from the controller 1020 as an input operation of the user.
- a signal output from an input device (not shown) different from the controller 1020 connected to the input / output IF14 is specified as an input operation of the user and received.
- the server 200 may be a general-purpose computer such as a workstation or a personal computer.
- the server 200 includes a processor 20, a memory 21, a storage 22, a communication IF 23, and an input / output IF 24. These configurations of the server 200 are electrically connected to each other by a communication bus.
- Gameplay terminal 300 may be a general-purpose computer such as a personal computer.
- the game play terminal 300 includes a processor 30, a memory 31, a storage 32, a communication IF 33, and an input / output IF 34. These configurations of the gameplay terminal 300 are electrically connected to each other by a communication bus.
- the game play terminal 300 is included in the HMD (Head Mounted Display) set 1000 as an example. That is, it can be expressed that the HMD set 1000 is included in the system 1, and the player can also express that the game is played using the HMD set 1000.
- the device for the player to play the game is not limited to the HMD set 1000.
- the device may be any device that allows the player to experience the game virtually.
- the device may be realized as a smartphone, a feature phone, a tablet computer, a laptop computer (so-called laptop computer), a desktop computer, or the like.
- the device may be a game device suitable for game play.
- the HMD set 1000 includes a game play terminal 300, an HMD 500, an HMD sensor 510, a motion sensor 520, a display 530, and a controller 540.
- the HMD 500 includes a monitor 51, a gaze sensor 52, a first camera 53, a second camera 54, a microphone 55, and a speaker 56.
- the controller 540 may include a motion sensor 520.
- the HMD 500 is mounted on the player's head and may provide the player with a virtual space during operation. More specifically, the HMD 500 displays an image for the right eye and an image for the left eye on the monitor 51, respectively. When each eye of the player visually recognizes the respective image, the player can recognize the image as a three-dimensional image based on the parallax of both eyes.
- the HMD 500 may include either a so-called head-mounted display provided with a monitor and a head-mounted device to which a terminal having a monitor such as a smartphone can be attached.
- the monitor 51 is realized as, for example, a non-transparent display device.
- the monitor 51 is arranged on the main body of the HMD 500 so as to be located in front of both eyes of the player. Therefore, the player can immerse himself in the virtual space when he / she visually recognizes the three-dimensional image displayed on the monitor 51.
- the virtual space includes, for example, a background, a player-operable object, and a player-selectable menu image.
- the monitor 51 can be realized as a liquid crystal monitor or an organic EL (Electro Luminescence) monitor included in a so-called smartphone or other information display terminal.
- the monitor 51 can be realized as a transmissive display device.
- the HMD 500 may be an open type such as a glasses type, not a closed type that covers the player's eyes as shown in FIG.
- the transmissive monitor 51 may be temporarily configured as a non-transparent display device by adjusting its transmittance.
- the monitor 51 may include a configuration that simultaneously displays a part of the image constituting the virtual space and the real space.
- the monitor 51 may display an image of the real space taken by the camera mounted on the HMD 500, or may make the real space visible by setting a part of the transmittance to be high.
- the monitor 51 may include a sub monitor for displaying an image for the right eye and a sub monitor for displaying an image for the left eye.
- the monitor 51 may be configured to display the image for the right eye and the image for the left eye as a unit.
- the monitor 51 includes a high-speed shutter. The high-speed shutter operates so that the image for the right eye and the image for the left eye can be alternately displayed so that the image is recognized by only one of the eyes.
- the HMD 500 includes a plurality of light sources (not shown). Each light source is realized by, for example, an LED (Light Emitting Diode) that emits infrared rays.
- the HMD sensor 510 has a position tracking function for detecting the movement of the HMD 500. More specifically, the HMD sensor 510 reads a plurality of infrared rays emitted by the HMD 500 and detects the position and inclination of the HMD 500 in the real space.
- the HMD sensor 510 may be realized by a camera.
- the HMD sensor 510 can detect the position and tilt of the HMD 500 by executing the image analysis process using the image information of the HMD 500 output from the camera.
- the HMD 500 may include a sensor (not shown) as a position detector in place of the HMD sensor 510 or in addition to the HMD sensor 510.
- the HMD 500 can use the sensor to detect the position and tilt of the HMD 500 itself.
- the sensor is an angular velocity sensor, a geomagnetic sensor, or an acceleration sensor
- the HMD 500 may use any of these sensors instead of the HMD sensor 510 to detect its position and tilt.
- the angular velocity sensor detects the angular velocity around the three axes of the HMD 500 in real space over time.
- the HMD 500 calculates the temporal change of the angle around the three axes of the HMD 500 based on each angular velocity, and further calculates the inclination of the HMD 500 based on the temporal change of the angle.
- the gaze sensor 52 detects the direction in which the line of sight of the player's right eye and left eye is directed. That is, the gaze sensor 52 detects the line of sight of the player. Detection of the direction of the line of sight is realized, for example, by a known eye tracking function.
- the gaze sensor 52 is realized by a sensor having the eye tracking function.
- the gaze sensor 52 preferably includes a sensor for the right eye and a sensor for the left eye.
- the gaze sensor 52 may be, for example, a sensor that detects the angle of rotation of each eyeball by irradiating the player's right eye and left eye with infrared light and receiving the reflected light from the cornea and the iris with respect to the irradiation light.
- the gaze sensor 52 can detect the line of sight of the player based on each detected rotation angle.
- the first camera 53 photographs the lower part of the player's face. More specifically, the first camera 53 photographs the nose, mouth, and the like of the player.
- the second camera 54 photographs the eyes and eyebrows of the player.
- the housing on the player side of the HMD 500 is defined as the inside of the HMD 500, and the housing on the opposite side of the player of the HMD 500 is defined as the outside of the HMD 500.
- the first camera 53 may be located outside the HMD 500 and the second camera 54 may be located inside the HMD 500.
- the images generated by the first camera 53 and the second camera 54 are input to the game play terminal 300.
- the first camera 53 and the second camera 54 may be realized as one camera, and the player's face may be photographed by this one camera.
- the microphone 55 converts the player's utterance into an audio signal (electrical signal) and outputs it to the game play terminal 300.
- the speaker 56 converts the voice signal into voice and outputs it to the player.
- the HMD 500 may include earphones instead of the speaker 56.
- the controller 540 is connected to the game play terminal 300 by wire or wirelessly.
- the controller 540 receives an input of a command from the player to the game play terminal 300.
- the controller 540 is configured to be grippable by the player.
- the controller 540 is configured to be wearable on a player's body or a portion of clothing.
- the controller 540 may be configured to output at least one of vibration, sound, and light based on the signal transmitted from the gameplay terminal 300.
- the controller 540 receives from the player an operation for controlling the position and movement of an object arranged in the virtual space.
- the controller 540 includes a plurality of light sources. Each light source is realized, for example, by an LED that emits infrared rays.
- the HMD sensor 510 has a position tracking function. In this case, the HMD sensor 510 reads a plurality of infrared rays emitted by the controller 540 and detects the position and tilt of the controller 540 in the real space.
- the HMD sensor 510 may be implemented by a camera. In this case, the HMD sensor 510 can detect the position and tilt of the controller 540 by executing the image analysis process using the image information of the controller 540 output from the camera.
- the motion sensor 520 is attached to the player's hand at a certain stage and detects the movement of the player's hand. For example, the motion sensor 520 detects the rotation speed, the number of rotations, and the like of the hand. The detected signal is sent to the game play terminal 300.
- the motion sensor 520 is provided in the controller 540, for example.
- the motion sensor 520 is provided, for example, in a controller 540 configured to be grippable by the player.
- the controller 540 is attached to something that does not easily fly by being attached to the player's hand, such as a glove type.
- a sensor not attached to the player may detect the movement of the player's hand.
- the signal of the camera that shoots the player may be input to the game play terminal 300 as a signal indicating the operation of the player.
- the motion sensor 520 and the game play terminal 300 are wirelessly connected to each other.
- the communication mode is not particularly limited, and for example, Bluetooth or other known communication method is used.
- the display 530 displays an image similar to the image displayed on the monitor 51. As a result, users other than the player wearing the HMD 500 can view the same image as the player.
- the image displayed on the display 530 does not have to be a three-dimensional image, and may be an image for the right eye or an image for the left eye. Examples of the display 530 include a liquid crystal display and an organic EL monitor.
- the game play terminal 300 operates a character to be operated by the player based on various information acquired from each part of the HMD 500, the controller 540, and the motion sensor 520, and advances the game.
- the "movement" here includes moving each part of the body, changing posture, changing facial expressions, moving, uttering, touching and moving objects placed in virtual space, and characters. This includes using weapons, tools, etc. to grasp. That is, in this game, when the player moves each part of the body, the character also moves each part of the body in the same manner as the player. Further, in this game, the character speaks the content spoken by the player. In other words, in this game, the character is an avatar object that behaves as a player's alter ego. As an example, at least part of the character's actions may be performed by input to the controller 540 by the player.
- the motion sensor 520 is attached to, for example, both hands of the player, both feet of the player, the waist of the player, and the head of the player.
- the motion sensors 520 attached to both hands of the player may be provided in the controller 540.
- the motion sensor 520 attached to the head of the player may be provided in the HMD 500.
- the motion sensor 520 may also be attached to the user's elbows and knees. By increasing the number of motion sensors 520 attached to the player, the movement of the player can be more accurately reflected in the character.
- the player may wear a suit to which one or more motion sensors 520 are attached instead of attaching the motion sensor 520 to each part of the body. That is, the method of motion capture is not limited to the example using the motion sensor 520.
- the distribution terminal 400 may be a mobile terminal such as a smartphone, a PDA (Personal Digital Assistant), or a tablet computer. Further, the distribution terminal 400 may be a so-called stationary terminal such as a desktop personal computer.
- the distribution terminal 400 includes a processor 40, a memory 41, a storage 42, a communication IF 43, an input / output IF 44, and a touch screen 45.
- the distribution terminal 400 may include an input / output IF 44 to which a display (display unit) configured separately from the distribution terminal 400 main body can be connected in place of or in addition to the touch screen 45.
- Controller 1021 may have one or more physical input mechanisms such as buttons, levers, sticks, and wheels.
- the controller 1021 transmits an output value based on an input operation input to the input mechanism by the operator of the distribution terminal 400 (player in the present embodiment) to the distribution terminal 400.
- the controller 1021 may have various sensors such as an acceleration sensor and an angular velocity sensor, and may transmit the output values of the various sensors to the distribution terminal 400. The above output value is accepted by the distribution terminal 400 via the communication IF43.
- the distribution terminal 400 may include a camera and a distance measuring sensor (both not shown).
- the controller 1021 may have a camera and a distance measuring sensor.
- the distribution terminal 400 includes a communication IF 43, an input / output IF 44, and a touch screen 45 as an example of a mechanism for inputting information to the distribution terminal 400.
- a communication IF 43 an input / output IF 44
- a touch screen 45 an example of a mechanism for inputting information to the distribution terminal 400.
- Each of the above-mentioned parts as an input mechanism can be regarded as an operation part configured to accept a user's input operation.
- the distribution terminal 400 identifies and accepts the user's operation performed on the input unit 451 of the touch screen 45 as the user's input operation.
- the distribution terminal 400 identifies and accepts a signal (for example, an output value) transmitted from the controller 1021 as an input operation of the user.
- the distribution terminal 400 identifies and accepts a signal output from an input device (not shown) connected to the input / output IF44 as a user's input operation.
- the processors 10, 20, 30, and 40 control the overall operation of the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400, respectively.
- Processors 10, 20, 30, and 40 include a CPU (Central Processing Unit), an MPU (Micro Processing Unit), and a GPU (Graphics Processing Unit).
- Processors 10, 20, 30, and 40 read programs from storages 12, 22, 32, and 42, which will be described later, respectively. Then, the processors 10, 20, 30, and 40 expand the read programs into the memories 11, 21, 31, and 41, which will be described later, respectively.
- Processors 10, 20, and 30 execute the expanded program.
- the memories 11, 21, 31, and 41 are the main storage devices.
- the memories 11, 21, 31 and 41 are composed of storage devices such as a ROM (ReadOnlyMemory) and a RAM (RandomAccessMemory).
- the memory 11 provides a work area to the processor 10 by temporarily storing a program and various data read from the storage 12 described later by the processor 10.
- the memory 11 also temporarily stores various data generated while the processor 10 is operating according to the program.
- the memory 21 provides a work area to the processor 20 by temporarily storing various programs and data read from the storage 22 described later by the processor 20.
- the memory 21 also temporarily stores various data generated while the processor 20 is operating according to the program.
- the memory 31 provides a work area to the processor 30 by temporarily storing various programs and data read from the storage 32 described later by the processor 30.
- the memory 31 also temporarily stores various data generated while the processor 30 is operating according to the program.
- the memory 41 provides a work area to the processor 40 by temporarily storing the program and various data read from the storage 42 described later by the processor 40.
- the memory 41 also temporarily stores various data generated while the processor 40 is operating according to the program.
- the program executed by the processors 10 and 30 may be the game program of this game.
- the program executed by the processor 40 may be a distribution program for realizing distribution of operation instruction data.
- the processor 10 may further execute a viewing program for realizing the reproduction of the moving image.
- the program executed by the processor 20 may be at least one of the above-mentioned game program, distribution program, and viewing program.
- the processor 20 executes at least one of a game program, a distribution program, and a viewing program in response to a request from at least one of a user terminal 100, a game play terminal 300, and a distribution terminal 400.
- the distribution program and the viewing program may be executed in parallel.
- the game program may be a program that realizes the game by the cooperation of the user terminal 100, the server 200, and the game play terminal 300.
- the distribution program may be a program that realizes distribution of operation instruction data by cooperation between the server 200 and the distribution terminal 400.
- the viewing program may be a program that realizes the reproduction of the moving image by the cooperation between the user terminal 100 and the server 200.
- Storages 12, 22, 32, 42 are auxiliary storage devices.
- the storages 12, 22, 32, and 42 are composed of a storage device such as a flash memory or an HDD (Hard Disk Drive).
- various data related to the game are stored in the storages 12 and 32.
- Various data related to the distribution of the operation instruction data are stored in the storage 42.
- various data related to the reproduction of the moving image are stored in the storage 12.
- the storage 22 may store at least a part of various data related to the game, the distribution of the operation instruction data, and the reproduction of the moving image.
- the communication IFs 13, 23, 33, and 43 control the transmission and reception of various data in the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400, respectively.
- the communication IFs 13, 23, 33, and 43 are, for example, communication using a wireless LAN (Local Area Network), Internet communication via a wired LAN, a wireless LAN, or a mobile phone network, communication using a short-range wireless communication, or the like. Control.
- a wireless LAN Local Area Network
- the input / output IFs 14, 24, 34, and 44 are interfaces for the user terminal 100, the server 200, the game play terminal 300, and the distribution terminal 400 to receive data input and to output data, respectively.
- the input / output IFs 14, 24, 34, and 44 may input / output data via USB (Universal Serial Bus) or the like.
- Input / output IFs 14, 24, 34, 44 may include physical buttons, cameras, microphones, speakers, mice, keyboards, displays, sticks, levers and the like. Further, the input / output IFs 14, 24, 34, 44 may include a connection unit for transmitting / receiving data to / from a peripheral device.
- the touch screen 15 is an electronic component that combines an input unit 151 and a display unit 152 (display).
- the touch screen 45 is an electronic component that combines an input unit 451 and a display unit 452.
- the input units 151 and 451 are, for example, touch-sensitive devices, and are configured by, for example, a touch pad.
- the display units 152 and 452 are configured by, for example, a liquid crystal display, an organic EL (Electro-Luminescence) display, or the like.
- the input units 151 and 451 detect the position where the user's operation (mainly a physical contact operation such as a touch operation, a slide operation, a swipe operation, and a tap operation) is input to the input surface, and information indicating the position. It has a function to transmit as an input signal.
- the input units 151 and 451 may include touch sensing units (not shown).
- the touch sensing unit may adopt any method such as a capacitance method or a resistance film method.
- the user terminal 100 and the distribution terminal 400 may be provided with one or more sensors for specifying the holding posture of the user terminal 100 and the distribution terminal 400, respectively.
- This sensor may be, for example, an acceleration sensor, an angular velocity sensor, or the like.
- the processors 10 and 40 each specify the holding postures of the user terminal 100 and the distribution terminal 400 from the output of the sensor, and perform processing according to the holding posture. It will also be possible.
- the processors 10 and 40 may be vertical screen displays for displaying vertically long images on the display units 152 and 452 when the user terminal 100 and the distribution terminal 400 are held vertically, respectively.
- a horizontally long image may be displayed on the display unit as a horizontal screen display.
- the processors 10 and 40 may be able to switch between the vertical screen display and the horizontal screen display according to the holding postures of the user terminal 100 and the distribution terminal 400, respectively.
- FIG. 6 is a block diagram showing a functional configuration of a user terminal 100, a server 200, and an HMD set 1000 included in the system 1.
- FIG. 7 is a block diagram showing a functional configuration of the distribution terminal 400 shown in FIG.
- the user terminal 100 has a function as an input device that accepts a user's input operation and a function as an output device that outputs a game image or sound.
- the user terminal 100 functions as a control unit 110 and a storage unit 120 in cooperation with a processor 10, a memory 11, a storage 12, a communication IF 13, an input / output IF 14, a touch screen 15, and the like.
- the server 200 has a function of mediating the transmission / reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400.
- the server 200 functions as a control unit 210 and a storage unit 220 by the cooperation of the processor 20, the memory 21, the storage 22, the communication IF23, the input / output IF24, and the like.
- the HMD set 1000 (game play terminal 300) has a function as an input device for receiving input operations of a player, a function as an output device for outputting game images and sounds, and a user of game progress information via a server 200. It has a function of transmitting to the terminal 100 in real time.
- the HMD set 1000 is a control unit 310 by the cooperation of the processor 30, the memory 31, the storage 32, the communication IF 33, the input / output IF 34, and the HMD 500, the HMD sensor 510, the motion sensor 520, the controller 540, and the like of the game play terminal 300. And functions as a storage unit 320.
- the distribution terminal 400 has a function of generating operation instruction data and transmitting the operation instruction data to the user terminal 100 via the server 200.
- the distribution terminal 400 functions as a control unit 410 and a storage unit 420 in cooperation with a processor 40, a memory 41, a storage 42, a communication IF 43, an input / output IF 44, a touch screen 45, and the like.
- the storage unit 120 stores the game program 131 (program), the game information 132, and the user information 133.
- the storage unit 220 stores the game program 231, the game information 232, the user information 233, and the user list 234.
- the storage unit 320 stores the game program 331, the game information 332, and the user information 333.
- the storage unit 420 stores the user list 421, the motion list 422, and the distribution program 423 (program, second program).
- the game programs 131, 231 and 331 are game programs executed by the user terminal 100, the server 200, and the HMD set 1000, respectively. This game is realized by the cooperative operation of each device based on the game programs 131, 231 and 331.
- the game programs 131 and 331 may be stored in the storage unit 220 and downloaded to the user terminal 100 and the HMD set 1000, respectively.
- the user terminal 100 renders the data received from the distribution terminal 400 based on the game program 131, and reproduces the moving image.
- the game program 131 is also a program for playing a moving image using the moving image instruction data distributed from the distribution terminal 400.
- the program for playing the moving image may be different from the game program 131.
- the storage unit 120 stores a program for playing the moving image separately from the game program 131.
- the game information 132, 232, and 332 are data referred to when the user terminal 100, the server 200, and the HMD set 1000 execute the game program, respectively.
- the user information 133, 233, and 333 are data relating to the user account of the user terminal 100.
- the game information 232 is the game information 132 of each user terminal 100 and the game information 332 of the HMD set 1000.
- the user information 233 is the user information of the player included in the user information 133 of each user terminal 100 and the user information 333.
- the user information 333 is the user information 133 of each user terminal 100 and the user information of the player.
- the user list 234 and the user list 421 are a list of users who have participated in the game.
- the user list 234 and the user list 421 may include a list of users who participated in the most recent gameplay by the player, as well as a list of users who participated in each gameplay before the gameplay.
- the motion list 422 is a list of a plurality of motion data created in advance.
- the motion list 422 is, for example, a list in which motion data is associated with each of the information (for example, a motion name) that identifies each motion.
- the distribution program 423 is a program for realizing distribution of operation instruction data for playing a moving image on the user terminal 100 to the user terminal 100.
- the control unit 210 comprehensively controls the server 200 by executing the game program 231 stored in the storage unit 220. For example, the control unit 210 mediates the transmission / reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400.
- the control unit 210 functions as a communication mediation unit 211, a log generation unit 212, and a list generation unit 213 according to the description of the game program 231.
- the control unit 210 can also function as another functional block (not shown) for mediating the transmission / reception of various information related to game play and distribution of operation instruction data, and for supporting the progress of the game.
- the communication mediation unit 211 mediates the transmission and reception of various information between the user terminal 100, the HMD set 1000, and the distribution terminal 400. For example, the communication mediation unit 211 transmits the game progress information received from the HMD set 1000 to the user terminal 100.
- the game progress information includes data indicating the movement of the character operated by the player, the parameters of the character, information on items and weapons possessed by the character, enemy characters, and the like.
- the server 200 transmits the game progress information to the user terminals 100 of all the users participating in the game. In other words, the server 200 transmits common game progress information to the user terminals 100 of all users participating in the game. As a result, the game progresses in the same manner as the HMD set 1000 on each of the user terminals 100 of all the users participating in the game.
- the communication mediation unit 211 transmits the information received from any one of the user terminals 100 to support the progress of the game by the player to the other user terminals 100 and the HMD set 1000.
- the information may be item information indicating an item provided to the player, which is an item for the player to advance the game advantageously.
- the item information includes information (user name, user ID, etc.) indicating the user who provided the item.
- the communication mediation unit 211 may mediate the distribution of the operation instruction data from the distribution terminal 400 to the user terminal 100.
- the log generation unit 212 generates a game progress log based on the game progress information received from the HMD set 1000.
- the list generation unit 213 generates the user list 234 after the end of the game play. Although the details will be described later, each user in the user list 234 is associated with a tag indicating the content of the support provided by the user to the player.
- the list generation unit 213 generates a tag based on the game progress log generated by the log generation unit 212, and associates it with the corresponding user.
- the list generation unit 213 may associate the content of the support provided by each user to the player, which is input by the game operator or the like using a terminal device such as a personal computer, with the corresponding user as a tag. ..
- the user terminal 100 transmits information indicating the user to the server 200 based on the user's operation. For example, the user terminal 100 transmits the user ID entered by the user to the server 200. That is, the server 200 holds information indicating each user for all the users participating in the game.
- the list generation unit 213 may generate the user list 234 using the information.
- the control unit 310 comprehensively controls the HMD set 1000 by executing the game program 331 stored in the storage unit 320. For example, the control unit 310 advances the game according to the game program 331 and the operation of the player. Further, the control unit 310 communicates with the server 200 to send and receive information as needed while the game is in progress. The control unit 310 may send and receive information directly to and from the user terminal 100 without going through the server 200.
- the control unit 310 has an operation reception unit 311, a display control unit 312, a UI control unit 313, an animation generation unit 314, a game progress unit 315, a virtual space control unit 316, and a reaction processing unit 317 according to the description of the game program 331. Functions as.
- the control unit 310 can also function as another functional block (not shown) for controlling characters appearing in the game, depending on the nature of the game to be executed.
- the operation reception unit 311 detects and accepts the input operation of the player.
- the operation reception unit 311 receives signals input from the HMD 500, the motion sensor 520, the controller 540, etc., determines what kind of input operation has been performed, and outputs the result to each element of the control unit 310.
- the UI control unit 313 controls a user interface (hereinafter, UI) image to be displayed on the monitor 51, the display 530, and the like.
- UI image is a tool for the player to make an input necessary for the progress of the game to the HMD set 1000, or a tool for obtaining information output during the progress of the game from the HMD set 1000.
- UI images are, but are not limited to, icons, buttons, lists, menu screens, and the like.
- the animation generation unit 314 generates an animation showing the motion of various objects based on the control mode of various objects.
- the animation generation unit 314 may generate an animation or the like that expresses how an object (for example, a player's avatar object) moves, moves its mouth, or changes its facial expression as if it were there. ..
- the game progress unit 315 progresses the game based on the game program 331, the input operation by the player, the operation of the avatar object in response to the input operation, and the like. For example, the game progress unit 315 performs a predetermined game process when the avatar object performs a predetermined operation. Further, for example, the game progress unit 315 may receive information representing a user's operation on the user terminal 100 and perform game processing based on the user's operation. Further, the game progress unit 315 generates game progress information according to the progress of the game and transmits it to the server 200. The game progress information is transmitted to the user terminal 100 via the server 200. As a result, the progress of the game in the HMD set 1000 is shared in the user terminal 100. In other words, the progress of the game in the HMD set 1000 and the progress of the game in the user terminal 100 are synchronized.
- the virtual space control unit 316 performs various controls related to the virtual space provided to the player according to the progress of the game. As an example, the virtual space control unit 316 creates various objects and arranges them in the virtual space. Further, the virtual space control unit 316 arranges the virtual camera in the virtual space. Further, the virtual space control unit 316 operates various objects arranged in the virtual space according to the progress of the game. Further, the virtual space control unit 316 controls the position and inclination of the virtual camera arranged in the virtual space according to the progress of the game.
- the display control unit 312 outputs a game screen reflecting the processing results executed by each of the above-mentioned elements to the monitor 51 and the display 530.
- the display control unit 312 may display an image based on the field of view from the virtual camera arranged in the virtual space on the monitor 51 and the display 530 as a game screen. Further, the display control unit 312 may include the animation generated by the animation generation unit 314 in the game screen. Further, the display control unit 312 may superimpose and draw the above-mentioned UI image controlled by the UI control unit 313 on the game screen.
- the reaction processing unit 317 receives feedback on the reaction of the user of the user terminal 100 to the game play of the player, and outputs this to the player.
- the user terminal 100 can create a comment (message) addressed to the avatar object based on the input operation of the user.
- the reaction processing unit 317 receives the comment data of the comment and outputs it.
- the reaction processing unit 317 may display the text data corresponding to the user's comment on the monitor 51 and the display 530, or may output the voice data corresponding to the user's comment from a speaker (not shown). In the former case, the reaction processing unit 317 may superimpose and draw an image corresponding to the text data (that is, an image including the content of the comment) on the game screen.
- the control unit 110 comprehensively controls the user terminal 100 by executing the game program 131 stored in the storage unit 120. For example, the control unit 110 advances the game according to the game program 131 and the user's operation. Further, the control unit 110 communicates with the server 200 to send and receive information as needed while the game is in progress. The control unit 110 may send and receive information directly to and from the HMD set 1000 without going through the server 200.
- the control unit 110 has an operation reception unit 111, a display control unit 112, a UI control unit 113, an animation generation unit 114, a game progress unit 115, a virtual space control unit 116, and a moving image playback unit 117 according to the description of the game program 131. Functions as.
- the control unit 110 can also function as other functional blocks (not shown) for the progress of the game, depending on the nature of the game being played.
- the operation reception unit 111 detects and accepts a user's input operation to the input unit 151.
- the operation reception unit 111 determines what input operation has been performed from the action exerted by the user on the console via the touch screen 15 and other input / output IF 14, and outputs the result to each element of the control unit 110. do.
- the operation receiving unit 111 receives an input operation for the input unit 151, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
- the operation receiving unit 111 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation receiving unit 111 detects that the contact input is canceled from the touch screen 15 when the continuously detected input is interrupted.
- the UI control unit 113 controls a UI image to be displayed on the display unit 152 in order to construct a UI according to at least one of a user's input operation and received game progress information.
- the UI image is a tool for the user to make an input necessary for the progress of the game to the user terminal 100, or a tool for obtaining information output during the progress of the game from the user terminal 100.
- UI images are, but are not limited to, icons, buttons, lists, menu screens, and the like.
- the animation generation unit 114 generates an animation showing the motion of various objects based on the control mode of various objects.
- the game progress unit 115 advances the game based on the game program 131, the received game progress information, the input operation by the user, and the like.
- the game progress unit 115 transmits information about the game process to the HMD set 1000 via the server 200.
- the predetermined game processing is shared in the HMD set 1000.
- the predetermined game process is, for example, a process of providing an item to the avatar object, and in this example, the information related to the game process is the item information described above.
- the virtual space control unit 116 performs various controls related to the virtual space provided to the user according to the progress of the game.
- the virtual space control unit 116 creates various objects and arranges them in the virtual space.
- the virtual space control unit 116 arranges the virtual camera in the virtual space.
- the virtual space control unit 116 operates various objects arranged in the virtual space according to the progress of the game, specifically, the received game progress information.
- the virtual space control unit 316 controls the position and inclination of the virtual camera arranged in the virtual space according to the progress of the game, specifically, the received game progress information.
- the display control unit 112 outputs to the display unit 152 a game screen that reflects the processing results executed by each of the above elements.
- the display control unit 112 may display an image based on the field of view from the virtual camera arranged in the virtual space provided to the user on the display unit 152 as a game screen. Further, the display control unit 112 may include the animation generated by the animation generation unit 114 in the game screen. Further, the display control unit 112 may superimpose and draw the above-mentioned UI image controlled by the UI control unit 113 on the game screen.
- the game screen displayed on the display unit 152 is the same game screen as the game screen displayed on the other user terminal 100 and the HMD set 1000.
- the video playback unit 117 analyzes (renders) the operation instruction data received from the distribution terminal 400, and reproduces the video.
- the control unit 410 comprehensively controls the distribution terminal 400 by executing a program (not shown) stored in the storage unit 420. For example, the control unit 410 generates operation instruction data and distributes it to the user terminal 100 according to the operation of the program and the user (player in this embodiment) of the distribution terminal 400. Further, the control unit 410 communicates with the server 200 as necessary to send and receive information. The control unit 410 may send and receive information directly to and from the user terminal 100 without going through the server 200.
- the control unit 410 functions as a communication control unit 411, a display control unit 412, an operation reception unit 413, a voice reception unit 414, a motion identification unit 415, and an operation instruction data generation unit 416 according to the description of the program.
- the control unit 410 can also function as other functional blocks (not shown) for the generation and distribution of operation instruction data.
- the communication control unit 411 controls transmission / reception of information to / from the server 200 or the user terminal 100 via the server 200.
- the communication control unit 411 receives the user list 421 from the server 200 as an example. Further, the communication control unit 411 transmits the operation instruction data to the user terminal 100 as an example.
- the display control unit 412 outputs various screens reflecting the processing results executed by each element to the display unit 452. As an example, the display control unit 412 displays a screen including the received user list 234. Further, as an example, the display control unit 412 displays a screen including a motion list 422 for allowing the player to select motion data for operating the avatar object included in the motion instruction data to be distributed.
- the operation reception unit 413 detects and accepts the input operation of the player with respect to the input unit 151.
- the operation reception unit 111 determines what kind of input operation has been performed from the action exerted by the player on the console via the touch screen 45 and other input / output IF44s, and outputs the result to each element of the control unit 410. do.
- the operation receiving unit 413 receives an input operation for the input unit 451, detects the coordinates of the input position of the input operation, and specifies the type of the input operation.
- the operation reception unit 413 specifies, for example, a touch operation, a slide operation, a swipe operation, a tap operation, and the like as types of input operations. Further, the operation reception unit 413 detects that the contact input is canceled from the touch screen 45 when the continuously detected input is interrupted.
- the voice reception unit 414 receives the voice generated around the distribution terminal 400 and generates the voice data of the voice.
- the voice receiving unit 414 receives the voice spoken by the player and generates voice data of the voice.
- the motion specifying unit 415 specifies the motion data selected by the player from the motion list 422 according to the input operation of the player.
- the operation instruction data generation unit 416 generates operation instruction data.
- the operation instruction data generation unit 416 generates operation instruction data including the generated voice data and the specified motion data.
- the functions of the HMD set 1000, the server 200, and the user terminal 100 shown in FIG. 6 and the functions of the distribution terminal 400 shown in FIG. 7 are merely examples.
- Each device of the HMD set 1000, the server 200, the user terminal 100, and the distribution terminal 400 may have at least a part of the functions of the other devices.
- another device other than the HMD set 1000, the server 200, the user terminal 100, and the distribution terminal 400 may be a component of the system 1, and the other device may be made to execute a part of the processing in the system 1.
- the computer that executes the game program in the present embodiment may be any of the HMD set 1000, the server 200, the user terminal 100, the distribution terminal 400, and another device other than the HMD set 1000, or a plurality of these. It may be realized by the combination of the devices of.
- FIG. 8 is a flowchart showing an example of a flow of control processing of the virtual space provided to the player and the virtual space provided to the user of the user terminal 100.
- FIG. 9 is a diagram showing a virtual space 600A provided to the player and a field of view image visually recognized by the player according to a certain embodiment.
- FIG. 10 is a diagram showing a virtual space 600B provided to a user of a user terminal 100 and a field of view image visually recognized by the user according to a certain embodiment.
- virtual space 600 when it is not necessary to distinguish between the virtual spaces 600A and 600B, it is described as "virtual space 600".
- step S1 the processor 30 defines the virtual space 600A shown in FIG. 9 as the virtual space control unit 316.
- the processor 30 defines a virtual space 600A by using virtual space data (not shown).
- the virtual space data may be stored in the game play terminal 300, may be generated by the processor 30 based on the game program 331, or may be acquired by the processor 30 from an external device such as the server 200. May be good.
- the virtual space 600 has an all-sky spherical structure that covers the entire 360-degree direction of the point defined as the center.
- the celestial sphere in the upper half of the virtual space 600 is illustrated so as not to complicate the explanation.
- step S2 the processor 30 arranges the avatar object 610 (character) in the virtual space 600A as the virtual space control unit 316.
- the avatar object 610 is an avatar object associated with the player and operates according to the input operation of the player.
- step S3 the processor 30 arranges other objects in the virtual space 600A as the virtual space control unit 316.
- the processor 30 arranges the objects 631 to 634.
- Other objects imitate, for example, character objects (so-called non-player characters, NPCs) that operate according to the game program 331, operation objects such as virtual hands, animals, plants, artificial objects, and natural objects that are arranged as the game progresses.
- Can include objects and the like.
- step S4 the processor 30 arranges the virtual camera 620A in the virtual space 600A as the virtual space control unit 316. As an example, the processor 30 arranges the virtual camera 620A at the position of the head of the avatar object 610.
- step S5 the processor 30 displays the field of view image 650 on the monitor 51 and the display 530.
- the processor 30 defines a field of view 640A, which is the field of view from the virtual camera 620A in the virtual space 600A, depending on the initial position and tilt of the virtual camera 620A.
- the processor 30 defines the field of view image 650 corresponding to the field of view area 640A.
- the processor 30 outputs the field of view image 650 to the monitor 51 and the display 530 to display the field of view image 650 on the HMD 500 and the display 530.
- a part of the object 634 is included in the field of view region 640A, so that the field of view image 650 is one of the objects 634 as shown in FIG. 9B. Including the part.
- step S6 the processor 30 transmits the initial arrangement information to the user terminal 100 via the server 200.
- the initial placement information is information indicating the initial placement position of various objects in the virtual space 600A.
- the initial placement information includes the avatar object 610 and the information of the initial placement positions of the objects 631 to 634.
- the initial placement information can also be expressed as one of the game progress information.
- step S7 the processor 30 controls the virtual camera 620A according to the movement of the HMD 500 as the virtual space control unit 316. Specifically, the processor 30 controls the orientation and tilt of the virtual camera 620A according to the movement of the HMD 500, that is, the posture of the player's head. As will be described later, when the player moves the head (changes the posture of the head), the processor 30 moves the head of the avatar object 610 in accordance with this movement. The processor 30 controls the direction and tilt of the virtual camera 620A so that the direction of the line of sight of the avatar object 610 and the direction of the line of sight of the virtual camera 620A match, for example. In step S8, the processor 30 updates the field of view image 650 in response to changes in the orientation and tilt of the virtual camera 620A.
- step S9 the processor 30 moves the avatar object 610 as the virtual space control unit 316 according to the movement of the player.
- the processor 30 moves the avatar object 610 in the virtual space 600A in response to the player moving in the real space.
- the processor 30 moves the head of the avatar object 610 in the virtual space 600A in response to the player moving the head in the real space.
- step S10 the processor 30 moves the virtual camera 620A as the virtual space control unit 316 so as to follow the avatar object 610. That is, the virtual camera 620A is always at the position of the head of the avatar object 610 even if the avatar object 610 moves.
- the processor 30 updates the field of view image 650 according to the movement of the virtual camera 620A. That is, the processor 30 updates the view area 640A according to the posture of the player's head and the position of the virtual camera 620A in the virtual space 600A. As a result, the view image 650 is updated.
- step S11 the processor 30 transmits the operation instruction data of the avatar object 610 to the user terminal 100 via the server 200.
- the operation instruction data here is motion data that captures the player's motion, voice data of the voice spoken by the player, and operation data indicating the content of the input operation to the controller 540 during the virtual experience (for example, during game play). Includes at least one.
- the operation instruction data is transmitted to the user terminal 100 as, for example, game progress information.
- steps S7 to S11 are continuously and repeatedly executed while the player is playing the game.
- step S21 the processor 10 of the user terminal 100 of the user 3 defines the virtual space 600B shown in FIG. 10 as the virtual space control unit 116.
- the processor 10 defines a virtual space 600B by using virtual space data (not shown).
- the virtual space data may be stored in the user terminal 100, may be generated by the processor 10 based on the game program 131, or may be acquired by the processor 10 from an external device such as the server 200. good.
- step S22 the processor 10 receives the initial placement information.
- step S23 the processor 10 arranges various objects in the virtual space 600B as the virtual space control unit 116 according to the initial arrangement information.
- the various objects are the avatar object 610 and the objects 631 to 634.
- step S24 the processor 10 arranges the virtual camera 620B in the virtual space 600B as the virtual space control unit 116. As an example, the processor 10 arranges the virtual camera 620B at the position shown in FIG. 10 (A).
- step S25 the processor 10 displays the field of view image 660 on the display unit 152.
- the processor 10 defines a field of view 640B, which is the field of view from the virtual camera 620B in the virtual space 600B, according to the initial position and tilt of the virtual camera 620B. Then, the processor 10 defines the field of view image 660 corresponding to the field of view area 640B.
- the processor 10 outputs the field of view image 660 to the display unit 152 so that the field of view image 660 is displayed on the display unit 152.
- the field of view image 660 is the avatar object 610 as shown in FIG. 10B. And the object 631.
- step S26 the processor 10 receives the operation instruction data.
- step S27 the processor 10 moves the avatar object 610 in the virtual space 600B as the virtual space control unit 116 according to the operation instruction data. In other words, the processor 10 reproduces the video in which the avatar object 610 is operating by real-time rendering.
- step S28 the processor 10 controls the virtual camera 620B as the virtual space control unit 116 according to the operation of the user accepted as the operation reception unit 111.
- step S29 the processor 10 updates the field of view image 660 in response to changes in the position of the virtual camera 620B in the virtual space 600B, the orientation and tilt of the virtual camera 620B.
- the processor 10 may automatically control the virtual camera 620B according to the movement of the avatar object 610, for example, the movement of the avatar object 610 or the change of the orientation.
- the processor 10 may automatically move the virtual camera 620B so as to always shoot the avatar object 610 from the front, or change the orientation and tilt.
- the processor 10 may automatically move the virtual camera 620B so as to always shoot the avatar object 610 from the rear, or change the orientation and tilt according to the movement of the avatar object 610. good.
- the avatar object 610 operates according to the movement of the player.
- the operation instruction data indicating this operation is transmitted to the user terminal 100.
- the avatar object 610 operates according to the received operation instruction data.
- the avatar object 610 performs the same operation in the virtual space 600A and the virtual space 600B.
- the user 3 can visually recognize the operation of the avatar object 610 according to the operation of the player by using the user terminal 100.
- FIG. 32 is a flowchart showing an example of the basic game progress of this game.
- the game is divided into, for example, two gameplay parts.
- the first part is a location information game part
- the second part is a live distribution part.
- the game may include an acquisition part that allows the user to acquire a game medium that is digital data that can be used in the game in exchange for valuable data possessed by the user.
- the play order of each part is not particularly limited.
- FIG. 32 shows a case where the user terminal 100 executes a game in the order of a location information game part, an acquisition part, and a live distribution part.
- step S1 the game progress unit 115 executes the part of the location information game.
- the location-based game part includes a fixed scenario S11 and an acquisition scenario S12 (described later).
- the location-based game part includes, for example, a scene in which a main character operated by a user interacts with an NPC.
- the "scenario" collected as digital data corresponds to one story of the main character, is supplied from the server 200, and is temporarily stored in the storage unit 120.
- the game progress unit 115 reads out one scenario stored in the storage unit 120, and advances one scenario according to the input operation of the user until the end is reached.
- the scenario includes options to be selected by the user, response patterns of the main character corresponding to the options, and even if different outcomes are obtained in one scenario depending on which option the user selects. good.
- the game progress unit 115 presents a plurality of options corresponding to the action to the main character in a user-selectable manner, and advances the scenario according to the options selected by the user.
- the game progress unit 115 may make the user acquire the privilege according to the ending.
- the privilege is provided to the user, for example, as a game medium which is digital data that can be used in the game.
- the game medium may be, for example, an item such as clothing that can be worn by a character (avatar object), or an item given to a distributor of a live distribution source.
- "to make the user acquire the privilege” may, as an example, change the status of the game medium as the privilege managed in association with the user from unusable to usable.
- the game medium may be stored in at least one of the memories (memory 11, memory 21, memory 31) included in the system 1 in association with the user identification information, the user terminal ID, or the like.
- step S3 the game progress unit 115 executes the acquisition part.
- the game medium acquired by the user may be a new scenario different from the scenario provided to the user terminal 100 at the time of the first download.
- the former scenario will be referred to as a fixed scenario, and the latter scenario will be referred to as an acquisition scenario.
- the scenario When it is not necessary to distinguish between the two, it is simply referred to as a scenario.
- the game progress unit 115 causes the user to possess an acquisition scenario different from the fixed scenario that the user already possesses, in exchange for consuming the user's valuable data.
- the scenario to be acquired by the user may be determined by the game progress unit 115 or the control unit 210 of the server 200 according to a predetermined rule. More specifically, the game progress unit 115 or the control unit 210 may execute a lottery and randomly determine a scenario to be acquired by the user from a plurality of acquisition scenarios.
- the acquisition part may be executed at any time before and after the location information game part and the live distribution part.
- step S4 the game progress unit 115 determines whether or not the operation instruction data has been received from an external device via the network. While the operation instruction data is not received from the external device, the game progress unit 115 may return from NO in step S4 to, for example, step S1 to execute the position information game part. Alternatively, the game progress unit 115 may execute the acquisition part of step S3. On the other hand, when the operation instruction data is received from the external device, the game progress unit 115 proceeds from YES in step S4 to step S5.
- step S5 the game progress unit 115 executes the live distribution part (second part). Specifically, the game progress unit 115 advances the live distribution part, and the moving image reproduction unit 117 operates the character according to the operation instruction data received in step S4.
- the user can freely and interactively interact with the character that operates in real time based on the operation instruction data transmitted from the external device. More specifically, the moving image reproduction unit 117 receives the operation instruction data including the voice data and the motion data generated according to the content of the input operation of the player 4 from the game play terminal 300. Then, the moving image reproduction unit 117 causes the character to speak based on the voice data included in the received operation instruction data, and moves the character based on the above-mentioned motion data. Thereby, the reaction of the character to the above-mentioned input operation of the player 4 can be presented to the user.
- the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131. Specifically, the user terminal 100 operates the main character in response to the user's input operation input via the operation unit (input / output IF 14, touch screen 15, camera 17, distance measurement sensor 18), and the first A step of advancing one part and a step of advancing the second part by operating the character of the player 4 based on the operation instruction data received from the game play terminal 300 are executed.
- the operation unit input / output IF 14, touch screen 15, camera 17, distance measurement sensor 18
- the user terminal 100 receives the operation instruction data from the game play terminal 300, and in the second part, the character is operated based on the operation instruction data. Since the character can be operated based on the operation instruction data received from the game play terminal 300, the operation of the character is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the character is in the real world through the relationship with the character during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
- FIG. 33 is a flowchart showing a basic game progress of a game executed based on the game program according to the first modification of the embodiment.
- Step S11a is the same as step S1 in FIG. 32, and duplicate description will be omitted. That is, the game progress unit 115 executes the position information game part (first part).
- step S13a While the position information game part of step S11a is in progress, in step S13a, the game progress unit 115 receives a specific action by the user. In response to this, the game progress unit 115 proceeds to step S14a, and an operation for switching from the position information game part to the live distribution part is performed. It is preferable that the game progressing unit 115 continuously executes the position information game part of step S11a until the specific action by the user is not accepted in step S13a.
- the result of a specific action by the user in the location information game part includes, for example, that the position of the user terminal 100 acquired by the above-mentioned location registration system included in the user terminal 100 becomes a predetermined position. Is done. More specifically, the quest is realized by the position information game using the position registration information of the user terminal 100, and the user moves to the position determined by the game progress unit 115 with the user terminal 100. As a result, when the current position information of the user terminal 100 matches the determined position, the user is made to acquire the privilege. Also, in place of, or in addition to, the acquisition of benefits, the progress of the game may be automatically switched to the livestreaming part.
- the result of a specific action by the user in the location-based game part includes the completion of a predetermined scenario associated with the location-based game part. More specifically, in the location-based game part, when the user clears one or more quests or selects an option, the dialogue with the character progresses and the scenario progresses. Then, when the scenario has one end, the user has completed the play of the scenario. As a result, the game may be automatically switched from the location-based game part to the live distribution part.
- step S14a is the same as step S4 in FIG. 32, and duplicate description will be omitted.
- Step S15a is the same as step S5 in FIG. 32, and duplicate description will be omitted.
- step S15a for example, when it is determined that the server 200 is the live distribution time, or when it is determined that the server 200 has received a specific action by the user, the process shifts to the live distribution part. May be good. That is, when the determination condition is satisfied, the server 200 and the game play terminal 300 provide the live distribution in the live distribution part to the user terminal 100. On the contrary, when the determination condition is not satisfied, the progress of the game is controlled so that the user terminal 100 does not proceed to the live distribution part.
- the user terminal 100 When the determination condition is satisfied, the user terminal 100 operates the character based on the operation instruction data, and can execute the progress of the live distribution part. Specifically, when the distribution terminal 400 has already started the live distribution, the user terminal 100 may be able to receive the supply of the real-time live distribution from the middle. Instead of this, when the determination condition is satisfied, the live distribution may be started by using this as a trigger, and the user terminal 100 may be able to receive the supply of the completed live distribution from the beginning. .. It should be noted that the specific action by the user, which is a determination condition, is determined in advance by the game master, for example, and is managed by the server 200 and the game play terminal 300.
- the user terminal 100 switching from the first part to the second part is performed according to the result of the user performing a specific action in the first part.
- the user terminal 100 receives the operation instruction data from the game play terminal 300, and in the second part, the character is operated based on the operation instruction data. Since the character can be operated based on the operation instruction data received from the game play terminal 300, the operation of the character is unconventional and its expression is greatly expanded. Therefore, the user can feel the reality as if the character is in the real world through the relationship with the character during the game play. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game. Further, in order to move to the second part, the user needs to perform a specific action in the first part, so that the game quality can be further enhanced.
- ⁇ Transformation example 2 of game configuration> In the description of the first embodiment, regarding FIG. 32 described above, the case where the user terminal 100 executes the game in the order of the location information game part, the acquisition part, and the live distribution part is shown. Further, in the first modification of the first embodiment, with respect to FIG. 33 described above, the live distribution part is automatically switched to the live distribution part according to the result of whether a specific action is performed by the user terminal 100 while the position information game part is in progress. The case where the game progresses was shown.
- the live distribution for advancing the live distribution part is received instead of the configuration that automatically switches to the live distribution part.
- Rights may be granted to the user.
- the right here may be in the form of a ticket, and the user holding the ticket has the right to access the live delivered.
- the live distribution time may be notified to the user terminal 100 in advance, or may be kept secret until the actual live distribution time is reached. In the former case, live distribution can be stably supplied to the user, and in the latter case, live distribution with special added value can be supplied to the user as a surprise distribution.
- FIG. 11 is a diagram showing another example of the field view image displayed on the user terminal 100. Specifically, it is a figure which shows an example of the game screen of the game which the system 1 is playing, which a player is playing.
- an avatar object 610 that operates weapons such as guns and knives and a plurality of enemy objects 671 that are NPCs appear in the virtual space 600, and the avatar object 610 battles against the enemy object 671. It is a game to let you.
- Various game parameters such as the physical strength of the avatar object 610, the number of magazines that can be used, the number of remaining bullets of the gun, and the remaining number of the enemy object 671 are updated as the game progresses.
- the genre of this game is not limited to a specific genre. System 1 can play games of any genre.
- the play mode of the game executed in the game system 1 is not limited to a specific play mode.
- the system 1 can execute a game of any play form. For example, a single-player game with a single player, a multiplayer game with a plurality of players, a battle game in which a plurality of players play against each other, and a cooperative play game in which a plurality of players cooperate among the multiplayer games. You may.
- a plurality of stages are prepared in this game, and the player can clear the stage by satisfying the predetermined achievement conditions associated with each stage.
- the predetermined achievement conditions are, for example, defeating all the enemy objects 671 that appear, defeating the boss object among the appearing enemy objects 671, acquiring a predetermined item, reaching a predetermined position, and the like. It may include the condition to be used.
- the achievement conditions are defined in the game program 131.
- the player clears the stage when the achievement condition is satisfied according to the content of the game, in other words, the avatar object 610 wins the enemy object 671 (avatar object 610 and enemy object 671). The victory or defeat between) is decided.
- the game executed by the system 1 is a racing game or the like, the ranking of the avatar object 610 is determined when the condition of reaching the goal is satisfied.
- the game progress information is live-distributed to the plurality of user terminals 100 at predetermined time intervals.
- a field-of-view image of the field of view region defined by the virtual camera 620B corresponding to the user terminal 100 is displayed.
- parameter images showing the physical strength of the avatar object 610, the number of magazines that can be used, the number of remaining bullets of the gun, the remaining number of the enemy object 671 and the like are displayed superimposed. ..
- This view image can also be expressed as a game screen.
- the game progress information includes motion data that captures the player's motion, voice data of the voice spoken by the player, and operation data indicating the content of the input operation to the controller 540.
- These data are, that is, information for specifying the position, posture, orientation, etc. of the avatar object 610, information for specifying the position, posture, orientation, etc. of the enemy object 671, and other objects (for example, obstacle objects 672, 673).
- This is information that identifies the position of the object.
- the processor 10 identifies the position, posture, orientation, and the like of each object by analyzing (rendering) the game progress information.
- the game information 132 includes data of various objects such as an avatar object 610, an enemy object 671, an obstacle object 672, and 673.
- the processor 10 uses the data and the analysis result of the game progress information to update the position, posture, orientation, and the like of each object.
- the game progresses, and each object in the virtual space 600B moves in the same manner as each object in the virtual space 600A.
- each object including the avatar object 610 operates based on the game progress information regardless of whether or not the user operates the user terminal 100.
- UI images 701 and 702 are displayed superimposed on the view image.
- the UI image 701 is a UI image that accepts an operation for displaying the UI image 711 for displaying the item input operation for supporting the avatar object 610 from the user 3 on the touch screen 15.
- the UI image 702 accepts an operation for displaying a UI image (described later) on the touch screen 15 for inputting a comment for the avatar object 610 (in other words, the player 4) and receiving an operation for transmitting from the user 3. It is a UI image.
- the operation accepted by the UI images 701 and 702 may be, for example, an operation of tapping the UI images 701 and 702.
- the UI image 711 is displayed superimposed on the view image.
- the UI image 711 is, for example, a UI image 711A on which a magazine icon is drawn, a UI image 711B on which an emergency box icon is drawn, a UI image 711C on which a triangular cone icon is drawn, and a UI on which a barricade icon is drawn. Includes image 711D.
- the item input operation corresponds to, for example, an operation of tapping any UI image.
- Obstacle objects 672 and 673 may be one that more obstructs the movement of enemy object 671 than the other.
- the processor 10 transmits the item input information indicating that the item input operation has been performed to the server 200.
- the item input information includes at least information for specifying the type of the item specified by the item input operation.
- the item input information may include other information about the item, such as information indicating the position where the item is placed.
- the item input information is transmitted to the other user terminal 100 and the HMD set 1000 via the server 200.
- FIG. 12 is a diagram showing another example of the field view image displayed on the user terminal 100. Specifically, it is a figure which shows an example of the game screen of this game, and is the figure for demonstrating the communication between a player and a user terminal 100 during a game play.
- the user terminal 100 causes the avatar object 610 to execute the utterance 691.
- the user terminal 100 causes the avatar object 610 to execute the utterance 691 according to the voice data included in the game progress information.
- the content of the utterance 691 is "There is no bullet! Spoken by the player 4. That is, the content of the utterance 691 is to inform each user that the magazine is 0 and the ammunition loaded in the gun is 1, so that the means for attacking the enemy object 671 is likely to be lost.
- a balloon is used to visually indicate the utterance of the avatar object 610, but in reality, the voice is output from the speaker of the user terminal 100.
- the balloon shown in FIG. 12A (that is, the balloon including the text of the audio content) may be displayed in the view image. This also applies to the utterance 692 described later.
- the user terminal 100 Upon receiving the tap operation for the UI image 702, the user terminal 100 superimposes and displays the UI images 705 and 706 (message UI) on the field of view image as shown in FIG. 12 (B).
- the UI image 705 is a UI image that displays a comment for the avatar object 610 (in other words, the player).
- the UI image 706 is a UI image that accepts a comment transmission operation from the user 3 in order to transmit the input comment.
- the user terminal 100 when the user terminal 100 receives a tap operation on the UI image 705, the user terminal 100 displays a UI image imitating a keyboard (not shown, hereinafter simply referred to as "keyboard") on the touch screen 15.
- the user terminal 100 causes the UI image 705 to display text corresponding to the user's input operation on the keyboard.
- the text "Magazine is sent" is displayed on the UI image 705.
- the user terminal 100 When the user terminal 100 receives a tap operation on the UI image 706 as an example after inputting text, the user terminal 100 transmits comment information including information indicating the input content (text content) and information indicating the user to the server 200. do. The comment information is transmitted to the other user terminal 100 and the HMD set 1000 via the server 200.
- the UI image 703A is a UI image showing the user name of the user who sent the comment
- the UI image 704A is a UI image showing the content of the comment sent by the user.
- the user whose user name is "BBBBBB” uses his / her own user terminal 100 to transmit comment information having the content of "dangerous!, That is, the UI image 703A and the UI image 704A.
- the UI image 703A and the UI image 704A are displayed on the touch screen 15 of all the user terminals 100 participating in this game and the monitor 51 of the HMD 500.
- the UI images 703A and 704A may be one UI image. That is, one UI image may include a user name and the content of a comment.
- the UI image 703B includes the user name "AAAAAA”, and the UI image 704B contains the comment "Magazine send! Entered in the example of FIG. 12B.
- the user "AAAAA” further inputs a tap operation to the UI image 701, displays the UI image 711 on the touch screen 15, and inputs a tap operation to the UI image 711A. It is a later view image 611. That is, as a result of the item input information indicating the magazine being transmitted from the user terminal 100 of the user "AAAAA" to the other user terminal 100 and the HMD set 1000, the user terminal 100 and the HMD set 1000 have the effect object 674 (described later). Is arranged in the virtual space 600. As an example, the user terminal 100 and the HMD set 1000 execute the effect related to the effect object 674 and execute the process of invoking the effect of the item object after the elapsed time indicated by the item input information has elapsed.
- the number of magazines is increased from 0 to 1 by executing the process of activating the effect of the item object.
- the player speaks "Thank you!” To the user "AAAAAA”, and the voice data of the utterance is transmitted to each user terminal 100.
- each user terminal 100 outputs the voice "Thank you!” As the utterance 692 of the avatar object 610.
- communication between the user and the avatar object 610 is realized by outputting the utterance voice of the avatar object 610 based on the utterance of the player and inputting a comment by each user.
- FIG. 13 is a flowchart showing an example of the flow of the game progress process executed by the game play terminal 300.
- step S31 the processor 30 advances the game as the game progress unit 315 based on the game program 331 and the movement of the player.
- step S32 the processor 30 generates game progress information and distributes it to the user terminal 100. Specifically, the processor 30 transmits the generated game progress information to each user terminal 100 via the server 200.
- the processor 30 When the processor 30 receives the item input information in step S33 (YES in S33), the processor 30 arranges the item object in the virtual space 600A based on the item input information in step S34. As an example, the processor 30 arranges the effect object 674 in the virtual space 600A before arranging the item object (see FIG. 11C).
- the effect object 674 may be, for example, an object imitating a present box.
- the processor 30 may execute the effect regarding the effect object 674 after the elapsed time indicated by the item input information has elapsed.
- the effect may be, for example, an animation in which the lid of the present box is opened. After executing the animation, the processor 30 executes a process of invoking the effect of the item object. For example, in the example of FIG. 11D, the obstacle object 673 is arranged.
- the processor 30 may arrange an item object corresponding to the tapped UI image in the virtual space 600A. For example, when a tap operation is performed on the UI image 711A, the processor 30 arranges a magazine object indicating a magazine in the virtual space 600A after executing the animation. Further, when the tap operation is performed on the UI image 711B, the processor 30 arranges the first aid kit object indicating the first aid kit in the virtual space 600A after executing the animation.
- the processor 30 may execute a process of invoking the effect of the magazine object or the first aid kit object when the avatar object 610 moves to the position of the magazine object or the first aid kit object, for example.
- the processor 30 continues and repeats the processes of steps S31 to S34 until the game is completed.
- the game ends for example, when the player inputs a predetermined input operation for ending the game (YES in step S35), the process shown in FIG. 13 ends.
- FIG. 14 is a flowchart showing an example of the flow of the game progress process executed by the user terminal 100.
- step S41 the processor 10 receives the game progress information.
- step S42 the processor 10 advances the game as the game progress unit 115 based on the game progress information.
- step S43 when the processor 10 accepts the item input operation by the user 3 (YES in step S43), in step S44, the processor 10 consumes the virtual currency and arranges the effect object 674 in the virtual space 600B.
- the virtual currency is purchased (charged for the game) by the user 3 performing a predetermined operation on the processor 10 before or during the participation in the game. It may be present, or it may be given to the user 3 when a predetermined condition is satisfied.
- the predetermined conditions may be those that require participation in this game, such as clearing a quest in this game, or those that do not require participation in this game, such as answering a questionnaire.
- the amount of virtual currency (owned amount of virtual currency) is stored in the user terminal 100 as game information 132 as an example.
- step S45 the processor 10 transmits the item input information to the server 200.
- the item input information is transmitted to the game play terminal 300 via the server 200.
- the processor 10 arranges the item object in the virtual space 600A after a predetermined time elapses after the arrangement of the effect object 674.
- the obstacle object 673 is arranged. That is, when the user 3 inputs a tap operation to the UI image 711C, a predetermined amount of virtual currency is consumed and the obstacle object 673 is arranged.
- the processor 10 continues and repeats the processes of steps S41 to S45 until the game is completed.
- the game is finished, for example, when the player performs a predetermined input operation for ending the game, or when the user 3 performs a predetermined input operation for leaving the game halfway (YES in step S46). ), The process shown in FIG. 14 is completed.
- FIG. 15 is a flowchart showing an example of the flow of the game progress process executed by the server 200.
- step S51 the processor 20 receives the game progress information from the game play terminal 300.
- step S52 the processor 20 updates the game progress log (hereinafter, play log) as the log generation unit 212.
- the play log is generated by the processor 20 when the initial arrangement information is received from the game play terminal 300.
- step S53 the processor 20 transmits the received game progress information to each user terminal 100.
- step S54 When the item input information is received from any user terminal 100 in step S54 (YES in step S54), the processor 20 updates the play log as the log generation unit 212 in step S55. In step S56, the processor 20 transmits the received item input information to the game play terminal 300.
- the processor 20 continues and repeats the processes of steps S51 to S56 until the game is completed.
- the processor 20 is used as a list generation unit 213 from the play log. Generate a list of users who participated in the game (user list 234).
- the processor 20 stores the generated user list 234 in the server 200.
- FIG. 16 is a diagram showing a specific example of the user list 234.
- information for example, a user name
- the "tag” column stores information (tags) generated based on the support provided by each user to the player.
- tags among the tags stored in the "tag” column, those without the key brackets are the information automatically generated by the processor 20, and those with the key brackets are those stored by the game operator. Information entered manually.
- the user "AAAAAA” is associated with information such as a magazine, 10F, a boss, and "winning the boss by presenting the magazine”. This indicates that, for example, in the boss battle on the stage of 10F, the user "AAAAA” inserts a magazine, and the avatar object 610 wins the boss with the ammunition of the inserted magazine.
- the user "BBBBBB” is associated with information such as a first aid kit, 3rd floor, Zako, and "recovery on the verge of game over”. For example, in a battle with a Zako enemy on the 3rd floor, the user “BBBB” “BBBBBB” throws in the first aid kit, and as a result, it shows that the physical strength of the avatar object 610 has recovered just before it becomes 0 (the game is over).
- the user "CCCCC” is associated with information such as barricade, 5th floor, Zako, and "stopping two zombies with barricade”. This means that, for example, in a battle with a Zako enemy on the 5th floor, the user "CCCCC” throws in a barricade (obstacle object 672 in FIG. 11), and as a result, succeeds in stopping the two Zako enemies. Shows.
- one support provided is associated with the user name of each user 3, but the user name of the user 3 who has performed the support a plurality of times has a tag for each of the plurality of support times. Can be associated.
- each tag is distinguished. As a result, after the game is over, the player who refers to the user list 421 using the distribution terminal 400 can accurately grasp the content of each support.
- FIG. 17 is a flowchart showing an example of the flow of the live distribution process executed by the distribution terminal 400.
- FIG. 18 is a diagram showing a specific example of a screen displayed on the distribution terminal 400.
- FIG. 19 is a diagram showing another specific example of the screen displayed on the distribution terminal.
- step S61 the processor 40 receives the first operation for displaying the list of users who have the right to receive live distribution (user list 234) as the operation reception unit 413.
- the conditions for receiving live distribution may be set by the game master as appropriate, but at least the conditions are that the application of this game is installed and that the game is online at the time of live distribution.
- the user who has the right to receive the live stream is the user who participates in the game in real time. In addition, you may add that you are a paid user for the livestreaming.
- the privilege acquired in the location information game part may be exchanged for a ticket and paid with the ticket.
- the specific user terminal 100 reserved in advance to receive live distribution at the live distribution time may be a user terminal having the right to receive live distribution.
- the download screen 721 shown in FIG. 18A is a screen for downloading the user list 234 from the server 200 and displaying it on the display unit 452.
- the download screen 721 is a screen displayed immediately after inputting the start operation of the application for executing the distribution process shown in FIG. 17 into the distribution terminal 400.
- the download screen 721 includes UI images 722 and 723 as an example.
- the UI image 722 accepts an operation for downloading the user list 234, that is, the first operation.
- the first operation may be, for example, an operation of tapping the UI image 722.
- the UI image 723 accepts an operation for terminating the application.
- the operation may be, for example, an operation of tapping the UI image 723.
- the processor 40 Upon receiving the tap operation for the UI image 722, in step S62, the processor 40 acquires (receives) the user list 234 from the server 200 as the communication control unit 411. In step S63, the processor 40 causes the display unit 452 to display the user list 234 as the display control unit 412. Specifically, the processor 40 causes the display unit 452 to display the user list screen generated based on the user list 234.
- the user list screen may be the user list screen 731 shown in FIG. 18B.
- the user list screen 731 is composed of a record image corresponding to each record in the user list 234.
- record images 732A to 732C are described as record images, but the number of record images is not limited to three. In the example of FIG.
- the player may perform, for example, scroll the screen (eg,).
- Scroll the screen eg, By inputting a drag operation or a flick operation) to the touch screen 45, another record image can be displayed on the display unit 452.
- the record images 732A to 732C include user names 733A to 733C, tag information 734A to 734C, and icons 735A to 735C, respectively.
- the user names 733A to 733C, the tag information 734A to 734C, and the icons 735A to 735C "record image 732", "user name 733", and "user name 733”, respectively. It is described as “tag information 734" and "icon 735".
- the user name 733 is information indicating each user who participated in the game, which is stored in the "user" column in the user list 234.
- the tag information 734 is information indicating a tag associated with each of the information indicating each user who participated in the game in the user list 234.
- the record image 732A includes "AAAAAA” as the user name 733A. Therefore, the record image 732A includes "magazine, 10F, boss,” win the boss by presenting the magazine “" associated with "AAAAAA” in the user list 234 as the tag information 734A.
- the icon 735 is, for example, an image preset by the user.
- the processor 40 may store the received user list in the distribution terminal 400 (user list 421 in FIG. 7).
- the download screen 721 may include a UI image (not shown) for displaying the user list 421 on the display unit 452.
- the processor 40 does not download the user list 234, reads the user list 421, generates a user list screen from the user list 421, and displays the user list screen on the display unit 452. ..
- step S64 the processor 40 receives a second operation for selecting any of the users included in the user list screen 731 as the operation receiving unit 413.
- the second operation may be an operation of tapping any one of the record images 732 on the user list screen 731.
- the player inputs a tap operation to the record image 732A. That is, the player has selected the user "AAAAAA" as the user who distributes the operation instruction data.
- the processor 40 Upon receiving the tap operation for the record image 732, in step S65, the processor 40 causes the display unit 452 to display the motion list 422 as the display control unit 412. Specifically, the processor 40 causes the display unit 452 to display the motion list screen generated based on the motion list 422.
- the motion list screen may be the motion list screen 741 shown in FIG.
- the motion list screen 741 is composed of a record image corresponding to each record in the motion list 422.
- record images 742A to 742C are described as record images, but the number of record images is not limited to three. In the example of FIG.
- the player inputs another record, for example, an operation of scrolling the screen (for example, a drag operation or a flick operation) to the touch screen 45.
- the image can be displayed on the display unit 452.
- the record images 742A to 742C include motion names 743A to 743C, motion images 744A to 744C, and UI images 745A to 745C, respectively.
- the motion names 743A to 743C, the motion images 744A to 744C, and the UI images 745A to 745C "record image 7432" and "motion name 743", respectively. It is described as “motion image 744" and "UI image 745".
- the motion name 743 is information for identifying the motion stored in the motion list 422.
- the motion image 744 is an image generated from the motion data associated with each motion name in the motion list 422.
- the processor 40 includes an image of the avatar object 610 that takes the first posture in each motion data as a motion image 744 in the record image 742.
- the motion image 744 may be a UI image that accepts a predetermined operation (for example, a tap operation on the motion image 744) by the player.
- the processor 40 may play a motion moving image in which the avatar object 610 operates based on the motion data.
- the processor 40 may automatically redisplay the motion list screen 741 when the motion moving image is finished.
- the record image 742 may include, for example, a UI image including the text "motion reproduction" instead of the motion image 744.
- step S66 the processor 40 receives the third operation of selecting a motion as the operation receiving unit 413.
- the third operation may be a tap operation on the UI image 745. That is, the UI image 745 accepts an operation of selecting motion data corresponding to each record image 742.
- the processor 40 specifies the motion data selected by the player as the motion specifying unit 415.
- step S67 the processor 40 receives the voice input of the player as the display control unit 412 and the voice reception unit 414 while the avatar object 610 plays a motion moving image that operates based on the selected motion data.
- FIG. 20 is a diagram showing a specific example of voice input by the player 4.
- the player 4 is inputting the utterance voice 820A while playing the motion moving image 810A.
- the utterance voice 820A is a utterance voice addressed to the user 3 (hereinafter, user 3A) whose user name is "AAAAAA”. That is, in the example of FIG. 20, the player 4 selects the user 3A (first user) in step S64 and creates the operation instruction data addressed to the user 3A. It is assumed that the user terminal 100 used by the user 3A is the user terminal 100A.
- the utterance voice 820A is a utterance voice addressed to the user 3A
- the utterance voice is based on the content of the support provided by the user 3A to the avatar object 610 (in other words, the player 4).
- the user 3A inserts a magazine in the boss battle on the stage of 10F, and the avatar object 610 wins the boss with the ammunition of the inserted magazine.
- the utterance voice 820A says, "Thank you for giving me the magazine in the boss battle! The timing was perfect! Thanks to Mr. AAAAA, I was able to clear it!
- the uttered voice includes the content of the support provided by the user 3 in the game and the gratitude to the user 3.
- the player 4 creates the utterance content addressed to the user 3 before starting the voice input, that is, before inputting the third operation to the distribution terminal 400.
- the utterance content addressed to the user 3 may be automatically generated by the processor 40. Further, the processor 40 may superimpose and display the tag associated with the user 3 selected by the second operation on the motion moving image 810A.
- the processor 40 converts the received voice into voice data.
- the processor 40 generates the operation instruction data including the voice data and the motion data of the selected motion as the operation instruction data generation unit 416.
- step S69 the processor 40 distributes the generated operation instruction data to the user terminal 100 (first computer) of the selected user 3 (user 3A in the example of FIG. 20) as the communication control unit 411.
- FIG. 21 is a diagram showing still another specific example of the screen displayed on the distribution terminal 400.
- the processor 40 causes the display unit 452 to display the distribution screen as the display control unit 412.
- the distribution screen may be the distribution screen 751 shown in FIG. 21 (A).
- the distribution screen 751 includes a UI image 752 and a motion image 753A. Further, as shown in FIG. 21A, the distribution screen 751 may include information indicating a user to whom the operation instruction data is distributed.
- the UI image 752 accepts an operation for delivering the operation instruction data to the selected user 3.
- the operation may be, for example, a tap operation on the UI image 752.
- the motion image 753A is a UI image that accepts an operation for playing a moving image based on the generated motion instruction data, that is, a moving image based on the motion instruction data generated for the user 3A.
- the operation may be, for example, a tap operation on the motion image 753A.
- the UI image that accepts the operation for playing the generated moving image is not limited to the motion image 753A. For example, it may be a UI image including the text "playing a moving image".
- the processor 40 may automatically redisplay the distribution screen 751 when the moving image is finished.
- the distribution screen 751 further includes a UI image that accepts an operation for returning to the acceptance of voice input.
- the operation may be, for example, a tap operation on the UI image.
- the player 4 can perform voice input again when the voice input fails, for example, when the content to be spoken is mistaken.
- the UI image may be a UI image that accepts an operation for returning to the selection of motion data.
- the processor 40 Upon receiving the tap operation for the UI image 752, the processor 40 transmits the operation instruction data to the server 200 together with the information indicating the user 3A.
- the server 200 identifies the user terminal 100 to which the operation instruction data is transmitted based on the information indicating the user 3A, and transmits the operation instruction data to the specified user terminal 100 (that is, the user terminal 100A).
- the processor 40 may display the distribution completion screen 761 shown in FIG. 21B on the display unit 452 as an example.
- the delivery completion screen 761 includes UI images 762 and 763 as an example. Further, the delivery completion screen 761 may include a text indicating that the transmission of the operation instruction data is completed, as shown in FIG. 21 (B).
- the UI image 762 accepts an operation for starting the creation of operation instruction data addressed to another user 3.
- the operation may be, for example, an operation of tapping the UI image 762.
- the processor 40 receives the tap operation, the processor 40 causes the display unit 452 to display the user list screen again. That is, when the tap operation is accepted, the distribution process returns to step S63. At this time, the processor 40 may generate a user list screen based on the user list 421 stored in the distribution terminal 400 and display it on the display unit 452.
- the UI image 763 accepts an operation for terminating the application.
- the operation may be, for example, an operation of tapping the UI image 763.
- the distribution process ends.
- the distribution terminal 400 uses the operation instruction data of the moving image addressed to the user 3A (user 3 whose user name is "AAAAAA"). , Sent only to the user terminal 100A.
- FIG. 22 is a diagram showing another specific example of voice input by the player 4.
- the player 4 is inputting the utterance voice 820B while playing the motion moving image 810B.
- the utterance voice 820B is a utterance voice addressed to the user 3 (hereinafter, user 3B) whose user name is "BBBBBB". That is, in the example of FIG. 22, the player 4 inputs the tap operation to the record image 732B corresponding to the user 3B in step S64, and creates the operation instruction data addressed to the user 3B. It is assumed that the user terminal 100 used by the user 3B is the user terminal 100B.
- the utterance voice 820B is a utterance voice addressed to the user 3B, the utterance voice is based on the content of the support provided by the user 3B to the avatar object 610 (in other words, the player 4). Specifically, in the battle with the Zako enemy on the stage of 3F, the user "BBBBBB” throws in the first aid kit, and as a result, the physical strength of the avatar object 610 becomes 0 (the game is over). I am recovering my physical strength. For this reason, the utterance voice 820B has the content of "Thanks to the first aid kit given by Mr. BBBBB, the game did not end on the 3rd floor. Thank you very much!
- FIG. 23 is a diagram showing still another specific example of the screen displayed on the distribution terminal 400.
- the distribution screen 751 shown in FIG. 23A includes a UI image 752 and a motion image 753B.
- the motion image 753B accepts the tap operation, the motion image 753B reproduces a moving image based on the operation instruction data generated for the user 3B.
- the processor 40 Upon receiving the tap operation for the UI image 752, the processor 40 transmits the operation instruction data to the server 200 together with the information indicating the user 3B.
- the server 200 identifies the user terminal 100 to which the operation instruction data is transmitted based on the information indicating the user 3B, and transmits the operation instruction data to the specified user terminal 100 (that is, the user terminal 100B).
- the distribution terminal 400 uses the operation instruction data of the moving image addressed to the user 3B (user 3 whose user name is “BBBBBB”). , Sent only to the user terminal 100B.
- the content of the voice based on the voice data included in the operation instruction data is based on the content of the support provided to the player 4 in the participation of the user 3 in the latest game. Since the content of the support is different for each user 3, the content of the voice is different for each user 3. That is, after the end of the game, operation instruction data including voices having different contents is transmitted to at least a part of the user terminals 100 of the user 3 who participated in the game.
- the motion of the avatar object 610 in the example of FIG. 22 is different from the motion of the example of FIG. 20. That is, the player 4 selects motion data different from that at the time of generating the operation instruction data addressed to the user 3A in the operation instruction data generation addressed to the user 3B. Specifically, in step S66, the player 4 inputs a tap operation to the UI image 745B for selecting the motion data corresponding to the record image 742B. In this way, the player 4 can make the motion data included in the operation instruction data different for each user 3.
- the operation instruction data unique to each user terminal 100 is transmitted to each of the user terminals 100 of the selected user 3.
- FIG. 24 is a diagram showing an outline of transmission of game progress information from the game play terminal 300 to the user terminal 100. While the operation instruction data for playing a moving image in the user terminal 100 is unique for each user terminal 100, as shown in FIG. 24, the user terminals 100 of all the users 3 participating in the game during the game execution.
- the game progress information transmitted to is common among the user terminals 100. That is, the operation instruction data included in the game progress information is also common among the user terminals 100. As described above, it can be said that the operation instruction data for playing the moving image and the operation instruction data for advancing the game are different data from the viewpoints of the difference between the user terminals 100 and the destination.
- FIG. 25 is a flowchart showing an example of the flow of the moving image reproduction process executed by the user terminal 100.
- step S71 the processor 10 receives the operation instruction data as the moving image reproduction unit 117.
- step S72 the processor 10 notifies the user 3 of the reception of the operation instruction data as the moving image reproduction unit 117.
- the processor 10 displays a notification image on the display unit 152, reproduces a notification voice from a speaker (not shown), lights a lighting unit (not shown) composed of an LED (light-emitting diode), or the like.
- the user 3 is notified of the reception of the operation instruction data by at least one of the blinking.
- step S73 the processor 10 receives the first reproduction operation for reproducing the moving image as the operation reception unit 111.
- the first reproduction operation may be an operation of tapping the notification image.
- step S74 the processor 10 renders the operation instruction data as the moving image reproduction unit 117 and reproduces the moving image.
- the processor 10 may start an application for playing this game and play a video, or may start an application for playing a video different from the application and play a video. good.
- the video will be referred to as a “thank you video”.
- FIG. 26 is a diagram showing a specific example of playing a thank-you video. Specifically, it is a figure which shows an example of the reproduction of the thank-you moving image in the user terminal 100 of the user 3A.
- the avatar object 610 is uttering the voice 920A while executing a certain motion.
- the processor 10 outputs the audio 920A from the speaker (not shown) while playing the thank-you video 910A including the avatar object 610 that executes a certain motion.
- the motion in the thank-you video 910A is based on the motion data selected by the player 4 in the generation of the motion instruction data addressed to the user 3A, and the voice 920A is the utterance voice input by the player 4 in the generation of the motion instruction data. It is based on the voice data generated from the 820A. That is, the voice 920A is a voice including the content of the support provided by the user 3A in the game and the gratitude for the support. In this way, the user 3A can watch the thank-you video in which the avatar object 610 utters the content of the support provided by himself / herself in the game and the gratitude for the support by inputting the first playback operation.
- the motion in the thank-you video 910A may be based on the motion of the player 4 acquired as motion data in the generation of the motion instruction data addressed to the user 3A.
- the movement of the player 4 is directly reflected in the movement of the avatar object 610 displayed on the display unit 152.
- the game progress unit 115 of the user terminal 100 transmits the voice and movement of the player 4 in real time at the installation location of the game play terminal 300 almost at the same time as the player 4 makes a voice or moves. It is reflected in the remarks and movements of the avatar object 610 in.
- the moving image reproduction unit 117 and the game progress unit 115 can continue rendering and reproduction of the real-time moving image while continuously receiving the operation instruction data from the distribution terminal 400. By seeing the avatar object that performs such an operation, the user can feel the reality of the avatar object as if it exists in the real world and can immerse himself in the game world.
- the user terminal 100 may display at least one UI image on the touch screen 15 after the reproduction of the thank-you video 910A is completed.
- the UI image may be, for example, a UI image that accepts an operation for playing the thank-you video 910A again, a UI image that accepts an operation for transitioning to another screen, or an application. It may be a UI image that accepts an operation for terminating.
- the user terminal 100 may display at least one UI image on the touch screen 15 during the reproduction of the thank-you video 910A.
- the UI image may be, for example, a plurality of UI images each accepting an operation of temporarily stopping, ending, or changing the scene to be reproduced of the thank-you moving image 910A being reproduced.
- these UI images displayed during the playback of the thank-you video 910A and after the playback of the thank-you video 910A is completed do not include the UI image for making a response to the avatar object 610. That is, the thank-you video 910A according to the present embodiment is not provided with a means for responding to the avatar object 610.
- FIG. 27 is a diagram showing another specific example of playback of the thank-you video. Specifically, it is a figure which shows the example of the reproduction of the thank-you moving image in the user terminal 100 of the user 3B.
- the avatar object 610 is uttering the voice 920B while executing a certain motion.
- the processor 10 outputs the audio 920B from the speaker (not shown) while playing the thank-you video 910B including the avatar object 610 that executes a certain motion.
- the motion in the thank-you video 910B is based on the motion data selected or input by the player 4 in the generation of the motion instruction data addressed to the user 3B, and the voice 920B is input by the player 4 in the generation of the motion instruction data. It is based on the voice data generated from the spoken voice 820B. Therefore, in the example of FIG. 27, the motion performed by the avatar object 610 is different from the motion of the example of FIG. 26. Further, the voice 920B is a voice including the content of the support provided by the user 3B in the game and the gratitude for the support. Therefore, in the example of FIG. 27, the content of the voice 920B is different from the content of the voice 920A in the example of FIG. 26.
- the thank-you video received by at least a part of the user terminals 100 of the users 3 who participated in the game after the end of the game is a video in which the utterance content of the avatar object 610 is different for each user 3.
- the processor 10 may display the UI image 930 including the content for encouraging participation in the next game by superimposing it on the moving image 910.
- the UI image 930 may be distributed together with the operation instruction data, or may be stored in the user terminal 100 as the game information 132.
- the game shown in FIG. 11 is a live distribution game in which the game screen, which is the viewing content, is displayed on the touch screen 15 based on the game progress information lively distributed via the server 200.
- the system 1 according to the present embodiment provides a location information game in addition to the live distribution game.
- a map image displaying objects of the portal and the live distribution tower is displayed on the touch screen 15.
- the game associated with the portal is executed. Clearing the game associated with the portal grants the user the benefits available in the livestreaming part.
- a privilege is a throwing item that can be given to a character (or a distributor of a live distribution source).
- the object representing the portal is simply referred to as a "portal”
- the object representing the live distribution tower is simply referred to as a "live distribution tower”.
- information on the current position of the user terminal 100 (for example, address information, latitude / longitude) is performed by the position registration system (not shown) provided in the user terminal 100.
- Information etc. is acquired, and map data of a specific area (for example, an area of 40 km east-west and 30 km north-south) centered on the current position is acquired from another service providing device.
- a plurality of portals and a plurality of live distribution towers are arranged in the specific area.
- the location information and portal ID of each of the plurality of portals and the location information and distribution tower ID of each of the plurality of live distribution towers are acquired from the server 200.
- the arrangement of the portal and the live distribution tower may be fixed or may be updated at any time.
- map data for making the map image shown in FIG. 29 (D) displayable is acquired.
- the thick circle drawn on the map image is, for example, a range with a radius of 10 km centered on the current position of the user terminal 100, and is defined as a “predetermined range” in the present embodiment.
- a map image schematically of a visualization area (for example, an area of 1 km east-west and 2 km north-south) centered on the current position of the map data is displayed based on the map data (Fig.). 29 (A)).
- the portal and the live distribution tower are displayed based on the location information and the portal ID of the portal acquired from the server 200 and the location information and the distribution tower ID of the live distribution tower.
- the map image also displays a current position image showing the current position of the user terminal 100.
- the map image on the touch screen 15 is scrolled by accepting a scroll operation (flick operation or swipe operation). For example, when the scroll operation to the right is performed, the map image shifts to the east side (for example, Shinjuku station to the left) as shown in FIG. 29 (A) ⁇ FIG. 29 (B) ⁇ FIG. 29 (C). It is updated to the map image of). Further, when the scroll operation to the right is performed, the map image as shown in FIG. 29 (E) is displayed on the touch screen 15.
- a scroll operation for example, when the scroll operation to the right is performed, the map image shifts to the east side (for example, Shinjuku station to the left) as shown in FIG. 29 (A) ⁇ FIG. 29 (B) ⁇ FIG. 29 (C). It is updated to the map image of).
- FIG. 29 (E) is displayed on the touch screen 15.
- the map data of the specific area centered on the center position is acquired from another service providing device. .. Further, from the server 200, the location information and the portal ID of the portal arranged in the specific area and the location information and the distribution tower ID of the live distribution tower are acquired.
- map image is scrolled so that the current position image is displayed in the center.
- map data of a specific area centered on the current position of the user terminal 100 is acquired from another service providing device and arranged in the specific area.
- the location information and portal ID of the portal and the location information and distribution tower ID of the live distribution tower are acquired from the server 200.
- the map image on the touch screen 15 does not have to be an image centered on the current position of the user terminal 100.
- the location-based game is a game in which the main character travels from a starting point to a destination designated on map data according to a scenario.
- the position of the main character does not have to be related to the current position of the user terminal 100.
- the main character moves along various routes from the starting point to the destination according to the progress of this scenario.
- the game associated with the portal can be executed, or the viewing process of the live distribution game associated with the live distribution tower is executed. be able to.
- the table TBL1 shown in FIG. 28 is stored in the memory 11.
- a plurality of portal IDs acquired from the server 200 are registered in the table TBL1. Further, the rarity of the portal and the remaining time of the cool time are set for each of the plurality of portal IDs.
- the thrown item may be, for example, a local item related to the area where the portal is located.
- the rarity of the portal is a parameter that defines the rarity of the thrown item, and increases in the order of silver ⁇ gold ⁇ rainbow.
- the rarity is reset to the portal when the game is finished, for example, by a random number lottery.
- the value of the thrown item is defined by the type of the thrown item and the amount of points associated with the thrown item.
- the cool time is a parameter that defines the time during which the tap operation is restricted, and is reset to a predetermined time (for example, 120 minutes) according to the end of the game.
- the tap operation for the portal is allowed again when the remaining time of the cool time becomes 0 minutes.
- the portal When a portal with 0 minutes remaining in the cool time and a tap operation is executed for a portal arranged within a predetermined range centered on the current position of the user terminal 100, the portal is associated with the portal.
- the game runs.
- the tap operation for the portal is performed before the cool time elapses from the previous tap operation for the portal, or the portal outside the predetermined range (for example, the portal on the map image shown in FIG. 29 (E)).
- the consumption of virtual currency (compensation) required to execute the game associated with the portal is the remaining time of the cool time or the current time. Calculated based on the distance from the location to the location of the portal. The consumption increases as the remaining time increases, or increases as the distance increases.
- the virtual currency may be paid or free of charge.
- the virtual currency consumption inquiry screen is superimposed and displayed on the touch screen 15.
- the character string "Do you want to consume virtual currency XX coins to execute the game?", "Execute” button, and "Cancel” button are displayed.
- the "execute” button is tapped, the virtual currency of the consumption amount among the virtual currencies owned by the user is consumed, and then the game associated with the portal is executed.
- the portal is a portal arranged in an area within a predetermined range from the current position of the user terminal 100
- the game associated with the portal is executed without consuming the virtual currency owned by the user.
- the portal is a portal located in an area outside the predetermined range from the current position, the game is executed by consuming the virtual currency.
- the predetermined range is not only the area (initial area) of the map image displayed on the touch screen 15 together with the current position image according to the game start operation, but also the map image displayed on the touch screen 15 according to the scroll operation.
- the area is defined to include an area in which the current position image is not displayed, as shown in FIGS. 29 (C) and 29 (E), for example.
- the virtual currency is not consumed. It becomes possible to execute the game, and it is possible to reduce the concern that the virtual currency will be consumed due to the user's erroneous operation and the interest of the game will be impaired.
- the game executed in response to the tap operation to the portal is, for example, a shooting game, and is cleared when a predetermined achievement condition (for example, a condition that the number of shot down enemy characters reaches a predetermined number) is satisfied. Become.
- a predetermined achievement condition for example, a condition that the number of shot down enemy characters reaches a predetermined number
- Become When the game is cleared, the user is given a free money item that is a type of money item corresponding to the rarity of the portal and is associated with a point amount according to the rarity of the portal.
- the rarity is silver or gold, it is a type of throwing item that is not related to the area where the portal is located and is associated with a point amount corresponding to silver or gold. The item is given to the user.
- the rarity is a rainbow
- the user will receive a free money throwing item related to the area where the portal is located, that is, a local item with a point amount corresponding to the rainbow. Granted. For example, if the location of the portal is Sumida-ku, Tokyo, local items that imitate the Sky Tree will be given. In addition, the amount of points associated with the thrown item increases in the order of silver ⁇ gold ⁇ rainbow.
- the portal's rarity and cool time will be reset.
- free money throwing items other than local items are, for example, knives, machine guns, sabers, and hand grenades, and paid money throwing items (magazines, first aid kits, triangular cones) that can be thrown in by consuming virtual currency in live distribution games. , Barricade).
- the variation of the throwing items that can be thrown in the live distribution game is increased by clearing the game associated with the portal.
- the free money throwing item other than the local item may be a clothing item or a point to be attached to the avatar object 610.
- a paid money item that can be thrown in by purchase may be given to the user.
- the user does not have to clear the game associated with the portal.
- the user accepts the purchase process of the paid money item on the purchase site, and grants the purchased money item to the user as the paid money item.
- the genre of the game executed in response to the tap operation to the portal is not limited to a specific genre.
- System 1 can play games of any genre. For example, sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
- sports-themed games such as tennis, table tennis, dodgeball, baseball, soccer and hockey, puzzle games, quiz games, RPG (Role-PlayingGame), adventure games, shooting games, simulation games, training games, and It may be an action game or the like.
- the play mode of the game executed in the system 1 is not limited to a specific play mode.
- the game system 1 may execute a game of any play form. For example, a single-player game by a single user, a multi-play game by a plurality of users, a battle game in which a plurality of users play against each other, and a cooperative play game in which a plurality of users cooperate among the multi-play games. You may.
- a tossed item is given to the user.
- the given throwing item can be used for the right to receive live distribution in the second part and for throwing money to the player during real-time live distribution.
- the user can support the player with the acquired items, thereby activating the interaction between the user and the player.
- the game includes at least a livestream part.
- the game may be composed of a single live distribution part or may be composed of a plurality of parts.
- the live viewing process for displaying the game screen of the live distribution game shown in FIG. 11 on the touch screen 15 is executed.
- the virtual space 600B shown in FIG. 10A is defined based on the virtual space data stored in the storage 12, and the virtual camera 620B is arranged at a predetermined initial position of the virtual space 600B. Will be done.
- the game play terminal 300 advances the live distribution game based on the game program. Specifically, the object including the avatar object 610 is arranged, moved, changed, etc. in the virtual space 600A shown in FIG. 9A, and various game parameters are initially set or updated. The game progress information is generated based on the updated object in the virtual space 600B and various game parameter values, and is distributed to a plurality of user terminals 100.
- the user terminal 100 arranges and operates an object including the avatar object 610 in the virtual space 600B based on the game progress information distributed from the game play terminal 300, and updates various game parameter values.
- the game progress information is transmitted from the game play terminal 300 every predetermined time (1/60 second). Therefore, the objects in the virtual space 600B and various game parameter values are updated at the predetermined time intervals.
- the object arranged in the virtual space 600B also includes an object of a throwing item, which will be described later.
- a field of view image 660 corresponding to the field of view area 640B according to the position and orientation of the current virtual camera 620B is displayed.
- the parameter image representing the above various game parameter values is superimposed and displayed on the upper part of the view image 660, and the UI image 711 for accepting the item input operation from the user is superimposed and displayed on the lower part of the view image 660. (See FIG. 11 (A)).
- a UI image on which the icon of the free tossed item is drawn and a UI image on which the icon of the paid tossed item is drawn are displayed. That is, when the user acquires a free tossed item by clearing the game associated with the portal, the UI image on which the icon of the free tossed item is drawn can be displayed in the UI image 711. .. For example, if the user acquires a grenade by clearing a game associated with the portal, a UI image with the grenade icon drawn is displayed in the UI image 711. When all the thrown items acquired by the user are thrown in, the UI image on which the icon of the thrown item is drawn is deleted from the UI image 711. For example, when all the grenades acquired by the user are thrown in, the UI image on which the grenade icon is drawn is deleted from the UI image 711.
- the paid money item is associated with the amount of virtual currency consumed when the money item is thrown and the amount of points according to the consumption amount of the virtual currency.
- the amount of points associated with the paid toss item is larger than the amount of points associated with the free toss item, and the amount of virtual currency required to insert the paid toss item increases. Increase.
- the item input information is transmitted to the game play terminal 300 with the consumption of the virtual currency corresponding to the consumption amount associated with the money item.
- the item input information is transmitted to the game play terminal 300 without consuming the virtual currency.
- the item input information information for specifying the type of the tossed item that is the target of the tap operation, the amount of points associated with the tossed item, and the user ID of the user who performed the item input operation are specified. Contains information for. When the amount of points associated with the thrown item is managed by the game play terminal 300, the amount of points may not be included in the item input information.
- the game play terminal 300 specifies the point amount included in the item input information and imparts the point amount to the avatar object 610.
- the amount of points given is determined by the game operator or the like to specify the ranking (popularity) of the avatar object 610, the operator or the like to determine the reward amount to be paid to the player who operates the avatar object 610, and the player. It is used to acquire items such as knives and sabers by performing gachas in the game. Further, the type of the tossed item included in the item input information is specified, and the object corresponding to the type is input and arranged in the virtual space 600B.
- the game progress information including the user ID of the user who has performed the item input operation may be distributed to the plurality of user terminals 100.
- the image representing the user ID is superimposed on the field of view image 660 and displayed on the touch screen 15.
- the user can also input and send a comment for the avatar object 610 by operating the user terminal 100.
- the comment is transmitted to the game play terminal 300 during the period when the game screen is displayed.
- the game play terminal 300 receives the comment from the user terminal 100, the game play terminal 300 can display the user name on the display 530.
- the player can input a comment for the comment.
- the comments input by the user and the player are distributed from the game play terminal 300 to the plurality of user terminals 100. In each user terminal 100, the comment is superimposed on the field of view image 660 and displayed on the touch screen 15.
- (About operation) 30 and 31 are flowcharts showing an example of the basic game progress of the location-based game and the live-streaming game.
- the flowchart shown in FIG. 30 and the flowchart shown on the left side of FIG. 31 are executed by the user terminal 100.
- the flowchart shown on the right side of FIG. 31 is executed by the game play terminal 300.
- the processing of the flowchart shown in FIG. 30 is started by receiving an input operation for starting the position information game, that is, a game start operation from the user.
- step S81 position information indicating the current position of the user terminal 100 is acquired from the position registration system, and a specific area centered on the current position (for example, an area of 40 km east-west and 30 km north-south).
- the map data of is acquired from other service providing devices.
- map data for making the map image shown in FIG. 29 (D) displayable is acquired.
- step S81 the position information and the portal ID of the portal arranged in the specific area and the position information and the distribution tower ID of the live distribution tower arranged in the specific area are acquired from the server 200.
- a map image schematically showing a visualization area (for example, an area of 1 km east-west and 2 km north-south) centered on the current position of the map data is displayed on the touch screen 15 based on the map data.
- the portal is displayed on the map image based on the location information and the portal ID of the portal acquired from the server 200, and the live distribution is performed based on the location information and the distribution tower ID of the live distribution tower acquired from the server 200. Display the tower on the map image.
- the current position image showing the current position of the user terminal 100 is displayed on the map image.
- the touch screen 15 displays the map image, the portal, the live distribution tower, and the current position image.
- step S82 it is determined whether or not the tap operation on the portal displayed on the touch screen 15 has been performed based on the input operation on the touch screen 15, the location information of the portal acquired from the server 200, and the portal ID. .. If it is not determined that the tap operation has been performed, the process proceeds to step S83, and it is determined whether or not the scroll operation (flick operation or swipe operation) has been performed based on the input operation on the touch screen 15.
- the process proceeds to step S85, and the map image displayed on the touch screen 15 is scrolled. For example, when the scroll operation to the right is performed, the map image is updated as shown in FIG. 29 (A) ⁇ FIG. 29 (B) ⁇ FIG. 29 (C). Further, when scrolling to the right is performed, the map image shown in FIG. 29 (E) is displayed on the touch screen 15.
- step S85 the process returns to step S82.
- step S82 When it is determined in step S82 that the tap operation has been performed, the process proceeds to step S88, and the table TBL1 indicates whether or not the remaining cool time set in the portal targeted for the tap operation is 0 minutes. Judgment is based on. If it is determined that the remaining time is 0 minutes, the process proceeds to step S89, and whether or not the position of the portal is within a predetermined range from the current position of the user terminal 100 is determined by the position information of the portal and the user terminal 100. Judgment is based on the current position of.
- the predetermined range is, for example, a range having a radius of 10 km centered on the current position, and in the map image shown in FIG. 29 (D), the range shown by the thick line corresponds to the predetermined range.
- step S95 When it is determined that the position of the portal is within the predetermined range, the process proceeds to step S95. For example, when the portal on the map image shown in FIGS. 29 (A) to 29 (C) is tapped, the process proceeds to step S95.
- step S95 the game associated with the portal is specified based on the portal ID of the portal, and the game is executed. The game is a game that is cleared when a predetermined achievement condition is satisfied.
- step S96 it is determined whether or not the game has been cleared based on the processing result of step S95. If it is determined that the game has been cleared, the process proceeds to step S97.
- step S97 the rarity set in the portal is specified from the table TBL1 based on the portal ID of the portal, and it is a type of throwing item according to the rarity, and the point amount corresponding to the rarity is associated with it. Give the user the item to be thrown.
- step S98 the process proceeds to step S98. If it is not determined in step S96 that the game has been cleared, the process proceeds to step S98 without executing the process of step S97.
- step S98 in the table TBL1, the rarity of the portal is reset to one of silver, gold, and rainbow, and the cool time of the portal is reset to a predetermined time.
- the process of step S98 returns.
- step S88 when it is not determined that the remaining cool time set in the portal to be tapped is 0 minutes, or in step S89, the position of the portal is the current position of the user terminal 100. If it is not determined that the range is within the predetermined range, the process proceeds to step S90. For example, if the tap operation for the portal is performed before the cool time elapses from the previous tap operation for the portal, the process proceeds from step S88 to step S90. Further, when the portal on the right side of the thick line among the portals on the map image shown in FIG. 29 (E) is tapped, the process proceeds from step S89 to step S90.
- step S90 the amount of virtual currency consumed to execute the game associated with the portal is calculated based on the remaining time of the cool time or the distance from the current position to the position of the portal. That is, when the process proceeds from step S88 to step S90, the consumption amount of the virtual currency is calculated based on the remaining time of the cool time, and when the process proceeds from step S89 to step S90, the current position is used. Calculate the consumption of the virtual currency based on the distance to the location of the portal. The consumption increases as the remaining time increases, or increases as the distance increases.
- step S91 the virtual currency consumption inquiry screen is superimposed and displayed on the touch screen 15.
- the character string "Do you want to consume virtual currency XX coins to execute the game?", "Execute” button, and "Cancel” button are displayed.
- step S92 it is determined whether or not the "execute” button has been tapped based on the input operation on the touch screen 15. If it is not determined that the "execute” button has been tapped, the process proceeds to step S93, and it is determined whether or not the "cancel” button has been tapped based on the input operation on the touch screen 15. If it is not determined that the "Cancel” button has been tapped, the process returns to step S92.
- step S92 When it is determined that the "execute" button has been tapped in step S92, the process proceeds to step S94, and the virtual currency of the consumption amount calculated in step S90 among the virtual currencies owned by the user is consumed.
- step S94 the process proceeds to step S95. Therefore, the tap operation for the portal can be performed by consuming the virtual currency even if the remaining time of the cool time is not 0 minutes or the position of the portal is outside the predetermined range from the current position of the user terminal 100. , Tap operation for the portal is allowed.
- step S93 when it is determined in step S93 that the "Cancel" button has been tapped, a return is made.
- step S83 If it is not determined that the scroll operation has been performed in step S83, the process proceeds to step S84, and it is determined whether or not the tap operation for the live distribution tower has been performed based on the input operation for the touch screen 15.
- the live viewing process described later is executed in step S86, and then the process returns to step S82.
- another process is executed in step S87, and then the process returns.
- the live viewing process is executed according to the flowchart (subroutine) shown on the left side of FIG. 31. As a result, the live distribution game shown in FIG. 11 is executed.
- step S101 the virtual space 600B is defined based on the virtual space data stored in the storage 12. Further, in step S101, the virtual camera 620B is arranged at a predetermined initial position of the virtual space 600B.
- step S102 the game progress information distributed from the game play terminal 300 is received.
- step S103 an object including the avatar object 610 is arranged and operated in the virtual space 600B based on the game progress information, and various game parameter values are updated.
- the game progress information is transmitted from the game play terminal 300 every predetermined time (1/60 second). Therefore, in step S103, the objects in the virtual space 600B are updated at the predetermined time intervals, and the various game parameter values are updated at the predetermined time intervals, based on the received game progress information.
- the object arranged in the virtual space 600B also includes an object of a tossed item thrown in by the user.
- step S104 a field of view area 640B corresponding to the position and orientation of the current virtual camera 620B is defined, and a field of view image 660 corresponding to the field of view area 640B is generated. Further, in step S104, a parameter image representing the above-mentioned various game parameter values is superimposed on the upper stage of the view image 660, and a UI image 711 for accepting an item input operation from the user is superimposed on the lower stage of the view image 660. , Displayed on the touch screen 15 (see FIG. 11 (A)).
- the icon of the free tossed item given in step S97 and the icon of the paid tossed item that can be input by consuming the virtual currency are drawn on the UI image 711.
- the paid money item is associated with the amount of virtual currency consumed when the money item is thrown and the amount of points corresponding to the consumption amount of the virtual currency.
- the amount of points associated with the paid money item is larger than the amount of points associated with the free money item.
- the amount of points associated with the paid toss item increases as the consumption of virtual currency required to input the paid toss item increases.
- step S105 it is determined whether or not the tap operation on the icon of the paid money item has been performed based on the input operation on the touch screen 15.
- the process proceeds to step S106, and among the virtual currencies owned by the user, the virtual currency corresponding to the consumption amount associated with the tossed item that is the target of the tap operation is selected. Consume.
- step S106 the process proceeds to step S108.
- step S107 it is determined whether or not the tap operation on the icon of the free money item has been performed based on the input operation on the touch screen 15. To judge. When it is determined that the tap operation has been performed, the process proceeds to step S108.
- step S108 the item input information is transmitted to the game play terminal 300.
- the item input information information for specifying the type of the tossed item that is the target of the tap operation, the amount of points associated with the tossed item, and the user ID of the user who performed the item input operation are specified. Contains information for.
- step S108 When the process of step S108 is completed, or when it is not determined that the free item loading operation has been performed in step S107, the process proceeds to step S109.
- step S109 it is determined whether or not the viewing end operation has been performed by the user based on the input operation on the touch screen 15. If it is not determined that the viewing end operation has been performed, the process returns to step S102. As a result, every time the game progress information is received from the game play terminal 300, the process of controlling the movement and arrangement of the objects is repeatedly executed. On the other hand, when it is determined that the viewing end operation has been performed, a return is made.
- the game play terminal 300 executes a live distribution game, that is, the game described with reference to FIGS. 11 and 12 according to the flowchart shown on the right side of FIG. 31.
- the live distribution game is advanced based on the game program. Specifically, for the object including the avatar object 610 (including the item object input in step S114 described later), the arrangement, movement, change, etc. are updated in the virtual space 600A, and various game parameters are initially set or set. Update etc.
- step S112 it is determined whether or not the item input information has been received from the user terminal 100 based on the input from the communication IF 13.
- the process proceeds to step S113.
- step S113 the amount of points included in the item input information is specified, and the amount of points is given to the avatar object 610.
- step S114 the type of the thrown item included in the item input information is specified, and the item object corresponding to the type is input and arranged in the virtual space 600A.
- step S115 game progress information is generated based on the object in the virtual space 600A updated in step S111 (including the item object input by the process of step S114) and various game parameter values, and the game progress information is generated.
- the data is distributed to a plurality of user terminals 100.
- the current position image is displayed on the touch screen 15 together with the map image of the area including the current position of the user terminal 100.
- the map image on the touch screen 15 is updated to the map image of the area not including the current position. Further, among the plurality of portals arranged on the map image, the portal arranged in the area is displayed on the touch screen 15.
- Predetermined game processing can be executed.
- the game processing is performed by consuming the virtual currency. Becomes feasible.
- the predetermined range is not only the area of the map image displayed on the touch screen 15 together with the current position image according to the game start operation, but also the area of the map image displayed on the touch screen 15 according to the scroll operation. It is defined as a range including an area that does not include the current position and does not display the current position image.
- the game can be executed without consuming virtual currency as long as it is within a predetermined range, and the interest of the game can be improved. can.
- the game can be executed without consuming virtual currency as long as it is within a predetermined range, and the interest of the game can be improved. can.
- the virtual currency is not consumed. It becomes possible to execute the game, and it is possible to reduce the concern that the virtual currency will be consumed due to the user's erroneous operation and the interest of the game will be impaired.
- the amount of virtual currency consumed when a tap operation is received for a portal not located in an area within a predetermined range from the current position is the distance from the current position to the portal. Depends on. As a result, as long as the virtual currency corresponding to the distance is consumed, the game associated with the portal can be enjoyed without moving to a position within the predetermined range.
- the tap operation to the portal is allowed again after the cool time elapses. This allows the game associated with the portal to be enjoyed repeatedly.
- the portal can be tapped again by consuming the virtual currency.
- the game associated with the portal can be enjoyed without worrying about the cool time.
- the virtual currency consumed to enable the tap operation to the portal for which the cool time has not passed differs depending on the remaining time of the cool time. This makes it possible to reduce the concern that the amount of virtual currency consumed will be unfair due to the length of the remaining time.
- the content by live distribution can be viewed.
- the throwing item that can be given to the distributor of the live distribution is given to the user as a privilege by clearing the game executed in response to the tap operation to the portal.
- the user can be motivated to throw in the thrown item in the live distribution.
- a rarity is set in the portal.
- the value of the tossed item given by clearing the game of the portal with high rarity is higher than the value of the tossed item given by clearing the game of the portal with low rarity.
- the game based on the game program is a live viewing in which the content is displayed on the touch screen 15 based on the game part in which the game progresses according to the user's operation and the data lively distributed via the server. Including parts.
- a free money item is given to the user as a privilege.
- the free money throwing item is given to the distributor of the live distribution source based on the tap operation on the icon of the free money throwing item.
- the throwing item (the throwing item displayed in the UI image 711) that can be given to the distributor in the live viewing part may differ depending on whether or not the predetermined achievement condition is satisfied in the game part.
- the user is motivated to establish the predetermined achievement condition in the game part, watch the content in the live viewing part, and give the distributor the tossed item acquired in the game part.
- the taste is improved.
- the paid money item is given to the distributor by consuming the virtual currency owned by the user based on the tap operation on the icon of the paid money item. Since the user can realize the attractiveness of being able to interact with the distributor by throwing in a free toss item, motivate the user to purchase a paid toss item and actively interact with the distributor. Is possible. On the other hand, distributors can expect an increase in profits by selling paid items.
- the thrown money item is associated with the points given to the distributor.
- the item input information includes information for making it possible to specify the amount of points given to the distributor in connection with the granting of the thrown item.
- the points given to the distributor based on the tap operation on the icon of the paid money item are more than the points given to the distributor based on the tap operation on the icon of the free money item. Is also big. As a result, the user can deepen the dialogue with the distributor by throwing in the paid money item, and the distributor can expand the profit by selling the paid money item.
- the game part is provided with a plurality of portals having different rarities as an opportunity to start a process related to whether or not a predetermined achievement condition is satisfied.
- the amount of points associated with a thrown item granted by fulfilling a given achievement condition in a portal with high rarity is associated with the thrown item given by fulfilling a given fulfillment condition in a portal with low rarity. More than the amount of points you have. This can motivate the user to play a more rare game.
- the amount of points given to the distributor by consuming a predetermined amount of virtual currency is larger than the amount of points given to the distributor by consuming a predetermined amount of virtual currency less than the predetermined amount.
- the user can deepen the dialogue with the distributor by consuming more virtual currency, and the distributor can expand the profit by consuming the virtual currency.
- the item input information includes information for making it possible to identify the type of the thrown item thrown into the virtual space 600A, and the object of the thrown item is displayed on the touch screen 15. To. This makes it possible for the other user to know that one of the multiple users has thrown the thrown item, and further motivates the other user to throw the thrown item. I can make it work.
- the item input information includes information for the distributor to be able to identify the user who has performed the operation of inputting the tossed item.
- information related to the user who gave the item to the distributor is displayed together with the content based on the game progress information distributed live. This makes it possible to urge users other than the user to insert the tossed item.
- the map image may be divided into a plurality of HEXs (areas), and a predetermined number or more of portals may be arranged in each of the plurality of HEXs.
- a portal with high rarity that is, a portal with rainbow rarity may be set for each of the plurality of HEXs.
- the reset may be performed so that the portal with a rainbow of reitability always exists in the HEX in which the portal is located.
- the game of the portal whose cool time has not passed or the portal located outside the predetermined range can be executed by consuming the virtual currency.
- the game may be executed without consuming the virtual currency, and when the game is cleared, a throwing item may be given to the user instead of consuming the virtual currency.
- what kind of tossed item may be given to the user at the stage when the game is cleared, and the tossed item may be given to the user instead of the consumption of the virtual currency by the user. ..
- the game can be executed by consuming a part of the virtual currency of the consumption calculated based on the remaining time of the cool time or the distance to the portal, and the consumption amount when the game is cleared. You may give a throwing item by consuming the remaining virtual currency.
- the rarity and cool time of the portal are reset regardless of whether or not the game of the tapped portal is cleared. However, if the game cannot be cleared, at least one of the rarity and the cool time may not be reset. For example, if the game cannot be cleared, the rarity of the portal may be maintained. Further, if the game cannot be cleared, the game of the portal may be continuously executed without setting the cool time for the portal.
- the display mode of the portal in the map image is the rarity of the portal, the remaining time of the cool time of the portal, the type of the game associated with the portal, and the clearing of the game. It is fixed regardless of the type of money thrown item given.
- the display mode of the portal may be different depending on at least one of rarity, remaining cool time, game type, and throwing item type.
- the color and size of the portal may be different depending on the rarity or the remaining time of the cool time, and the design of the portal may be different depending on the type of the game or the type of the thrown item.
- the display mode of the portal in the map image is not distinguished by whether it is inside or outside the predetermined range.
- the display mode of the portal may be different depending on whether it is inside or outside a predetermined range.
- the free tossed item given by clearing the game is different from any of the paid tossed items that can be thrown in the live viewing process.
- the free tossed item may be matched with any of the paid tossed items. In this case, even if the type of the thrown item is the same, a different amount of points may be associated with the thrown item depending on whether it is paid or free.
- the throwing item is given to the user by clearing the game associated with the tapped portal.
- the game associated with the portal may be a gacha.
- the amount of points that can be given to the distributor is associated with the thrown item in advance.
- the point amount may be associated with the thrown item.
- the point amount may be specified according to the number of users who are watching the live distribution game when the throwing item is thrown. Further, the point amount may be arbitrarily set by the user when the throwing item is inserted.
- the item insertion information includes the user ID of the user who has performed the operation of inserting the thrown item, regardless of whether the thrown item is paid or free of charge.
- the item insertion information transmitted in response to the operation of inserting a paid toss item includes the user ID of the user who performed the operation, while the item transmitted in response to the operation of injecting a free toss item.
- the input information may not include the user ID of the user who performed the operation. This makes it possible to motivate the user to throw in paid money items and deepen the dialogue with the distributor.
- the amount of movement of the map image in response to the scroll operation is determined by whether the map image is a map image of an area within a predetermined range or a map image of an area straddling a predetermined range. It does not differ depending on whether the map image is an area outside the range. However, it is specified whether the area of the displayed map image is within the predetermined range, the area straddling the predetermined range, or the area outside the predetermined range, and the scroll operation is performed according to the specified area.
- the amount of movement of the map image may be specified according to the above.
- the game associated with the portal may be defined by the rarity of the portal. That is, let's make the clear level of the game associated with the portal different according to the rarity, such as Reity associating a simple game with the silver portal, Reity associating a difficult game with the rainbow portal, and so on. You may.
- a shooting game is assumed as a game executed in response to a tap operation on the portal.
- the game is not limited to the shooting game, and may be a game such as a puzzle game, a card game, or a mahjong game.
- the predetermined range is set to a range with a radius of 10 km centered on the current position of the user terminal 100.
- the width of the predetermined range may be set to a range having a radius of about 50 km centered on the current position.
- the user requests the progress of the completed live distribution part even after the real-time live distribution is once completed, and the live is performed based on the received operation instruction data.
- the distribution part can be advanced again.
- the user can look back at the live stream again, and even if he / she misses it, he / she can watch the live stream again.
- a scene in which a game including a location information game part and a live distribution part after the location information game part progresses and the live distribution time ends is assumed.
- the character here is assumed to be a character (avatar object) that is not the target of direct operation by the user.
- the user terminal 100 is configured to execute the following steps in order to improve the interest of the game based on the game program 131.
- the user terminal 100 (computer) has a step of requesting the progress of a completed live distribution part via an operation unit such as an input unit 151, and a server 200 or a game play terminal 300 (character control device). ), The step of receiving the recorded operation instruction data related to the completed live distribution part, and the step of advancing the completed live distribution part by operating the character based on the recorded operation instruction data.
- the recorded operation instruction data includes motion data and voice data input by the player 4 associated with the character.
- the player includes not only a model and a voice actor but also a worker who performs some operation on the game play terminal 300 (character control device), but does not include a user.
- the recorded operation instruction data is often stored in the storage unit 320 of the storage unit 220 game play terminal 300 of the server 200 or the memory 41 of the distribution terminal 400, and is stored in the memory 41 of the distribution terminal 400 in response to a request from the user terminal 100. It is better to deliver it to the user terminal 100 again.
- the progress of the completed live distribution part based on the recorded operation instruction data is different depending on the result of whether or not the user has progressed the live distribution part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the same live distribution part as the user has advanced the live distribution part in real time is advanced again. Is good (return delivery). In return delivery, it is better to perform a selective progression of the live delivery part. On the other hand, when it is determined that the user does not have a track record of advancing the live distribution part in real time, it is preferable to advance the live distribution part in a progress mode different from the progress in real time (missed distribution).
- the real-time live distribution part can proceed. Nevertheless, this includes cases where this was not actually done. For missed deliveries, it's a good idea to perform a limited progression of the live stream part.
- the game progress unit 115 when it is determined that the user has a track record of advancing the live distribution part in real time, the game progress unit 115 further receives the user action history information in the live distribution part.
- the user action history information is a data set of user actions recorded by an input operation during the progress of the live distribution part, in addition to the contents of the recorded action instruction data.
- the user action history information is often associated with the recorded action instruction data, and is preferably stored in the storage unit 220 of the server 200, the memory 31 of the game play terminal 300, or the memory 41 of the distribution terminal 400.
- the user behavior history information may be stored in the storage unit 120 of the user terminal 100.
- FIG. 34 is a diagram showing an example of a data structure of user behavior history information.
- the user action history information includes, for example, items such as action time, action type, and action details in which the user has acted in the live distribution part, and is associated with a user ID that identifies the user.
- the item “behavior time” is the time information in which the user performed an action in the live distribution part
- the item “behavior type” is a type indicating the user's action
- the item “behavior details” is the specific action of the user. Content.
- actions such as changing items (so-called dress-up) such as clothing of the character may be included.
- Such actions may also include time selection for later playback of a particular progress portion of the livestreaming part.
- actions may include the acquisition of rewards, points, etc. during the live distribution part.
- the user action history information is preferably associated with each other between the data structure of the action instruction data and the data structure of the game information described later in FIG. 35. It should be understood by those skilled in the art that these data structures are merely examples and are not limited thereto.
- FIG. 35 is a diagram showing an example of the data structure of the game information 132 processed by the system 1 according to the present embodiment.
- the items provided in the game information 132 are appropriately determined according to the genre, nature, content, etc. of the game, and the exemplary items do not limit the scope of the present disclosure.
- the game information 132 is configured to include each item of "play history”, “item”, “delivery history”, and "game object". Each of these items is appropriately referred to when the game progress unit 115 advances the game.
- the user's play history is stored in the item "play history”.
- the play history is information indicating whether or not the user's play is completed for each scenario stored in the storage unit 120.
- the play history includes a list of fixed scenarios downloaded at the first time of play and a list of acquisition scenarios acquired later in the acquisition part. In each list, statuses such as "played”, “unplayed”, “playable”, and “unplayable” are associated with each scenario.
- the item "item” stores a list of items owned by the user as a game medium.
- the item is, for example, a clothing item worn by a character.
- the user can make the character wear the clothing items obtained by playing the scenario and customize the appearance of the character.
- the item "Distribution history” a list of videos, so-called back numbers, which have been live-distributed from the player 4 in the past in the live distribution part is stored.
- the video that is PUSH-distributed in real time can be viewed only at that time.
- the moving images for past distribution are recorded on the server 200, the game play terminal 300, or the distribution terminal 400, and can be PULL-distributed in response to a request from the user terminal 100.
- the back number may be made available for download by the user for a fee.
- the item "game object” stores data of various objects such as an avatar object 610, an enemy object 671, an obstacle object 672, and 673.
- FIG. 36 is a flowchart showing an example of a basic game progress of a game executed based on the game program according to the present embodiment. The processing flow is applied to the scenes after the end of the live distribution time when the real-time live distribution part has already been completed.
- step S301 the input unit 151 of the user terminal 100 newly requests the progress of the completed live distribution part.
- step S302 in response to the request in step S301, the user terminal 100 receives the recorded operation instruction data related to the completed live distribution part from the server 200 or the game play terminal 300 (character control device).
- the recorded action instruction data includes motion data and voice data input by the operator associated with the character.
- the user terminal 100 may receive various progress record data acquired and recorded along with the movement of the character during the progress of the real-time live distribution part.
- the progress record data may include viewer behavior data in which the user who participated in the real-time live distribution part behaves in accordance with the movement of the character.
- the viewer behavior data is data including a record of the behavior during the live of all the users (that is, the viewers who participated in the live) who have advanced the real-time live distribution part in real time.
- the viewer behavior data should include messaging content such as text messages and icons sent by the viewer to the character in real time during the live performance.
- the recorded operation instruction data and progress record data may be received by the user terminal 100 as separate data, and each may be analyzed (rendered).
- the server 200 or the game play terminal 300 the previously recorded operation instruction data and the viewer behavior data may be combined, and the combined data set may be received by the user terminal 100 at one time.
- the combined data set it is possible to reduce the load of subsequent data analysis (rendering) by the user terminal 100.
- the progress record data is combined with the recorded action order data (that is, the progress record data is included in the recorded action order data).
- step S303 the game progress unit 115 determines whether or not the user has a track record of progressing the live distribution part in real time. The determination may be performed, for example, based on whether there is a record in which the operation instruction data has been transmitted to the user terminal 100. Alternatively, even if the live distribution part is executed based on whether or not the live distribution part has the status of "played" with reference to the item "play history" shown in FIG. 35, the same item "distribution history" is also referred to. It may be executed based on whether or not there is a record of live distribution from the character in the past. In addition to this, when the operation instruction data already recorded is stored in the storage unit 120 of the user terminal 100, it may be determined that the live distribution part has already been advanced in real time. In addition, the determination may be performed by combining them, or by any other method.
- step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time (YES), the progress of the completed live distribution part is "return distribution”. On the other hand, when it is determined in step S303 that the user has no record of progressing the live distribution part in real time (NO), the progress of the completed live distribution part is "missed distribution". As mentioned above, the user experience is different between "return delivery” and "missed delivery”.
- step S303 If it is determined in step S303 that the user has a track record of advancing the live distribution part in real time, the processing flow proceeds from YES in step S303 to step S304.
- step S304 the game progress unit 115 acquires and analyzes the user action history information of the live distribution part shown in FIG. 34.
- the user action history information may be acquired from the server 200, the game play terminal 300, or the distribution terminal 400, or may be used directly if it is already stored in the storage unit 120 of the user terminal 100.
- the game progress unit 115 re-progresses the completed live distribution part (that is, the above-mentioned “return distribution”). Specifically, the live distribution part is re-progressed using the recorded operation instruction data and the user action history information analyzed in step S304. Further, in the return distribution, the throwing item input in the real-time live distribution part may be reflected in the operation mode of the character. For example, if the user puts in a clothing item (here, a "necklace") in the livestreaming part, the character is actuated based on that item (ie, wearing a necklace). As a result, the live distribution part may be re-progressed. That is, the re-progress of the live distribution part reflects the user behavior history information and the reward information, is similar to the live distribution part that has progressed in real time, and is unique to the user.
- the return distribution the live distribution part is re-progressed using the recorded operation instruction data and the user action history information analyzed in step S304.
- the live distribution part will be re-progressed.
- the user specifies a specific action time, and the live distribution part is selectively advanced from there. Can be made to. For example, if the user inputs a comment 2 minutes and 45 seconds after the start of the live distribution part, the user can advance the live distribution part again by specifying the timing 2 minutes and 45 seconds later. ..
- action time corresponding to the consumption of valuable data by the user's input operation and the record of actions such as change of items such as character's clothing. It is better to make it feasible based on.
- the live distribution part is selectively progressed by using the data of the action time. Can be made to. For example, if the user has selected a period of 2 minutes 45 seconds to 5 minutes 10 seconds from the start of the live distribution part, the user can re-progress the live distribution part over that period.
- the return distribution unlike the live distribution part that progressed in real time, it is better to limit the actions of the viewers that can be accepted. Specifically, in the live distribution part that progressed in real time, it was possible to accept the consumption of valuable data by the input operation of the viewer (in one example, throwing money, charging by purchasing items, etc.). On the other hand, in the re-progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in the live distribution part progressed in real time, a user interface (UI) including a button and a screen for executing the consumption of valuable data was displayed on the display unit 152. Then, the viewer could execute the consumption of valuable data through such an input operation with the UI.
- UI user interface
- step S303 determines that the user has no record of advancing the live distribution part in real time
- the processing flow proceeds from NO in step S303 to step S306.
- step S306 the game progress unit 115 executes a limited progress (that is, the above-mentioned "missed distribution") of the completed live distribution part.
- the reason why the missed delivery is restricted is that the user has the right to receive the live stream, but it can be considered that he has waived this right, so not all of the live stream is necessarily limited. Based on the idea that it is not necessary to reproduce and present it to the user.
- the progress of the live distribution part is executed using the recorded operation instruction data.
- the user has acquired the perk as a clothing item (eg, a "necklace") through the scenario associated with the location-based game part, the character will wear that item in the real-time livestreaming part.
- the image was synthesized so that it would work.
- the behavior mode of the character was associated with the privilege.
- the privilege is not associated with the movement mode of the character. That is, the image composition process is not performed so that the character wears the item and operates.
- the progress of the completed livestreaming part is limited in that it does not reflect the benefit information and is not unique to the user.
- the thrown money item input in the real-time live distribution part can be reflected in the movement mode of the character.
- the overlooked distribution the thrown money item input in the real-time live distribution part is not reflected in the movement mode of the character.
- the retrospective distribution viewing real-time live distribution in the past
- the missed distribution unlike the live distribution part that progressed in real time, it is better to limit the actions of the users that can be accepted. Specifically, in the live distribution part that progressed in real time, it was possible to accept the consumption of valuable data by the user's input operation (in one example, throwing money, charging by purchasing items, etc.). On the other hand, in the progress of the completed live distribution part, the consumption of such valuable data may be restricted so as not to be accepted. More specifically, in the live distribution part progressed in real time, a user interface (UI) including a button and a screen for executing the consumption of valuable data was displayed on the display unit 352. Then, the user could execute the consumption of valuable data through such an input operation in the UI. On the other hand, in the overlooked delivery, such a UI should be hidden so that the user cannot explicitly perform an input operation. As a result, in the return delivery and the missed delivery, the user 3 cannot insert the throwing item to support the character.
- UI user interface
- the user can play a specific scenario associated with the live delivery part as well as the live delivery part that progresses in real time.
- Certain scenarios include, for example, user-participatory events, and users are provided with an interactive experience with avatar objects.
- user-participatory events include questionnaires provided by characters, quizzes given by avatar objects, battles with avatar objects (for example, rock-paper-scissors), and the like. Then, as in the case of live distribution in real time, the participation result of such a user participation type event is fed back to the user in the overlooked distribution.
- the result of the correctness determination is fed back to the user.
- the program may automatically make only a simple judgment (correctness judgment, etc.) and give feedback.
- the user is different from the one during live participation.
- a display such as "The answer is different from that during the live” may be displayed and output to the user terminal by comparing with the answer of the user who is participating in the live.
- the user may be restricted from earning a predetermined game point for the above feedback.
- predetermined game points may be associated with the user and added to the points owned by the user.
- points may not be associated with the user.
- the points owned by the user for example, in the case of a game in which a plurality of users who are game players are ranked based on the points, when the user advances the completed live distribution part, this is the case. It will not affect the ranking.
- the user terminal 100 may request the progress of the completed second part (live distribution part) again. That is, it is preferable that the return delivery or the missed delivery can be repeatedly executed a plurality of times. In this case, the processing flow returns to step S301.
- the user terminal 100 even after the live distribution part has progressed in real time, the user can proceed with the live distribution part again in various modes. As a result, the user feels more attached to the character through the experience of interacting with the character in a rich sense of reality, so that another part that operates the character can be played with even more interest. .. As a result, it has the effect of increasing the immersive feeling of the game and improving the interest of the game.
- Step S303 in FIG. 36 whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined based on whether or not the user has a track record of advancing the live distribution part in real time.
- the user may be configured to select the return delivery or the overlooked delivery. Alternatively, regardless of the presence or absence of the above achievements, only the overlooked delivery may be provided to the user.
- ⁇ Modification 2> In the second embodiment, after the end of the return distribution (step S305 in FIG. 36) or the missed distribution (step S306 in FIG. 36), the progress of the completed second part (live distribution part) may be requested again. And said. That is, the return delivery or the missed delivery could be repeatedly executed a plurality of times. In the second modification, it is preferable that the second and subsequent return delivery or missed delivery correspond to the record of the previous return delivery or missed delivery.
- the first distribution history data is stored in the storage unit 220 of the server 200 or the storage unit 320 of the distribution terminal 400. After that, when the recorded operation instruction data related to the completed live distribution part is requested again from the user terminal 100, the first distribution history data is sent from the server 200 or the distribution terminal 400 together with the recorded operation instruction data. It will be delivered. In the user terminal 100, the received first delivery history data is referred to, and if the first return delivery or the missed delivery is performed halfway, the user terminal 100 will perform the second return delivery or the second return delivery from the continuation. Resume the progress of overlooked delivery. As a result, the user can efficiently perform return delivery or missed delivery.
- the return delivery should be executed from the second time onward, and if the first delivery is a missed delivery, the missed delivery should be executed from the second time onward. Further, when the recorded operation instruction data already exists in the user terminal 100, the user terminal 100 may not receive the recorded operation instruction data again. As a result, the amount of data received by the user terminal 100 can be saved.
- ⁇ Modification 3> In the second embodiment, whether the progress of the completed live distribution part is a return distribution or a missed distribution is determined according to the actual result of the user advancing the live distribution part in real time (). Step S303 in FIG. 36).
- the third modification when it is determined that the user has progressed the live distribution part halfway in real time, it is preferable to restart the progress of the completed live distribution part from the continuation.
- the record of how far the user has advanced the live distribution part in real time can be determined from the user behavior history information described above in FIG. 34. That is, the user behavior history information records how long the user has progressed with respect to a specific live distribution part.
- the resumption of the completed live distribution part should be a missed distribution, which is a limited progress. As a result, the user can efficiently execute the overlooked delivery.
- the user terminal 100 was able to accept an input operation for supporting the player in the live distribution part of the game progressing in real time.
- the player support is enabled in the real-time live distribution part, while the player support is not possible in the return distribution and the overlooked distribution.
- the live distribution part accepts an input operation for inserting a tossed item into a player's operating character, while the back-to-back distribution and a missed distribution do not accept an input operation to insert a tossed item.
- the UI image 701 (FIG. 11) that accepts the input operation that causes the support operation may not be displayed on the display unit.
- the user can support the player in real time, and can perform operations that are more interesting than the return distribution and the overlooked distribution. Therefore, it is possible to give the viewer an incentive to watch the real-time live distribution, and as a result, guide the user to the real-time live distribution.
- FIG. 37 shows an example of a screen displayed on the display unit 152 of the user terminal 100 based on the game program according to the present embodiment, and an example of a transition between these screens.
- screens include the A home screen 800A, the live selection screen 800B for live distribution, the missed selection screen 800C for missed distribution, and the game screen 800D for the location information game part.
- the home screen 800A can be transitioned to the live selection screen 800B and the game screen 800D.
- the live selection screen 800B can be changed to the home screen 800A, the overlooked selection screen 800C, and the game screen 800D.
- the overlooked selection screen 800C can be transitioned to the live selection screen 800B
- the game screen 800D can be transitioned to the home screen 800A and the live selection screen 800B.
- the actual distribution screen (not shown) is transitioned from the live selection screen 800B and the overlooked selection screen 800C.
- the home screen 800A displays various menus for advancing the location-based game part or the live distribution part on the display unit 152 of the user terminal 100.
- the game progress unit 115 receives an input operation for starting the game, the game progress unit 115 first displays the home screen 800A.
- the home screen 800A includes a "live" icon 802 for transitioning to the live selection screen 800B) and an "outing" icon 804 for transitioning to the game screen 800D of the location information game.
- the game progress unit 115 Upon receiving an input operation for the "live" icon 802 on the home screen 800A, the game progress unit 115 causes the display unit 152 to display the live selection screen 800B.
- the live selection screen 800B presents the user with live information that can be distributed.
- live information includes at least the live delivery date and time.
- the live announcement information may include free / paid live information, an advertisement image including an image of a character appearing in the live, and the like.
- the live selection screen 800B may display the notification information regarding the live distribution to be distributed in the near future on the live selection screen on the pop-up screen 806.
- the server 200 searches for one or more user terminals 100 having the right to receive the live distribution.
- the right to receive live distribution is that the consideration for receiving live distribution has been paid (for example, holding a ticket), the scenario has been cleared in the location information game part, or the user terminal in the location information game part.
- the condition is that the current position of 100 or the main character is in a specific area / position where a live distribution tower or the like is located.
- the corresponding live notification information will be displayed on the user terminal 100 having the right to receive the live distribution.
- the user terminal 100 accepts a live playback operation, for example, a selection operation for a live at the live distribution time on the live selection screen 800B (more specifically, a touch operation for a live image). Accordingly, the game progress unit 115 shifts the display unit 152 to the actual distribution screen (not shown). As a result, the user terminal 100 can advance the live distribution part and advance the live viewing process in real time.
- the moving image reproduction unit 117 operates the character in the live distribution part based on the received operation instruction data.
- the moving image reproduction unit 117 generates a moving image reproduction screen (for example, thank-you moving image 910A of FIG. 26) including a character that operates based on the operation instruction data in the live distribution part, and displays it on the display unit 152.
- the live selection screen 800B has a "return (x)" icon 808 for transitioning to the screen displayed immediately before and a "missing delivery” icon 810 for transitioning to the missed selection screen 800C on the display unit 152. It may be displayed.
- the game progress unit 115 shifts the screen 800B to the screen displayed immediately before. Specifically, the game progress unit 115 shifts to the home screen 800A when the screen displayed immediately before is the home screen 800A, and to the game screen 800D when the game screen 800D is used. That is, it is preferable that the history back function is executed on the "back (x)" icon 808.
- the live selection screen 800B is selectively transitioned to either the home screen 800A or the game screen 800D in response to an input operation for the “back (x)” icon 808.
- the game progress unit 115 shifts to the missed selection screen 800C on the screen 800B.
- the overlook selection screen 800C displays, among the delivered information about one or more live delivered in the past, the delivered information in which the user has not progressed the live delivery part in real time.
- the input unit 151 of the user terminal 100 receives an input operation for the live distribution information displayed on the overlook selection screen 800C, for example, the image 830 including the character appearing in the live, the game progress unit 115 after the live distribution part ends. , You can proceed with the livestreaming part that has already been completed.
- the delivered information about the live is further delivered with the playback time 812 of each delivered live, the period until the end of delivery (days, etc.) 814, and how many days before the present. It may include information 816 indicating whether or not it has been done, a past delivery date and time, and the like.
- the overlooked selection screen 800C includes a “back ( ⁇ )” icon 818 for transitioning to the live selection screen 800B. In response to the input operation for the "return ( ⁇ )" icon 818, the game progress unit 115 transitions to the live selection screen 800B.
- the overlooked selection screen 800C is not limited to this, but it is preferable that the transition is made only from the live selection screen 800B and not directly from the home screen 800A and the game screen 800D.
- the missed distribution is performed for the user who missed the live distribution, and is only a function accompanying the live distribution function.
- one of the purposes of this game is to enhance the fun of the game by allowing the user to watch the live distribution in real time, support the character in real time, and deepen the interaction with the character. For this reason, in order to guide the user to watch the live distribution in real time rather than the overlooked distribution in which real-time interaction with the character (player) is not possible, here, the overlooked selection screen from the home screen 800A or the game screen 800D. It is better not to make a direct transition to 800C.
- the delivered information that the user has not made the live distribution part in real time is displayed.
- the delivered information about all the live delivered in the past may be displayed in a list for each live.
- it is preferable that either the return delivery or the missed delivery is executed depending on whether or not the user has progressed the live delivery part in real time. Specifically, when it is determined that the user has a track record of advancing the live distribution part in real time, the above-mentioned return distribution is performed. On the other hand, if it is determined that the user has no record of advancing the live distribution part in real time, the distribution will be overlooked. As described above with respect to FIG. 36, the look-back delivery and the missed delivery can provide different user experiences.
- the game screen 800D is a screen displayed on the display unit 152 in the location information game part.
- the game progress unit 115 presents a quest to the user while the scenario is in progress in the location information game part.
- the game progress unit 115 may realize the quest by a location information game using the location registration information of the user terminal 100.
- the game progress unit 115 acquires the current position information (for example, address information, latitude / longitude information, etc.) of the user terminal 100 from a position registration system (not shown) provided in the user terminal 100.
- a map 824 around the place where the user terminal 100 is located is generated and arranged on the game screen 800D.
- the map data that is the source of generating the map 824 may be stored in the storage unit 120 of the user terminal 100 in advance, or may be acquired from another service providing device that provides the map data via the network. ..
- the game progress unit 115 determines a position (address, latitude / longitude, etc.) at which the privilege can be obtained, and superimposes and displays the portal icon 826 on the position on the map corresponding to the determined position.
- the user can acquire the privilege and clear the quest by moving to the position of the portal icon 826 on the map 824 with the user terminal 100.
- the user can take the user terminal 100, move to the position of the portal icon 826 on the map 824, clear the game associated with the portal, obtain the privilege, and clear the quest.
- the position of the portal may be randomly determined by the game progress unit 115, or may be predetermined according to the contents of the scenario, quest, and privilege.
- the quest may be realized by the location information game without using the location registration information of the user terminal 100.
- virtual location information is used instead of the actual location registration information of the user terminal 100.
- the game screen 800D displays a "home” icon 828 and a "live” icon 822.
- the game progress unit 115 causes the display unit 152 to display the home screen 800A. Further, when the input operation for the "live” icon 822 is received, the game progress unit 115 causes the display unit 152 to display the live selection screen 800B.
- the game screen 800D can transition to the home screen 800A or the live selection screen 800B. That is, the live selection screen 800B can be transitioned not only from the home screen 800A but also from the game screen 800D. As described above, the game screen 800D may be configured so as not to be directly transitioned to the overlooked selection screen 800C for the purpose of guiding the user to watch the live distribution in real time.
- One aspect of the embodiment shown in the present disclosure is a method for the progress of a game executed on a user terminal (user terminal 100A) including a processor, a memory, and a display unit, wherein the game is operated by the user.
- the first part position information game part
- the second part live
- the user is provided with a privilege that can be used not in the first part but in the second part by satisfying a predetermined achievement condition in the first part by the processor, including the distribution game part).
- a method comprising a step (S107).
- Appendix 2 In Appendix 1, the result of the specific action includes that the virtual position on the map data of the user terminal in the first part becomes a predetermined position.
- Appendix 3 In Appendix 1 or 2, the result of the particular action includes the completion of the predetermined scenario associated with the first part.
- Appendix 4 A step of requesting the second progress of the second part and a step of requesting the second progress of the second part after the completion of the first progress of the second part by the processor according to any one of the appendices 1 to 3. On the other hand, it includes a step of executing the second progress of the second part by operating the avatar object of the distributor based on the operation instruction data distributed again from the server.
- Appendix 5 In Appendix 4, in the progress of the second part after the end of the first progress of the second part, the privilege associated with the user in the first step is not given to the distributor of the live distribution source. ..
- Appendix 6 In Appendix 5, the second progress of the second part is executed based on the operation instruction data and the record of the action by the input operation of the user received during the first progress of the second part. Will be done.
- Appendix 6 In Appendix 6, the record of the action includes time information, and the second progress of the second part is due to the input operation of the user via the operation unit during the first progress of the second part. According to the above time information designation.
- Appendix 8 In Appendix 6 or 7, the action comprises selecting a specific progress portion by the user's input operation via the control unit during the first progress of the second part. In the second progression of the second part, the progression of only the selected specific progression is performed.
- One aspect of the embodiment presented in the present disclosure is a method for the progress of a game executed on a user terminal including a processor, a memory, and a display unit, wherein the game is played according to a user operation.
- a predetermined achievement in the first part by the processor including a first part to be advanced and a second part to display the content on the display unit based on the first information lively distributed via the server. From the server, a step of associating the user with a privilege that becomes available in the second part instead of the first part by satisfying the condition, a step of requesting the progress of the completed second part, and a step of requesting the progress of the completed second part.
- a step of receiving recorded action instruction data wherein the action instruction data includes motion data and voice data input by a distributor of a live distribution source, and the delivery based on the action instruction data. It is a method including a step of executing the progress of the second part by operating a person's avatar object.
- Appendix 10 In Appendix 9, after the first part, the real-time second part that operates the avatar object in real time based on the action instruction data can be advanced, and the progress of the real-time second part and the completed part are completed. The mode of progress is different from the progress of the second part.
- Appendix 11 In Appendix 10, in the progress of the real-time second part, the privilege associated with the user can be associated with the distributor of the live distribution source, and in the progress of the completed second part, the said. The privilege associated with the user is not associated with the distributor of the live distributor.
- Appendix 12 In Appendix 10 or 11, in the progress of the completed second part, the consumption of valuable data by the input operation of the user via the operation unit is not accepted.
- the display unit displays a first screen for displaying a menu for advancing the first part or the second part, and live information that can be distributed from the first screen.
- the second screen to be displayed and the third screen for displaying information about live concerts delivered in the past are configured to be displayable, and the third screen is configured so that it cannot be directly transitioned from the first screen. ing.
- Appendix 15 In Appendix 14, the display unit is further configured to be able to display a fourth screen for advancing the first part, and the third screen is configured so as not to be directly transitioned from the fourth screen. ing.
- Appendix 17 A computer-readable medium containing computer-executable instructions that, upon execution of the computer-executable instruction, causes the processor to perform a step comprising any of the methods of Appendix 1-16.
- Appendix 18 One aspect of the embodiment shown in the present disclosure is an information processing apparatus for progressing a game, comprising a processor, a memory, and a display unit, wherein the game progresses according to a user operation.
- the processor includes one part and a second part for displaying contents on the display unit based on the first information delivered live via a server, and the processor executes an instruction stored in the memory.
- the operation instruction data requested and recorded from the server is received, and the operation instruction data includes motion data and voice data input by the distributor of the live distribution source, and the operation instruction data is based on the operation instruction data. It is an information processing device that executes the progress of the second part by operating the avatar object of the distributor.
- control blocks (particularly the control units 110, 210, 310, 410) of the user terminal 100, the server 200, the game play terminal 300 (HMD set 1000), and the distribution terminal 400 are logics formed in an integrated circuit (IC chip) or the like. It may be realized by a circuit (hardware) or by software.
- the user terminal 100, the server 200, the game play terminal 300 (HMD set 1000), and the distribution terminal 400 include a computer that executes a program command that is software that realizes each function.
- the computer includes, for example, one or more processors and a computer-readable recording medium that stores the program. Then, in the computer, the processor reads the program from the recording medium and executes the program, thereby achieving the object of the present disclosure.
- a CPU Central Processing Unit
- the recording medium in addition to a “non-temporary tangible medium” such as a ROM (Read Only Memory), a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
- a RAM RandomAccessMemory
- the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. It should be noted that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the above program is embodied by electronic transmission.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Environmental & Geological Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un procédé de jeu, un support lisible par ordinateur, et un dispositif terminal d'information avec lesquels il est possible d'augmenter l'intérêt et le plaisir. La présente invention concerne un procédé pour faire progresser un jeu qui est exécuté dans un terminal d'utilisateur équipé d'un processeur, d'une mémoire et d'une unité d'affichage. Le jeu inclut : une première partie qui fait progresser le jeu conformément à des opérations d'utilisateur ; et une seconde partie qui affiche un contenu sur l'unité d'affichage sur la base de premières informations qui sont diffusées en continu en direct par l'intermédiaire d'un serveur. Exécuté par le processeur, le procédé inclut : une première étape consistant à associer un bonus qui peut être utilisé pendant la seconde partie, mais pas pendant la première partie, à un utilisateur dans la mesure où il satisfait une condition de réalisation prescrite pendant la première partie ; une étape consistant à recevoir une action spécifique par l'utilisateur dans la première partie, et à commuter la progression du jeu de la première partie à la seconde partie en fonction du résultat de l'action spécifique ; et une seconde étape pour effectuer un traitement par lequel le bonus associé à l'utilisateur à la première étape est accordé à un distributeur d'une source de diffusion en continu en direct sur la base d'une première opération pendant la seconde partie.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044443 WO2022113326A1 (fr) | 2020-11-30 | 2020-11-30 | Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information |
JP2022564983A JPWO2022113326A1 (fr) | 2020-11-30 | 2020-11-30 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/044443 WO2022113326A1 (fr) | 2020-11-30 | 2020-11-30 | Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022113326A1 true WO2022113326A1 (fr) | 2022-06-02 |
Family
ID=81755487
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/044443 WO2022113326A1 (fr) | 2020-11-30 | 2020-11-30 | Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2022113326A1 (fr) |
WO (1) | WO2022113326A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004193827A (ja) * | 2002-12-10 | 2004-07-08 | Canon Inc | 動画像再生装置 |
JP2019071958A (ja) * | 2017-10-12 | 2019-05-16 | 株式会社バンダイナムコエンターテインメント | コンテンツ配信システム及びコンピュータシステム |
JP2020049286A (ja) * | 2018-05-29 | 2020-04-02 | 株式会社コロプラ | ゲームプログラム、方法、および情報処理装置 |
JP2020156739A (ja) * | 2019-03-26 | 2020-10-01 | 株式会社コロプラ | ゲームプログラム、ゲーム方法、および情報端末装置 |
JP2020166559A (ja) * | 2019-03-29 | 2020-10-08 | 株式会社バンダイナムコエンターテインメント | サーバシステムおよび動画配信システム |
-
2020
- 2020-11-30 WO PCT/JP2020/044443 patent/WO2022113326A1/fr active Application Filing
- 2020-11-30 JP JP2022564983A patent/JPWO2022113326A1/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004193827A (ja) * | 2002-12-10 | 2004-07-08 | Canon Inc | 動画像再生装置 |
JP2019071958A (ja) * | 2017-10-12 | 2019-05-16 | 株式会社バンダイナムコエンターテインメント | コンテンツ配信システム及びコンピュータシステム |
JP2020049286A (ja) * | 2018-05-29 | 2020-04-02 | 株式会社コロプラ | ゲームプログラム、方法、および情報処理装置 |
JP2020156739A (ja) * | 2019-03-26 | 2020-10-01 | 株式会社コロプラ | ゲームプログラム、ゲーム方法、および情報端末装置 |
JP2020166559A (ja) * | 2019-03-29 | 2020-10-08 | 株式会社バンダイナムコエンターテインメント | サーバシステムおよび動画配信システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022113326A1 (fr) | 2022-06-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7286588B2 (ja) | ゲームプログラム | |
JP6776393B2 (ja) | 視聴プログラム、視聴方法、および情報端末装置 | |
JP2021010181A (ja) | 視聴プログラム、視聴方法、および情報端末装置 | |
JP2021053365A (ja) | プログラム、方法、および視聴端末 | |
JP6722320B1 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
WO2020262332A1 (fr) | Programme de jeu, procédé de jeu, et dispositif de terminal d'informations | |
JP6796158B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP6776394B2 (ja) | プログラム、ゲーム方法、コンピュータ、および情報端末装置 | |
JP7437480B2 (ja) | プログラム、方法、およびコンピュータ | |
JP6813618B2 (ja) | 視聴プログラム、視聴方法、視聴端末、配信プログラム、配信方法、および情報端末装置 | |
WO2022137519A1 (fr) | Procédé de visualisation, support lisible par ordinateur, système informatique et dispositif de traitement d'informations | |
JP6770603B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
JP6754859B1 (ja) | プログラム、方法、およびコンピュータ | |
JP6952730B2 (ja) | プログラム、方法、情報処理装置、およびシステム | |
JP6776425B1 (ja) | プログラム、方法、および配信端末 | |
JP6826645B1 (ja) | プログラム、方法、および端末装置 | |
WO2022113326A1 (fr) | Procédé de jeu, support lisible par ordinateur, et dispositif terminal d'information | |
JP2021006274A (ja) | プログラム、ゲーム方法、コンピュータ、および情報端末装置 | |
JP2021053406A (ja) | プログラム、方法、および端末装置 | |
JP2021175436A (ja) | ゲームプログラム、ゲーム方法、および端末装置 | |
WO2022137522A1 (fr) | Procédé de jeu, système informatique, support lisible par ordinateur, et dispositif terminal d'information | |
WO2022113329A1 (fr) | Procédé, support lisible par ordinateur, système informatique et dispositif de traitement d'informations | |
WO2022137377A1 (fr) | Procédé de traitement d'informations, support lisible par ordinateur, système informatique, et dispositif de traitement d'informations | |
JP7087148B2 (ja) | ゲームプログラム、ゲーム方法、および情報端末装置 | |
WO2022137523A1 (fr) | Procédé de jeu, support lisible par ordinateur, et dispositif de traitement d'information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20963590 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022564983 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20963590 Country of ref document: EP Kind code of ref document: A1 |