CN114650872A - Game device, game system, control system, game device operation method, control system operation method, and program - Google Patents

Game device, game system, control system, game device operation method, control system operation method, and program Download PDF

Info

Publication number
CN114650872A
CN114650872A CN202080030268.3A CN202080030268A CN114650872A CN 114650872 A CN114650872 A CN 114650872A CN 202080030268 A CN202080030268 A CN 202080030268A CN 114650872 A CN114650872 A CN 114650872A
Authority
CN
China
Prior art keywords
game
data
user
control
playback
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080030268.3A
Other languages
Chinese (zh)
Inventor
奥秋政人
东尚吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Kelomei Entertainment Co ltd
Original Assignee
Kelomei Entertainment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2019084229A external-priority patent/JP7419635B2/en
Priority claimed from JP2019084230A external-priority patent/JP6762519B1/en
Application filed by Kelomei Entertainment Co ltd filed Critical Kelomei Entertainment Co ltd
Publication of CN114650872A publication Critical patent/CN114650872A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/30Interconnection arrangements between game servers and game devices; Interconnection arrangements between game devices; Interconnection arrangements between game servers
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/70Game security or game management aspects
    • A63F13/79Game security or game management aspects involving player-related data, e.g. identities, accounts, preferences or play histories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Computer Security & Cryptography (AREA)
  • General Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The game device is provided with: a game control part for controlling the game given to the user according to the result of the reward according to the amount of the game media; a progress data generation unit that generates progress data indicating a progress of the game; a transmission control unit for transmitting the progress status data to the control system via the communication device via the communication network; and a reception control unit that receives progress instruction data instructing the progress of the game from the control system via the communication device via the communication network. The game control part controls the game according to the progress instruction data received by the receiving control part.

Description

Game device, game system, control system, game device operation method, control system operation method, and program
Technical Field
The present invention relates to a game.
Background
For example, patent document 1 discloses a game device including a field in which a ball as a game medium rolls, and a roulette unit in which a plurality of pockets into which the ball can enter are provided. In the game device, a game is executed in which a result is determined according to entry of a ball thrown into a field into any of a plurality of pockets.
[ Prior art documents ]
[ patent document ]
[ patent document 1] Japanese patent application laid-open No. 2004-89514.
Disclosure of Invention
[ problems to be solved by the invention ]
It is difficult for a user who has little experience in playing various games to accurately understand the program to be executed by the user according to the situation of the user. In view of the above circumstances, the present invention aims to effectively support the play of a game by a user.
[ means for solving problems ]
To solve the above problem, a game device according to a preferred aspect of the present invention includes: a game control unit that controls a game; a user data generating unit that generates user data indicating a situation of a user who plays the game; a transmission control unit for transmitting the user data to a control system via a communication network via a communication device; a reception control unit for receiving, from the control system via the communication network, playback control data for causing the virtual character to be played back to the playback device via the communication device; and a playback control unit that plays the virtual character on the playback device in parallel with the control of the game by the game control unit, based on the playback control data.
In order to solve the above problem, a game system according to a preferred aspect of the present invention is a game system including a game device and a control system that can communicate with each other via a communication network, the game device including: a game control unit that controls a game; a user data generating unit that generates user data indicating a situation of a user who plays the game; a first transmission control unit that transmits the user data to the control system via a first communication device through the communication network; a first reception control unit that receives, from the control system via the communication network, playback control data for causing the virtual character to be played back to a playback device via the first communication device; and a playback control unit that causes the virtual character to be played back on the playback device in accordance with the playback control data in parallel with control of the game by the game control unit, the control system including: a second reception control unit for receiving the user data from the game device via a second communication device via the communication network; a playback control data generation unit that generates the playback control data; and a second transmission control unit that transmits the playback control data generated by the playback control data generation unit to the game device via the second communication device via the communication network.
To solve the above problem, an operation method of a game device according to a preferred aspect of the present invention is a game device for controlling a game to execute: generating user data indicating a status of a user who plays the game; transmitting the user data to a control system through a communication network by a communication device; receiving, from the control system through the communication network, play control data for causing the virtual character to be played on the playing device by the communication device; and in parallel with the control of the game, the virtual character is played on the playing device according to the playing control data.
To solve the above problem, a program according to a preferred aspect of the present invention causes one or more processors included in a game device to execute: a game control process of controlling a game; a user data generation process of generating user data indicating a situation of a user who plays the game; a transmission process of transmitting the user data to a control system via a communication network via a communication device; a receiving process of receiving, from the control system via the communication network, play control data for causing the virtual character to be played on the playing device by the communication device; and a playing process of playing the virtual character to the playing device in parallel with the game control process according to the playing control data.
Drawings
Fig. 1 is a block diagram illustrating the structure of a game system according to a first embodiment.
Fig. 2 is a plan view of the game device.
Fig. 3 is a block diagram illustrating the structure of a game device.
Fig. 4 is a block diagram illustrating the structure of each station.
Fig. 5 is a block diagram illustrating the functional structure of the game device.
Fig. 6 is a flowchart illustrating a specific procedure of the game.
FIG. 7 is a schematic diagram of user data.
FIG. 8 is a display example of a virtual character.
Fig. 9 is a block diagram illustrating the structure of the control system.
Fig. 10 is a block diagram illustrating the structure of the functions of the control system.
Fig. 11 is a display example of a management screen.
Fig. 12 is a schematic diagram of playback control data.
Fig. 13 is a flowchart illustrating a procedure of an operation of the playback control section.
Fig. 14 is a flowchart illustrating a procedure of the overall operation of the game system.
Fig. 15 is a block diagram of a progress indication data generating portion in the second embodiment.
Fig. 16 is an explanatory diagram of mechanical learning for the generation pattern.
Fig. 17 is a block diagram of a play control data generating part in the second embodiment.
Fig. 18 is an explanatory diagram of mechanical learning for the generation pattern.
Fig. 19 is a flowchart illustrating an operation of the game device according to the third embodiment.
Fig. 20 is a plan view of a game device in a modification.
Detailed Description
Reference is made to the accompanying drawings that illustrate aspects of the present invention. The embodiments described below include various limitations that are technically appropriate. The scope of the present invention is not limited to the following exemplary embodiments.
[ first embodiment ]
Fig. 1 is a block diagram illustrating a structure of a game system 1 according to a first embodiment. As shown in fig. 1, a game system 1 according to a first embodiment includes a control system 10 and a game device 20. The control system 10 and the game device 20 can communicate with each other through a communication network 2 such as a network. The communication between the control system 10, the game device 20, and the communication network 2 may be either wired communication or wireless communication. In the actual game system 1, each of the plurality of game devices 20 can communicate with the control system 10, but in the following description, a focus is placed on one game device 20 as appropriate.
The game device 20 is installed in a game facility, for example. The gaming establishment is an entertainment establishment such as a gaming center, or a commercial establishment such as a shopping center. The game device 20 is a computer system that provides a game to a user (i.e., player) Pa in a game facility. The control system 10 is a computer system for controlling the operation of the game device 20. The control system 10 is installed outside the game facility and operated by a manager Pb who operates the control system 10. The operation of the game device 20 is remotely controlled by the control system 10.
The user Pa plays the game by the game device 20 by using the game medium. The game medium is a tangible or intangible value medium used in the game. Tangible game media are, for example, tokens (medals), tickets. Intangible game media are, for example, electronic medals, credits, or points. The number of intangible game media is stored as electronic data in a recording medium such as an IC card. When a user Pa plays a game, a prize, such as an electronic medal, a credit value, or a point number, according to the number of results of the play is given to the user Pa. In addition, when the game medium used for the game play is an intangible medium, the award given to the user Pa may be either of the same kind or different kind.
Fig. 2 is a plan view illustrating the structure of the game device 20. As shown in fig. 2, the game device 20 of the first embodiment includes a lottery mechanism 21 and N stations S1 to SN (N is a natural number equal to or greater than 1). Fig. 2 suitably illustrates a game device 20 including 6 stations S1 to S6(N is 6). The drawing mechanism 21 is formed in a substantially circular shape in plan view in the vertical direction and is used for physical drawing by the drawing member B. The drawing member B is a three-dimensional object such as a sphere. The N stations S1-SN are arranged in the circumferential direction so as to surround the lottery mechanism 21. In the first embodiment, one or more users Pa play the game using different stations Sn (N is 1 to N). The game is played in parallel at N stations S1-SN.
As shown in fig. 2, the drawing means 21 includes a physical drawing unit 211, a drawing area 212, and a drop-in means 213. The physical drawing portion 211 is a disc-shaped structure rotatably supported. A plurality of (for example, 25) drawing holes H are formed in the physical drawing portion 211 at equal intervals in the circumferential direction. Each of the plurality of drawing holes H is a pocket into which the drawing body B can enter. Each of the plurality of lottery holes H is assigned a different lottery number. The physical drawing portion 211 of the first embodiment is driven to rotate by a driving mechanism (not shown) such as a motor. In addition, the physical drawing unit 211 may not be rotated.
The drawing area 212 is an annular plate-like member that surrounds the physical drawing portion 211 when viewed in plan view in the vertical direction. The surface of the drawing area 212 is an inclined surface that descends from the outer periphery to the inner periphery (i.e., the physical drawing portion 211 side). The input mechanism 213 is provided near the outer periphery of the drawing area 212, and sequentially inputs the drawing members B on the surface of the drawing area 212. The lottery body B inserted by the insertion mechanism 213 gradually approaches the physical lottery section 211 while rolling on the surface of the lottery area 212, and finally enters any of the plurality of lottery holes H of the physical lottery section 211. The lottery drawing number corresponding to the lottery drawing hole H entered by the lottery drawing body B is the result of lottery drawing. That is, the physical lottery by the lottery mechanism 21 is an operation of randomly selecting any one of a plurality of lottery numbers (an example of lottery elements) corresponding to the dissimilar lottery holes H.
Each of the N stations S1 to SN is a unit used by each user Pa for playing a game. As shown in fig. 2, each station Sn is provided with a receiving mechanism 22. The receiving mechanism 22 includes an input port for inputting a tangible game medium such as a medal and a payout port for discharging the tangible game medium as a bonus. However, for example, a reading circuit for reading an intangible game medium such as an electronic medal or a credit value from a recording medium such as an IC card, and a writing circuit for writing the intangible game medium into a storage medium may be used as the receiving means 22.
Fig. 3 is a block diagram illustrating the structure of the game device 20. As shown in fig. 3, the game device 20 of the first embodiment includes the aforementioned lottery mechanism 21 and N stations S1 to SN, and further includes a control device 23, a storage device 24, a communication device 25, an operation device 26, and a playback device 27. The game device 20 may be realized as a single device, or may be realized as a set (i.e., a system) of a plurality of devices configured as different units.
The control device 23 is, for example, a single or a plurality of processors that control the respective elements of the game device 20. Specifically, the control device 23 is composed of one or more processors such as a cpu (central Processing unit), a gpu (graphics Processing unit), a dsp (digital Signal processor), an fpga (field Programmable Gate array), or an asic (application Specific Integrated circuit).
The storage device 24 is a single or a plurality of memories for storing programs executed by the control device 23 and various data used by the control device 23. As the storage device 24, a known recording medium such as a semiconductor recording medium or a magnetic recording medium, or a combination of a plurality of types of recording media is used. The communication device 25 (an example of a first communication device) communicates with the control system 10 through the communication network 2 under the control by the control device 23. The operation device 26 is constituted by a plurality of operation elements, for example, and receives an operation by a staff member at a game facility, for example.
The playback device 27 plays back a moving image including video and audio. Specifically, the playback device 27 includes a display device 271 for displaying images and a playback device 272 for playing back audio. The display device 271 is a display panel such as a liquid crystal display panel or an organic light emitting diode (light emitting diode) display panel. The sound reproduction device 272 is, for example, a speaker.
As shown in fig. 2, the playing device 27 of the first embodiment is disposed above the lottery mechanism 21. The playback device 27 can move to a position corresponding to any one of the users Pa (specifically, a position corresponding to any one of the stations Sn). That is, the playback device 27 is movable circumferentially about the center axis of the lottery mechanism 21 as indicated by the arrow in fig. 2, and is rotatable about the rotation axis in the vertical direction at an arbitrary point on the circumference. Therefore, the position and angle of the playback device 27 facing the user Pa using any one of the N stations S1 through SN can be controlled. Specifically, the display surface of the display device 271 and the sound emitting surface of the sound emitting device 272 face the user Pa who uses the specific station Sn. That is, the playback device 27 can play back the moving image toward a specific user Pa among the plurality of users Pa.
FIG. 4 is a block diagram illustrating the structure of each of the N stations S1-SN. As shown in fig. 4, each station Sn includes the receiving means 22, the playback device 31, the recording device 32, and the operation device 33. The playback device 31 plays back a moving image including video and audio. Specifically, the playback device 31 includes a display device 311 that displays images and a playback device 312 that plays back audio. The display device 311 is a display panel such as a liquid crystal display panel or an organic light emitting diode display panel, and the sound reproduction device 312 is a speaker, for example. As understood from the above description, the playback device 27 is shared by a plurality of users Pa, and the playback device 31 is used individually by the users Pa of each station Sn.
The recording device 32 is a picture input device that records moving pictures including video and audio. Specifically, the recording device 32 includes an imaging device 321 that images the user Pa, and a sound pickup device 322 that collects the sound of the user Pa. The imaging device 321 includes an optical system such as a video recording lens, and an imaging element that converts incident light from the optical system into an electric signal. The sound receiving device 322 is a microphone that generates an electric signal according to ambient sound.
The operation device 33 accepts an operation by the user Pa. For example, the operation device 33 is a plurality of operation elements operated by the user Pa, or a touch panel integrally formed with the display device 311 for detecting a contact by the user Pa.
Fig. 5 is a block diagram illustrating a functional configuration of the game device 20. As shown in fig. 5, the control device 23 in the game device 20 of the first embodiment realizes a plurality of functions (a game control unit 231, a progress data generation unit 232, a user data generation unit 233, a playback control unit 234, a transmission control unit 235, and a reception control unit 236) by executing programs stored in the storage device 24. The control device 23 may be configured by N control units corresponding to the N stations S1 to SN and a control unit that integrally controls the N stations S1 to SN. That is, a plurality of functions of the game device 20 illustrated in fig. 5 are also realized by a plurality of control units. For example, the game control unit 231 and the playback control unit 234 may be provided for each station Sn.
The game control unit 231 controls the progress of the game. The game of the first embodiment is a lottery game in which a prize of the user Pa is determined based on the result of a physical lottery by the lottery mechanism 21. For example, a card game in which a prize is won when lottery numbers sequentially selected by a physical lottery are arranged in a row among a plurality of lottery numbers arranged in a row in a card is a suitable example of the lottery game. However, the type of game to be executed by the game device 20 is not limited to the above example. The game control unit 231 performs the game by the lottery mechanism 21 in parallel for the N stations S1 to SN.
Fig. 6 is a flowchart illustrating a specific procedure of the progress of the game. The process illustrated in fig. 6 was repeated. When the processing is started, the game control unit 231 receives the use of the game medium by each user Pa (Sa 1). Each user Pa can use a desired number of game media (hereinafter referred to as "usage amount") via the operation device 33. The game control unit 231 receives instructions from the users Pa regarding the usage amount of the game medium from the users Pa during the reception period. The reception period is a period of a predetermined length for receiving an instruction for use of the game medium from each user Pa.
When the end of the reception period is reached, the game control unit 231 sequentially drops each of the plurality of lottery elements B onto the surface of the lottery area 212 via the control drop-in mechanism 213 (Sa 2). The game control unit 231 determines whether or not the winning of each user Pa is successful, based on the drawing number corresponding to the drawing hole H into which the drawing body B enters, among the plurality of drawing numbers (Sa 3). Then, the game control unit 231 awards the award to the user Pa for which the award is established (Sa 4). Specifically, the user Pa is given the game medium in an amount according to the usage amount accepted from the user Pa as the bonus. As understood from the above examples, in the game of the first embodiment, the award in the amount according to the usage amount of the game medium by the user Pa is given to the user Pa in accordance with the result of the game.
The progress data generating unit 232 in fig. 5 generates progress data G indicating the progress of the game. Specifically, the progress status data generating unit 232 generates the progress status data G indicating the current stage of the game (any of steps Sa1 to Sa4 in fig. 6), the status of awards obtained by the users Pa, the usage status of game media by the users Pa, and the result of a physical lottery (winning number).
The user data generation unit 233 generates user data U indicating the status of each user Pa. The user data U is data indicating a moving image recorded by the recording device 32 of each station Sn. Specifically, as shown in fig. 7, the user data U includes video data U1 representing a video captured by the imaging device 321 of each station Sn and audio data U2 representing audio collected by the sound pickup device 322 of each station Sn. The file format of the video data U1 and the audio data U2 is arbitrary.
The video data U1 is data indicating videos of the plurality of users Pa. The user data generation unit 233 generates the video data U1 by performing a designated picture process on the video signal supplied from the imaging device 321 of each station Sn. The image data U1 can also be said to be data indicating the motion or expression of each user Pa. On the other hand, the sound data U2 is data indicating sounds uttered by each of the plurality of users Pa. The user data generation unit 233 generates the audio data U2 by performing a predetermined audio process on the audio signal supplied from the sound pickup device 322 of each station Sn. The audio data U2 may be the data representing the contents of speech or the suppression of speech by the user Pa. Further, it is also possible to limit generation of only the video data U1 and the audio data U2 of the user Pa permitted to be recorded among the plurality of users Pa.
The playback control unit 234 in fig. 5 causes the playback device 27 and each playback device 31 to execute playback operations. The playback control unit 234 according to the first embodiment causes a game screen indicating the progress of a game to be displayed on the display device 311 of each playback device 31, and causes the sound generated in accordance with the progress of the game to be played back to the sound playback device 312 of each playback device 31. As shown in fig. 8, the playback control unit 234 causes the playback device 27 to play back the animation of the virtual character V in parallel with the control of the game by the game control unit 231. The virtual character V is an object representing a virtual creature in the virtual space, and supports play of a game by each user Pa. Specifically, the playback control unit 234 causes the display device 271 of the playback device 27 to display the video of the virtual character V, and causes the sound reproduction device 272 of the playback device 27 to reproduce the sound of the virtual character V.
The playback control section 234 of the first embodiment can control the position and angle of the playback device 27. The user Pa can recognize the video and audio of the virtual character V by a playback device 27 which plays back the animation of the virtual character V in a state facing a specific user Pa among a plurality of users Pa. That is, the virtual character V is generated in the same atmosphere as the user Pa session.
The transmission control unit 235 and the reception control unit 236 in fig. 5 control communication with the control system 10 using the communication device 25. The transmission control unit 235 (an example of a first transmission control unit) transmits various data from the communication device 25 to the control system 10 via the communication network 2. Specifically, the transmission control unit 235 transmits the progress status data G generated by the progress status data generation unit 232 and the user data U generated by the user data generation unit 233 from the communication device 25 to the control system 10.
The reception control unit 236 (an example of a first reception control unit) receives various data from the control system 10 via the communication device 25 via the communication network 2. Specifically, the reception control unit 236 receives, from the control system 10 via the communication device 25, the progress instruction data Q for instructing the progress of the game and the playback control data C for playing back the virtual character V via the playback device 27. The details of the execution instruction data Q and the playback control data C will be described later.
Fig. 9 is a block diagram illustrating the structure of the control system 10. As shown in fig. 9, the control system 10 of the first embodiment includes a control device 11, a storage device 12, a communication device 13, a playback device 14, and an operation device 15. In addition, the control system 10 may be implemented as a collection of a plurality of devices configured by different individuals, instead of being implemented as a single device.
The control device 11 is, for example, a single or a plurality of processors that control the respective elements of the control system 10. Specifically, the control device 11 is configured by one or more kinds of processors such as a CPU, a GPU, a DSP, an FPGA, or an ASIC. The storage device 12 is a single or a plurality of memories that store programs executed by the control device 11 and various data used by the control device 11. For example, a conventional recording medium such as a semiconductor recording medium or a magnetic recording medium, or a combination of a plurality of types of recording media is used as the storage device 12. The communication device 13 (an example of a second communication device) communicates with the game device 20 through the communication network 2 under the control of the control device 11.
The playback device 14 plays back a moving image including video and audio. Specifically, the playback device 14 includes a display device 141 for displaying images and a playback device 142 for playing back audio. The display device 141 is a display panel such as a liquid crystal display panel or an organic light emitting diode display panel. The sound reproduction device 142 is, for example, a speaker. The operation device 15 receives an operation from the manager Pb of the operation control system 10. For example, the operation device 15 is a plurality of operation elements operated by the manager Pb, or a touch panel integrally configured with the display device 141 and detecting a contact by the manager Pb.
Fig. 10 is a block diagram illustrating a functional structure of the control system 10. As shown in fig. 10, the control device 11 in the control system 10 according to the first embodiment implements a plurality of functions (an execution instruction data generation unit 111A, a playback control data generation unit 112A, a playback control unit 113, a transmission control unit 114, and a reception control unit 115) by executing programs stored in the storage device 12.
The transmission control unit 114 and the reception control unit 115 control communication with the game device 20 using the communication device 13. The transmission control unit 114 (an example of a second transmission control unit) transmits various data from the communication device 13 to the game device 20 via the communication network 2. Specifically, the transmission control unit 114 transmits the progress instruction data Q and the playback control data C from the communication device 13 to the game device 20. The reception control unit 115 (an example of a second reception control unit) receives various data from the game device 20 via the communication device 13 via the communication network 2. Specifically, the reception control unit 115 receives the progress data G and the user data U from the game device 20 via the communication device 13.
The playback control unit 113 causes the playback device 14 to execute a playback operation. The playback control unit 113 of the first embodiment enables the playback device 14 to perform playback operations according to the progress status data G received by the reception control unit 115 and according to the user data U. The playback control unit 113 of the first embodiment causes the display device 141 of the playback device 14 to display the management screen R of fig. 11. The manager Pb can refer to the management screen R. As shown in fig. 11, the management screen R includes a first region R1, a second region R2, and a third region R3.
The playback control unit 113 causes the first region R1 to display a picture of each user Pa indicated by the video data U1 of the user data U. Specifically, N pictures G1 to GN corresponding to different stations Sn are displayed in the first region R1. Any one of the pictures Gn is an image captured by the camera 321 of the station Sn corresponding to the picture Gn. That is, the picture Gn is an image of the user Pa using the station Sn.
The manager Pb can select any one of the N pictures G1 to GN by operating the operation device 15. The playback control unit 113 enlarges and displays one picture (hereinafter referred to as "selected picture") GN selected by the manager Pb among the N pictures G1 to GN in the second region R2. The playback control unit 113 causes the playback device 142 of the playback device 14 to play back the sound collected by the sound pickup device 322 corresponding to the station Sn that selects the picture Gn, using the sound data U2. That is, the sound of the user Pa using the station Sn corresponding to the selected picture Gn is emitted from the sound emitting device 142. As understood from the above description, the manager Pb can grasp the movement and expression of the user Pa using each station Sn and the contents and the suppression of the speech of the user Pa by visually displaying the management screen R on the display device 141 and listening to the sound from the sound reproduction device 142.
The playback control unit 113 causes the progress status indicated by the progress status data G to be displayed in the third area R3. Specifically, the stage of the current game, the status of obtaining the bonus by each user Pa, and the result of the physical lottery (for example, winning number) are displayed in the third area R3. The manager Pb can confirm the progress of the game indicated by the progress data G on the management screen R.
The progress instruction data generation unit 111A in fig. 10 generates progress instruction data Q for instructing the progress of the game. The play instruction data Q of the first embodiment indicates the end of the drawing area 212 charged with the drawing body B and the reception period (i.e., the reception deadline) for receiving the usage amount of the game medium from the user Pa.
The manager Pb can select an instruction regarding the progress of the game by the game device 20 from the plurality of candidates by referring to the management screen R and operating the operation device 15. The plurality of candidates includes, for example, the end of the input and reception period of the lottery body B. The manager Pb selects the end of the acceptance period, for example, at a stage of confirming that the usage amount of the game media for all the users Pa is accepted, by referring to the first region R1 or the second region R2 of the management screen R. The administrator Pb selects the drawing body B to be inserted at a stage of confirming that all the users Pa are paying attention to the drawing mechanism 21 by referring to the first region R1 or the second region R2 of the management screen R. The execution instruction data generation unit 111A according to the first embodiment generates execution instruction data Q in accordance with an instruction from the administrator Pb to the operation device 15. Specifically, progress instruction data Q indicating an instruction selected by the administrator Pb is generated.
As described above, in the first embodiment, the user data U including the video data U1 representing the video of the user Pa and the audio data U2 representing the audio of the user Pa is transmitted from the game device 20 to the control system 10. Therefore, the control system 10 can generate the appropriate performance instruction data Q in accordance with the situation of the user Pa. Further, there is an advantage that the progress instruction data Q corresponding to a plurality of types of execution situations can be generated in accordance with an instruction from the manager Pb who refers to the progress situation of the game indicated by the progress situation data G.
The playback control data generation unit 112A in fig. 10 generates playback control data C for causing the virtual character V to be played back to the playback device 27. As shown in fig. 12, the playback control data C includes motion data C1 indicating the motion of the virtual character V, sound data C2 indicating the sound of the virtual character V, and instruction data C3 specifying any one of the N stations S1 to SN.
The administrator Pb can instruct the operation and sound of the virtual character V by referring to the management screen R and operating the operation device 15. For example, the manager Pb selects each of the motion and sound of the virtual character V from the plurality of candidates. The manager Pb selects the station SN of the user Pa who is the playback target of the virtual character V from the N stations S1 to SN. The playback control data generation unit 112A generates the playback control data C in accordance with an instruction from the administrator Pb. Specifically, the playback control data generation unit 112A generates playback control data C that includes: the motion data C1 indicating the motion selected by the manager Pb, the sound data C2 indicating the sound selected by the manager Pb, and the instruction data C3 specifying the station Sn selected by the manager Pb.
As described above, in the first embodiment, the user data U including the video data U1 representing the video of the user Pa and the audio data U2 representing the audio of the user Pa is transmitted from the game device 20 to the control system 10. Accordingly, the playback control data C capable of playing back the virtual character V that performs an appropriate action or utterance according to the situation of the user Pa can be generated in the control system 10. Further, there is an advantage that the playback control data C of plural kinds corresponding to the situation of the user Pa can be generated in accordance with an instruction from the manager Pb referring to the situation of the user Pa indicated by the user data U.
The transmission control unit 114 transmits the progress instruction data Q generated by the progress instruction data generation unit 111A and the playback control data C generated by the playback control data generation unit 112A from the communication device 13 to the game device 20. As described above, the reception control unit 236 of the game device 20 receives the progress instruction data Q and the playback control data C transmitted from the control system 10 via the communication device 25. As will be described later, the execution instruction data Q and the playback control data C received by the reception control unit 236 control the operation of the game device 20.
The game control unit 231 of fig. 5 controls the game based on the progress instruction data Q received by the reception control unit 236. Specifically, the game control unit 231 executes the operation indicated by the progress instruction data Q. For example, when the instruction data Q indicates the insertion of the lottery body B, the game control unit 231 controls the insertion mechanism 213 to insert the lottery body B into the lottery area 212(Sa 2). When the progress instruction data Q indicates the end of the reception period, the game control unit 231 ends the reception period for receiving the usage amount of the game medium (Sa 1). As understood from the above description, the progress of the game in the game device 20 is remotely controlled from the control system 10 in accordance with an instruction from the manager Pb who can refer to the progress status of the game in the game device 20 and the status of each user Pa.
The playback control unit 234 of fig. 5 causes the virtual character V to be played back by the playback device 27 based on the playback control data C received by the reception control unit 236 from the control system 10. Fig. 13 is a flowchart illustrating a specific procedure of a process in which the playback control unit 234 controls the playback of the virtual character V.
The playback control unit 234 moves the playback device 27 to the position and angle of the station Sn indicated by the instruction data C3 corresponding to the playback control data C (Sb 1). That is, the playback device 27 is adjusted to the position and angle of the user Pa facing the station Sn.
The playback control unit 234 causes the virtual character V to be displayed on the display device 271(Sb2), and causes the virtual character V to execute the operation of the operation data C1 representing the playback control data C. The playback control unit 234 causes the sound of the sound data C2 indicating the playback control data C to be played back to the sound reproducing device 272(Sb 3). With the above operations, the user Pa can feel the virtual character V as if the user Pa moves and pronounces sound toward the platform Sn.
As described above, in the first embodiment, the virtual character V is played to a specific user Pa among a plurality of users Pa playing the game. Therefore, the play of the game by the specific user Pa can be effectively supported. In particular, in the first embodiment, the virtual character V is played while the playback device 27 is moved to a position corresponding to a specific user Pa among the plurality of users Pa, so that the effect of being able to support the specific user Pa is particularly remarkable. In addition, there is an advantage that other users Pa can easily grasp a specific user Pa to be played back by the virtual character V. There is also an advantage that a viewer of a game by a plurality of users Pa can easily grasp a specific user Pa which is a target of the playback of the virtual character V.
Fig. 14 is a flowchart illustrating a procedure of the overall operation of the game system 1. As shown in fig. 14, the game control unit 231 of the game device 20 controls the progress of the game (Sc 1). A specific procedure for controlling the progress of the game by the game control unit 231 is shown in fig. 6.
The progress data generator 232 generates progress data G indicating the progress of the game (Sc 2). The user data generation unit 233 also generates user data U indicating the status of the user Pa recorded by the recording device 32 of each station Sn (Sc 3). The order of generating the status data G (Sc2) and the user data U (Sc3) may be reversed. The transmission controller 235 transmits the progress data G and the user data U from the communication device 25 to the control system 10(Sc 4).
The reception control unit 115 of the control system 10 receives the progress data G and the user data U transmitted from the game device 20 via the communication device 13 (Sd 1). The playback control unit 113 causes the playback device 14 to execute a playback operation based on the progress data G and the user data U received by the reception control unit 115 (Sd 2). Specifically, the playback control unit 113 causes the display device 141 to display the management screen R of the video data U1 based on the progress status data G and the user data U, and causes the playback device 142 to play back the sound indicated by the sound data U2 of the user data U. The administrator Pb operates the operation device 15 by referring to the management screen R displayed on the display device 141 and the sound emitted from the sound emission device 142, and gives the control system 10 an instruction on the progress of the game and an instruction on the virtual character V.
The execution instruction data generation unit 111A generates execution instruction data Q in accordance with an instruction from the administrator Pb (Sd 3). The playback control data generation unit 112A generates playback control data C in accordance with an instruction from the administrator Pb (Sd 4). In addition, the order of generation of the instruction data Q (Sd3) and generation of the playback control data C (Sd4) may be reversed. The transmission control unit 114 transmits the progress instruction data Q and the playback control data C from the communication device 13 to the game device 20(Sd 5).
The reception control unit 236 of the game device 20 receives the progress instruction data Q and the playback control data C transmitted from the control system 10 via the communication device 25 (Sc 5). The game control unit 231 controls the progress of the game based on the progress instruction data Q received by the reception control unit 236 (Sc 6). Specifically, the game control unit 231 executes the operation indicated by the progress instruction data Q. Further, the playback control unit 234 causes the virtual character V to be played back to the playback device 27 based on the playback control data C received by the reception control unit 236 (Sc 7). The operation illustrated in fig. 14 is repeated periodically.
As described above, in the first embodiment, the progress status data G is transmitted to the control system 10 via the communication device 25, and the game is controlled in accordance with the progress instruction data Q generated by the control system 10. Therefore, even if the player who controls the progress of the game by operating the operation device 33 is not near the game device 20, the progress of the game can be controlled from the control system 10. In particular, in the first embodiment, since the user data U indicating the status of the user Pa is added to the progress status data G and transmitted to the control system 10 via the communication device 25, there is an advantage that the progress instruction data Q can be generated in the control system 10 in accordance with both the progress status of the game and the status of the user Pa.
In addition, in the first embodiment, the user data U is transmitted to the control system 10 via the communication device 25, and the virtual character V is controlled according to the playback control data C generated by the control system 10. Therefore, the play of the game by the user Pa can be effectively supported. In addition, since it is not necessary to load a function for generating the playback control data C in accordance with the situation of the user Pa to the game device 20, there is also an advantage that the configuration and processing of the game device 20 are simplified.
[ second embodiment ]
The second embodiment of the present invention is explained. In the following embodiments, the same elements as those in the first embodiment in function are denoted by the same reference numerals as those in the first embodiment, and detailed descriptions thereof are omitted as appropriate.
In the control system 10 of the second embodiment, the progress indication data generating unit 111A of the first embodiment is replaced with a progress indication data generating unit 111B of fig. 15. As described above, the progress instruction data generation unit 111A according to the first embodiment generates the progress instruction data Q in accordance with the instruction from the administrator Pb. The progress instruction data generation unit 111B of the second embodiment generates the progress instruction data Q using the generation pattern M1 created by the machine learning (Sd 3).
The control data X is supplied to the generation pattern M1. The control data X includes the progress data G and the user data U received from the game device 20. The generation pattern M1 is a learned pattern in which the relationship between the learned control data X and the instruction data Q is performed. A statistical estimation pattern such as a deep neural network is suitably utilized as the generation pattern M1. Specifically, the generation mode M1 is realized by causing the control device 11 to execute a program for generating the instruction data Q from the control data X and a combination of a plurality of coefficients applied to the calculation.
Fig. 16 is an explanatory diagram of mechanical learning for generating the pattern M1. As shown in fig. 16, a plurality of coefficients of the generation pattern M1 are set by deep learning using a plurality of teacher data L1. Each of the plurality of teacher data L1 is configured by a combination of control data X and pointing data Q (positive decode values).
The control device 11 of the control system 10 according to the second embodiment implements the learning processing unit 51 shown in fig. 16 by executing a program stored in the storage device 12. The learning processing unit 51 sets a plurality of coefficients for generating the pattern M1. Specifically, the learning processing unit 51 repeatedly updates the coefficients of the generation pattern M1 so as to reduce errors in the progress instruction data Q output by the provisional generation pattern M1 with respect to the input of the control data X included in the teacher data L1 and the progress instruction data Q included in the teacher data L1. The coefficients set by the above deep learning are stored in the storage device 12, and the coefficients stored in the storage device 12 are applied to the generation pattern M1. As understood from the above description, the generation pattern M1 after the machine learning outputs statistically appropriate progression instruction data Q for unknown control data X with a tendency potentially between the control data X and the progression instruction data Q in the plurality of teacher data L1.
In the control system 10 of the second embodiment, the playback control data generator 112A of the first embodiment is replaced with the playback control data generator 112B of fig. 17. As described above, the playback control data generation unit 112A of the first embodiment generates the playback control data C in accordance with the instruction from the administrator Pb. The playback control data generation unit 112B of the second embodiment generates playback control data C using the generation pattern M2 created by machine learning (Sd 4).
The control data X is supplied to the generation pattern M2. As described above, the control data X includes the progress data G and the user data U received from the game device 20. In addition, the control data X supplied to the generation mode M1 and the control data X supplied to the generation mode M2 may be different from each other. The generation pattern M2 is a learned pattern for learning the relationship between the control data X and the playback control data C. A statistical estimation pattern such as a deep neural network is suitably utilized as the generation pattern M2. Specifically, the generation mode M2 is realized by a program for causing the control device 11 to execute an operation for generating the playback control data C from the control data X, and a combination of a plurality of coefficients applied to the operation.
Fig. 18 is an explanatory diagram of mechanical learning for generating the pattern M2. As shown in fig. 18, a plurality of coefficients of the generation pattern M2 are set by deep learning using a plurality of teacher data L2. Each of the plurality of teacher data L2 is configured by a combination of control data X and playback control data C (positive decoding values).
The control device 11 of the control system 10 according to the second embodiment implements the learning processing unit 52 shown in fig. 18 by executing a program stored in the storage device 12. The learning processing unit 52 sets a plurality of coefficients for generating the pattern M2. Specifically, the learning processing unit 52 repeatedly updates the coefficients of the generation pattern M2 to reduce errors between the playback control data C input to and output from the control data X included in the teacher data L2 in the provisional generation pattern M2 and the playback control data C included in the teacher data L2. The coefficients set by the above deep learning are stored in the storage device 12, and the coefficients stored in the storage device 12 are applied to the generation pattern M2. As understood from the above description, the generation pattern M2 after the mechanical learning outputs the statistically appropriate playback control data C for the unknown control data X with a tendency potentially between the control data X and the playback control data C in the plurality of teacher data L2.
As described above, in the second embodiment, the play instruction data Q is generated in the generation mode M1, and the playback control data C is generated in the generation mode M2. Therefore, the configuration for generating the execution instruction data Q and the playback control data C in accordance with the instruction from the administrator Pb is omitted. For example, the structure for playing the management screen R (the playback control unit 113 and the playback device 14) may be omitted. The structure and operation of the first embodiment are the same as those of the first embodiment except for generation of the indication data Q and the playback control data C. Therefore, the same effects as those of the first embodiment are also achieved in the second embodiment.
As described above, in the second embodiment, the progress instruction data Q is generated using the learned control data X (the progress status data G and the user data U) and the generation pattern M1 of the relationship between the progress instruction data Q. Therefore, the manager Pb who operates the control system 10 is not necessary, and can generate the progress instruction data Q appropriate for the progress situation data G and the user data U.
In addition, in the second embodiment, the playback control data C is generated using the generation pattern M2 in which the relationship between the learned control data X (the progress data G and the user data U) and the playback control data C is learned. Therefore, the administrator Pb who operates the control system 10 is not necessary, and can generate the playback control data C appropriate for the progress situation data G and the user data U.
The teacher data L1 and the teacher data L2 may be generated based on the actual instruction from the administrator Pb. Specifically, as illustrated in the first embodiment, in the application that generates the progress instruction data Q and the playback control data C in accordance with the instruction from the administrator Pb, a plurality of pieces of teacher data L1 and a plurality of pieces of teacher data L2 are collected. For example, the control device 11 of the control system 10 generates teacher data L1 in which teacher data L1 is associated with control data X including progress status data G and user data U received in step Sd1 of fig. 14 and progress instruction data Q generated in step Sd3 in accordance with an instruction from the administrator Pb. Further, the control device 11 generates the teacher data L2 in which the teacher data L2 is associated with the control data X including the progress status data G and the user data U received in step Sd1 and the playback control data C generated in accordance with the instruction from the administrator Pb in step Sd 4.
At the stage when the number of teacher data L1 necessary for machine learning has been collected, the learning processing unit 51 builds the generation pattern M1 by machine learning (fig. 16) using the teacher data L1. Similarly, at the stage when the teacher data L2 of the number necessary for machine learning is collected, the learning processing unit 52 builds the generation pattern M2 by machine learning (fig. 18) using the teacher data L2. The generation mode M1 and the generation mode M2 constructed by the above method are used in the game system 1 of the second embodiment.
Further, at an arbitrary time point after the construction of the generated pattern M1, the generated pattern M1 may be updated by additional machine learning using the newly collected teacher data L1. Similarly, at an arbitrary time point after the construction of the generated pattern M2, the generated pattern M2 may be updated by additional machine learning using the newly collected teacher data L2.
[ third embodiment ]
The game device 20 of the third embodiment operates in any one of the first operation mode and the second operation mode. The first operation mode is an operation mode (remote control mode) in which the game is controlled in accordance with an instruction from the control system 10. The second operation mode is an operation mode (automatic control mode) in which the game is controlled without an instruction from the control system 10. A manager (hereinafter, referred to as a "facility manager") such as a clerk at a game facility selects either one of the first operation mode and the second operation mode by operating the operation device 26 of the game device 20.
Fig. 19 is a flowchart illustrating an operation of the control device 23 in the game device 20 according to the third embodiment. As shown in fig. 19, the control device 23 determines which of the first operation mode and the second operation mode is selected (Se 1). When the first operation mode is selected (Se 1: YES), the control device 23 operates according to the play control data C and the play instruction data Q transmitted from the control system 10 in the same manner as the first embodiment (Se 2). Specifically, the control device 11 generates and transmits the progress status data G and the user data U (Sc2 to Sc4), and receives the progress instruction data Q and the playback control data C from the control system 10(Sc 5). Further, the control device 11 executes control of the game based on the progress instruction data Q (Sc6) and display of the virtual character V based on the playback control data C (Sc 7). Therefore, the same effect as the first embodiment is achieved.
On the other hand, when the second operation mode is selected (Se 1: NO), the control device 23 operates without an instruction from the control system 10 (Se 3). Specifically, the control device 23 does not generate and transmit the status data G and the user data U (Sc2 to Sc 4). Therefore, the control device 23 plays the game by the processing of fig. 4 without depending on the progress instruction data Q, and causes the virtual character V to be played back by the playback device 27 by the specified algorithm without depending on the playback control data C. As understood from the above description, according to the third embodiment, in addition to controlling the game in accordance with the progress instruction data Q received from the control system 10 in the first operation mode, in the second operation mode, the game can be controlled without depending on the progress instruction data Q.
In the above description, the first operation mode in which the game is played in accordance with the instruction from the control system 10 and the second operation mode in which the game is automatically played have been exemplified. A configuration is also suitable in which the first operation mode and the second operation mode can be selected, and a third operation mode (manual control mode) in which a game is played in accordance with an instruction from a facility manager is added. The facility manager confirms the actual game status and the statuses of the plurality of users Pa at any time, and instructs the progress of the game by the operation of the operation device 26. In the third operation mode, the control device 23, for example, takes an instruction from the facility manager as a trigger, and puts the lottery body B into the lottery area 212, and ends the reception period by the instruction from the facility manager. In addition, a configuration in which either one of the first operation mode and the third operation mode can be selected (a configuration in which the second operation mode is omitted) is also conceivable. In addition, although the structure based on the first embodiment has been illustrated in the above description, the third embodiment may be applied to the second embodiment.
[ fourth embodiment ]
The performance instruction data Q according to the fourth embodiment indicates an instruction for a specific user Pa among the plurality of users Pa. The manager Pb confirms the status of each user Pa and the progress status of the game via the management screen R, and simultaneously operates the operation device 15 to designate any one of the N stations S1 to SN and an instruction for the user Pa of the station SN. The execution instruction data generation unit 111A generates execution instruction data Q in accordance with an instruction from the administrator Pb. The progress instruction data Q is transmitted to the game device 20 together with the play control data C, as in the first embodiment.
The playback control unit 234 causes the playback device 27 to play the virtual character V based on the playback control data C, and also causes the playback device 27 to play the instruction indicated by the execution instruction data Q. Specifically, the playback control unit 234 not only moves the playback device 27 to the position and angle corresponding to the station Sn indicated by the progress instruction data Q, but also plays back the instruction indicating the progress instruction data Q to the playback device 27. For example, the playback control unit 234 causes the display device 271 to display a picture showing the instruction, and causes the playback device 272 to play back a sound showing the instruction. That is, the playback control unit 234 causes the playback device 27 to play back the instruction indicated by the execution instruction data Q to the specific user Pa among the plurality of users Pa.
The same effects as those of the first embodiment are also achieved in the fourth embodiment. In addition, according to the fourth embodiment, the instruction indicated by the execution instruction data Q can be notified to the specific user Pa. In particular, in the fourth embodiment, there is an advantage that the playback device 27 can be moved to give a definite instruction to a specific user Pa among a plurality of users Pa. In addition, there is an advantage that other users Pa can easily grasp the specific user Pa notified of the instruction. There is also an advantage that the viewer of the game by the plurality of users Pa can easily grasp the user Pa which is the notification target of the instruction. In addition, the second embodiment or the third embodiment may also be applied to the fourth embodiment.
[ modified examples ]
The various aspects illustrated above can be varied in many ways. The following illustrates specific variations that can be applied to the above-described embodiments. Two or more aspects arbitrarily selected from the following examples may be combined within a range not contradictory to each other.
(1) In each of the above-described embodiments, the instruction data Q has instructed the end of the input and reception period of the drawing body B, but the content of the instruction indicated by the instruction data Q is not limited to the above example. For example, any one of the plurality of lottery numbers may be indicated by the indication data Q. For example, the administrator Pb selects a lottery number expected by the user Pa estimated as a plurality of lottery numbers through the confirmation management screen R. The game control unit 231 of the game device 20 determines that the lottery number indicated by the play instruction data Q (actually, the lottery number for which the lottery body B does not enter the lottery hole H) has been selected by the lottery and causes the game to progress. In addition, in the configuration including the conveying mechanism for conveying the lottery element B to an arbitrary lottery hole H, the game control unit 231 controls the conveying mechanism so that the lottery element B enters the lottery number indicated by the instruction data Q. With the above configuration, the manager Pb can preferentially win the intended user Pa.
(2) In the above-described embodiments, the motion data C1 indicating the motion selected by the manager Pb is generated, but the method of generating the motion data C1 of the playback control data C is not limited to the above example. For example, the motion of the manager Pb may be analyzed from a picture of the motion of the manager Pb captured by the imaging device 321, and the playback control data generation unit 112A (112A, 112B) may generate the motion data C1 indicating the motion. With the above configuration, the virtual character V can execute the operation of the administrator Pb.
(3) In the above-described embodiments, the audio data C2 indicating the audio selected by the administrator Pb is generated, but the method of generating the audio data C2 of the playback control data C is not limited to the above example. For example, the playback control data generation unit 112A (112A, 112B) may generate the sound data C2 showing the sound of the administrator Pb collected by the sound pickup device. With the above configuration, the sound generated by the administrator Pb can be emitted from the sound emitting device 272 as the sound of the virtual character V. The playback control data generation unit 112A (112A, 112B) may also generate the sound data C2 that represents the sound that has converted the timbre of the sound emitted by the administrator Pb. The playback control data generation unit 112A may generate the audio data C2 indicating the character string indicated by the administrator Pb. The playback control unit 234 generates a sound signal by sound synthesis for the character string representing the sound data C2, and supplies the sound signal to the sound reproduction device 272 to reproduce the sound. With the above configuration, the sound corresponding to the character string instructed by the administrator Pb can be played from the sound playing device 272 as the sound of the avatar V.
(4) The playback control data C may be motion picture data representing the video and audio of the virtual character V. The playback control unit 234 causes the playback device 27 to play back the animation indicated by the playback control data C (e.g., streaming playback). With the above configuration, there is an advantage that it is not necessary to control the picture processing of the virtual character V based on the motion data C1.
(5) In the second embodiment, one piece of progress indication data Q has been generated in the generation pattern M1, but the method of generating the progress indication data Q in the generation pattern M1 is not limited to the above example. For example, a plurality of progress instruction data Q may be generated in the generation mode M1, and the administrator Pb may select one progress instruction data Q among the plurality of progress instruction data Q by the operation device 15 and send it to the game device 20. That is, the execution instruction data generation unit 111(111A, 111B) generates the execution instruction data Q in accordance with the processing in the generation pattern M1 and the selection by the administrator Pb.
Similarly, in the second embodiment, one piece of playback control data C has been generated in the generation pattern M2, but the method of generating the playback control data C in the generation pattern M2 is not limited to the above example. For example, the plurality of pieces of play control data C may be generated in the generation mode M2, and one piece of play control data C selected by the manager Pb via the operation device 15 among the plurality of pieces of play control data C may be transmitted to the game device 20. That is, the playback control data generation unit 112 generates one piece of playback control data C in accordance with the processing by the generation mode M2 and the selection by the administrator Pb.
(6) In the game device 20 of each of the above-described embodiments, the display of the video image on the virtual character V by the display device 271 and the playback of the sound by the sound playback device 272 are performed, but either the display of the video image or the playback of the sound may be omitted. That is, one of the display device 271 and the playback device 272 may be omitted from the playback device 27. In the control system 10 of each of the above-described embodiments, the display of the management screen R by the display device 141 and the playback of the sound of the user Pa by the sound playback device 142 are executed, but one of the display of the management screen R and the playback of the sound may be omitted. That is, one of the display device 141 and the playback device 142 may be omitted from the playback device 14.
(7) In each of the above-described embodiments, the playback device 27 is moved to a position and an angle corresponding to any one of the N stations S1 to SN, but the method of notifying the information Pa of each user using the playback device 27 is not limited to the above example.
As shown in fig. 20, a structure is conceived to secure the space O by omitting one station S6 among the 6 stations S1 to S6 illustrated in fig. 2. Elements other than the station Sn, such as a conveying mechanism for conveying the drawing member B, are provided in the space O.
As shown in fig. 20, the playback device 27 is controlled to display the angle of the surface F toward the opposite side of the space O in a position close to the space O (between the stations S1 and S5). In the state of fig. 20, the user Pa using each of the 5 stations S1 to S5 can visually recognize the video displayed on the playback device 27. Therefore, by playing the virtual character V by the playing device 27, the virtual character V is generated as if it is an atmosphere in the whole session with the plurality of users Pa. The playback device 27 in the state shown in fig. 20 can notify the instruction to the entire plurality of users Pa by causing the instruction indicated by the instruction data Q to be played back (embodiment 4). Further, the playback device 27 at a position close to the space O can notify the user Pa of the instruction individually in real time without moving the playback device 27 to the vicinity of the user Pa by adjusting the angle toward the specific user Pa.
As in the above example, the playing device 27 can move to the position corresponding to the space O and the position corresponding to each station Sn. As described above, in a state where the playback device 27 is located at a position corresponding to the space O, information is notified to the entirety of the plurality of users Pa. In contrast to the above state, in the state where the playback device 27 is located at a position corresponding to one station Sn, there is an advantage that the atmosphere in which the virtual character V is talking with the user Pa of the station Sn can be emphasized.
(8) In each of the above-described embodiments, the progress status data G and the user data U are already transmitted from the game device 20 to the control system 10, but generation and transmission of the progress status data G may be omitted. That is, the progress data generating unit 232 may be omitted. The playback control unit 113 of the control system 10 plays back the management screen R including the first region R1 and the second region R2 in accordance with the user data U. The execution instruction data generation unit 111A generates execution instruction data Q in accordance with an instruction from the administrator Pb, as in the first embodiment. The execution instruction data generation unit 111B of the second embodiment generates playback control data C from control data X including user data U in the generation mode M1. In the second embodiment, the process condition data G is omitted from the control data X.
In addition, generation of the user data U and transmission to the control system 10 may also be omitted. That is, the user data generation unit 233 may be omitted. The playback control unit 113 of the control system 10 plays back the management screen R including the third region R3 based on the progress data G. The execution instruction data generation unit 111A generates execution instruction data Q in accordance with an instruction from the administrator Pb, as in the first embodiment. The progress instruction data generation unit 111B of the second embodiment generates the playback control data C from the control data X including the progress status data G in the generation mode M2. In the second embodiment, the user data U is omitted from the control data X.
(9) In each of the above-described embodiments, the control system 10 generates both the execution instruction data Q and the playback control data C, but generation of the execution instruction data Q may be omitted. That is, the instruction data generating unit 111(111A, 111B) may be omitted. In the above configuration, the control of the game according to the progress instruction data Q is omitted (Sc 6). In addition, the generation of the playback control data C may also be omitted. That is, the playback control data generation unit 112(112A, 112B) may be omitted. In the above configuration, the playback of the virtual character V by the playback control data C is omitted (Sc 7).
(10) In each of the above-described embodiments, the virtual character V is played by the playing device 27 shared by a plurality of users Pa, but the virtual character V may be played by the playing device 31 installed at each station Sn. The common virtual character V can be played back by the playback devices 31 of the N stations S1-SN, or the individual virtual character V can be played back by the playback devices 31 at each station Sn. In addition, the virtual character V may be played only by the playback device 31 of a specific station Sn among the N stations S1-SN.
In each of the above-described embodiments, the virtual character V has been played back by the display device 271 and the playback device 272 of the playback device 27, but the image of the virtual character V may be displayed on the display device 271 of the playback device 27, and the sound of the virtual character V may be played back to the playback device 312 of each station Sn. For example, the sound of the virtual character V is played out from the sound reproducing apparatus 312 of the station Sn indicated by the indication data C3.
(11) In each of the above-described aspects, the progress status data G and the user data U have been transmitted from the game device 20 to the control system 10 in step Sc4 of fig. 14, but the progress status data G and the user data U may be transmitted to the control system 10 at separate points in time. In addition, in each of the above-described embodiments, the progress instruction data Q and the playback control data C have been transmitted from the control system 10 to the game device 20 in step Sd5 of fig. 14, but the progress instruction data Q and the playback control data C may be transmitted to the game device 20 at separate points in time.
(12) In the second embodiment, the control data X including the progress status data G and the user data U has been illustrated, but the content of the control data X is not limited to the above example. For example, the control data X may include data generated by picture processing on the video data U1 of the user data U (e.g., feature values of pictures represented by the video data U1). The control data X may include data generated by voice processing of the voice data U2 of the user data U (for example, a feature quantity of a voice indicated by the voice data U2, or a character string indicating the content of the voice). As understood from the above example, the control data X is comprehensively expressed as data based on the progress data G and/or the user data U.
(13) In the above-described aspects, the game device 20 has been controlled by one control system 10, but the game device 20 may be controlled by a plurality of control systems 10. For example, in a configuration in which the playback device 27 plays back a plurality of virtual characters V, each virtual character V is controlled by playback control data C transmitted from each of the plurality of control systems 10. That is, for example, the virtual character V corresponding to each control system 10 is controlled in accordance with an instruction from the manager Pb operating the control system 10. In addition, one virtual character V may be controlled by the playback control data C transmitted from a plurality of control systems 10. The control system 10 for generating the execution instruction data Q and the control system 10 for generating the playback control data C may be provided separately. In the configuration using the plurality of control systems 10, the game device 20 transmits the progress status data G and the user data U to the plurality of control systems 10(Sc 4).
(14) The functions of the control system 10 according to the above-described respective types are realized by cooperation of one or more processors constituting the control device 11 and programs stored in the storage device 12. Similarly, the functions of the game device 20 according to each of the above-described types are realized by cooperation of one or more processors constituting the control device 23 and a program stored in the storage device 24. The program according to each of the above types is provided in a form of being stored in a computer-readable recording medium, and can be installed in a computer. The recording medium is, for example, a non-transitory (non-transitory) recording medium, and an optical recording medium (optical disc) such as a CD-ROM is a preferred example, but includes any known recording medium such as a semiconductor recording medium or a magnetic recording medium. The non-transitory recording medium includes any recording medium except a transitory transmission signal (transmission), and a volatile recording medium is not excluded. In the configuration in which the transmission device transmits the program via the communication network 2, the storage device storing the program in the transmission device corresponds to the aforementioned non-transitory recording medium.
[ accompanying notes ]
From the above description, suitable aspects of the present invention are understood as, for example, in the following manner. In the following description, for the purpose of facilitating understanding of various aspects, reference numerals shown in the drawings are appropriately combined in parentheses, but the present invention is not limited to the embodiments shown in the drawings.
[ additional notes A ]
In the technique of patent document 1, a drop mechanism mounted on a lottery device automatically drops a lottery ball at a time point according to the progress of a game. On the other hand, in a scene of an event or the like held in a game facility, for example, a player such as a clerk at the game facility can advance the game at an appropriate timing depending on the progress status of the game, and a hot atmosphere can be played for many participants. However, in order to realize the above application, there is a problem that it is necessary to secure a player for each gaming establishment. In the above description, a game including a physical lottery has been exemplified as appropriate, but the same problem is assumed in any type of game. In view of the above circumstances, the following aspects (note a) are exemplified for the purpose of appropriately playing a game in accordance with the situation of the user without requiring an environment in which the player is installed in the game device.
[ additional notes A1]
A game device (20) according to a preferred embodiment of the present invention (note a1) includes: a game control unit (231) that controls a game to be given to the user (Pa) according to the result of the award according to the amount of usage of the game medium by the user (Pa); a progress data generation unit (232) that generates progress data (G) indicating the progress of the game; a transmission control unit (235) for transmitting the progress status data (G) to a control system (10) via a communication device (25) via a communication network (2); and a reception control unit (236) that receives progress instruction data (Q) instructing the progress of the game from the control system (10) via the communication device (25) via the communication network (2). The game control unit (231) controls the game based on the progress instruction data (Q) received by the reception control unit (236). In the above aspect, the progress condition data (G) is transmitted to the control system (10) via the communication device (25), the progress instruction data (Q) generated by the control system (10) based on the progress condition data (G) is received by the communication device (25), and the game is controlled based on the progress instruction data (Q). Therefore, even if the administrator who manually plays the game is not in the vicinity of the game device (20), the game can be controlled to be played by the control system (10).
A typical example of the "game" is a lottery game in which a user (Pa) is given a prize in an amount according to the usage amount of the game medium as a result of a prize winning by means of a lottery process (physical lottery or electronic lottery). However, the "game" in the present invention is not limited to the lottery game exemplified above. For example, a battle game (e.g., a card game) in which a plurality of users (Pa) compete is also a suitable example of the "game". In a match-up game, the progress of the game differs among users.
The "usage amount of game medium" is the number of game media that the user (Pa) instructs to use. For example, in a lottery game in which a prize is awarded to a user (Pa) who wins the lottery, wherein the prize is the number of times the specified multiplier is multiplied by the number of game media indicated by the user (Pa), the number of game media indicated by the user (Pa) corresponds to the "usage amount".
The "progress data (G)" is data indicating the progress of the game, and for example, data indicating the current stage, the status of obtaining the bonus by the user (Pa), the status of using the game medium by the user (Pa), or the result of the lottery process in the game composed of a plurality of stages (for example, acceptance of the game medium → lottery process → award of bonus).
"progress instruction data (Q)" is data indicating the progress of the game. For example, data indicating the following operation is exemplified as "instruction data (Q)": the game device is provided with a drawing body (B) for physically drawing a picture, an end of a reception period for receiving an instruction from a user (Pa) regarding the usage amount of a game medium, and a selection of any one of a plurality of drawing elements.
[ additional notes A2]
A game device (20) according to a preferred example of supplementary note a1 (supplementary note a2) includes a user data generating unit (233) that generates user data (U) indicating a situation of the user (Pa), and the transmission control unit (235) transmits the progress situation data (G) and the user data (U) to the control system (10) via the communication device (25). According to the above aspect, since the progress status data (G) is added with the user data (U) indicating the status of the user (Pa) and transmitted to the control system (10) via the communication device (25), it is possible to generate the progress instruction data (Q) according to both the progress status of the game and the status of the user (Pa) in the control system (10) by using the progress status data (G) and the user data (U).
The "user data (U)" is arbitrary data indicating the situation of the user (Pa). For example, video data (U1) indicating a picture taken by the user (Pa) or audio data (U2) indicating a sound uttered by the user (Pa) is suitable as the user data (U).
The temporal relationship between the transmission of the status data (G) and the transmission of the user data (U) is arbitrary. For example, the progress data (G) and the user data (U) may be transmitted to the control system (10) at the same time, or the progress data (G) and the user data (U) may be transmitted to the control system (10) at independent time points.
[ additional notes A3]
In a preferred example of the supplementary note a2 (supplementary note A3), the user data (U) includes at least one of video data (U1) indicating a video of the user (Pa) captured by the imaging device (321) and audio data (U2) indicating that the sound pickup device (322) collects a sound of the user (Pa). According to the above aspect, it is possible to generate appropriate performance instruction data (Q) in the control system (10) according to the movement or expression of the user (Pa) captured by the imaging device (321), or the content or suppression of the sound of the user (Pa) collected by the sound pickup device (322).
[ additional notes A4]
In a preferable example (note a4) of any one of notes a1 to A3, the game is a lottery game in which the result of the game is determined by a physical lottery, wherein the physical lottery is a lottery in which any one of a plurality of lottery elements is selected by a lottery body (B), and the performance instruction data (Q) indicates at least one of the input of the lottery body (B), the end of an acceptance period in which an instruction from the user (Pa) regarding the amount of use of a game medium is accepted, and the selection of any one of the plurality of lottery elements.
The "selection of any one of the plurality of lottery elements" means an operation of forcibly selecting a specific lottery element among the plurality of lottery elements. For example, when a physical drawing is supposed in which the drawing body (B) is entered into any one of the plurality of drawing holes (H) corresponding to different drawing elements, the operation of forcibly entering the drawing body (B) into the drawing hole (H) corresponding to the drawing element of the object is instructed by the instruction data (Q) by mechanically blocking each of the plurality of drawing holes (H) other than the drawing hole (H) corresponding to the drawing element of the object. Further, the lottery element indicating the object by the execution indication data (Q) does not actually perform the physical lottery as the operation of the state having been selected.
[ additional notes A5]
In a preferred embodiment (note a5) of any one of notes a1 to a4, the game control unit (231) controls the game in accordance with the progress instruction data (Q) received by the reception control unit (236) in a first operation mode, and controls the game in accordance with no progress instruction data (Q) in a second operation mode. According to the above aspect, in addition to controlling the game based on the progress instruction data (Q) received from the control system (10), the game can be controlled without using the progress instruction data (Q).
[ additional notes A6]
A game device (20) according to a preferred embodiment (note A6) of any one of notes A1 to A5 includes a playback control unit (234) for causing an instruction indicated by the execution instruction data (Q) to be played back to a playback device (27, 31) for a specific user (Pa) among a plurality of users (Pa) including the user. According to the above aspect, the instruction indicated by the instruction data (Q) can be notified to a specific user (Pa) among the plurality of users (Pa).
[ additional notes A7]
In a preferred example of the additional note a6 (additional note a7), the playback device (27) is movable to a position corresponding to a specific user (Pa) among the plurality of users (Pa), and the playback control unit (234) causes the playback device (27) to play back the instruction indicated by the execution instruction data (Q) in a state where the playback device (27) is moved to the position corresponding to the specific user (Pa). According to the above aspect, the instruction indicated by the instruction data (Q) can be notified to a specific user (Pa) among a plurality of users (Pa) including the user (Pa) by moving the playback device (27). In addition, since the playback device (27) is moved to a position corresponding to a specific user (Pa), there is an advantage that other users (Pa) can easily grasp the specific user (Pa) to be played back by the playback device (27). There is also an advantage that a viewer of a game by the game system (1) can easily grasp a specific user (Pa) to be played.
[ additional notes A8]
A game system (1) according to a preferred aspect of the present invention (note A8) is a game system including a game device (20) and a control system (10) that can communicate with each other via a communication network (2), wherein the game device (20) includes: a game control unit (231) that controls a game to be given to the user (Pa) according to the result of the award according to the amount of usage of the game medium by the user (Pa); a progress data generation unit (232) that generates progress data (G) indicating the progress of the game; a first transmission control unit (235) for transmitting the progress status data (G) to the control system (10) via a first communication device (25) via the communication network (2); and a first reception control unit (236) that receives progress instruction data (Q) instructing the progress of the game from the control system (10) via the communication network (2) by means of the first communication device (25), wherein the control system (10) includes: a second reception control unit (115) for receiving the progress status data (G) from the game device (20) via a second communication device (13) via the communication network (2); an execution instruction data generation unit (111) that generates the execution instruction data (Q); and a second transmission control unit (114) for transmitting the progress instruction data (Q) generated by the progress instruction data generation unit (111) to the game device (20) via the second communication device (13) via the communication network (2). The game control unit (231) controls the game based on the progress instruction data (Q) received by the first reception control unit (236).
The specific contents of the processing for generating the execution instruction data (Q) by the execution instruction data generation unit (111) are arbitrary. For example, a progress instruction data generation unit (111) generates progress instruction data (Q) in accordance with an instruction from a manager (Pb) who can refer to the progress status of the game indicated by the progress status data (G). A progress instruction data generation unit (111) generates progress instruction data (Q) using a generation pattern (M1) in which the relationship between progress condition data (G) and progress instruction data (Q) has been learned.
[ additional notes A9]
In a preferable example (note a9) of note A8, the progress instruction data generation unit (111) generates the progress instruction data (Q) in accordance with an instruction from a manager (Pb) who can refer to the progress status of the game indicated by the progress status data (G). According to the above configuration, it is possible to generate progress indication data (Q) of plural kinds corresponding to the progress status in accordance with an instruction from the manager (Pb) who refers to the progress status of the game indicated by the progress status data (G).
The "generation of the execution instruction data (Q) in accordance with the instruction from the administrator (Pb)" includes not only the operation of generating the execution instruction data (Q) only in accordance with the instruction from the user (Pa), but also the operation of the administrator (Pb) of selecting any one of a plurality of candidates generated by various arithmetic processing such as a generation mode of machine learning.
[ additional notes A10]
In a preferred example (a 10) of the supplementary note A8, the execution instruction data generation unit (111) generates execution instruction data (Q) corresponding to the progress condition data (G) received by the second reception control unit (115) using a generation pattern (M1) in which the relation between the execution instruction data (Q) and the control data (X) based on the progress condition data (G) is learned by machine learning. According to the above configuration, since the progress instruction data (Q) is generated by the generation pattern (M1) that has been mechanically learned, it is possible to generate the progress instruction data (Q) appropriately for the progress situation data (G) without the need for the administrator (Pb) giving various instructions to the control system (10).
[ additional notes A11]
The method of operation of the game device (20) according to a suitable aspect of the invention (appendix A11) is as follows: a game device (20) for controlling a game given to a user (Pa) according to the result of awarding a number of usage amounts of game media by the user (Pa) to generate progress status data (G) (Sc2) representing the progress status of the game; transmitting the progress data (G) to a control system (10) via a communication device (25) via a communication network (2) (Sc 4); receiving progress instruction data (Q) instructing progress of the game from the control system (10) via the communication network (2) by means of the communication device (25) (Sc 5); and controls the games (Sc1, Sc6) based on the progress indication data (Q).
[ additional notes A12]
A program according to a preferred aspect of the present invention (aspect a12) causes one or more processors (23) included in a game device (20) to execute: a game control process (Sc1, Sc6) for controlling a game given to the user (Pa) according to the result of the award according to the amount of usage of the game medium by the user (Pa); a progress status data (G) generation process (Sc2) for generating progress status data (G) indicating the progress status of the game; a transmission process (Sc4) for transmitting the progress status data (G) to a control system (10) via a communication network (2) via a communication device (25); and a reception process (Sc5) for receiving progress instruction data (Q) instructing the progress of the game from the control system (10) via the communication device (25) via the communication network (2). And controls the games (Sc1, Sc6) based on the progress instruction data (Q) in the game control processing.
[ additional note B ]
It is difficult for a user who has little experience in playing various games such as a physical lottery to accurately understand the program to be executed by the user in accordance with the situation of the user. In the above description, a game including a physical lottery has been exemplified as appropriate, but the same problem is assumed in any type of game. In view of the above circumstances, the following aspects (note B) are intended to effectively support the play of a game by a user.
[ additional notes B1]
A game device (20) according to a preferred embodiment of the present invention (note B1) includes: a game control unit (231) that controls a game; a user data generation unit (233) that generates user data (U) indicating the status of a user (Pa) who plays the game; a transmission control unit (235) for transmitting the user data (U) to a control system (10) via a communication device (25) via a communication network (2); a reception control unit (236) that receives, from the control system (10), through the communication network (2), playback control data (C) for causing the virtual character (V) to be played back by the playback devices (27, 31) via the communication device (25); and a playback control unit (234) for playing back the virtual character (V) on the playback devices (27, 31) in accordance with the playback control data (C) in parallel with the control of the game by the game control unit (231). According to the above configuration, the virtual character (V) according to the situation of the user (Pa) can be played by the playing devices (27, 31), and the play of the game by the user (Pa) can be effectively supported. In addition, since it is not necessary to mount a function for generating the playback control data (C) based on the user data (U) on the game device (20), there is an advantage that the structure and processing of the game device (20) are simplified.
The kind of "game" is arbitrary. For example, as a result of winning a prize by a lottery process (physical lottery or electronic lottery), a lottery game in which a user (Pa) is awarded a prize in an amount corresponding to the amount of usage of a game medium, a match-up game in which a plurality of users (Pa) match up with each other, a medal game in which users (Pa) play individually, and the like are given, and any kind of game is included in the concept of "game" of the present invention.
The "virtual character (V)" is a virtual character that can be confirmed by a user (Pa) who plays the game. For example, a character that supports the progress of a game by the user (Pa) is an example of a "virtual character (V)".
"the virtual character (V) is played" by, for example, at least one of an operation of displaying the image of the virtual character (V) on the display devices (271, 311) and an operation of playing the sound of the virtual character (V) to the sound playing devices (272, 312).
The form of the "play control data (C)" is arbitrary. When attention is paid to the display of a video of a virtual character (V), for example, motion data (C1) indicating the motion of the virtual character (V) or animation data (e.g., streaming data) representing the video of the virtual character (V) is included in the concept of "playback control data (C)". When the reception control unit (236) receives the motion data (C1) as the playback control data (C), the playback control unit (234) generates animation data representing the video of the virtual character (V) that moves in accordance with the motion data (C1), and displays the animation data on the display devices (271, 311).
Focusing on the release of the sound of the virtual character (V), for example, sound data (C2) indicating the content (i.e., character string) of the sound of the virtual character (V) or sound data (C2) indicating the sound itself of the virtual character (V) is included in the concept of "playback control data (C)". When the reception control unit (236) receives the sound emission data as the playback control data (C), the playback control unit (234) generates sound data representing the sound of the character string corresponding to the sound emission data by, for example, sound synthesis processing, and causes the sound data to be played back to the sound playback devices (272, 312).
[ additional notes B2]
In a preferable example of the note B1 (note B2), in the game, a bonus in an amount according to the usage amount of the game medium by the user (Pa) is given to the user (Pa) according to the result of the game.
A typical example of the "game" is a lottery game in which awards according to the amount of usage of game media are given to the user (Pa) as a result of winning by means of lottery processing (physical lottery or electronic lottery). However, the "game" in the present invention is not limited to the lottery game exemplified above.
[ additional note B3]
In a preferred example of the supplementary note B2 (supplementary note B3), the user data (U) includes at least one of video data (U1) representing a video of the user (Pa) captured by the imaging device (321) and audio data (U2) representing a sound of the user (Pa) collected by the sound pickup device (322). According to the above aspect, the playback control data (C) that can play back the playback control data (C) of the appropriate virtual character (V) according to the motion or expression of the user (Pa) captured by the imaging device (321), or the content or suppression of the sound of the user (Pa) collected by the sound pickup device (322) can be generated in the control system (10).
[ additional notes B4]
In a preferred example (note B4) of any one of notes B1 to B3, the playback control unit (234) causes the virtual character (V) to be played back to the playback devices (27, 31) for a specific user (Pa) among a plurality of users (Pa) including the user (Pa). According to the above aspect, since the virtual character (V) is played to a specific user (Pa) among the plurality of users (Pa), it is possible to effectively support the play of the game by the user (Pa).
[ additional notes B5]
In a preferred example of the supplementary note B4 (supplementary note B5), the playback device (27) is movable to a position corresponding to a user (Pa) of any one of the plurality of users (Pa), and the playback control unit (234) causes the virtual character (V) to be played back in a state where the playback device (27) is moved to a position corresponding to the specific user (Pa). According to the above aspect, the virtual character (V) can be displayed to a specific user (Pa) among the plurality of users (Pa) by moving the playback device (27). In addition, since the playback device (27) is moved to a position corresponding to the specific user (Pa), there is an advantage that other users (Pa) can easily grasp the specific user (Pa) who is the target of the playback of the virtual character (V). There is also an advantage that a viewer of a game by the game system (1) can easily grasp a specific user (Pa) who is a target of the playback of the virtual character (V).
[ additional notes B6]
A game system (1) according to a preferred aspect of the present invention (note B6) is a game system including a game device (20) and a control system (10) that can communicate with each other via a communication network (2), wherein the game device (20) includes: a game control unit (231) that controls a game; a user data generation unit (233) that generates user data (U) indicating the status of a user (Pa) who plays the game; a first transmission control unit (235) for transmitting the user data (U) to the control system (10) via a first communication device (25) via the communication network (2); a first reception control unit (236) that receives, from the control system (10), through the communication network (2), playback control data (C) for causing the virtual character (V) to be played back by the playback devices (27, 31) via the first communication device (25); and a playback control unit (234) that causes the virtual character (V) to be played back on the playback devices (27, 31) in accordance with the playback control data (C) in parallel with the control of the game by the game control unit (231), wherein the control system (10) includes: a second reception control unit (115) for receiving the user data (U) from the game device (20) via a second communication device (13) via the communication network (2); a playback control data generation unit (112) that generates the playback control data (C); and a second transmission control unit (114) for transmitting the playback control data (C) generated by the playback control data generation unit (112) to the game device (20) via the second communication device (13) via the communication network (2).
[ additional notes B7]
In a preferred example (B7) of the B6, the playback control data generator (112) generates the playback control data (C) in response to an instruction from a manager (Pb) who can refer to the situation of the user (Pa) indicated by the user data (U). According to the above configuration, it is possible to generate a plurality of types of playback control data (C) according to the situation of the user (Pa) in accordance with the instruction from the administrator (Pb) who refers to the situation of the user (Pa) indicated by the user data (U).
The manager (Pb) may refer to the status of the user (Pa) indicated by the user data (U) by any method. For example, when the user data (U) is video data (U1) indicating a video of the user (Pa), the manager (Pb) can visually refer to the video of the user (Pa) by displaying a picture (R) indicated by the video data (U1) on the display device (141). For example, when the user data (U) is sound data (U2) indicating the sound of the user (Pa), the manager (Pb) can refer to the sound of the user (Pa) auditorily by causing the sound indicated by the sound data (U2) to be emitted to the sound emitting device (142).
The content of the specific process for generating the playback control data (C) in accordance with the instruction from the user (Pa) is arbitrary. For example, when the display of the video of the virtual character (V) is focused, for example, the playback control data (C) is used in which the virtual character (V) is caused to execute an operation in accordance with the body motion of the manager (Pb), or the playback control data (C) indicating an operation pattern generated in accordance with an instruction from the manager (Pb) (for example, an operation pattern selected by the manager (Pb) from a plurality of different candidates). When the sound of the virtual character (V) is played, for example, playback control data (C) indicating a sound uttered by the administrator (Pb) or a sound obtained by processing the sound, or playback control data (C) indicating a character string designated by the administrator (Pb) is used.
[ additional notes B8]
In a preferred example of the supplementary note B6 (supplementary note B8), the playback control data generating unit (112) generates the playback control data (C) corresponding to the user data (U) received by the second reception control unit (115) by using a generation pattern (M2) in which the relationship between the control data (X) based on the user data (U) and the playback control data (C) is learned by machine learning. According to the above configuration, since the playback control data (C) is generated in the generation pattern (M2) that has been mechanically learned, it is possible to generate appropriate playback control data (C) for the user data (U) without the need for the administrator (Pb) to give various instructions to the control system (10). As understood from the attached notes B7 and B8, the playback control data generation unit (112) is comprehensively expressed as an element for generating the playback control data (C) based on the user data (U).
[ additional notes B9]
The operation method of the game device (20) according to a suitable aspect of the present invention (note B9) is that the game device (20) that controls the game performs the following operations: generating user data (U) indicating a situation of a user (Pa) who plays the game; transmitting the user data (U) to a control system (10) via a communication device (25) via a communication network (2); receiving, from the control system (10), through the communication network (2), playback control data (C) for causing the virtual character (V) to be played back to playback devices (27, 31) via the communication device (25); and in parallel with the control of the game, the virtual character (V) is played back to the playback devices (27, 31) according to the playback control data (C).
[ additional notes B10]
According to a preferred aspect of the present invention (note B10), the program causes one or more processors (23) included in the game device (20) to execute: a game control process (Sc1, Sc6) for controlling the game; a user data generation process (Sc3) for generating user data (U) indicating the status of a user (Pa) who plays the game; a transmission process (Sc4) for transmitting the user data (U) to a control system (10) via a communication device (25) via a communication network (2); a reception process (Sc5) for receiving, from the control system (10), via the communication network (2), playback control data (C) for causing the playback devices (27, 31) to play back the virtual character (V) via the communication device (25); and a playback process (Sc7) for playing back the virtual character (V) on the playback devices (27, 31) in parallel with the game control processes (Sc1, Sc6) based on the playback control data (C).
Description of reference numerals:
1: game system
2: communication network
10: control system
11: control device
111A, 111B: execution instruction data generating unit
112A, 112B: play control data generating unit
113: play control unit
114: transmission control unit
115: reception control unit
12: storage device
13: communication device
14: player device
141: display device
142: audio player
15: operating device
20: game device
21: lottery drawing mechanism
211: physical lottery drawing part
212: lottery area
213: throw-in mechanism
22: receiving mechanism
23: control device
231: game control unit
232: progress data generating unit
233: user data generating part
234: play control unit
235: transmission control unit
236: reception control unit
24: storage device
25: communication device
26: operating device
27: player device
271: display device
272: audio player
Sn (S1-SN): platform
31: player device
311: display device
312: playback apparatus
32: recording apparatus
321: image capturing apparatus
322: radio device
33: operating device
51: learning processing unit
52: a learning processing unit.

Claims (22)

1. A game device is characterized by comprising:
a game control unit that controls a game;
a user data generation unit that generates user data indicating a situation of a user who plays the game;
a transmission control unit for transmitting the user data to a control system via a communication network via a communication device;
a reception control unit that receives, from the control system via the communication network, playback control data for causing a virtual character to be played back by a playback device via the communication device; and
a play control section for playing the virtual character on the playing device in parallel with the control of the game by the game control section according to the play control data.
2. The game device according to claim 1, wherein the user data includes at least one of image data representing an image of the user captured by the capturing device and sound data representing a sound of the user collected by the sound pickup device.
3. The game device according to claim 1 or 2, wherein the playback control unit causes the virtual character to be played back on the playback device for a specific user among a plurality of users including the user.
4. A game apparatus according to claim 3, wherein the playing means is movable to a position corresponding to any one of the plurality of users;
and the playing control part enables the virtual character to be played in a state that the playing device is moved to a position corresponding to the specific user.
5. A game device according to claim 1, wherein the game device includes a progress status data generation unit that generates progress status data indicating a progress status of the game; wherein
The transmission control unit transmits the progress status data and the user data to the control system via the communication device,
the reception control unit receives progress instruction data instructing progress of the game from the control system via the communication network via the communication device, and the reception control unit receives the progress instruction data instructing progress of the game via the communication network
The game control unit controls the game based on the progress instruction data received by the reception control unit.
6. A game apparatus according to claim 5, wherein the game control section controls the game in accordance with the progress instruction data received by the reception control section in a first operation mode, and controls the game in accordance with no progress instruction data in a second operation mode.
7. The game device according to claim 5 or 6, wherein the game device includes a playback control unit that causes an instruction indicated by the progress instruction data to be played back to a playback device for a specific user among a plurality of users including the user.
8. The game apparatus of claim 7, wherein the playing means is movable to a position corresponding to a specific user among the plurality of users, and
the playback control unit causes the playback device to play the instruction indicated by the performance instruction data to the playback device in a state in which the playback device is moved to a position corresponding to the specific user.
9. A game system comprising a game device and a control system that can communicate with each other via a communication network, wherein the game device comprises:
a game control unit that controls a game;
a user data generation unit that generates user data indicating a situation of a user who plays the game;
a first transmission control unit that transmits the user data to the control system via a first communication device via the communication network;
a first reception control unit that receives, from the control system via the communication network, playback control data for causing a virtual character to be played back by a playback device via the first communication device; and
a play control section for causing the virtual character to be played on the playing device in accordance with the play control data in parallel with the control of the game by the game control section, and
the control system is provided with:
a second reception control unit that receives the user data from the game device via a second communication device via the communication network;
a playback control data generation unit that generates the playback control data; and
and a second transmission control unit that transmits the playback control data generated by the playback control data generation unit to the game device via the second communication device via the communication network.
10. The game system according to claim 9, wherein the play control data generation section generates the play control data in accordance with an instruction from a manager who can refer to a situation of the user indicated by the user data.
11. The game system according to claim 9, wherein the play control data generation section generates play control data in accordance with the user data received by the second reception control section.
12. The game system according to claim 11, wherein the play control data generating section generates the play control data corresponding to the user data received by the second reception control section, using a generation pattern by machine learning of a relationship between the play control data and control data according to the user data.
13. The game system according to claim 11, wherein the game device includes a progress data generating unit that generates progress data indicating a progress of the game; wherein
The first transmission control unit transmits the progress status data and the user data to the control system via the first communication device,
the first reception control unit receives progress instruction data instructing progress of the game from the control system via the first communication device via the communication network, and the first reception control unit receives the progress instruction data instructing progress of the game via the communication network
The game control unit controls the game based on the progress instruction data received by the first reception control unit.
14. A control system characterized by comprising:
a reception control unit for receiving, from the game device via a communication network, user data indicating a situation of a user who plays a game with the game device via the communication device;
a playback control data generation unit that generates playback control data for causing the virtual character to be played back on the playback device; and
and a transmission control unit that transmits the playback control data generated by the playback control data generation unit to the game device via the communication network via the communication device.
15. The control system according to claim 14, wherein the playback control data generation section generates the playback control data in accordance with an instruction from an administrator who can refer to the situation of the user indicated by the user data.
16. The control system according to claim 14, wherein the playback control data generation section generates playback control data in accordance with the user data received by the reception control section.
17. The control system according to claim 16, wherein the playback control data generating section generates the playback control data corresponding to the user data received by the second reception control section, using a generation pattern in which a relationship between the playback control data and control data according to the user data is learned by machine learning.
18. The control system according to any one of claims 14 to 17, characterized by comprising a progress instruction data generating unit that generates progress instruction data that instructs progress of the game; wherein
The reception control unit receives progress data indicating a progress of the game from the game device via the communication network, and the reception control unit receives the progress data via the communication device
The transmission control unit transmits the progress instruction data generated by the progress instruction data generation unit to the game device via the communication network.
19. An operation method of a game device, the operation method controlling the game device to execute the following operations:
generating user data representing a status of a user playing the game;
sending the user data to a control system through a communication network by a communication device;
receiving, from the control system via the communication network, play control data for causing the virtual character to be played on a playing device by the communication device; and
and in parallel with the control of the game, the virtual character is played on the playing device according to the playing control data.
20. An operation method of a control system, the operation method being capable of performing the following operations with a control system communicating with a game device that controls a game:
receiving, from the game device via a communication network, user data indicating a situation of a user who plays a game with the game device via a communication device;
generating play control data for causing the virtual character to be played on the playing device; and
and sending the playing control data to the game device through the communication network by means of the communication device.
21. A program for causing one or more processors included in a game device to execute:
a game control process of controlling a game;
a user data generation process of generating user data indicating a situation of a user who plays the game;
a transmission process of transmitting the user data to a control system via a communication network via a communication device;
a receiving process of receiving, from the control system via the communication network, play control data for causing the virtual character to be played on a player by the communication device; and
and playing processing, which is parallel to the game control processing, for playing the virtual character on the playing device according to the playing control data.
22. A program for causing one or more processors included in a control system capable of communicating with a game device that controls a game to execute:
a reception process of receiving, from the game device via a communication network, user data indicating a situation of a user who plays a game with the game device via the communication device;
a generation process of generating playback control data for causing the virtual character to be played back on the playback device; and
and a transmission process of transmitting the play control data to the game device via the communication network via the communication device.
CN202080030268.3A 2019-04-25 2020-04-16 Game device, game system, control system, game device operation method, control system operation method, and program Pending CN114650872A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2019084229A JP7419635B2 (en) 2019-04-25 2019-04-25 Game systems, control systems, control methods and programs
JP2019-084229 2019-04-25
JP2019-084230 2019-04-25
JP2019084230A JP6762519B1 (en) 2019-04-25 2019-04-25 Game system
PCT/JP2020/016685 WO2020218142A1 (en) 2019-04-25 2020-04-16 Game device, game system, control system, method for operation of game device, method for operation of control system, and program

Publications (1)

Publication Number Publication Date
CN114650872A true CN114650872A (en) 2022-06-21

Family

ID=72942480

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080030268.3A Pending CN114650872A (en) 2019-04-25 2020-04-16 Game device, game system, control system, game device operation method, control system operation method, and program

Country Status (3)

Country Link
CN (1) CN114650872A (en)
TW (1) TWI753409B (en)
WO (1) WO2020218142A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112774189B (en) * 2021-02-08 2023-03-28 腾讯科技(深圳)有限公司 Picture display method, device, terminal and storage medium

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9339728B2 (en) * 2002-12-10 2016-05-17 Sony Interactive Entertainment America Llc System and method for managing audio and video channels for video game players and spectators
JP2005326670A (en) * 2004-05-14 2005-11-24 Hiroshi Sato Mobile terminal device, information processing method, and service providing system
JP5729672B2 (en) * 2009-08-06 2015-06-03 株式会社セガゲームス Game device
KR101816014B1 (en) * 2013-05-30 2018-02-21 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 Controlling a massively multiplayer online role-playing game
US9633526B2 (en) * 2014-04-25 2017-04-25 Cadillac Jack, Inc. Electronic gaming device with near field functionality
JP6500847B2 (en) * 2016-06-20 2019-04-17 株式会社セガゲームス Image generation system and image generation program
JP6351671B2 (en) * 2016-08-26 2018-07-04 株式会社 ディー・エヌ・エー Program, system, and method for adjusting neural network structure and parameters using neuro evolution
US10617961B2 (en) * 2017-05-07 2020-04-14 Interlake Research, Llc Online learning simulator using machine learning
JP6471774B2 (en) * 2017-07-04 2019-02-20 株式会社セガゲームス Information processing system and moving image reproduction method
JP6369734B1 (en) * 2017-07-07 2018-08-08 株式会社コナミアミューズメント GAME DEVICE AND GAME DEVICE PROGRAM
JP2018171448A (en) * 2018-04-12 2018-11-08 株式会社ドワンゴ Learning device, learning method, learning program, moving image distribution device, object action device, object action program, and moving image generation device
CN109240576B (en) * 2018-09-03 2021-09-07 网易(杭州)网络有限公司 Image processing method and device in game, electronic device and storage medium

Also Published As

Publication number Publication date
WO2020218142A1 (en) 2020-10-29
TWI753409B (en) 2022-01-21
TW202039041A (en) 2020-11-01

Similar Documents

Publication Publication Date Title
CA2559412C (en) Method and apparatus for peer-to-peer wagering game
JP4731095B2 (en) Remote roulette system and method
US20090197673A1 (en) Wagering game machine with wireless peripherals
US20090191931A1 (en) Skill crane games and other amusement vending machines having display devices and other interactive features
US20060058100A1 (en) Wagering game with 3D rendering of a mechanical device
JP2018129058A (en) System and method for interactive experience and controller therefor
JP2009521247A (en) Real video gaming method and system
CN101681539A (en) Simulating real gaming environments with interactive host and players
US11138836B2 (en) Systems and methods for providing a multi-player wagering game
JP2009011819A (en) Gaming system and game method using the system
JP2004008706A (en) Game machine, server, and program
JP2008245869A (en) Online game system
CN114650872A (en) Game device, game system, control system, game device operation method, control system operation method, and program
US9646452B2 (en) Game history for wagering games using handheld devices
JP7343185B2 (en) Control system, computer program and control method
JP7419635B2 (en) Game systems, control systems, control methods and programs
US20200193768A1 (en) Controlling audio content layers played on a bank of electronic gaming machines
JP6762519B1 (en) Game system
CA3224380A1 (en) Computer-implemented systems and methods for cutscene management in electronically displayed games
JP2008245871A (en) Online game system and program
JP2006087715A (en) Game machine, service providing system, server, and portable terminal unit
JP2022019313A (en) Game system, game system operation method, and program
US20240165512A1 (en) Tipping gift for games
WO2021157370A1 (en) Management system
US20080280679A1 (en) Sound Filtering System for Gaming Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination