CN109789335B - Game device, game article, and recording medium - Google Patents

Game device, game article, and recording medium Download PDF

Info

Publication number
CN109789335B
CN109789335B CN201780058493.6A CN201780058493A CN109789335B CN 109789335 B CN109789335 B CN 109789335B CN 201780058493 A CN201780058493 A CN 201780058493A CN 109789335 B CN109789335 B CN 109789335B
Authority
CN
China
Prior art keywords
game
area
card
unit
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780058493.6A
Other languages
Chinese (zh)
Other versions
CN109789335A (en
Inventor
樋口亘
宫崎嵩广
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bandai Co Ltd
Original Assignee
Bandai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bandai Co Ltd filed Critical Bandai Co Ltd
Publication of CN109789335A publication Critical patent/CN109789335A/en
Application granted granted Critical
Publication of CN109789335B publication Critical patent/CN109789335B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/22Setup operations, e.g. calibration, key configuration or button assignment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/95Storage media specially adapted for storing game information, e.g. video game cartridges

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Reducing execution of actions related to operation input unintended by the user. The game device has a table, detects a first type of object and a second type of object different from the first type of object existing on the table, and executes a game reflecting the detection result. The game device sets a first area in which the detection result is reflected and a second area in which the detection result is not reflected, and changes the areas according to the progress of the game.

Description

Game device, game article, and recording medium
Technical Field
The present invention relates to a game device, a game article, and a program, and more particularly to a game device that executes a game based on information acquired from an article.
Background
There are game devices as follows: the game machine has a discharge function for discharging an article such as a card, and provides a game playing experience for discharging a game element corresponding to the article by acquiring information from the card owned by a user. Patent document 1 discloses a card game device including: a plurality of cards are placed on a table, and an input operation is performed to change the position of a character corresponding to each card in a game.
Documents of the prior art
Patent document
Patent document 1: japanese patent laid-open publication No. 2004-041740
Disclosure of Invention
Problems to be solved by the invention
In a so-called arcade game device which is generally installed in a store, a facility, or the like and provides a game, an article held by a user or the like may be placed on an operation table during the game. In particular, when the operation table is configured to have a relatively large area as in the card game device described in patent document 1, it is easy to place a wallet of a user, a card not used when playing a game, or the like on the table. Therefore, in a situation where an object can be placed on a table for placing an article used in operation, there is a possibility that a better game experience is not provided to the user, such as performing an action related to an operation input unintended by the user, not appropriately performing an action related to an operation input performed, or the like.
The present invention has been made in view of the above-described problems, and an object of the present invention is to provide a game device, a game article, and a program that reduce the number of operations related to operation input unintended by a user.
Means for solving the problems
In order to achieve the above object, a game device according to the present invention includes: a table top; a detection unit that detects a first kind of object and a second kind of object different from the first kind of object that are present on the table top; an execution unit that executes a game reflecting a detection result of the detection unit, the game being based on information acquired from the object of the first type; and a setting unit that sets, for the table top, a first area in which the detection result of the detection unit is reflected in the game and a second area in which the detection result of the detection unit is not reflected in the game, wherein the setting unit changes the first area and the second area in accordance with the progress of the game.
ADVANTAGEOUS EFFECTS OF INVENTION
With such a configuration, according to the present invention, it is possible to reduce the execution of an action related to an operation input unintended by a user.
Drawings
Fig. 1 is a block diagram showing a functional configuration of a game device 100 according to an embodiment of the present invention.
Fig. 2 is a diagram illustrating an external appearance of the game device 100 according to the embodiment of the present invention.
Fig. 3 is a diagram for explaining a structural example of the mounting panel 131 according to the embodiment of the present invention.
Fig. 4 shows an example of a data structure of various information used in the game device 100 according to the embodiment of the present invention.
Fig. 5 is a diagram illustrating the display of the first display unit 120 and the second display unit 130 in each sequence according to the embodiment of the present invention.
Fig. 6 is a diagram illustrating a related image displayed on a physical card when the physical card is placed on the placement panel 131 according to the embodiment of the present invention.
Fig. 7 is a diagram illustrating an example of area control in each sequence according to the embodiment of the present invention.
Fig. 8 is a flowchart illustrating an area control process (progress) executed by the game device 100 according to embodiment 1 of the present invention.
Fig. 9 is a flowchart illustrating an area control process (dynamic state) executed by the game device 100 according to embodiment 2 of the present invention.
Detailed Description
[ embodiment 1]
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. In one embodiment described below, an example will be described in which the present invention is applied to a game device that can provide a game based on information acquired from an article, as an example of the game device. However, the present invention can be applied to any device that can execute a game based on information acquired from an article.
In the present embodiment, the description is given assuming that the article that can be used in the game and is discharged or otherwise circulated by the game device is a card, but the article is not limited to a card as long as it is configured to be able to acquire article information described later. The object may be a shaped object such as a firm having the appearance of a game element (character, prop). In this case, the article information may be acquired from a pattern attached to the bottom surface or a predetermined surface of the shaped object by labeling, printing, or the like, or may be acquired from a recording medium inside the shaped object. The article may be not only a handheld article but also any article such as a toy and a sticker.
In the present embodiment, the following configuration is explained: for all cards that can be used in a game, constant article information is converted into a one-dimensional or multi-dimensional pattern (code) by applying a predetermined conversion operation, and the pattern is attached to the card so as to be formed (printed) on the card surface. In the present embodiment, a description will be given of forming a code on a card surface by printing using invisible ink.
However, the present invention is not limited to this, and the code may be visually printed on the surface of the card, or a predetermined identification pattern related to the code may be formed on an intermediate layer of the card to be attached to the card. In this case, the article information may be recorded in a tag for Near Field Communication (NFC) incorporated in a card, and the acquisition and change of the information may be performed by a predetermined reader/writer. The method of attaching the variable article information to the card is not limited to the NFC tag, and may be any method such as recording the variable article information in a recording medium such as an IC chip as data retention.
In the present embodiment, a description will be given of a case where a game element that is present using a card is a character during a game played by a game device, and a pattern (character image) of the corresponding character is attached to the card. However, it goes without saying that the implementation of the present invention is not limited to this, and the card for specifying the game element of the executed game is not limited to one configured to be able to specify a character related to the game, and may specify other game elements such as props, effects, and the like.
Game apparatus 100
Here, the functional configuration of the game device 100 will be described with reference to the block diagram of fig. 1.
The control unit 101 is, for example, a CPU, and controls the operation of each module included in the game device 100. Specifically, the control unit 101 reads out an operation program of each module recorded in the recording medium 102, for example, and controls the operation of each module by expanding and executing the operation program in the memory 103.
The recording medium 102 is a recording device capable of permanently holding data, such as a nonvolatile memory or an HDD. The recording medium 102 stores, in addition to the operation programs of the respective modules included in the game device 100, parameter information necessary for the operation of the respective modules, various kinds of graphic data used in the game executed by the game device 100, and the like. The memory 103 is a storage device for temporarily storing data, such as a volatile memory. The memory 103 is used not only as an expansion area for the operation program of each module but also as a storage area for temporarily storing data and the like output during the operation of each module.
The payout detecting unit 104 detects that payout of the value is performed in the game apparatus 100. For example, the payment of the consideration may be determined by detecting that a coin or a corresponding token (coin) of a predetermined amount is inserted into the coin insertion port, or by detecting the completion of a settlement process based on communication with a chip relating to a predetermined electronic money. The game device 100 of the present embodiment is described as starting the provision of the service accompanied by the discharged card to the user based on the payment of the consideration, but the payment of the consideration is not an essential requirement and the provision of the service may be started based on a predetermined start instruction.
The display control unit 105 includes, for example, a drawing device such as a GPU, and generates and controls screens to be displayed on the first display unit 120 and the second display unit 130 in the present embodiment. Specifically, the display control unit 105 performs appropriate arithmetic processing on a necessary drawing object based on processing and commands performed by the control unit 101 during operation of the game device 100 (during game play or in a standby state), and draws a screen. The generated screen is output to the first display unit 120 and the second display unit 130, and is displayed in a predetermined display area, so as to be presented to the user, where the first display unit 120 and the second display unit 130 are display devices that are located in the same housing as the game device 100 or detachably connected to the outside of the game device 100.
The game device 100 of the present embodiment includes 2 kinds of display devices (the first display unit 120 and the second display unit 130) as shown in fig. 2 for displaying a game screen, and the display control unit 105 generates a game screen for each display unit. As shown in the drawing, in the present embodiment, the second display unit 130 is formed as a mounting panel 131 having the following structure: the mounting panel 131 has a top plate (mounting surface) on which a card can be mounted in the display region of the second display unit 130. The user can perform a part of operation inputs related to the game by performing a moving operation on the card placed on the placement surface at a predetermined timing. Although the game screen image displayed on the second display unit 130 will be described later, in order to provide a preferable game experience, the game screen image displayed on the second display unit 130 is of a plurality of types including a screen image that presents an image of an area that is a reference for placing a card (a two-dimensional image configured to clearly indicate a position where the card is placed and an area) as shown in fig. 2. On the other hand, the first display unit 120 basically displays a game screen generated based on a card moving operation performed on the top of the mounting panel 131, for example, a game screen or various performance screens in which a viewpoint of a character corresponding to a card or a viewpoint of the character and another character appearing in the game are drawn.
Structure of mounting Panel 131
Here, the structure of the mounting panel 131 having the second display unit 130 will be described in detail with reference to fig. 3.
The placement panel 131 of the present embodiment is configured to be capable of receiving movement operations performed on both a physical card that is actually placed on the placement surface and has a physical entity in the real world, and a virtual card that is not provided with a physical entity in the real world but is treated as a card equivalent to the physical card in terms of program, and the details of which will be described later. The operation of moving the physical card is input by physically moving the card on the mounting panel 131. On the other hand, the movement operation for the virtual card is input by a touch operation performed on the mounting panel 131 (an operation recognized based on a case where an object such as a finger or a pen other than the physical card that can be used in the game comes into contact with or approaches the mounting panel 131). In addition, the touch operation is also used for operation input to the GUI or the like displayed on the second display unit 130.
In order to detect such 2 types of operation inputs, the mounting panel 131 of the present embodiment has a stacked structure as shown in fig. 3 (a) and 3 (b). As shown in fig. 3 (a), the mounting panel 131 includes a touch operation detection layer 301, a tempered glass layer 302, a liquid crystal panel layer 303, and a light guide layer 304, and is configured to be able to mount a physical card on a top plate, perform recognition and position detection of the physical card mounted thereon, display a game screen for card movement operation, and detect a movement operation input for a virtual card included in the game screen.
The touch operation detection layer 301 is a layer configured to be able to detect a touch operation performed on the layer by a finger or the like of a user by a touch operation detection unit 141 described later. In the present embodiment, for the sake of simplicity, the following description will be made: as shown in fig. 3 (b), the touch operation detection layer 301 is a layer formed of a group of optical paths related to light receiving and emitting element rows arranged in a line on an outer frame 305 provided around the display area of the second display unit 130. The optical path is formed, for example, by: light emitted from light emitting elements such as infrared LEDs arranged on one side of the outer frame 305 is adjusted to be incident on light receiving elements arranged on the opposite side in a pair with the light emitting elements. In the present embodiment, in order to specify the two-dimensional coordinates of the position where the touch operation is performed, the placement panel 131 is provided with a light-emitting element row on one side and a light-receiving element row on the opposite side in the lateral direction and the longitudinal direction (X direction and Y direction). Although the infrared blocking method is described as a method of detecting a touch operation, the present invention is not limited to this, and various methods such as a resistive film method, a capacitive method, an ultrasonic method, and an infrared camera method may be used. In the detection of the touch operation according to the present invention, it is preferable that the touch operation performed on the area blocked by the blocking object is detected from the position of any of the detection elements even when an obstacle is present at a certain position on the placement panel 131, and the detection of the touch operation according to the present invention can be realized, for example, by: all four sides or all four corners of the outer frame 305 have a structure for detecting the position where the touch operation is performed so as to detect the reflected light.
The tempered glass layer 302 is provided to protect the liquid crystal panel layer 303 and ensure strength to allow movement operations with respect to the physical card and the virtual card. In the present embodiment, since the touch operation detection layer 301 performs touch operation detection by the infrared shielding method as described above, the tempered glass layer 302 substantially serves as a top plate of the second display unit 130 with respect to the mounting panel 131. Therefore, the tempered glass layer 302 serves as a surface for supporting the mounted physical card and performing a movement operation of the physical card. However, the top plate of the second display unit 130 does not need to be the tempered glass layer 302, and may be the touch operation detection layer 301 depending on the touch operation detection method. In the present embodiment, for the sake of simplicity, the tempered glass layer 302 is described as being made of glass, but the tempered glass layer is not limited thereto, and may be made of any material as long as it has permeability.
The liquid crystal panel layer 303 is composed of the following liquid crystal panels: the liquid crystal panel controls the light emission amount of each pixel for each color component based on the game screen displayed on the second display unit 130. The display control of the liquid crystal panel is performed based on the corresponding game screen generated by the display control unit 105, and the details thereof will not be described in detail.
As shown in fig. 3 (b), the light guide layer 304 is composed of: this member performs surface light emission by the white LED group linearly provided in the outer frame 306 to irradiate the entire surface of the liquid crystal panel layer 303.
The mounting panel 131 of the present embodiment is configured to be able to recognize an invisible pattern attached to a physical card mounted on the tempered glass layer 302, and the details thereof will be described later. More specifically, the second display unit 130 is configured to be able to detect the identification and position/rotation of the physical card placed on the placement panel 131 from the back side of the second display unit 130 (the inside of the housing of the game device 100) while displaying the game screen generated by the display control unit 105. For example, when the invisible code attached to the physical card is formed of infrared light reflecting ink, the mounting panel 131, the liquid crystal panel layer necessary for display, and the light guide layer are all configured to be transmissive to infrared light, and a housing serving as a shield is not provided on the bottom surface of the light guide layer.
In the present embodiment, the description is given assuming that information indicating which character the entity card corresponds to is printed as an invisible pattern using infrared light reflective ink, and therefore the mounting panel 131 has such a laminated structure, but it should be easily understood that the structure of the mounting panel 131 may be different depending on the identification method of the entity card.
The operation input unit 106 is a user interface provided in the game device 100, detects an operation input by a user, and outputs a corresponding control signal to the control unit 101. The operation input unit 106 detects that an operation input is made to an operation member such as a button provided in the game device 100 as shown in fig. 2. The operation input unit 106 of the present embodiment includes: a touch operation detection unit 141 that detects a touch operation performed on the placement panel 131; and a card detection unit 142 that detects the identity of the physical card placed on the placement panel 131 and the position of the physical card.
When the touch operation detection unit 141 detects that the infrared light is blocked by the touch operation detection layer 301 as described above, it outputs information of the position where the infrared light is blocked as information of the detected touch operation. The touch operation detection unit 141 of the present embodiment employs the above-described detection method, and therefore detects an obstacle (an object blocking an optical path) present in the touch operation detection layer 301. That is, the touch operation detection unit 141 can detect not only a so-called "touch operation" intentionally performed by the user using a finger or a pen, but also an article held by the user placed on the tempered glass layer 302, and a human body such as an elbow or a wrist which the user unintentionally comes into contact with or comes close to the touch operation detection layer 301, as an obstacle. Further, although it depends on the number of light-receiving and light-emitting elements provided in the outer frame 305, the touch operation detection unit 141 can detect an obstacle across a coordinate range having a predetermined area when detecting a virtual card with a resolution that achieves an operation feeling equivalent to that of a real card.
Therefore, an obstacle as an object of the second type according to the present invention, which is detected by the touch operation detecting section 141 as an obstacle existing on the touch operation detecting layer 301, may block the optical paths of the plurality of light receiving and emitting element pairs in the lateral and longitudinal directions. Therefore, when the touch operation detecting unit 141 detects a touch operation (including an obstacle) in the adjacent coordinates, the detection results are sorted using, for example, an area division method so that the touch operation is classified as a touch operation by one object. That is, the touch operation detecting unit 141 classifies the detection results such that one detection result is derived for a touch operation estimated to be caused by one object. In this case, the touch operation detecting unit 141 may derive, as the detection position, the barycentric coordinate with respect to the detected range, for example, with respect to the detection result detected with a certain area.
Since the area and the number of objects that can be detected by the touch operation detection layer 301 are not constant unlike the physical card, the detection result (touch detection information) may be managed by associating the position information 402 and the range information 403 with an object ID 401 assigned to each object detected at an arbitrary time point, as shown in fig. 4 (a), for example. The touch detection information is output from the touch operation detection unit 141 and stored in the memory 103, for example. In addition, when the detection results at consecutive detection times are estimated to be due to the same object, the same object ID 401 may be assigned at each time. In this case, the following may be used: the memory 103 is configured to hold touch detection information on a plurality of detection times for a predetermined period of time, and to add a time stamp, for example, so that they can be distinguished.
On the other hand, the card detection unit 142 detects a physical card placed on the placement panel 131 as the first type of object according to the present invention. Specifically, the card detection unit 142 detects whether or not the physical card is placed on the placement panel 131, the identification of each card placed thereon, and the position and rotation of each card. The card detection unit 142 may be configured to include an infrared camera that captures an image of the second display unit 130 from the back side as described above, for example, extract an invisible code from the image obtained by the capturing, and convert the invisible code by applying predetermined image processing, thereby acquiring the item information from the physical card placed on the panel 131. In the present embodiment, for the sake of simplicity, it is assumed that the physical cards that can be used in the game device 100 correspond to one character appearing in the game.
The article information of the physical card may be configured to be able to specify which character is set to be able to be presented in the game, and in the present embodiment, the article information includes a card ID that is able to uniquely specify the type of the physical card, and information for identifying a character ID of the character corresponding to the card ID is managed in the character DB 107 described later.
Therefore, when the card detection unit 142 detects that a new physical card is placed (including an invisible code that has not been analyzed in the captured image) from the captured image while the placement of the physical card to be used during the game play is accepted, the card detection unit analyzes the invisible code added to the card to acquire the card ID of the card. The card detection unit 142 derives and outputs the placement position of each card based on the position in the captured image at which the invisible code is detected. In addition, the invisible code may be added with predetermined information for identifying the rotation (direction) of the physical card, or the front or back thereof, and the card detection unit 142 may be configured to derive and output the direction of the physical card (the mounting direction on the table) when analyzing the invisible code.
In the present embodiment, the card detection unit 142 acquires the item information of the physical card and detects the position thereof when the physical card is placed on the placement panel 131, but the present invention is not limited to this. For example, the pattern to be applied to the physical card so as to be invisible may be configured to be used only for identifying the physical card on which the item information is acquired and detecting the movement of the card, or the like, without including the identification information of the character corresponding to the physical card. In the present embodiment, the description is given by the card detection unit 142 detecting the position and the direction of the physical card placed on the placement panel 131 in order to provide a preferable game experience, but the direction detection may not be performed according to the display mode performed on the physical card in the second display unit 130, the shape of the placement panel 131 or the physical card, the configuration mode of the invisible code, and the like.
Since the detection target of the card detection unit 142 is a physical card, and the area of the detection target of the card detection unit 142 is fixed and a card ID capable of uniquely identifying the card type is added to each card, unlike the detection target of the touch operation detection unit 141, it is not necessary to manage the detection result by assigning an object ID for each detection time. Therefore, as shown in fig. 4 (b), for example, the position/rotation information 412 may be managed in association with the card ID 411 acquired from the physical card as to the detection result (card detection information) detected by the card detection unit 142 for each physical card. The card detection information is output from the card detection unit 142, and stored in the memory 103, for example, as in the touch operation detection unit 141. In this case, although the detection result is for the same mounting panel 131, the touch detection information of the touch operation detection unit 141 and the card detection information of the card detection unit 142 may be managed separately. The card detection information of the card detection unit 142 may be configured to be held for a predetermined period of time with a time stamp.
In the present embodiment, the description is given assuming that the touch detection information and the card detection information have the position or the rotation angle of the object (a predetermined obstacle blocking the optical path in the case of the touch detection information, or a real card in the case of the card detection information) existing on the mounting panel 131, but the implementation of the present invention is not limited to this. The detection information may be configured to include information indicating a movement state such as a movement vector, or may be configured to include information indicating both a position and a movement state.
The character DB 107 is a database for managing character information for each character which is determined in advance to be present in the game. As shown in fig. 4 (c), the character information may be managed, for example, by associating a character ID 421 for uniquely specifying a character with a character name 422, various parameters (numerical values (physical strength, offensive power, defensive power, and the like) specific to the character that determine superiority in game progress) 423 of the character within the game, various capabilities, starting conditions thereof, and the like), drawing information 424 indicating image or model data and the like used for displaying the character on the game screen, and a corresponding card ID 425 indicating a physical card corresponding to the character.
The area control unit 108 sets an area in which whether or not the detection results of the mounting panel 131 are reflected in the game is determined in accordance with the progress of the game. The area control unit 108 sequentially changes an area (reflection area) in which the detection result is reflected in the game on the mounting panel 131, which is the first area according to the present invention, and an area (non-reflection area) in which the detection result is not reflected in the game on the mounting panel 131, which is the second area according to the present invention, for each time sequence of the provided game play, and details thereof will be described later.
The discharge control unit 109 controls the discharge of the physical cards for 1 play based on the payout of the consideration. The physical cards discharged from the game device 100 may be discharged before a game sequence related to the provision of a predetermined game (a match-up game) is started, for example. The discharge unit 150 is, for example, a card dispenser, and may be configured to include a storage box, not shown, for vertically stacking the physical cards, and to have a mechanism for discharging 1 physical card held at the lowest part of the storage box in response to a discharge command from the discharge control unit 109. In the case where the ejection unit 150 is built in the same housing as the game device 100, the ejected physical card can be provided to the user by being guided to an ejection port 201 (fig. 2) accessible from the outside of the game device 100. The physical cards discharged from the game device 100 of the present embodiment are configured to have invisible codes related to unique article information on the surface thereof as described above, and therefore are ready-made cards printed in advance, and all the physical cards are formed in the same shape and the same size. In the game device 100 of the present embodiment, the objects present on the mounting panel 131 are detected to control the progress of the game, but the objects discharged by the discharge unit 150 are only actual cards among the objects used for the progress control. That is, an obstacle such as a human body or an article held by the human body, which is recognized as an object used for the progress control as well, cannot be a discharge target of the discharge unit 150.
The communication unit 110 is a communication interface with an external device included in the game device 100. The communication unit 110 can be connected to an external device via a communication network such as the internet, not shown, or a communication medium such as a cable, whether wired or wireless, and can transmit and receive data. The communication unit 110 converts information input as a transmission target into data of a predetermined format, for example, and transmits the data to an external device such as a server via a network. When receiving information from an external device via a network, for example, the communication unit 110 decodes the information and stores the information in the memory 103. The game device 100 according to the present embodiment is configured to be able to receive program data obtained by encapsulating a program of processing related to a game from an external device via the communication unit 110. When the program data is received by the communication unit 110 and a request for updating the program is received, the control unit 101 can update the program of the game-related process currently stored in the recording medium 102 using the received program data in accordance with the update request. In addition to the update processing of the program related to the processing of the game, the update processing may be automatically executed when program data recorded in a recording medium is inserted into an optical drive or the like, not shown, included in the game device, or may be executed by a start command from the administrator after the insertion.
Summary of the game
Here, an outline of a game play experience provided in the game device 100 of the present embodiment will be described.
The game play experience provided by the game device 100 of the present embodiment includes, as main game elements, a round-play game that is executed between a player team composed of characters used by the user, that is, characters (game play characters) corresponding to the respective physical cards loaded on the loading panel 131 by the user, and an opponent team composed of characters (opponent characters) of an opponent selected by a predetermined method. In a battle game, the sum of physical forces determined for the characters constituting a team is determined as a team physical force in each of a player team and an opponent team, and a team whose opponent team physical force is reduced to 0 until the predetermined upper limit number of rounds is reached is a winner of the game.
One round is composed of a strategic stage in which the action of each character of the player team in the round is determined in consideration of the states of the player team and the opponent team, and an action stage in which the action determined in the strategic stage is executed in accordance with the progress of the game to perform processing related to increase or decrease of the physical strength of the team. Here, when the battle game does not progress in the strategy phase and the transition to the action phase is made while the end condition of the strategy phase is satisfied, the battle game progresses in consideration of an action determined for a player team, an action determined for an opponent team, statuses of each character and team, and the like.
Hereinafter, the contents of 1 play provided when the payout of the consideration is made are exemplified, and the transition of the game screen images displayed on the first display unit 120 and the second display unit 130 in various timings including the battle game in the game is exemplified.
When the payout of the match price is made, first, as shown in fig. 5 (a), a mode selection sequence for selecting a game mode such as a scene relating to the present match game comes. As shown in the drawing, in the mode selection sequence, the game screen indicating that the mode selection operation on the placement panel 131 is necessary is displayed on the first display unit 120. Further, a game screen for selecting a game mode, the content of which changes in accordance with the touch operation, is displayed on the second display unit 130. Here, the game screen displayed on the second display unit 130 is configured such that the groups of icons 501 associated with a plurality of game modes are arranged side by side in the horizontal direction, and the icon 501 to be mainly displayed changes in accordance with the left-right switching operation. Each icon 501 is configured to be selectable in accordance with a touch operation, and the user may perform a switching operation to display the icon 501 related to a desired game mode mainly (in front) and perform a touch operation to select the icon from the icons mainly displayed in order to play the game related to the game mode.
When the selection of the game mode is completed, a shift is made to a character registration timing for registering a character (game-playing character) to be used in the battle game using the physical card. In the character registration sequence, as shown in fig. 5 (b), the following game screen is displayed on the first display unit 120: the game screen indicates that the registration of the game character can be performed by placing the physical card on the placement panel 131, and indicates that the placed physical card has been recognized. The display that the physical card has been identified may be made by: the image of the character 511 corresponding to the physical card placed on the placement panel 131 is displayed at a position corresponding to the placement position on the placement panel 131 in the game screen displayed on the first display unit 120. Further, the second display unit 130 displays a game screen indicating an area 512 on the placement panel 131 where the physical card to be registered should be placed. Characters that can be registered as characters to be used are set to be, for example, at most 7 characters in the 1-time match-up game, and a display indicating that several characters can be registered may be included in the game screen displayed on the first display unit 120 and the second display unit 130.
When the registration of the play character is completed, a shift is made to the timing associated with the battle game. As described above, since the match game is composed of the strategy phase and the action phase, the game is processed as different time series. Hereinafter, screen transition of the first display unit 120 and the second display unit 130 will be described in accordance with the flow of processing performed at each stage.
Strategic stage
In the strategy phase, a game screen as shown in fig. 5 (c) for determining a guideline for an action of each game character in the subsequent action phase (the same round) is displayed on the first display unit 120 and the second display unit 130.
As shown in the figure, the attack area 521 and the standby area 522 in the game screen displayed on the second display unit 130 correspond to areas in which game characters can be arranged in a space (3D space) in the game related to the battle game. The user can place the actual card in the space in the game by placing the actual card on the placement panel 131 of the second display unit 130 so that at least a part of the actual card overlaps the attack area 521 or the standby area 522 displayed on the second display unit 130. For example, as shown in the drawing, when the physical cards 523, 524, and 525 are placed on the placement panel 131, the characters 526, 527, and 528 corresponding to the respective cards are arranged in the game space related to the match-up game so as to maintain the relative relationship therebetween on the game screen displayed on the first display unit 120.
The attack area 521 and the standby area 522 are areas in which actions in the game, such as actions performed in the action phase and effects given at the end of the round, are different for the game character corresponding to the physical card placed (overlapping with the display of the areas).
The standby area 522 is an area in which: in the action phase, action (attack) for reducing the physical strength of the opponent character team is not performed, but action points are supplemented to the game-playing character corresponding to the mounted card. The action points are points that are consumed in the action phase, and thereby enable a character that consumes the points to perform an attack action. Since the game character cannot perform the attack action in the state where the action point is 0, the user needs to move the card corresponding to the game character to the standby area 522. For example, the initial value of the action point, the point that rises by being placed in the standby area 522, and the maximum value that can be replenished can be determined for each character by the various parameters 423 in the character information.
On the other hand, the attack area 521 is an area in which the game character corresponding to the mounted card performs an attack action by consuming the action point. As shown in fig. 5 (c), the attack area 521 is divided into three areas, and is controlled so that the shorter the distance from the opponent team, the more action points are consumed, and the more the physical strength of the opponent team is reduced in the attack action. Here, the attack area 521 and the standby area 522 correspond to the space in the game determined when the opponent team is confronted in the battle game, and the distances from the opponent team are set to increase in the order of the attack area 521 and the standby area 522. That is, in fig. 5 (c), the upper end of the attack area 521 indicates the foremost part of the space facing the opponent team, and the character corresponding to the card can be set in a state in which the player can more easily reduce the physical strength of the opponent team by placing the card at a position on the deeper side of the game device 100 (the upper direction of the attack area 521).
In addition, on the game screen displayed on the second display unit 130, a related image as shown in (a) of fig. 6 is displayed around the mounted physical card to inform the user of various information such as the recognition of the character corresponding to the physical card, the recognition of the physical card at the mounting position, and the current action point of the corresponding character. The related image may be used to display various information related to the corresponding character, and in the present embodiment, the related image is arranged and displayed at a position determined based on the placement position of the card detected by the card detection unit 142.
In the present embodiment, as shown in fig. 6 (a), the related image displayed for one card 600 includes: a frame image 601 which covers the periphery of the card placed thereon and indicates that the card has been recognized; a point image 602 indicating the current action point of the game character corresponding to the card and a complementary maximum value of the action point; and a role name 603. However, the present invention is not limited to this, and the user may be notified that the character corresponding to the physical card has been recognized and the recognition position thereof, and therefore the related image may be configured such that only the character name with which the character can be recognized is displayed in the vicinity of the physical card and follows the movement of the physical card on the game screen displayed on the second display unit 130. Note that the information that can identify a character need not be limited to the character name, and a character image 611 as shown in fig. 6 (b) may be used.
In the game provided in the game device 100 of the present embodiment, a virtual card can be generated on the game screen displayed on the second display unit 130 by setting a plurality of physical cards placed on the placement panel 131 to a predetermined format during or before the match-up game, and the virtual card can be used during the play of the game. That is, in order to use the virtual card in the game, the user may set a predetermined arrangement by performing a predetermined moving operation on a plurality of physical cards in the attack area 521 or the standby area 522.
Regarding the generation of the virtual card, for example, the following situation may be taken as a condition: as shown in fig. 5 (d), the 2 physical cards are in a state of contact with the mounting panel 131 by the moving operation. The generation condition may be determined on the premise of a positional relationship (or a transition of a positional change of each physical card) of the physical cards before the physical cards come into contact, or a predetermined operation input, such as a positional relationship in which 2 physical cards are separated by a predetermined distance, a state in which 2 physical cards can be generated by a predetermined operation input, or the like. For example, information that a character different from the character related to the basic physical card can be generated may be determined in advance in the character DB 107 regarding the combination of physical cards from which virtual cards can be generated and virtual cards (characters) newly generated when the generation conditions are satisfied. In the example of fig. 5 (d), the following is performed: in the case where the physical card 523 and the physical card 524 are arranged in the attack area 521 such that one side of the cards abut against each other, the virtual card 531 is arranged in the attack area 521 so as to be usable (movable by touch operation) after the generated performance.
In the present embodiment, the virtual cards are displayed on the second display unit 130 in the same appearance as the physical cards so that the virtual cards can be recognized by the user as game articles equivalent to the physical cards. The same appearance as the physical card means that the size, shape, description item configuration, and display form of the related image are all the same as those of the physical card in the placement panel 131. In particular, an image of a physical card issued to a character generated as a virtual card or a printed image when the physical card is issued may be displayed on the game screen as a virtual card having the same appearance as the physical card. As described above, even when the game object of the physical object is a predetermined toy body other than a card, the game object serving as the virtual object has the same appearance as the toy body.
In this manner, in the strategic phase related to one round, the user can determine the action, state change, character arrangement in the game, and the like of the character corresponding to each card in the action phase of the round by performing a move operation on the physical card placed on the placement panel 131 and the displayed virtual card.
Action stage
When the strategic phase (due to deterministic operations or timeouts) ends, the action phase associated with the round is started. The characters that become action targets in the action phase are basically characters that are arranged in the attack area 521 for the player team and the opponent team. The action phase is a phase in which an attack action based on the result of the card moving operation performed in the strategic phase is presented, and the attack action progresses without requiring the card moving operation or the like.
As shown in fig. 5 (e), in the action phase, a game screen relating to an attack performance accompanying the action of each character appearing in the match game is displayed on the first display unit 120. In other words, the game screen displayed on the first display unit 120 is a series of screen transitions that are displayed for an event generated by a change in the game, preferably an event generated by an operation performed by the user, and that progress while continuously changing the content in at least a part of the section, and the progress is controlled in accordance with the operation at the strategic stage.
On the other hand, as shown in the drawing, the game screen displayed on the second display unit 130 in the action phase does not display the related image related to the card, and the attack area 521 and the standby area 522, so as to avoid causing the user to misunderstand that the card operation is necessary, or to make the user more likely to focus on the game screen of the first display unit 120 related to the attack performance. In the present embodiment, the game screen on which the related image relating to the card is not displayed is described in which the space in the game relating to the match game is displayed in the bird's eye on the second display unit 130 in the action phase.
The processing related to the round having such strategy phase and action phase is repeatedly executed until the end condition of the battle game is satisfied. The ending condition of the battle game may be determined based on whether the battle physical strength of any one of the battles is 0 or more, or whether the current round is the final round (the round-related process is performed a predetermined number of times). If the ending condition is satisfied, the sequence related to the competing game ends, and after the result display or the like is performed, the 1-play provision is ended.
Regional controls responsive to game progress
Next, with reference to fig. 7, the area control performed by the area control unit 108 in accordance with the progress of the game in a series of flows related to the provision of the game play, that is, the setting of the area in which the detection result is reflected in the game (reflection area) and the area in which the detection result is not reflected in the game (non-reflection area), will be described. The reflection of the detection result in the game means, for example, that position information, various parameters, action points, and the like in the game, which are managed for each character (whether a physical card or a virtual card) of the player team, are changed in accordance with the movement of the detected object, or that the drawing of the game screen is changed in accordance with the movement.
First, in the mode selection sequence, as shown as a game screen displayed on the second display unit 130 in fig. 5 (a), it is necessary to detect a touch operation performed on the mounting panel 131 in regard to a switching operation or a selection operation of the group of icons 501 related to the game mode. In this case, it is estimated that it is difficult for the user to place an obstacle such as an object held by the user in the area where the user is placed, or to exclude the user even if the user is placed, in order to select the game mode or improve the visibility of the group of icons 501. On the other hand, as shown in the figure, since the area in which the icon group is arranged is limited with respect to the game screen image, it is also difficult to assume that the touch operation is performed at a position distant from the area. In addition, in contrast to the case where it is difficult to place obstacles in the region where the icon group is arranged, obstacles such as a wallet or a card pocket and a physical card intended for use in the next character registration sequence may be placed in a region other than the region.
Therefore, in the mode selection sequence, the area control unit 108 sets the peripheral area 701 of the group of icons 501 shown by hatching in fig. 7 (a) as the reflection area, and sets the areas other than the reflection area as the non-reflection area. That is, in the mode selection sequence, the touch operation detection unit 141 and the card detection unit 142 detect the physical card or the touch operation, but do not reflect the detection result detected for the non-reflection area in the information managed in association with the processing executed by the control unit 101, and reflect only the detection result detected for the reflection area. In the present embodiment, since the game mode is selected by the touch operation in the mode selection sequence, it is also possible to control the card detection unit 142 not to detect any area.
In the character registration sequence, as shown in fig. 5 (b) as a game screen displayed on the second display unit 130, when registration of a game character is performed, a physical card needs to be placed on the area 512. In this case, it is estimated that it is difficult or impossible for the user to place unnecessary entity cards, items to be held, and the like in the area 512 in order to place entity cards related to characters to be used or to prevent entity cards related to characters not to be used from being recognized. On the other hand, in the region other than the region 512, there may be a case where, for example, physical cards in which whether or not they are used are being studied are arranged, or a hand is touched when the physical cards are placed on the region 512 provided at the upper end of the second display unit 130.
Therefore, in the character registration sequence, the area control unit 108 sets the area 702 corresponding to the area 512 hatched in fig. 7 (b) as the reflection area and sets the areas other than the reflection area as the non-reflection areas. In the present embodiment, since the virtual card is not displayed in the character registration sequence, it is also possible to control the touch operation detection unit 141 not to detect in any area.
When the sequence shifts to the sequence related to the battle, as shown in fig. 5 (c) and (d) as the game screen displayed on the second display unit 130, the movement operation related to the action of the character in the action phase to be performed next is performed in the attack area 521 and the standby area 522 in the strategy phase. At this time, it is estimated that it is difficult to place an obstacle such as an unused physical card or an article to be held in the area corresponding to the attack area 521 and the standby area 522 on the placement panel 131, or even if the obstacle is placed, the obstacle is excluded from the user, so as not to interfere with the touch operation related to the movement of the physical card or the movement of the virtual card. On the other hand, in a region other than the attack region 521, the standby region 522, and the periphery thereof (a region where the physical card is allowed to slightly protrude in the vicinity of the boundary), there is a possibility that an unused physical card, a physical card that has been used but becomes unusable with respect to the progress of the game, an article held by the physical card, or the like is placed.
Therefore, at the strategic stage, the area control unit 108 sets the area 703 corresponding to the attack area 521 and the standby area 522 as shown by hatching in fig. 7 (c) as the reflection area, and sets the areas other than the reflection area as the non-reflection area.
On the other hand, in the action phase, as described with respect to (e) of fig. 5, since the game screen related to the attack performance is displayed on the first display unit 120, the user inevitably looks at the first display unit 120. In this case, since the user does not need to perform a moving operation or the like on the mounting panel 131, for example, when taking a rest with the hand, a part of the body or the like may be mounted on the mounting panel 131.
Therefore, in the action phase, the area control unit 108 sets the entire area as the non-reflection area without setting the reflection area, as in fig. 7 (d), which does not show the hatched area, so that any detection result in the action phase is not reflected in the game.
As described above, in the game device 100 of the present embodiment, the settings of the reflection area and the non-reflection area are changed on the assumption of actions that the user can take in accordance with the progress of the game. It should be noted that the control related to the setting of the reflection region and the non-reflection region shown in the present embodiment is merely an example, and the embodiment capable of implementing the present invention is not limited to this, which is needless to say.
Local control processing (progress)
A specific process of the area control process (progress) relating to the area control according to the game progress, which is executed by the game device 100 according to the present embodiment, will be described below with reference to the flowchart of fig. 8. The processing corresponding to this flowchart can be realized by the control unit 101 reading out a corresponding processing program stored in the recording medium 102, for example, and expanding and executing the program in the memory 103. For example, the present zone control process is started when the payout detection unit 104 detects that a payout of a price is made to a medal insertion slot, not shown, of the game apparatus 100.
When the control unit 101 starts the mode selection sequence in S801, the area control unit 108 sets an area related to the mode selection sequence under the control of the control unit 101 in S802 (the area 701 in fig. 7 a is set as a reflection area).
In S803, the control unit 101 determines whether or not to end the mode selection sequence. The mode selection sequence may be ended in response to a selection completion operation of the game mode being input by the user, or in response to a predetermined time limit having elapsed. When determining that the mode selection sequence has ended, the control unit 101 shifts the process to S804, starts the character registration sequence, and repeats the process of this step when determining that the mode selection sequence has not ended.
When the character registration sequence is started, the area control unit 108 sets an area related to the character registration sequence in S805 (the area 702 in fig. 7 (b) is set as a reflection area).
In S806, the control unit 101 determines whether or not to end the character registration sequence. Similarly to the mode selection sequence, the character registration sequence may be ended in response to a user input of a registration completion operation, or a lapse of a predetermined time limit. When determining that the character registration sequence has ended, the control unit 101 proceeds to S807 and starts a sequence related to the match-up game. When determining that the character registration sequence has not ended, the control unit 101 repeats the process of this step.
When the control unit 101 starts the process of the strategic stage related to the round in S808, the area control unit 108 sets an area related to the strategic stage in S809 (sets the area 703 in fig. 7 (c) as a reflection area).
In S810, the control unit 101 determines whether or not to end the strategic stage process. When determining that the process in the strategic stage is ended, the control unit 101 shifts the process to S811 to start the action stage, and when determining that the process is not ended, the process in this step is repeated.
In S812, the area control unit 108 performs area setting related to the action phase (the entire area is set as the non-reflection area as shown in fig. 7 (d)).
In S813, the control unit 101 determines whether or not to end the process at the action stage. The control unit 101 shifts the process to S814 when determining that the process in the action phase is ended, and repeats the process of this step when determining that the process is not ended.
In S814, the control unit 101 determines whether or not the end condition of the battle game is satisfied. The control unit 101 completes the local area control process when determining that the end condition of the match game is satisfied, and returns the process to S808 when determining that the end condition is not satisfied, and starts the process of the strategic stage for the next round.
As described above, according to the game device of the present embodiment, since the reflection area and the non-reflection area can be set assuming the display area where the user can act or watch in accordance with the progress of the game, it is possible to reduce the execution of the operation related to the operation input unintended by the user.
In the present embodiment, the touch operation detection unit 141 and the card detection unit 142 have been described as detecting the non-reflection area but not reflecting the area in the game, but the present invention is not limited to this. The region set as the non-reflection region can be understood by the user as a region not necessary for playing the game, and therefore, there is a possibility that an obstacle, a hand at the time of non-operation, or the like is placed, and therefore, it may be controlled so that at least the detection by the touch operation detecting unit 141 is not performed.
For example, when an image for instructing movement of a real card or a virtual card or an image for instructing a touch operation such as drawing a predetermined trajectory is arranged on the game screen in order to start a predetermined performance in a game, a predetermined region corresponding to the image on the mounting panel 131 may be set as a reflection region, and regions other than the reflection region may be set as non-reflection regions. That is, when an operation input is desired for a limited reflection area, area control may be performed for the purpose of eliminating processing related to other unnecessary operation inputs. In this case, since the operation may be performed on the non-reflection area with respect to the reflection area, it is sufficient to control the touch operation detection unit 141 and the card detection unit 142 to perform detection and reflect the detection result in the game when the content of the movement operation is determined. In this case, in order to make it easy for the user to recognize that the fun-improving effect can be generated only in the limited region, it is preferable that the limited reflection region has an area narrower than the non-reflection region, such as a circular region having a predetermined radius around an image having the same size as the card or an image for giving a command.
[ embodiment 2]
In embodiment 1 described above, the area control is performed assuming the display area where the user is likely to act or watch in accordance with the progress of the game, but the implementation of the present invention is not limited to the area control in accordance with the progress of the game as described above. The present embodiment will be described below: the presence of an object estimated not to reflect the detection result in the game is dynamically determined, and the region corresponding to the object is set as a non-reflection region. In the present embodiment, the configuration of the game device 100 is the same as that of embodiment 1, and the description thereof is omitted.
Local control processing (dynamic) of the book
A specific process of the area control process (dynamic process) for realizing the area control for dynamically removing the detection result that should not be reflected, which is executed in the game device 100 according to the present embodiment, will be described below with reference to the flowchart of fig. 9. The processing corresponding to this flowchart can be realized by reading out a corresponding processing program stored in the recording medium 102, for example, by the control unit 101, and expanding and executing the program in the memory 103. For example, the present area control process is started when the payout detecting unit 104 detects that a payout of a consideration is made to a medal drop not shown in the drawing of the game apparatus 100. In addition, the present area control process (dynamic) may be executed in parallel with the area control process (progress) of embodiment 1.
In S901, the control unit 101 determines whether or not an object that can be determined to have not moved for a predetermined period is present on the mounting panel 131 based on the detection results of the touch operation detection unit 141 and the card detection unit 142. The control unit 101 shifts the process to S902 when determining that there is an object that can be determined to have not moved for a predetermined period on the placement panel 131, and shifts the process to S903 when determining that there is no object.
In S902, the area control unit 108 sets the area related to the object that can be determined as not moving as the non-reflection area so as not to reflect the detection result in the game. In this case, the set area is managed as a temporary non-reflection area so as to be distinguishable from the non-reflection area set according to the progress of the game. In the present embodiment, the region control processing (progress) and the region control processing (dynamic state) are executed in parallel, whereby the reflective region and the non-reflective region based on the former and the temporary non-reflective region based on the latter are set. In this case, the setting of the area control processing (dynamic state) may be prioritized over the setting of the area control processing (progress), and while the temporary non-reflection area is set, the area is always treated as an area in which the detection result is not reflected in the game, without depending on the processing result of the area control processing (progress). In the present embodiment, the detection results of the touch operation detection unit 141 and the card detection unit 142 are held in the memory 103 for at least a predetermined period in association with the time information for detection, so that the passage of time can be determined.
In S903, the control unit 101 determines whether or not an object is no longer present in the region set as the temporary non-reflection region. The control unit 101 shifts the process to S904 when determining that the object is no longer present in the region set as the temporary non-reflection region, and shifts the process to S905 when determining that the object is present.
In S904, the area control unit 108 cancels the setting of the temporary non-reflection area for the temporary non-reflection area in which the object is no longer present. That is, when the area control processing (progress) according to the progress of the game is classified into the reflection area, the temporary non-reflection area is changed to the reflection area, and when the area control processing (progress) is classified into the non-reflection area, the temporary non-reflection area still does not reflect the detection result in the non-reflection area of the game.
In S905, the control unit 101 determines whether or not the end condition of the battle game is satisfied. The control unit 101 completes the local area control process when determining that the end condition of the match game is satisfied, and returns the process to S901 when determining that the end condition of the match game is not satisfied. In S901 after the return, determination of an object that does not move is not performed for the region that has already been set as the temporary non-reflection region.
In this way, when an area is generated in which it is determined that the detection result should not be reflected in the game regardless of the progress of the game, the area can be dynamically set as the temporary non-reflection area, and the detection result can be controlled not to be reflected in the game. In addition, even if the area is an area that becomes a temporary non-reflection area at a time, the area setting can be dynamically canceled in accordance with the removal of the obstacle, and thus, user convenience can be ensured.
In the present embodiment, the description has been made assuming that the region to be the temporary non-reflection region is set according to the fact that the object placed on the panel 131 does not move for a predetermined period of time, but the implementation of the present invention is not limited to this. For example, since the card ID of the physical card used for registration of the game character can be determined, the physical card can be excluded from the determination that the physical card has not moved for a predetermined period. It should be understood that the object estimated not to reflect the detection result in the game is not limited to time, and may be determined based on any reference such as the placement area and shape.
[ other embodiments ]
The present invention is not limited to the above-described embodiments, and various changes and modifications can be made without departing from the spirit and scope of the present invention. The game device according to the present invention can also be realized by a program for causing 1 or more computers to function as the game device. The program can be provided and distributed by being recorded in a computer-readable recording medium or via an electrical communication line.
Description of the reference numerals
100: a game device; 101: a control unit; 102: a recording medium; 103: a memory; 104: a payment detection section; 105: a display control unit; 106: an operation input unit; 107: a role DB;108: an area control unit; 109: a discharge control unit; 110: a communication unit; 120: a first display unit; 130: a second display unit; 131: a mounting panel; 141: a touch operation detection unit; 142: a card detection section; 150: a discharge unit.

Claims (16)

1. A game device is provided with:
a table top;
a detection unit that detects a first kind of object and a second kind of object different from the first kind of object that are present on the top surface, the first kind of object and the second kind of object being independent objects from each other, the first kind of object having information for specifying a game element that is present in the game, the second kind of object having no information for specifying the game element;
an execution unit that executes a game reflecting a detection result of the detection unit, the game being based on information acquired from the first type of object; and
a setting unit that sets, for the table top, a first region in which the detection result of the detection unit is reflected in the game and a second region in which the detection result of the detection unit is not reflected in the game,
wherein the setting means changes the first area and the second area in accordance with the progress of the game.
2. A gaming apparatus as defined in claim 1,
the detection result indicates at least one of a position and a moving state of each of the first type of object and the second type of object on the tabletop.
3. Gaming apparatus according to claim 1 or 2,
the detection unit includes:
a first detection unit that detects the first kind of object existing on the top surface; and
a second detection unit that detects the second kind of object present on the top.
4. Gaming apparatus according to claim 3,
the first detection unit includes an imaging unit that images a placement surface of the table top on which the first type of object is placed from a back surface side of the table top.
5. Gaming apparatus according to claim 3,
the second detection unit includes a light emitting unit disposed on at least 2 sides surrounding the table top, and a light receiving unit receiving light emitted from the light emitting unit.
6. Gaming apparatus according to claim 1 or 2,
the setting means may further set, as the second region, a region in which the object exists, when at least one of the first type of object and the second type of object that is determined to have not moved for a predetermined period of time exists.
7. Gaming apparatus according to claim 1 or 2,
the detection unit does not detect at least the second kind of object for the second area.
8. Gaming apparatus according to claim 1 or 2,
further having an ejecting unit that ejects the object in association with the execution of the game,
the discharge target of the discharge unit is the first kind of object,
the second kind of object is not included in the discharge object of the discharge unit.
9. Gaming apparatus according to claim 1 or 2,
the game device is also provided with a display unit, the display unit presents game pictures related to the game through the table board, and the display unit is arranged on the lower layer of the table board.
10. A gaming apparatus as defined in claim 9,
the first region is a region on the table corresponding to a predetermined image arranged on the game screen,
the second region is a region on the table top other than a region corresponding to the predetermined image.
11. Gaming apparatus according to claim 10,
the predetermined image is an image for instructing at least one of the first type of object and the second type of object to be placed or moved in the first area.
12. Gaming apparatus according to claim 10 or 11,
the detection means detects the first type of object and the second type of object with respect to the second region while the predetermined image is displayed.
13. Game apparatus according to claim 10 or 11,
the first region has a narrower area than the second region.
14. An article for game detected as the first kind of object in the game device according to any one of claims 9 to 13,
the game item has information for determining a game element appearing in the game,
when the detection means detects the game article, the display means displays an image of a game element specified from information of the game article.
15. The article for game use according to claim 14,
the image of the game element is displayed in the vicinity of the game article on the playing surface.
16. A recording medium storing a program for causing a computer to function as each unit of the game device according to any one of claims 1 to 13.
CN201780058493.6A 2016-11-16 2017-09-28 Game device, game article, and recording medium Active CN109789335B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2016223624A JP6194091B1 (en) 2016-11-16 2016-11-16 GAME DEVICE, GAME ARTICLE, AND PROGRAM
JP2016-223624 2016-11-16
PCT/JP2017/035125 WO2018092427A1 (en) 2016-11-16 2017-09-28 Game device, game article, and program

Publications (2)

Publication Number Publication Date
CN109789335A CN109789335A (en) 2019-05-21
CN109789335B true CN109789335B (en) 2022-11-04

Family

ID=59798935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780058493.6A Active CN109789335B (en) 2016-11-16 2017-09-28 Game device, game article, and recording medium

Country Status (3)

Country Link
JP (1) JP6194091B1 (en)
CN (1) CN109789335B (en)
WO (1) WO2018092427A1 (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6372944B1 (en) * 2017-09-19 2018-08-15 株式会社コナミアミューズメント GAME SYSTEM AND COMPUTER PROGRAM USED FOR THE SAME
JP6669711B2 (en) * 2017-11-17 2020-03-18 株式会社バンダイ Game device, game system and program
JP6484793B1 (en) * 2018-02-14 2019-03-20 株式会社コナミアミューズメント GAME SYSTEM AND COMPUTER PROGRAM USED FOR THE SAME
JP2019136492A (en) 2019-01-18 2019-08-22 株式会社コナミアミューズメント Game system and computer program used for the same
JP6929901B2 (en) * 2019-06-12 2021-09-01 株式会社バンダイ Game cards, game devices and game programs
JP7319147B2 (en) * 2019-09-05 2023-08-01 株式会社ポケモン CARD USAGE SYSTEM, CARD USAGE METHOD, AND CARD USAGE PROGRAM
JP6913733B2 (en) * 2019-12-05 2021-08-04 株式会社バンダイ Game equipment, programs and game systems

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1960786A (en) * 2004-05-31 2007-05-09 世嘉股份有限公司 Data recording medium and game apparatus
CN101180107A (en) * 2005-05-25 2008-05-14 科乐美数码娱乐株式会社 Game machine, game system, and game progress control method
WO2010041860A2 (en) * 2008-10-07 2010-04-15 (주)에프투시스템 Electronic card game system and electronic card game method using same
JP2015116469A (en) * 2014-06-20 2015-06-25 株式会社バンダイ Game device and program
JP2016107018A (en) * 2014-12-10 2016-06-20 株式会社セガゲームス Game apparatus and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008110206A (en) * 2006-10-05 2008-05-15 Kenji Yoshida Information processing device
JP5517026B2 (en) * 2009-02-18 2014-06-11 株式会社セガ GAME DEVICE, GAME DEVICE CONTROL METHOD, AND GAME DEVICE CONTROL PROGRAM
JP5044731B2 (en) * 2010-04-19 2012-10-10 株式会社Dapリアライズ Portable information processing apparatus having touch panel means and program for portable information processing apparatus
JP5592555B1 (en) * 2013-12-20 2014-09-17 株式会社バンダイ GAME DEVICE AND PROGRAM
WO2015113395A1 (en) * 2014-01-30 2015-08-06 Zheng Shi System and method for directing a moving object on an interactive surface

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1960786A (en) * 2004-05-31 2007-05-09 世嘉股份有限公司 Data recording medium and game apparatus
CN101180107A (en) * 2005-05-25 2008-05-14 科乐美数码娱乐株式会社 Game machine, game system, and game progress control method
WO2010041860A2 (en) * 2008-10-07 2010-04-15 (주)에프투시스템 Electronic card game system and electronic card game method using same
JP2015116469A (en) * 2014-06-20 2015-06-25 株式会社バンダイ Game device and program
JP2016107018A (en) * 2014-12-10 2016-06-20 株式会社セガゲームス Game apparatus and storage medium

Also Published As

Publication number Publication date
CN109789335A (en) 2019-05-21
JP6194091B1 (en) 2017-09-06
WO2018092427A1 (en) 2018-05-24
JP2018079095A (en) 2018-05-24

Similar Documents

Publication Publication Date Title
CN109789335B (en) Game device, game article, and recording medium
CN108025214B (en) Game device and program
CN109414617B (en) Game device, game article, and recording medium
US11452935B2 (en) Virtual card game system
EP3525897B1 (en) Game system
US8262476B2 (en) Game apparatus, character and virtual camera control method, program and recording medium
KR20200109351A (en) Interaction system and method
JP2014176724A (en) Game device
JP5997325B1 (en) GAME DEVICE AND PROGRAM
JP2017012766A (en) Game device and program
US10709965B2 (en) Game device, gaming item, and program product
CN116196624A (en) Program, game device, and game system
JP2018134453A (en) Game device, article for game, and program
CN110573222B (en) Game device, computer-readable recording medium, and game system
JP6708540B2 (en) Game device and program
JP6746352B2 (en) Game device and program
JP6951147B2 (en) Game equipment, game systems and programs
JP6681364B2 (en) Game device, game system and program
JP3935102B2 (en) Game device
WO2018179879A1 (en) Game device and article for game
EP2684584B1 (en) Game apparatus
CN116196623A (en) Program, game device, and game system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant