WO2009141913A1 - Dispositif de jeu - Google Patents

Dispositif de jeu Download PDF

Info

Publication number
WO2009141913A1
WO2009141913A1 PCT/JP2008/059531 JP2008059531W WO2009141913A1 WO 2009141913 A1 WO2009141913 A1 WO 2009141913A1 JP 2008059531 W JP2008059531 W JP 2008059531W WO 2009141913 A1 WO2009141913 A1 WO 2009141913A1
Authority
WO
WIPO (PCT)
Prior art keywords
detection sensor
story
predetermined
operation position
collision
Prior art date
Application number
PCT/JP2008/059531
Other languages
English (en)
Japanese (ja)
Inventor
志郎 野木
Original Assignee
ログイン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ログイン株式会社 filed Critical ログイン株式会社
Priority to PCT/JP2008/059531 priority Critical patent/WO2009141913A1/fr
Priority to JP2009527378A priority patent/JPWO2009141913A1/ja
Publication of WO2009141913A1 publication Critical patent/WO2009141913A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B63/00Targets or goals for ball games
    • A63B63/004Goals of the type used for football, handball, hockey or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/002Training appliances or apparatus for special sports for football
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0605Decision makers and devices using detection means facilitating arbitration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0028Tracking the path of an object, e.g. a ball inside a soccer pitch
    • A63B2024/0031Tracking the path of an object, e.g. a ball inside a soccer pitch at the starting point
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0037Tracking a path or terminating locations on a target surface or at impact on the ground
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0021Tracking a path or terminating locations
    • A63B2024/0037Tracking a path or terminating locations on a target surface or at impact on the ground
    • A63B2024/0043Systems for locating the point of impact on a specific surface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B63/00Targets or goals for ball games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/54Controlling the output signals based on the game progress involving acoustic signals, e.g. for simulating revolutions per minute [RPM] dependent engine sounds in a driving game or reverberation against a virtual wall
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6063Methods for processing data by generating or executing the game program for sound processing
    • A63F2300/6081Methods for processing data by generating or executing the game program for sound processing generating an output signal, e.g. under timing constraints, for spatialization
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/643Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car by determining the impact between objects, e.g. collision detection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • A63F2300/646Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car for calculating the trajectory of an object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates to a technology for realizing a realistic game in a sports game using a moving object such as a ball.
  • a sports game using a computer is a game in which various sports are virtually reproduced and enjoyed, and there are, for example, games using balls such as soccer and baseball.
  • Such a sports game is realized by a home game device or a game device installed in a game center or the like.
  • the performance of home-use game devices has been significantly improved, and while games in a state closer to reality have come to be enjoyed, the size of game machines is limited because they are for home use, so The sense of reality was limited.
  • a game device used in such a play facility is realized by, for example, a game device as disclosed in Patent Documents 1 and 2.
  • Patent Documents 1 and 2 have large shapes and can not be realized in a home-use game device.
  • the movement range of the ball is large and a large detection mechanism is required to detect the movement of the ball, the movement and adjustment of the device can not be easily performed.
  • the present invention has been made in view of the above-described circumstances, and it is an object of the present invention to provide a game apparatus which can enjoy a realistic game and save space.
  • the present invention detects a moving object moving in space by an operation of the operator and a collision position of the moving object on a predetermined display screen, and coordinates according to the detected collision position.
  • a collision position detection sensor that outputs data
  • an operation position detection sensor that detects the operation position of the moving body by the operator, and outputs position data according to the detected operation position, and a predetermined algorithm on the display screen
  • display control means for displaying a moving image according to the story progressing according to the predetermined algorithm, the predetermined algorithm according to the coordinate data output from the collision position detection sensor and the position data output from the operation position detection sensor
  • the display control means is configured to, from the position of the display screen corresponding to the coordinate data output from the collision position detection sensor, position data to be output from the operation position detection sensor and the coordinate data
  • a predetermined object is displayed so as to move along a trajectory according to the predetermined trajectory
  • the predetermined algorithm is configured to change a mode of progress of the story according to the trajectory of the predetermined object.
  • the movement path calculation means is operated from the collision position detection sensor from the timing when the detection result by the contact state detection sensor changes from a state in which the predetermined portion is in contact to a state in which the predetermined portion is not in contact.
  • the moving speed of the moving body is further determined based on the time until the timing when the coordinate data is output, and the positional relationship between the operation position detected by the operation position detection sensor and the collision position detected by the collision position detection sensor.
  • the display control means displays the predetermined object so as to move at a speed according to the movement speed calculated by the movement path calculation means.
  • the moving direction calculating unit further calculates the moving direction of the moving body based on the operation position detected by the operation position detection sensor and the collision position detected by the collision position detection sensor.
  • the display control means may display the predetermined object so as to move along a trajectory according to the movement direction calculated by the movement direction calculation means.
  • the mobile terminal further includes a contact state detection sensor that detects whether or not the mobile body is in contact with a predetermined portion, and the predetermined algorithm stops the progress of the story at a predetermined timing. And the progress of the story is resumed if the detection result by the contact state detection sensor indicates that the movable body is in contact with the predetermined portion and is not in contact with the predetermined portion. It is characterized by
  • the predetermined algorithm is performed at the coordinate data output from the collision position detection sensor, at a timing when the coordinate data is output, or at a timing before a predetermined time set from the timing. According to the position data outputted from the operation position detection sensor, the aspect of the progress of the story is changed.
  • DESCRIPTION OF SYMBOLS 1 Game device main body, 2 ... Operation position detection sensor, 3 ... Contact state detection sensor, 4 ... Installation stand, 10 ... Bus, 11 ... Control part, 12 ... Storage part, 13 ... Sound emission part, 14 ... Display part, DESCRIPTION OF SYMBOLS 15 ... Operation part, 16 ... Collision position detection part, 17 ... Interface, 21 ... Operation position detection part, 22 ... Detection part, 31 ... Contact state reception part, 32 ... Contact state detection part, 100 ... Virtual ball, 200 ... Goal keeper, 300 ... soccer goal, 1000 ... actual ball, 2000 ... hand.
  • the game device is a device capable of playing a soccer game.
  • the game is advanced by causing a ball to collide with a display screen having a touch panel and displaying the ball in the game screen from the point of collision.
  • this soccer game is a game for achieving a shot to a goal in a PK match, free kick, or the like.
  • the goalkeeper 200 and the soccer goal 300 are displayed on the display screen of the display unit 14 of the game apparatus body 1.
  • the user of the game apparatus wears a band-shaped detection target 22 in the vicinity of the user's hand 2000, in this example, the wrist. Then, the small real ball 1000 imitating a soccer ball is thrown by the hand 2000 toward the display screen or flipped.
  • the virtual ball 100 is displayed in the display screen from the collision position P, and moves along a trajectory calculated as described later
  • the goalkeeper 200 moves to block the goal of the virtual ball 100 into the soccer goal 300. Then, if the goal keeper 200 can not stop the goal and the virtual ball 100 enters the soccer goal 300, it is scored, and if the goal keeper 200 catches the virtual ball 100 or bounces outside the football goal 300, the score is scored. It does not become.
  • the game apparatus has a game apparatus body 1 and an operation position detection sensor 2.
  • the game apparatus body 1 includes a control unit 11, a storage unit 12, a sound emitting unit 13, a display unit 14, an operation unit 15, a collision position detection unit 16, and an interface 17, and is, for example, a notebook computer.
  • the operation position detection sensor 2 includes an operation position detection unit 21 and a detection target unit 22.
  • the control unit 11 includes a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM).
  • the CPU reads out the control program from the ROM or the storage unit 12 to the RAM and executes it to control each part of the game apparatus body 1 and the operation position detection sensor 2 via the bus 10. Also, by executing this control program, the above-described soccer game is progressed with a story in accordance with a predetermined algorithm.
  • This predetermined algorithm changes the aspect of the progress of the story by satisfying a predetermined condition as described later.
  • the change in the aspect of the progress of the story according to the predetermined algorithm means, for example, a plurality of branches from the story A in response to satisfying the predetermined condition in the ongoing story (story A) at a certain point in time Of the stories (Story B1, B2,%) Of any of the stories (Story B1, B2,%) Are changed from Story A as a story to be advanced. Then, depending on which one of the stories to be changed is changed from the story A, the aspect of the progress of the story is changed. For example, since the progress of the story is different between the case where the story A is changed to the story B1 and the case where the story A is changed to the story B2, the aspect of the progression is changed.
  • the change of the story indicates that the parameter indicating the content of the story progressing according to a predetermined algorithm is changed, for example, the parameter for controlling the display mode, the sound emission mode, etc. in the display screen is changed.
  • the parameter for controlling the display mode, the sound emission mode, etc. in the display screen is changed.
  • Indicates For example, each of the stories A, B1, B2,... Controls the display mode in the display screen according to the parameters set in advance, and the parameters are changed by changing the stories. . About the content of the story mentioned later, this example is shown.
  • each of the stories A, B1, B2,... May be described as a subroutine program.
  • the change of the story is, for example, a subroutine corresponding to a story to be changed (one of the stories B1, B2,...) To be changed from the state where the subroutine program of the story A is being executed.
  • the program is called and executed, and the subroutine program to be executed is jumped and changed.
  • the storage unit 12 is, for example, a large-capacity storage unit such as a hard disk, and can store a control program and various data.
  • the sound emitting unit 13 has a sound emitting means such as a speaker unit, and emits the audio data input under the control of the control unit 11.
  • this audio data is data of a sound to be emitted according to a story proceeding according to a predetermined algorithm, and is data showing a musical tone, an imitation sound, and the like.
  • the display unit 14 is a display device such as a liquid crystal display that displays a video on a display screen, and performs display based on video data input by control of the control unit 11. During the game, this video data is moving image data to be displayed on the display screen in accordance with the story progressing in accordance with a predetermined algorithm.
  • the operation unit 15 is, for example, operation means such as a keyboard and a mouse.
  • operation means such as a keyboard and a mouse.
  • data representing the content of the operation is output to the control unit 11.
  • a pointer or the like that moves according to the operation is displayed on the display screen of the display unit 14.
  • the collision position detection unit 16 has a resistive film type touch panel provided on the display screen of the display unit 14, and outputs coordinate data indicating a touch position on the touch panel to the control unit 11.
  • coordinate data indicating the coordinates is output to the control unit 11.
  • the coordinates on the display screen of the display unit 14 and the coordinates on the touch panel are associated with each other in advance, and the coordinates indicating the touch position can be detected as the collision position P on the display screen.
  • each coordinate may be associated one to one, or may be associated by a predetermined conversion formula.
  • the touch panel does not have to be a resistive film type, and may be another detection type such as a capacitive type, an optical type, or an ultrasonic type. Also, as in these cases, a plurality of points may be detected by detecting with a matrix such as a digital type, instead of an analog type touch panel that detects one point of contact.
  • the operation position detection unit 21 detects the position of the detection target 22, and outputs position data corresponding to the detected position to the control unit 11 via the interface 17. Since the detection target 22 is a band-like one attached to the wrist in the vicinity of the user's hand 2000, the position to be detected is the operation position S (FIG. 7 (a) Corresponding to).
  • the to-be-detected part 22 may be ring-shaped things, such as a ring, and as long as it can be mounted
  • the operation position detection unit 21 uses, in this example, a known electromagnetic wave, sound wave or the like for the direction and distance of the detection target 22 from the operation position detection unit 21.
  • the position is detected by measurement according to the method (for example, JP-A-2002-357656). Therefore, the position of the detection target 22 is detected with the position of the operation position detection unit 21 as a reference.
  • the method of detecting the position of the detection target 22 is not limited to this method, and various known methods such as a three-point survey method may be used.
  • the operation position S detected by the operation position detection unit 21 is a reference The positional relationship between the collision position P and the operation position S can not be recognized. Therefore, the operation position detection unit 21 performs conversion for aligning the reference position with respect to the detected operation position S so as to recognize each positional relationship, and outputs it as position data. Specifically, the positional relationship between the position of the operation position detection unit 21 and a predetermined position in the display screen serving as the reference of the collision position P is set in advance, and the reference position of the operation position S is determined by the set positional relationship. Convert to a predetermined position on the display screen.
  • control part 11 can recognize the relative positional relationship of the collision position P and the operation position S, if coordinate data and position data are acquired.
  • the operation position detection unit 21 outputs the detected operation position S as it is as position data, and the control unit 11 performs conversion for aligning the reference position, and the position between the collision position P and the operation position S. The relationship may be recognized.
  • the above is the description of the configuration of the game device according to the first embodiment.
  • the control unit 11 executes the control program, and a story according to a predetermined algorithm, as shown in FIG.
  • the goal 300 and the like are displayed as a moving image.
  • the goalkeeper 200 or a background (not shown) for example, a spectator
  • the story (hereinafter referred to as a story A) related to the moving image displayed in this manner proceeds in accordance with a predetermined algorithm (step S110).
  • the control unit 11 When determining that there is a collision to the display screen, the control unit 11 acquires position data output from the operation position detection sensor 2 and identifies the position of the detection target 22, that is, the operation position S (step S130). .
  • the control unit 11 requests the operation position detection sensor 2 to output position data, and acquires position data.
  • the operation position detection sensor 2 is performed at the timing when the collision to the display screen occurs, that is, the timing when the coordinate data is output from the collision position detection unit 16.
  • the position data output from may be acquired, and it is assumed that the timing at which the real ball 1000 is operated is slightly before the timing at which the collision occurred, and a predetermined time set in advance from the timing at which the collision occurred.
  • the position data output from the operation position detection sensor 2 may be acquired at the timing of (1). In this case, position data for the predetermined time may be buffered in the RAM of the control unit 11 or the like.
  • control unit 11 specifies the collision position P in the display screen of the real ball 1000 from the output coordinate data (step S140).
  • the order of the identification of the collision position P (step S140) and the identification of the operation position S described above (step S130) may be reversed.
  • the control unit 11 calculates the moving direction of the real ball 1000 at the time of the display screen collision based on the positional relationship between the specified operation position S and the collision position P (step S150).
  • this movement direction is calculated as a direction connecting the operation position S to the collision position P by a straight line, but if it is calculated using the operation position S and the collision position P Any calculation method may be used.
  • a function indicating a predetermined curve having the operation position S and the collision position P as variables is set in advance, and it is assumed that the curve calculated using the function is the movement path of the real ball 1000. It is sufficient to calculate the moving direction of the real ball 1000 in.
  • the control unit 11 causes the virtual ball 100 to be displayed at a position in the display screen corresponding to the collision position P (step S160), calculates a trajectory corresponding to the calculated movement direction, and starts movement on the trajectory.
  • the moving speed of the virtual ball 100 may be changed according to the distance between the operation position S and the collision position P.
  • the direction in which the goal keeper 200 in the display screen is moved (left, right, upper, upper left, upper right, no movement, etc.) is determined and movement is started.
  • the movement of the goal keeper 200 may be determined at random, for example, at randomly determined moving speeds, or may be determined by a preset function or the like.
  • the control unit 11 determines whether the calculation result of the trajectory of the virtual ball 100 satisfies a predetermined condition (step S170).
  • the predetermined condition is to reach the soccer goal 300 by moving the virtual ball 100 and not to touch the virtual ball 100 by moving the goal keeper 200, that is, the condition that the chute is determined It is.
  • the control unit 11 determines that the predetermined condition is satisfied (step S170: Yes)
  • the virtual ball 100 in the display screen moves into the soccer goal 300.
  • a score is entered, and the story A progressing according to a predetermined algorithm is changed to the story B1 corresponding to the point scored (step S181). For example, a display showing a moving image that the goalkeeper 200 is discouraged Score is added.
  • step S170 determines that the predetermined condition is not satisfied (step S170: No)
  • the virtual ball 100 in the display screen moves out of the soccer goal 300.
  • a score is not entered, and the story A progressing according to a predetermined algorithm is changed to the story B2 corresponding to a case where the score is not entered (step S182). It becomes indication and the score is not added.
  • the mode of progress of the story changes.
  • the virtual ball 100 can be displayed on the display screen by causing the user to throw the small-sized actual ball 1000 and cause the object screen to collide with the display screen. You can enjoy the game by intuitive operation.
  • the user can recognize the operation position S of the operation for throwing the virtual ball 100 by attaching the band-like detected portion 22 to the wrist, and the operation position can be set to various directions and distances. Since the movement direction of the real ball 1000 can also be calculated by this, it is possible to make the trajectory when the virtual ball 100 moves in the display screen coincide with the user's sense, It can be something with a sense.
  • a game apparatus is the game apparatus according to the first embodiment, further including a contact state detection sensor for detecting whether or not the real ball 1000 is in contact with a predetermined portion. is there.
  • the outline of the game in the configuration of the second embodiment will be described with reference to FIG.
  • FIG. 4A a display similar to that of the embodiment is displayed on the display screen of the display unit 14.
  • the user places the real ball 1000 on a predetermined portion (in the figure, the upper surface portion) of the contact state detection unit 32 of the installation stand 4 configured by the detection target unit 22 and the contact state detection unit 32. Then, the user performs an operation to flip the entity ball 1000 toward the display screen with a finger.
  • the virtual ball 100 is displayed in the display screen from the collision position P, and moves along a trajectory calculated as described later
  • the goalkeeper 200 moves to block the goal of the virtual ball 100 into the soccer goal 300. Then, if the goal keeper 200 can not stop the goal and the virtual ball 100 enters the soccer goal 300, it is scored, and if the goal keeper 200 catches the virtual ball 100 or bounces outside the football goal 300, the score is scored. It does not become.
  • the game apparatus has a touch state detection sensor 3 in addition to the configuration of the first embodiment.
  • the contact state detection sensor 3 has a contact state reception unit 31 and a contact state detection unit 32.
  • the configuration of the contact state detection sensor 3 will be described.
  • the contact state detection unit 32 detects whether or not the substance ball 1000 is in contact with a predetermined portion (hereinafter referred to as a contact state) (hereinafter referred to as a non-contact state), and indicates contact data indicating the detection result. It is transmitted to the touch state reception unit 31. Then, the touch state reception unit 31 receives the touch data transmitted from the touch state detection unit 32, and outputs the touch data to the control unit 11 through the connected interface 17.
  • the detection of the contact state in the contact state detection unit 32 may be detected directly by measuring whether or not the substantial ball 1000 is in contact with a predetermined portion, or the measurement may be performed without light contact etc. It may be detected indirectly by measuring whether or not the entity ball 1000 is positioned at a predetermined portion.
  • the contact state detection unit 32 constitutes the installation stand 4 together with the detection unit 22.
  • the to-be-detected part 22 differs from the band-like structure in embodiment.
  • the other configuration is the same as the configuration in the first embodiment, and thus the description thereof will be omitted.
  • the control unit 11 executes a control program to display a moving image according to the story A on the display screen of the display unit 14. Then, the story A progresses according to a predetermined algorithm (step S210).
  • the control unit 11 determines whether the physical ball 1000 is in contact with the installation stand 4 based on whether the contact data from the contact state detection sensor 3 indicates a contact state or a contact state (step S215). While the contact data indicates the contact state (step S215: Yes), the progress of the story A is continued on the assumption that the entity ball 1000 remains on the installation table 4 (step S210).
  • step S215 determines that the contact state has been changed from the contact state, assuming that the physical ball 1000 is flipped by the user's operation and separated from the installation stand 4
  • the timing when changing to the state is specified as the operation timing (step S216)
  • the position data output from the operation position detection sensor 2 is acquired by the same method as the first embodiment, and the operation position S is specified ( Step S217).
  • the specification of the operation position S may be performed in advance.
  • step S220 determines whether or not there is a collision with the display screen of the display unit 14 for the real ball 1000 by the same method as the first embodiment (step S220), and determines that there is no collision with the display screen ((S220) While the step S220: No) is performed, the determination in the step S220 is repeated. If it is determined that there is a collision to the display screen (step S220: Yes), the control unit 11 determines from the coordinate data output by the collision position detection unit 16 the collision position P in the display screen of the real ball 1000 and the coordinate data. The output timing is specified as a collision timing (step S240).
  • step S250 When the control unit specifies the operation position S, the operation timing, the collision position P and the collision timing, the time from the operation timing to the collision timing (hereinafter referred to as moving time) and the positional relationship between the operation position S and the collision position P Based on the movement path of the real ball 1000 from the operation position S to the collision position P is calculated (step S250).
  • moving time the time from the operation timing to the collision timing
  • the positional relationship between the operation position S and the collision position P Based on the movement path of the real ball 1000 from the operation position S to the collision position P is calculated (step S250).
  • moving time the time from the operation timing to the collision timing
  • FIG. 7 shows the real space on the user side and the virtual space in the display screen with the display screen as the boundary, and the moving path of the real ball 1000 calculated by the control unit 11 and the virtual space in the virtual space
  • the moving direction of the ball 100 is shown.
  • FIG. 7A schematically shows the calculation method in the first embodiment, and it is assumed that the movement path of the real ball 1000 from the operation position S to the collision position P is substantially a straight line. The movement direction of 100 is calculated.
  • the movement path on the parabola can be calculated by including the length of the movement time and the gravitational acceleration in the calculation method. For example, when the movement time is short, it is a low movement path as shown in FIG. 7 (b), and when the movement time is long, it is calculated as a movement path on a parabola which has risen high as shown in FIG. 7 (c). Can. Then, the movement direction of the virtual ball 100 may be calculated as the direction on the extension of the movement path.
  • the moving speed of the real ball 1000 calculates the speed of the virtual ball 100 according to the calculated moving speed, and move the virtual ball 100 at the calculated speed.
  • the positions of the goal keeper 200 and the soccer goal 300 in the virtual space may be arbitrarily changed by setting.
  • the setting may be changed in the depth direction (right direction in FIG. 7). .
  • the control unit 11 causes the virtual ball 100 to be displayed at a position in the display screen corresponding to the collision position P (step S260), calculates a trajectory corresponding to the calculated movement route, and starts movement on the trajectory.
  • the processes of subsequent steps S270, S281, and S282 are the same as the processes of steps S170, S181, and S182 in the first embodiment, and thus the description thereof will be omitted.
  • the above is the description of the operation of the game device according to the second embodiment.
  • the game apparatus accurately calculates the movement path and movement speed of the real ball 1000 by providing the contact state detection sensor 3 in addition to the game apparatus in the first embodiment. Therefore, the trajectory of the virtual ball 100 in the display screen when moving can be made more consistent with the sense of the user, and can be more realistic.
  • the control unit 11 determines the operation timing and the operation position after determining that the physical ball 1000 has been separated from the installation stand 4 based on the contact data. It may be performed after there is a collision on the display screen of. In this case, time data indicating the last timing when the contact data changed from the contact state to the non-contact state, and the position data output at that timing are stored in the storage unit 12 or the like, and the display screen is displayed. When there is a collision of the real ball 1000, time data and position data may be read out from the storage unit 12 to specify the operation timing and the operation position.
  • steps S310 and S320 are the same as steps S110 and S120 in the first embodiment, and thus the description thereof is omitted.
  • the time data is read from the storage unit 12 to specify the operation timing (step S336)
  • the position data is read from the storage unit 12 to specify the operation position (step S337). Note that this order may be reversed.
  • step S340 and subsequent steps are respectively the same as step S240 and subsequent steps in the second embodiment, the description thereof will be omitted.
  • time data and position data are stored in advance, and the processing is performed after the substance ball 1000 collides with the display screen, so that even if the substance ball 1000 is accidentally dropped from the installation stand 4, It is possible to operate after placing the real ball 1000 again without progressing the processing.
  • the installation base 4 is configured of the detection target portion 22 and the contact state detection portion 32.
  • the contact state detection sensor 3 is a glove like one to be attached to the hand 2000 instead of the installation stand 4. Whether or not the real ball 1000 is in contact with the object may be detected, and the operation position detection sensor 2 may detect the position of the hand 2000.
  • the operation timing can be specified by the touch data output from the touch state detection sensor 3, but the touch data may be used in another mode.
  • the progress may be resumed when the contact data changes from the contact state to the non-contact state.
  • the story A is temporarily stopped and the moving image is stopped.
  • the progress of the story A may be resumed, and the process may proceed to step S216 shown in FIG.
  • interruption processing as shown in FIG. 9 may be inserted before the flowchart shown in FIG. 3.
  • the installation stand 4 is configured by the contact state detection unit 32 of the contact state detection sensor 3, and the detection unit 4 is mounted on the wrist of the user.
  • the control unit 11 advances the story A (step S401), determines whether or not the stop timing preset by the predetermined algorithm has been reached (step S402), and if it does not reach the stop timing (step S402: No) The determination is continued in a state in which the progress of the story A is continued (step S402). Then, when the stop timing is reached (step S402: Yes), the control unit 11 temporarily stops the story A (step S403).
  • the process of temporarily stopping the story A may be a process of stopping the moving image displayed on the display screen as a still image, or the process of displaying the moving image by repeatedly displaying the predetermined display content. May not progress.
  • step S404 determines the contact state of the physical ball 1000 to the contact state detection unit 32 (step S404), and while determining that the contact state detection unit 32 is in the non-contact state (step S404: No), step S404. Continue the judgment in.
  • step S404: Yes determines the contact state again in the next step (step S405).
  • step S405: Yes determines the contact state again in the next step.
  • step S405: No determines the contact state again in the next step S406.
  • the control unit 11 places the real ball 1000 on the installation table 4 and then resumes the interrupted story A by performing an operation of releasing it.
  • the timing for stopping the story A may be set in advance as a predetermined algorithm as described above, or may be stopped when the contact data changes from the non-contact state to the contact state.
  • the position data output by the operation position detection sensor 2 specifies the operation position by the distance and the direction, but it is also possible to specify only the distance.
  • the trajectory of the virtual ball 100 is set in advance, and the story may be changed according to the distance indicated by the position data, for example, the score may be changed. In this way, for example, it is possible to play a game that reproduces two-point shoot, three-point shoot and the like in basketball. Further, in the soccer game as in the embodiment, if the distance is too close, it may be changed to a story in which a score is not entered as a foul.
  • the real ball 1000 was operated by the user. However, depending on the type of game, it may not be the ball. For example, in the case of a dart or the like, an imitation of an arrow may be used, and any moving body that moves in space by the operation of the user may be used. Further, the object displayed on the display screen as in the virtual ball 100 may be changed depending on the type of the game, and may be an object corresponding to the moving object.
  • the detection of the collision position P on the display screen is performed by directly detecting the collision position P using a touch panel, but may be indirect detection.
  • the physical ball 1000 may be photographed by a CCD camera or the like, and the movement path of the substance ball 1000 may be calculated by image analysis from photographed data obtained by photographing to detect the collision position P on the display screen.
  • a plurality of passage sensors for detecting passage coordinates of the real ball 1000 may be provided, and a movement path of the real ball 1000 may be calculated from the passage coordinates to detect the collision position P on the display screen. Then, coordinate data corresponding to the collision position P may be output. That is, as long as the collision position P on the display screen is detected, it may be detected by any method.
  • the user may operate the operation unit 15 to issue an instruction to correct the trajectory of the virtual ball 100.
  • the virtual ball 100 may be given a change such as a curve or a shot.
  • the control unit 11 may calculate the trajectory of moving the virtual ball 100 including this instruction.
  • a plurality of units having different structures of predetermined portions for detecting the touch state in the touch state detection unit 32 may be prepared and used depending on the game.
  • a predetermined part may be configured to reproduce a fairway, a rough, a bunker, or the like, and may be used properly according to the situation of the game (progress of the story).
  • a structure ID corresponding to each structure is stored in the contact state detection unit 32, and in the contact state reception unit 31, the contact state detection unit 32 having a structure ID that can be used according to the game situation contacts Reception of data other than data may be invalid.
  • the operation position detection sensor 2 is used to detect the operation position, but the movement speed is detected by detecting the movement speed from the time change of the position of the detection target 22 According to the above, the moving speed of the virtual ball 100 may be corrected.
  • the collision position detection unit 16 is included in the configuration of the game apparatus body 1, but like the operation position detection sensor 2 and the touch state detection sensor 3, An external collision position detection sensor may be connected.
  • the collision position detection sensor outputs the coordinate data corresponding to the detected collision position P to the control unit 11 via the interface 17 and the touch panel provided on the front of the display screen of the display unit 14 It is sufficient to have
  • the collision position detection sensor is also provided with a correlating means for performing processing of correlating both coordinate relationships. Just do it.
  • An example of such association processing is the following. That is, a figure such as a point is displayed at a predetermined coordinate as a display for association in the display screen. Then, the user touches the figure through the touch panel installed on the front of the display screen to acquire coordinate data as the collision position P. By performing this for a plurality of coordinates, the coordinates of the touch panel can be made to correspond to the coordinates in the display screen. Thereafter, the collision position detection sensor may output coordinate data indicating coordinates obtained by converting the coordinates of the collision position P into coordinates on the display screen according to the correspondence.
  • the collision position detection sensor, the operation position detection sensor 2 and the contact state detection sensor 3 are connected to a general computer, and the control program is stored in the storage unit 12 to obtain the game device of the present invention. It can be configured.
  • ⁇ Modification 12> In the first and second embodiments described above, the configuration in which the game apparatus alone enjoys the game has been described, but the game may be enjoyed using a plurality of game apparatuses by communicating through a network. In this case, each of the plurality of game devices may be connected to the network by the interface 17 or the like to communicate with each other.
  • game apparatus A the display content displayed on the side of the game apparatus performing the shooting operation
  • game apparatus B the other game apparatus
  • the game devices A and B do not have to have exactly the same display content.
  • the display content of the game device B is the direction of the line of sight of the goalkeeper 200 (the direction when the operating position S is viewed from the back of the soccer goal 300) ) May be reproduced.
  • the movement path is not calculated until the vehicle collides with the display screen after the actual ball 1000 is operated, so if displaying the game device B after the movement path is determined, The movement path of the real ball 1000 can also be reproduced by the virtual ball 100, and can be displayed from the state when shot.
  • the goalkeeper 200 may be operated.
  • the display on the game device A stops the movement of the virtual ball 100 after the actual ball 1000 collides with the display screen, and the movement of the virtual ball 100 is performed only on the game device B.
  • the game apparatus A may move the goal keeper 200 operated in the game apparatus B together with the movement of the virtual ball 100.
  • the movement of the goal keeper 200 in the game apparatus B is included, and as a result, the story A is changed to either of the stories B1 and B2 depending on whether or not the virtual ball 100 enters the soccer goal 300. And the aspect of the progress of the story may be changed.
  • the game device can also enjoy the game by a plurality of users simultaneously via the network.
  • the control program executed by the control unit 11 in the embodiment described above may be provided in a state of being stored in a computer readable recording medium such as a magnetic recording medium, an optical recording medium, a magneto-optical recording medium, or a semiconductor memory. It is also possible to download via a network such as the Internet.
  • a computer readable recording medium such as a magnetic recording medium, an optical recording medium, a magneto-optical recording medium, or a semiconductor memory. It is also possible to download via a network such as the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention porte sur un dispositif de jeu qui peut afficher une bille virtuelle (100) sur un écran d'affichage lorsque l'utilisateur lance une bille réelle (1000) vers un écran d'affichage, de telle sorte que l'utilisateur peut apprécier le jeu grâce à un fonctionnement intuitif. De plus, l'utilisateur a la possibilité, du fait qu'il porte une partie détection en forme de bande (22) au poignet, de reconnaître la position opérationnelle (S) à laquelle la bille virtuelle (100) est lancée, de façon à localiser ainsi la position opérationnelle (S) dans différentes directions et à différentes distances, et à calculer la direction de déplacement de la bille réelle (1000). Par conséquent, l'orbite selon laquelle la bille virtuelle (100) se déplace sur l'écran d'affichage peut correspondre à la sensation de l'utilisateur, de façon à établir ainsi une sensation réaliste.
PCT/JP2008/059531 2008-05-23 2008-05-23 Dispositif de jeu WO2009141913A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2008/059531 WO2009141913A1 (fr) 2008-05-23 2008-05-23 Dispositif de jeu
JP2009527378A JPWO2009141913A1 (ja) 2008-05-23 2008-05-23 ゲーム装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2008/059531 WO2009141913A1 (fr) 2008-05-23 2008-05-23 Dispositif de jeu

Publications (1)

Publication Number Publication Date
WO2009141913A1 true WO2009141913A1 (fr) 2009-11-26

Family

ID=41339862

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/059531 WO2009141913A1 (fr) 2008-05-23 2008-05-23 Dispositif de jeu

Country Status (2)

Country Link
JP (1) JPWO2009141913A1 (fr)
WO (1) WO2009141913A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110316767A1 (en) * 2010-06-28 2011-12-29 Daniel Avrahami System for portable tangible interaction
JP2012050769A (ja) * 2010-09-03 2012-03-15 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム制御方法、ならびに、プログラム
JP2020014100A (ja) * 2018-07-17 2020-01-23 株式会社セガゲームス 映像システム及び映像プログラム
JP2021016406A (ja) * 2019-07-17 2021-02-15 株式会社バンダイナムコアミューズメント ゲーム装置及びプログラム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11299947A (ja) * 1998-02-19 1999-11-02 Purareizu:Kk バ―チャルゲ―ム装置
JP2002052246A (ja) * 2000-08-11 2002-02-19 Konami Co Ltd ゲーム機
JP2004097662A (ja) * 2002-09-12 2004-04-02 Toshiba Eng Co Ltd サッカーシミュレーションゲーム装置および方法
JP2004097702A (ja) * 2002-09-12 2004-04-02 Konami Co Ltd ゲーム装置
JP3758007B2 (ja) * 1997-10-17 2006-03-22 株式会社ナムコ 合成画像表示装置、ゲーム装置及びボーリングゲーム装置

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3412693B2 (ja) * 2001-06-28 2003-06-03 株式会社コナミコンピュータエンタテインメント大阪 ネットワークゲーム進行制御システム、ネットワークゲーム進行制御方法及びネットワークゲーム進行制御プログラム
JP4179162B2 (ja) * 2003-12-26 2008-11-12 株式会社セガ 情報処理装置、ゲーム装置、画像生成方法、ゲーム画像生成方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3758007B2 (ja) * 1997-10-17 2006-03-22 株式会社ナムコ 合成画像表示装置、ゲーム装置及びボーリングゲーム装置
JPH11299947A (ja) * 1998-02-19 1999-11-02 Purareizu:Kk バ―チャルゲ―ム装置
JP2002052246A (ja) * 2000-08-11 2002-02-19 Konami Co Ltd ゲーム機
JP2004097662A (ja) * 2002-09-12 2004-04-02 Toshiba Eng Co Ltd サッカーシミュレーションゲーム装置および方法
JP2004097702A (ja) * 2002-09-12 2004-04-02 Konami Co Ltd ゲーム装置

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110316767A1 (en) * 2010-06-28 2011-12-29 Daniel Avrahami System for portable tangible interaction
JP2013532337A (ja) * 2010-06-28 2013-08-15 インテル・コーポレーション 携帯デバイスと有体物とがインタラクションするシステム
US9262015B2 (en) 2010-06-28 2016-02-16 Intel Corporation System for portable tangible interaction
JP2012050769A (ja) * 2010-09-03 2012-03-15 Konami Digital Entertainment Co Ltd ゲーム装置、ゲーム制御方法、ならびに、プログラム
JP2020014100A (ja) * 2018-07-17 2020-01-23 株式会社セガゲームス 映像システム及び映像プログラム
WO2020017474A1 (fr) * 2018-07-17 2020-01-23 株式会社セガゲームス Système vidéo et programme vidéo
JP2021016406A (ja) * 2019-07-17 2021-02-15 株式会社バンダイナムコアミューズメント ゲーム装置及びプログラム
JP7492199B2 (ja) 2019-07-17 2024-05-29 株式会社バンダイナムコアミューズメント ゲーム装置及びプログラム

Also Published As

Publication number Publication date
JPWO2009141913A1 (ja) 2011-09-29

Similar Documents

Publication Publication Date Title
JP5286267B2 (ja) ゲーム装置、ゲームプログラム及びオブジェクトの操作方法
KR100832198B1 (ko) 게임의 실행을 제어하는 프로그램이 기록된 컴퓨터판독가능한 기록 매체, 게임의 실행을 제어하는 방법, 및정보 처리 장치
KR100970866B1 (ko) 게임장치, 진행제어방법 및 정보기록매체
JP5751594B2 (ja) 仮想ゴルフシミュレーション装置及び方法
JP2016182338A (ja) スクリーン野球システム競技方法
JPWO2009028690A1 (ja) ゲーム装置、ゲームプログラム及びゲーム装置の制御方法
WO2006126484A1 (fr) Machine de jeu, systeme de jeu, et procede de commande de progres de jeu
US8267779B2 (en) Program, storage medium on which program is recorded, and game device
JP6889944B2 (ja) ゲーム制御装置、ゲームシステム及びプログラム
US8465353B2 (en) Game device, control method for game device, and information storage medium
WO2009141913A1 (fr) Dispositif de jeu
JP2012101026A (ja) プログラム、情報記憶媒体、ゲーム装置及びサーバシステム
JP3619223B2 (ja) ゲーム装置
TWI664007B (zh) 使用投球模式玩屏幕虛擬棒球的方法
JP3934661B1 (ja) 不正行為防止機能付きゲーム装置、ゲーム中の不正行為防止方法及びそのプログラム
JP2011067677A (ja) ゲーム装置およびゲームプログラム
JP2010115244A (ja) ゲーム画像表示制御プログラム、ゲーム装置及びゲーム画像表示制御方法
JP4962977B2 (ja) ゲームプログラム、対戦ゲーム装置、及び対戦ゲーム制御方法
JP2001149642A (ja) ゲーム装置及びゲーム装置の残像表示方法
JP2023111501A (ja) ゲームプログラム、ゲーム処理システム、ゲーム装置、およびゲーム処理方法
JP2002320776A (ja) ゲームを実行制御するプログラム及び、このプログラムを実行するゲーム装置
JP6636491B2 (ja) ゲームプログラム、方法、および情報処理装置
JP5325196B2 (ja) ゲーム装置及びゲーム制御プログラム
JP2006230582A (ja) プログラム、情報記憶媒体及びゲーム装置
JP2024027966A (ja) ゲーム制御装置、ゲームシステム及びプログラム

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2009527378

Country of ref document: JP

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08764577

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 08764577

Country of ref document: EP

Kind code of ref document: A1