WO2011046014A1 - Système de jeu et dispositif de jeu - Google Patents

Système de jeu et dispositif de jeu Download PDF

Info

Publication number
WO2011046014A1
WO2011046014A1 PCT/JP2010/066715 JP2010066715W WO2011046014A1 WO 2011046014 A1 WO2011046014 A1 WO 2011046014A1 JP 2010066715 W JP2010066715 W JP 2010066715W WO 2011046014 A1 WO2011046014 A1 WO 2011046014A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
unit
controller
information
game program
Prior art date
Application number
PCT/JP2010/066715
Other languages
English (en)
Japanese (ja)
Inventor
基康 田中
Original Assignee
株式会社メガチップス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2009238956A external-priority patent/JP5964008B2/ja
Priority claimed from JP2009245458A external-priority patent/JP2011087848A/ja
Priority claimed from JP2009267785A external-priority patent/JP5843342B2/ja
Application filed by 株式会社メガチップス filed Critical 株式会社メガチップス
Publication of WO2011046014A1 publication Critical patent/WO2011046014A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1025Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection
    • A63F2300/1031Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals details of the interface with the game device, e.g. USB version detection using a wireless connection, e.g. Bluetooth, infrared connections
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8094Unusual game types, e.g. virtual cooking

Definitions

  • the present invention relates to a game system and a game apparatus, and more particularly, to a game system and a game apparatus that use information indicating the movement of a controller itself as game operation information.
  • some stationary game devices use a controller with a built-in acceleration sensor or the like.
  • the acceleration sensor detects the moving direction of the controller.
  • the game device uses information indicating the moving direction output from the acceleration sensor as game operation information. The user can enjoy the game by moving his / her body while holding the controller.
  • Some portable game devices use a touch panel as a user interface.
  • the user inputs position information to the touch panel using a stylus pen or a user's finger.
  • the user can play the game simply by touching the touch panel.
  • a tablet as a user interface of a personal computer (PC).
  • the tablet outputs two-dimensional position information corresponding to the touched position when the user touches the operation surface of the tablet with the stylus pen.
  • the PC controls the position of the pointer displayed on the monitor based on the two-dimensional position information.
  • Patent Document 1 discloses an attitude detection device that detects the height of a position indicating coil built in a three-dimensional structure such as a dice.
  • the posture detection device has the same configuration as an electromagnetic induction tablet and includes a sensor coil group including a plurality of loop coils.
  • the sensor coil group receives a resonance signal generated by the position indicating coil.
  • the attitude detection device forms a detection waveform from the received resonance signal.
  • the posture detection device detects the distance between the installation surface of the sensor coil group and the position indicating coil as the height of the position indicating coil based on the half width of the main signal formed at the center of the detected waveform.
  • the position indicating coil is arranged at a position where the height of the position indicating coil is different when it is placed on the installation surface with each surface of the dice facing up.
  • the posture detection device identifies the dice roll by detecting the height of the position indicating coil. For this reason, the user can enjoy a computer game while actually rolling the dice.
  • a game system includes a position pointing device held by a user, and a tablet that detects the position of the position pointing device in a three-dimensional space on an operation surface and generates three-dimensional position information indicating the position of the position pointing device. And a control device that executes a game program using the three-dimensional position information, the control device generating a virtual game space by executing the game program, and a user's An object placement unit that places an operation target object generated as an operation target in the game space based on the three-dimensional position information, and at least one of a position and a shape of the operation target object is used to change the three-dimensional position information. And an object changing unit that changes based on the object.
  • Various games can be provided based on a new operation method of moving the position pointing device on the tablet.
  • the game device of the present invention includes a tablet that detects a position of each position instruction unit in a three-dimensional space on the operation surface when an input device that includes a plurality of position instruction units and is held by a user is on the operation surface, A posture information generating unit that generates posture information indicating the posture of the input device based on the position of each position indicating unit, and a control unit that executes a game program using the posture information,
  • the control unit includes a game space generation unit that generates a virtual game space by executing the game program, an object placement unit that places an operation target object generated as a user operation target in the game space, and the operation target.
  • An object control unit that changes the posture of the object based on the posture information.
  • the game device of the present invention includes a tablet that detects a position of each position instruction unit in a three-dimensional space on the operation surface when an input device that includes a plurality of position instruction units and is held by a user is on the operation surface, A posture information generating unit that generates posture information indicating the posture of the input device based on the position of each position indicating unit, and a control unit that executes a game program using the posture information,
  • the control unit includes a game space generation unit that generates a virtual game space by executing the game program, an object placement unit that places an operation target object generated as a user operation target in the game space, and the operation target.
  • An action instruction for the object is specified based on the posture information, and the action of the operation target object is controlled based on the specified action instruction. It includes an object control unit.
  • a game system includes a position instruction controller including a position instruction unit, a position detection controller that detects a relative position of the position instruction unit with reference to its own device and generates relative position information indicating the relative position, A control device that executes a game program using the relative position information, the control device generating a virtual game space by executing the game program, a first part object, An object generation unit that generates a player object that is generated as a user's operation target, and the first part object is associated with one of the position instruction controller and the position detection controller.
  • the operation target object that is the user's operation target can be controlled in accordance with the user's movement, a game with higher reality can be provided.
  • a game system includes a position instruction controller including a position instruction unit, a position detection controller that detects a relative position of the position instruction unit with reference to its own device and generates relative position information indicating the relative position,
  • a control device that executes a game program using the relative position information, and the control device includes a game space generation unit that generates a virtual game space by executing the game program, and a user operation target.
  • An object generation unit that generates a certain operation target object, a reference object that serves as a reference for the position of the operation target object, and associates the operation target object with one of the position instruction controller and the position detection controller.
  • the game is arranged in a game space and the reference object is associated with the other
  • An object arrangement unit disposed between, based on the position of the reference object in the game space and the relative position information, including, an object control unit for changing the position of the operation target object in the game space.
  • the operation target object that is the user's operation target can be controlled in accordance with the user's movement, a game with higher reality can be provided.
  • an object of the present invention is to provide a game system capable of executing various games using three-dimensional position information in view of the above problems.
  • FIG. 1 is an overall view of a game system according to a first embodiment of the present invention. It is a block diagram which shows the functional structure of a game device. It is a block diagram which shows the functional structure of a controller. It is a perspective view of a controller. It is a figure which shows the positional relationship of a tablet and a controller. It is a figure explaining the operation method of a domino game. It is a figure explaining the operation method when moving a Domino object. It is a figure explaining the operation method of a billiard game. It is a figure explaining the operation method of a dentist simulation game. It is the figure which expanded the peripheral region of the drill object. It is the figure which expanded the peripheral region of the drill object. It is the figure which expanded the peripheral region of the wrench object.
  • FIG. 1 is an overall view of the game system 1 according to the present embodiment.
  • the game system 1 includes a game device 12, a controller 13, and a tablet 14.
  • the game device 12 is a processing device that performs overall control of the game system 1 and executes a game program recorded on an optical disc.
  • the game apparatus 12 generates image data for displaying a game video on the screen 15 a of the television receiver 15, and outputs the image data to the television receiver 15.
  • the controller 13 transmits information input by the user to the game apparatus 12 as game operation information.
  • the tablet 14 detects the position of the controller 13 in the three-dimensional space on the operation surface 14a, and generates three-dimensional position information indicating the position of the controller 13.
  • the operation surface 14 a is an area where the tablet 14 can detect the position of the controller 13.
  • the user holds the controller 13 while floating on the operation surface 14a. Thereby, the three-dimensional position information of the controller 13 is input to the game apparatus 12 as operation information. Therefore, the game system 1 can provide a game using a new operation method in which the user moves the controller 13 in a three-dimensional space on the operation surface 14a.
  • FIG. 2 is a block diagram showing a functional configuration of the game apparatus 12.
  • the game device 12 includes a control unit 121, an optical disc drive 122, an output unit 123, and a wireless communication unit 124.
  • the control unit 121 includes a CPU and a RAM, and executes various game programs.
  • the control unit 121 accesses the optical disc 16 set in the optical disc drive 122 and executes a game program recorded on the optical disc 16.
  • the user can enjoy various games by setting the optical disc 16 on which the game program is recorded in the optical disc drive 122.
  • the output unit 123 outputs game video and game audio to the television receiver 15 connected to the game apparatus 12.
  • the wireless communication unit 124 performs wireless communication with the controller 13 and the tablet 14.
  • Bluetooth registered trademark
  • FIG. 3 is a block diagram showing a functional configuration of the controller 13.
  • the controller 13 includes a control unit 131, an operation unit 132, a wireless communication unit 133, and a position instruction unit 134.
  • the control unit 131 performs overall control of the controller 13.
  • the operation unit 132 is a button provided on the surface of the controller 13.
  • FIG. 4 is a perspective view of the controller 13. As shown in FIG. 4, an A button 132a, a B button 132b, a cross key 132c, and the like are provided on the surface of the controller 13 as the operation unit 132.
  • the user inputs game operation information by operating the operation unit 132.
  • the wireless communication unit 133 transmits operation information to the game device 12 by performing wireless communication with the game device 12.
  • the position instruction unit 134 is a resonance circuit including a coil 134 a and a capacitor 134 b (see FIG. 5), and transmits a resonance signal to the tablet 14. Although the position of the position instruction unit 134 is not shown in FIG. 4, the position instruction unit 134 is provided in the vicinity of the side surface 135 on the longitudinal direction side of the controller 13. Hereinafter, the side surface 135 is referred to as a position indicating surface 135. A B button 132 b exists in the vicinity of the position indicating surface 135.
  • FIG. 5 is a diagram illustrating a positional relationship between the tablet 14 and the controller 13.
  • the tablet 14 can detect the position of the position instruction unit 134. That is, the tablet 14 transmits the position of the position instruction unit 134 to the game apparatus 12 as the three-dimensional position information of the controller 13.
  • the X axis, the Y axis, and the Z axis are defined with the lower left corner of the operation surface 14a as the origin.
  • the operation surface 14a corresponds to the XY plane.
  • the Z axis is a direction perpendicular to the operation surface 14a.
  • the X sensor coil 141x and the Y sensor coil 141y are indicated by solid lines.
  • the X sensor coil 141x is installed along the Y-axis direction, and is used to detect the X coordinate of the position indicating unit 134.
  • the Y sensor coil 141y is installed along the X-axis direction and is used to detect the Y coordinate of the position indicating unit 134.
  • the controller 131 supplies a current signal (hereinafter referred to as “position indication signal”) to the coil 134a. Electric power necessary for supplying the position indication signal is supplied by a battery built in the controller 13.
  • position indication signal a current signal
  • Electric power necessary for supplying the position indication signal is supplied by a battery built in the controller 13.
  • the X sensor coil 141x and the Y sensor coil 141y receive the resonance signal.
  • the tablet 14 detects the X coordinate and the Y coordinate of the position indicating unit 134 based on the resonance signal received by the X sensor coil 141x and the Y sensor coil 141y.
  • the tablet 14 specifies the Z coordinate of the position indicating unit 134 based on the received waveform of the resonance signal. In this way, the tablet 14 acquires the three-dimensional position information of the position instruction unit 134.
  • the power necessary for supplying the position instruction signal may be supplied from the tablet 14. Specifically, by supplying a signal corresponding to the resonance frequency of the position indicating unit 134 to the X sensor coil 141x and the Y sensor coil 141y, an induced current is generated in the coil 134a. Charges corresponding to the induced current are accumulated in the capacitor 134b. The control unit 131 supplies a position instruction signal to the coil 134a using the electric charge accumulated in the capacitor 134b.
  • the controller 13 shown in FIGS. 3 and 5 includes one position instruction unit 134. However, the controller 13 may include a plurality of position instruction units 134, 134... So that the user can enjoy various games.
  • the domino game is a game in which a domino game can be simulated.
  • the user sets the optical disc 16 storing the domino game game program in the optical disc drive 122. Thereby, the domino game is started.
  • the processing in the control unit 121 is processing by executing a domino game game program.
  • FIG. 6 is a diagram for explaining the operation method of the domino game.
  • the control unit 121 In response to the start of the domino game, the control unit 121 generates a game space 51a for the domino game.
  • the game space 51a is a virtual three-dimensional space.
  • the control unit 121 generates a pointer object 51b as a user operation target in the domino game.
  • the control unit 121 converts the three-dimensional position information input from the tablet 14 into position information (virtual position information) of the game space 51a.
  • the control unit 121 arranges the pointer object 51b at a position corresponding to the virtual position information in the game space 51a.
  • the initial screen of the domino game is displayed on the screen 15a in FIG.
  • the plane 51c is a plane formed in the game space 51a and corresponds to the operation surface 14a.
  • the user arranges virtual dominoes on the plane 51c by moving and operating the controller 13 on the operation surface 14a.
  • the user moves the controller 13 on the operation surface 14a. At this time, as shown in FIG. 6, it is desirable for the user to hold the controller 13 with the position indicating surface 135 facing the operation surface 14a. Since the distance between the position instruction unit 134 and the operation surface 14a is shortened, the tablet 14 can accurately detect the position of the controller 13.
  • the tablet 14 outputs the three-dimensional position information of the controller 13 to the game device 12 in real time.
  • the control unit 121 converts the three-dimensional position information into virtual position information each time the three-dimensional position information is input, and changes the position of the pointer object 51b in the game space 51a.
  • the user moves the pointer object 51b to the domino storage 51d and presses the A button 132a. Accordingly, the control unit 121 newly generates a domino object 51e.
  • a domino object 51e (see FIG. 7) is displayed on the screen 15a instead of the pointer object 51b.
  • the user's operation target is changed from the pointer object 51b to the domino object 51e. Since the virtual position information is associated with the domino object 51e, the user can move the domino object 51e by moving the controller 13 on the operation surface 14a.
  • FIG. 7 is a diagram illustrating an operation method when the domino object 51e is moved. As shown in FIG. 7, the plane 51c has a plurality of domino objects 51f, 51f,.
  • the control unit 121 monitors the distance between the domino object 51e and the domino object 51f in real time. When the distance between the domino object 51e and the domino object 51f becomes shorter than the predetermined distance, the control unit 121 determines that the domino object 51e has contacted the domino object 51f. The control unit 121 performs a process of defeating the domino object 51f with which the domino object 51e has come into contact. When the collapsed domino object 51f comes into contact with another domino object 51f, the control unit 121 performs a process of defeating the other domino object 51f. In this way, when the domino object 51e contacts the domino object 51f, a state in which the domino object 51f falls in a chained manner is displayed on the screen 15a.
  • the user needs to move the domino object 51e while watching the screen 15a so that the domino object 51e does not contact the domino object 51f.
  • the controller 13 moves parallel to the operation surface 14a as in the path indicated by the arrow 51g, there is a high possibility that the domino object 51e contacts the domino object 51f.
  • the controller 13 moves like a path indicated by the arrow 51h, the possibility that the domino object 51e contacts the domino object 51f is low.
  • the user moves the domino object 51e to a desired position on the plane 51c and then presses the A button 132a.
  • the domino object 51e is stood on the plane 51c as the arranged domino object 51f, thereby completing the operation of the domino object 51e.
  • the pointer object 51b is newly displayed at a position corresponding to the controller 13. The user repeats the operation of arranging the domino objects 51e on the plane 51c.
  • the game system 1 uses the three-dimensional position information of the controller 13 to specify the position of the domino object 51e. Since the movement of the controller 13 can be accurately reflected on the domino object 51e to be operated, the user can enjoy a highly realistic domino game.
  • the billiard game is a game that allows the player to experience billiards in a pseudo manner.
  • the controller 13 is regarded as a cue.
  • the operation surface 14a is likened to a billiard table.
  • the user sets the optical disc 16 on which the billiard game program is stored in the optical disc drive 122. Thereby, a billiard game is started.
  • the processing in the control unit 121 is processing by executing a billiard game program.
  • FIG. 8 is a diagram for explaining an operation method of the billiard game.
  • the control unit 121 In response to the start of the billiard game program, the control unit 121 generates a game space 52a for the billiard game, and places the billiard table 52b in the game space 52a.
  • a game space 52a including a billiard table 52b is displayed on the screen 15a.
  • the control unit 121 generates a hand ball object 52c, a plurality of target ball objects 52d, 52d, and a cue object 52e.
  • the control unit 121 places the hand ball object 52c and the target ball object 52d on the billiard table 52b based on the virtual position information set for the hand ball object 52c and the target ball object 52d.
  • the cue object 52e is an operation target of the user, and is arranged in the game space 52a based on the three-dimensional position information of the controller 13. Specifically, the tip 521e of the cue object 52e corresponds to the position instruction unit 134. For this reason, the position of the tip 521e is specified based on the virtual position information converted from the three-dimensional position information.
  • the cue object 52e is arranged in a direction opposite to the direction of the hand ball object 52c viewed from the tip 521e.
  • the user determines the direction in which the cue object 52e is moved while looking at the hand ball object 52c and the cue object 52e displayed on the screen 15a. As shown in FIG. 8, the user moves the controller 13 so as to poke toward the determined direction.
  • the control unit 121 converts the three-dimensional position information input in real time from the tablet 14 into virtual position information, and moves the cue object 52e based on the converted virtual position information. In parallel with the movement of the cue object 52e, the control unit 121 calculates the distance between the tip 521e and the handball object 52c based on the virtual position information of the tip 521e and the virtual position information of the handball object 52c. When the distance between the tip 521e and the handball object 52c is shorter than a predetermined distance, it is determined that the tip 521e has hit the handball object 52c.
  • the control unit 121 determines the moving direction of the tip 521e based on the virtual position information of the tip 521e immediately before the tip 521e hits the handball object 52c and the virtual position information of the tip 521e when the tip 521e hits the handball object 52c. Identify. The control unit 121 moves the hand ball object 52c based on the moving direction of the tip 521e.
  • the cue object 52e moves based on the three-dimensional position information of the controller 13, the movement of the controller 13 can be faithfully reproduced in the game space 52a. Therefore, the user can enjoy a billiard game that is close to a real sense by moving the controller 13 like a cue to poke.
  • the control unit 121 changes the moving direction of the handball object 52c based on the incident direction of the handball object 52c with respect to the outer frame. Whether or not the handball object 52c collides with the outer frame can be determined in the same manner as the determination of the collision between the handball object 52c and the tip 521e.
  • the control unit 121 can also determine whether or not the handball object 52c collides with the target ball object 52d based on the distance between the handball object 52c and the target ball object 52d. When the hand ball object 52c collides with the target ball object 52d, the control unit 121 moves the target ball object 52d.
  • the control unit 121 calculates the incident angle of the cue object 52e with respect to the billiard table 52b. When the incident angle of the cue object 52e is larger than the predetermined angle, the control unit 121 causes the hand ball object 52c to jump from the billiard table 52b.
  • the dentist simulation game is a game in which a user becomes a dentist and treats a virtual decayed tooth.
  • the controller 13 is regarded as a treatment instrument such as a drill or a wrench. The user treats the virtual caries by operating the controller 13 on the operation surface 14a.
  • the user sets the optical disc 16 on which the dentist simulation game program is stored in the optical disc drive 122. Thereby, a dentist simulation game is started.
  • the processing in the control unit 121 is processing by executing a dentist simulation game program.
  • FIG. 9 is a diagram for explaining an operation method of the dentist simulation game.
  • the control unit 121 generates a game space 53a for the dentist simulation game in response to the start of the dentist simulation game.
  • the control unit 121 generates tooth objects 53b, 53b... And carious tooth objects 53c, 53c.
  • the tooth object 53b and the decayed tooth object 53c are arranged in the game space 53a.
  • the tooth object 53b is a normal tooth that is not caries.
  • the decayed tooth object 53c is a decayed tooth.
  • Virtual position information and information indicating the shape are set in the tooth object 53b and the caries object 53c.
  • the control unit 121 displays an image as shown in FIG. 9 on the screen 15a.
  • FIG. 9 a plurality of tooth objects 53b, 53b... Corresponding to the lower teeth and two caries objects 53c, 53c are displayed.
  • the tooth object 53b is shown in white.
  • the caries object 53c is indicated by hatching.
  • the user selects “Drill” by using the therapeutic instrument selection menu 53e displayed on the screen 15a.
  • the control unit 121 newly generates a drill object 53f.
  • the control unit 121 arranges the drill object 53f in the game space 53a based on the virtual position information converted from the three-dimensional position information of the controller 13.
  • the drill object 53f is displayed on the screen 15a.
  • the position of the drill object 53f that is the operation target changes.
  • the rotation sound data in which the rotation sound of the drill is recorded is reproduced.
  • the user moves the controller 13 while looking at the screen 15a to move the drill object 53f in the vicinity of the caries object 53c.
  • the tip of the drill object 53 f corresponds to the position instruction unit 134 of the controller 13.
  • the user moves the controller 13 on the operation surface 14a so that the drill object 53f does not hit the tooth object 53b or the like.
  • the video displayed on the screen 15a changes to the video shown in FIG. 10A.
  • FIG. 10A and 10B are enlarged views of the vicinity of the drill object 53f.
  • the image which looked at the lower tooth from diagonally upward direction is displayed on the screen 15a.
  • the user can easily confirm the vertical positional relationship between the caries object 53c and the drill object 53f.
  • the user moves the drill object 53f to a position in contact with the caries object 53c while viewing the screen 15a shown in FIG. 10A.
  • the control unit 121 determines whether or not the drill object 53f has contacted the carious object 53c.
  • the control unit 121 starts a process of cutting the caries object 53c.
  • the control unit 121 reproduces the friction sound data in which the friction sound between the drill and the teeth is recorded instead of the rotation sound data. .
  • the control unit 121 may increase the reproduction volume of the rotation sound data or change the tone of the reproduction sound of the rotation sound data. As shown in FIG. 10B, the control unit 121 displays on the screen 15a how the scraped teeth fragments are scattered. At this time, as the drill object 53f enters the occupied area of the carious tooth object 53c, the volume of the drill accompanying the reproduction of the frictional sound data or the rotation sound data may be increased. By adjusting the volume of the audio data to be reproduced in accordance with the position of the controller 13, the user can experience a pseudo treatment of caries.
  • the control unit 121 executes a process of cutting the tooth object 53b even when the drill object 53f contacts the tooth object 53b. Therefore, in the dentist simulation game, a fine movement of the controller 13 is required to cut only the caries object 53c. Therefore, the user can enjoy a game with a sense of tension.
  • the user can also perform extraction as a treatment for caries.
  • the user selects “wrench” using the therapeutic instrument selection menu 53e displayed on the screen 15a.
  • the control unit 121 newly generates a wrench object 53g according to the selection of “wrench”.
  • 11A and 11B are enlarged views of the periphery of the wrench object 53g.
  • the user moves the wrench object 53g to the vicinity of the caries object 53c by moving the controller 13 on the operation surface 14a.
  • the control unit 121 reproduces sound data in which a sound effect indicating that the wrench object 53g has hit the caries object 53c is recorded.
  • the user can close the wrench object 53g by pressing the B button 132b.
  • the controller 121 displays an image in which the wrench object 53g sandwiches the caries object 53c by closing the wrench object 53g.
  • the control unit 121 displays an image in which the caries object 53c is extracted from the gums as shown in FIG. 11B. At the moment when the caries object 53c is extracted from the gums, the control unit 121 reproduces sound data in which a sound effect for extracting the tooth is recorded. According to the position of the controller 13, the user can perform the tooth extraction treatment with a sense close to reality by reproducing the sound effect from which the tooth decay is extracted.
  • a dedicated controller such as a drill-shaped controller or a controller that can be operated like an actual wrench may be used. This makes it possible to enjoy a dentist simulation game with higher reality.
  • the decoration game is a game that allows the user to experience the cake decoration in a pseudo manner.
  • the controller 13 is regarded as a cream squeezer.
  • the user can use the controller 13 to place fruits such as candles and strawberries on a virtual cake.
  • the user sets the optical disc 16 on which the decoration game program is stored in the optical disc drive 122. Thereby, the decoration game is started.
  • the processing of the control unit 121 is processing by executing a decoration game program.
  • FIG. 12 is a diagram for explaining the operation method of the decoration game.
  • the controller 121 generates a game space 54a for the decoration game in response to the start of the decoration game.
  • the control unit 121 generates a cylindrical cake object 54b that is not decorated and places the cake object 54b in a default position in the game space 54a.
  • the cake object 54b virtual position information and information indicating the shape of the cake are set.
  • the control unit 121 displays the game space 52a on the left side of the screen 15a.
  • the cream objects 54d and 54d are not displayed on the screen 15a.
  • An item selection menu 54c is displayed on the right side of the screen 15a.
  • the user uses the item selection menu 54c to select an item used for decorating the cake object 54b.
  • the control unit 121 newly generates a squeezed bag object 54e.
  • the controller 121 converts the three-dimensional position information of the controller 13 to generate virtual position information.
  • the air bag object 54e is arranged in the game space 54a.
  • the position instruction unit 134 of the controller 13 corresponds to the cap 54f of the air bag object 54e.
  • the user moves the controller bag 13e to the position where the cream is to be placed by moving the controller 13 on the operation surface 14a while viewing the screen 15a.
  • the cream can be squeezed out.
  • the user instructs to squeeze the cream by pressing the A button 132a.
  • the control unit 121 displays an image in which the cream is squeezed from the base 54f on the screen 15a.
  • a cream object 54d is newly generated.
  • the size of the cream object 54d increases as the time that the A button 132a is pressed is longer.
  • the cream object 54d is arranged at a position corresponding to the base 54f (virtual position information).
  • FIG. 13 is a diagram showing the cake object 54b decorated with items other than the squeezed bag.
  • the user can place each item such as a strawberry, chocolate plate, and candle on the cake object 54b using the item selection menu 54c.
  • the control unit 121 newly generates a strawberry object 54g.
  • the user operates the controller 13 to move the strawberry object 54g to a desired position on the cake object 54b and presses the A button 132a. Thereby, the strawberry object 54g can be placed on the cake object 54b.
  • chocolate plates and candles can be placed on the cake object 54b.
  • the user can easily realize the cake decoration of the original design that the user has considered.
  • Wood carving game is a game that allows you to experience a wood carving in a pseudo manner.
  • the controller 13 is regarded as a tool such as a carving sword.
  • the user can model the virtual wood into a desired shape by operating the controller 13 on the operation surface 14a.
  • the user sets the optical disc 16 on which the wood carving game program is stored in the optical disc drive 122. Thereby, the wood carving game is started.
  • the processing in the control unit 121 is processing by executing a wood carving game program.
  • FIG. 14 is a diagram for explaining the operation method of the wood carving game.
  • the control unit 121 generates a game space 55a for the wood carving game in response to the start of the wood carving game.
  • the control unit 121 generates a cuboid wood object 55b and places it at a default position in the game space 55a.
  • Information indicating the shape of the wood object 55b itself is set in the wood object 55b.
  • a game space 55a is displayed on the left side of the screen 15a.
  • a tool selection menu 55c for selecting a tool is displayed on the right side of the screen 15a.
  • the control unit 121 newly forms a flat sword object 55d according to the user's selection.
  • the flat sword object 55d is arranged in the game space 55a based on the virtual position information converted from the three-dimensional position information of the controller 13.
  • the blade edge 55e of the flat blade object 55d corresponds to the position of the position instruction unit 134.
  • the user moves the controller 13 on the operation surface 14a while looking at the wood object 55b and the flat sword object 55d displayed on the screen 15a.
  • the flat sword object 55d moves in the game space 55a in accordance with the change in the three-dimensional position information of the controller 13.
  • the control unit 121 monitors the distance between the wood object 55b and the blade edge 55e. This distance is calculated based on the virtual position information of the blade edge 55e and the virtual position information and information indicating the shape of the wood object 55b. When the distance between the wood object 55b and the blade edge 55e is shorter than a predetermined distance, the control unit 121 determines that the blade edge 55e has contacted the wood object 55b.
  • FIG. 15 is a diagram showing the wood object 55b whose shape has been changed. It is assumed that the user moves the controller 13 obliquely in the upper left direction with respect to the operation surface 14a.
  • the control unit 121 changes the shape of the wood object 55b based on the trajectory of the blade edge 55e after the blade edge 55e contacts the wood object 55b. That is, the wood object 55b is scraped off obliquely in the upper left direction from the contact point 55f with the blade edge 55e. Since the wood object 55b is cut by the flat sword object 55d, the cut surface becomes a flat surface. When a sword other than a flat sword is selected, the shape of the cut surface changes according to the selected sword.
  • the user can experience a sculpture in a pseudo manner by moving the controller 13 on the operation surface 14a. Further, since it is not necessary to clean up after engraving, the user can enjoy engraving easily.
  • a controller dedicated to the engraving game may be used instead of the controller 13.
  • the controller 13 it is conceivable to use a plurality of controllers having a shape such as a flat sword or a round sword.
  • each controller By properly using each controller, the user can enjoy a sculpture game with a feeling close to that of a real sculpture.
  • each position indicating unit 134 of each controller has a position indicating unit 134 having a different resonance frequency.
  • the tablet 14 can identify the controller by identifying the frequency of the received resonance signal.
  • the Chinese cuisine game is a game for virtually creating Chinese cuisine.
  • the controller 13 is like a cooking utensil such as a wok or a ladle. The user virtually cooks by operating the controller 13 on the operation surface 14a.
  • the user sets the optical disc 16 on which the Chinese cuisine game program is stored in the optical disc drive 122. Thereby, the Chinese cooking game is started.
  • the processing in the control unit 121 is processing by executing a Chinese cooking game program.
  • FIG. 16 is a diagram for explaining the operation method of the Chinese cuisine game.
  • the control unit 121 generates a game space 56a for the Chinese cuisine game in response to the start of the Chinese cuisine game.
  • the control unit 121 generates a wok object 56b, a ladle object 56c, and container objects 56d, 56e, and 56f.
  • the container objects 56d, 56e, and 56f are virtual containers that contain water, soy sauce, and chili oil.
  • an item selection menu 56g is displayed on the right side of the screen 15a.
  • the user selects a cooking utensil to be used by using the item selection menu 56g.
  • a game space 56a in which a wok object 56b, a ladle object 56c, and container objects 56d to 56f are displayed.
  • the control unit 121 arranges the egg object 56c in the game space 56a based on the virtual position information converted from the three-dimensional position information of the controller 13.
  • the wok object 56b and the container objects 56d to 56f are arranged at default positions in the game space 56a.
  • the control unit 121 monitors in real time whether or not the tama object 56c contacts the wok object 56b. When it is determined that the wok object 56b and the tama object 56c are in contact with each other, the control unit 121 reproduces sound data in which a metal sound is recorded as a sound effect. Thereby, the user can perform the operation
  • FIG. 17 is a diagram for explaining the operation of the container object 56d. Since “water” is selected in the item selection menu 56g, the container object 56d moves in the game space 56a in accordance with the change in the three-dimensional position information of the controller 13. The ladle object 56c is arranged at a default position in the game space 56a.
  • the container object 56d When the user presses the B button 132b (see FIG. 4), the container object 56d is displayed in a tilted state as shown in FIG. If the container object 56d is positioned on the wok object 56b, the controller 121 determines that water has entered the wok object 56b. In order to indicate that water has entered the wok object 56b, the control unit 121 reproduces a cooking sound in which a sound (water boiling sound) generated when the water enters the heated wok is recorded. .
  • the user can operate the container objects 56e and 56f in the same manner as the container object 56d by selecting “soy sauce” or “ra oil” from the item selection menu 56g.
  • the user can experience the cooking of the Chinese food in a pseudo manner.
  • the user may operate the tama object 56c to add water to the wok object 56b. Specifically, the user moves the tama object 56c to the container object 56d. By putting the hemispherical part of the tama object 56c into the container object 56d, water can be put into the hemispherical part. The user tilts the ladle object 56c on the wok object 56b, whereby water is poured into the wok object 56b. When tilting the ladle object 56c, the user may press the B button 132b.
  • a cooking utensil selection menu may be displayed on the right side of the screen 15a so that the user can select any one of the plurality of cooking utensils.
  • FIG. 18 is a diagram for explaining the operation of the Chinese wok object 56b. Since “wok” is selected in the item selection menu 56g, the wok object 56b moves in the game space 56a in accordance with the change in the three-dimensional position information of the controller 13.
  • the position instruction unit 134 corresponds to the pattern portion of the wok object 56b.
  • the wok object 56b moves up and down in the game space 56a.
  • the controller 121 determines that the controller 13 has moved up and down at a certain speed or higher, the controller 121 displays an image in which water or the like in the wok object 56b jumps out of the wok object 56b. Thereby, the user can experience the operation
  • the user can artificially create Chinese food with a more realistic feeling by moving the controller 13 in the three-dimensional space on the operation surface 14a.
  • the stethoscope game is a game in which a patient's examination using a stethoscope can be simulated.
  • the controller 13 is likened to a stethoscope.
  • the user sets the optical disc 16 in which the stethoscope game program is stored in the optical disc drive 122. Thereby, a stethoscope game is started.
  • the processing in the control unit 121 is processing by executing a stethoscope game program.
  • FIG. 19 is a diagram for explaining the operation method of the stethoscope game.
  • the control unit 121 generates a game space 57a for the stethoscope game in response to the start of the stethoscope game.
  • the control unit 121 generates a stethoscope object 57b and a patient object 57c.
  • the patient object 57c corresponds to the upper body of the human body and is arranged at a default position in the game space 57a.
  • the control unit 121 arranges the stethoscope object 57b in the game space 57a based on the virtual position information converted from the three-dimensional position information of the controller 13.
  • the sound collection unit 57 d of the stethoscope object corresponds to the position instruction unit 134.
  • the game space 57a including the patient object 57c is displayed on the left side of the screen 15a.
  • the abdomen side of the patient object 57c is displayed on the screen 15a shown in FIG.
  • the user can display the back of the patient object 57c on the screen 15a using the display switching menu 57e.
  • the user moves the controller 13 while looking at the screen 15a to move the stethoscope object 57b to a position to be examined. At this time, assuming that the upper body of the patient is lying on the operation surface 14a, the user moves the controller 13 while floating from the operation surface 14a. Since the position indicating unit 134 corresponds to the sound collecting unit 57d, it is desirable that the position indicating surface 135 (see FIG. 4) of the controller 13 is directed toward the operation surface 14a when the controller 13 is moved.
  • the control unit 121 monitors the distance between the sound collection unit 57d and the patient object 57c. When the distance between the sound collection unit 57d and the patient object 57c is shorter than a predetermined distance, the control unit 121 determines that the sound collection unit 57d has contacted the patient object 57c.
  • the optical disc 16 stores a plurality of audio data in which internal sounds that can be heard with an actual stethoscope are recorded.
  • the control unit 121 reproduces audio data corresponding to the contact position of the patient object 57c from among a plurality of audio data. For example, when the sound collection unit 57d is applied to the left chest of the patient object 57c, the control unit 121 reproduces sound data in which the heartbeat is recorded.
  • the user can experience the patient's diagnosis in a pseudo manner by operating the stethoscope object 57 b using the controller 13.
  • the control unit 121 may give a quiz regarding the position where the stethoscope object 57b is in contact with the patient object 57c.
  • FIG. 20 is a diagram showing a screen 15a when a quiz is given in the stethoscope game.
  • the optical disk 16 stores sound data obtained by recording a plurality of heart sound patterns corresponding to heart diseases.
  • the control unit 121 randomly selects and reproduces any one of a plurality of heart sound patterns.
  • An answer menu 57f is displayed on the right side of the screen 15a. In the answer menu 57f, a message “Please select the correct symptom for the audible heart sound.” Is displayed.
  • the user selects an answer from the options displayed below the message.
  • the control unit 121 displays the correct answer and a detailed explanation about the correct answer in the answer menu 57f. In this manner, the user can not only experience a patient examination using a stethoscope, but also deepen knowledge about medical care.
  • a doll corresponding to the shape of the patient object 57c may be placed on the operation surface 14a.
  • the user can easily grasp the positional relationship between the stethoscope object 57b and the patient object 57c.
  • the control unit 121 In order to enable the control unit 121 to specify the position and posture of the doll, it is desirable that a plurality of position instruction units 134 having different resonance frequencies be embedded in the doll.
  • the tablet 14 identifies each position indicating unit 134 based on the frequency of the received resonance signal.
  • the control unit 121 specifies the position and orientation of the doll based on the three-dimensional position information of each position instruction unit 134. Thereby, the direction and position of the doll placed on the operation surface 14a can be made to correspond to the direction and position of the patient object 57c displayed on the screen 15a.
  • a controller dedicated to the stethoscope game in the shape of a stethoscope may be used instead of the controller 13.
  • the position instruction unit 134 is installed at a position corresponding to the sound collection unit. Thereby, the user can enjoy a stethoscope game with a more realistic feeling.
  • FIG. 21 is a diagram for explaining the operation method of the doll pop-out game. On the screen 15a, an image of a doll pop-out game is displayed.
  • the doll pop-out game is a game in which key objects 58c, 58c,... Are inserted one after another into a plurality of insertion holes formed in the cylindrical object 58b as shown in FIG.
  • the key object 58c is inserted into the contact hole 58e among the plurality of insertion holes, the doll object 58d set on the cylindrical object 58b pops out.
  • the user plays the doll pop-out game with the controller 13 as a key object 58c.
  • the user sets the optical disc 16 on which the doll pop-out game program is stored in the optical disc drive 122. Thereby, a doll pop-out game is started.
  • the processing in the control unit 121 is processing by executing a doll popping out game program.
  • the control unit 121 generates a game space 58a in response to the start of the doll pop-out game.
  • the control unit 121 generates a cylindrical object 58b and a doll object 58d. Virtual position information and information indicating the shape are set in the cylinder object 58b.
  • the control unit 121 places the cylindrical object 58b at the default position in the game space 58a.
  • the doll object 58d is set in a hole formed on the upper surface of the cylindrical object 58b.
  • a plurality of insertion holes are formed on the side surface of the cylindrical object 58b.
  • the control unit 121 randomly sets any one of the plurality of insertion holes as the contact hole 58e.
  • a cylindrical object 58b in which a doll object 58d is set is displayed on the screen 15a.
  • the control unit 121 When the user presses the B button 132b, the control unit 121 newly generates a key object 58c. Based on the virtual position information converted from the three-dimensional position information of the controller 13, the control unit 121 arranges the generated key object 58c in the game space 58a.
  • the key object 58c moves in the game space 58a according to the movement of the controller 13 on the operation surface 14a.
  • the user inserts the key object 58c into any one of the plurality of insertion holes by moving the controller 13 while inserting the controller 13 while viewing the screen 15a.
  • the control unit 121 determines that the key object 58c has been inserted into the insertion hole located closest to the contact position. If the insertion hole into which the key object 58c is inserted is the hit hole 58e, the control unit 121 performs a process of popping out the doll object 58d from the cylindrical object 58b.
  • a plurality of users can simultaneously enjoy a doll pop-out game.
  • the position of the hit hole 58e is changed, so that it is possible to enjoy a feeling of tension that does not know when the doll object 58d pops out.
  • the baseball game is a game in which a user can experience baseball batting in a pseudo manner by the user swinging the controller 13 that looks like a bat on the operation surface 14a.
  • the user sets the optical disc 16 in which the baseball game program is stored in the optical disc drive 122. Thereby, a baseball game is started.
  • the processing in the control unit 121 is processing by executing a baseball game program.
  • FIG. 22 is a diagram for explaining a baseball game operation method.
  • the control unit 121 generates a game space 59a corresponding to the baseball field in response to the start of the baseball game.
  • On the screen 15a an image of a baseball field (game space 59a) in which the mound is viewed from behind the home base is displayed.
  • the operation surface 14a corresponds to an area around the home base.
  • the control unit 121 generates a bat object 59b and a ball object 59c.
  • the ball object 59c is arranged on the mound of the game space 59a (baseball field).
  • the bat object 59b is arranged in one of the left and right batter boxes.
  • the position of the position instruction unit 134 corresponds to the tip of the bat object 59b. Therefore, the position of the tip of the bat object 59b is determined based on the virtual position information converted from the three-dimensional position information.
  • the user selects either the right batter box or the left batter box.
  • the controller 121 places the bat object 59b on the batter box selected by the user based on the position of the tip of the bat object 59b.
  • the control unit 121 displays an image in which the pitcher performs a pitching operation on the screen 15a, and moves the ball object 59c from the mound toward the home base.
  • the screen 15a displays how the ball object 59c flies from the mound toward the home base.
  • the user swings the controller 13 on the operation surface 14a while confirming the position of the ball object 59c displayed on the screen 15a.
  • the direction in which the controller 13 swings is the direction from the front side to the back side of the operation surface 14a (the direction indicated by the arrow 59d).
  • the controller 13 When the controller 13 is swung on the operation surface 14a, the position of the bat object 59b changes.
  • the controller 121 monitors the distance between the bat object 59b and the ball object 59c after the pitcher starts a pitching motion. If the distance between the bat object 59b and the ball object 59c is shorter than a predetermined distance, the control unit 121 determines that the ball object 59c has hit the bat object 59b. When the ball object 59c hits the bat object 59b, the control unit 121 moves the ball object 59c toward the fair ground.
  • the user in order to hit the ball object 59c against the bat object 59b, the user needs to swing the controller 13 at an appropriate height and timing.
  • the controller 13 since it can be determined whether or not the batting is successful based on the three-dimensional position information of the controller 13, it is possible to enjoy a baseball game that is more realistic.
  • FIG. 23 is a diagram showing the swing direction of the controller 13.
  • the control unit 121 determines the trajectory of the ball object 59c (hit ball) after hitting the bat object 59b based on the swing direction of the bat object 59b.
  • the swing direction is determined based on virtual position information of the bat object 59b from when the pitcher starts a pitching motion until the bat object 59b hits the ball object 59c.
  • the bat object 59b is also swung downward.
  • the hit ball is a ball. If the hit ball is a ball, an image of the ball object 59c rolling is displayed on the screen 15a.
  • the bat object 59b is also swung upward. In this case, the hit ball is a fly. If the hit ball is a fly, an image of the ball object 59c flying in a parabola is displayed on the screen 15a.
  • the user can perform batting according to the progress of the baseball game by performing various swings.
  • the finger puppet game is a game in which a character in the game space is moved in accordance with the movement of the user's finger.
  • a controller that can be worn on the user's finger is used instead of the controller 13.
  • the user sets the optical disc 16 on which the finger puppet game program is stored in the optical disc drive 122. Thereby, a finger puppet game is started.
  • the processing in the control unit 121 is processing by executing a finger puppet game program.
  • FIG. 24 is a diagram for explaining the operation method of the finger puppet game.
  • the control unit 121 generates a game space 60a for the finger puppet game in response to the start of the finger puppet game.
  • the control unit 121 generates a finger puppet object 60b and places it in the game space 60a.
  • a finger puppet object 60b is displayed on the screen 15a.
  • the controllers 17a, 17b, and 17c are used to operate the finger puppet object 60b.
  • the user operates the finger puppet object 60b by moving the finger on which the controllers 17a to 17c are mounted on the operation surface 14a.
  • FIG. 25 is an external view of the controller 17.
  • the controller 17 is collectively referred to as the controller 17 when common explanations regarding the controllers 17a to 17c are given.
  • the controller 17 has a finger sack shape. The user inserts the finger from below the controller 17 and puts the controller 17 on the finger.
  • a sensor chip 171 is attached to the surface of the controller 17.
  • FIG. 26 is a block diagram showing a functional configuration of the sensor chip 171.
  • the sensor chip 171 includes a position instruction unit 172 and a position instruction signal supply unit 173.
  • the position instruction unit 172 is a resonance circuit including a coil, a capacitor, and the like, like the position instruction unit 134.
  • the controller 17 has different resonance frequencies of the position indicating unit 172.
  • the position instruction signal supply unit 173 supplies a current signal (position instruction signal) corresponding to the resonance frequency of the position instruction unit 172 of each controller 17 to the position instruction unit 172. Note that the position instruction unit 172 and the position instruction signal supply unit 173 may be the same resonance circuit.
  • the tablet 14 supplies a sine wave current signal corresponding to one of the controllers 17a to 17c to the X sensor coil 141x and the Y sensor coil 141y.
  • the tablet 14 supplies a sine wave current signal of f1 (Hz) to the X sensor coil 141x and the Y sensor coil 141y.
  • f1 resonance frequency
  • the tablet 14 supplies a sine wave current signal of f1 (Hz) to the X sensor coil 141x and the Y sensor coil 141y.
  • an induced current is generated in the position indicating unit 172 of the controller 17a.
  • No induced current is generated in the position indicating section 172 of the controllers 17b and 17c.
  • the position instruction signal supply unit 173 supplies a position instruction signal of f1 (Hz) to the position instruction unit 172.
  • the tablet 14 receives a resonance signal of f1 (Hz).
  • the tablet 14 determines that the resonance signal of f1 (Hz) received after the supply of the sine wave current signal of f1 (Hz) is transmitted from the controller 17a.
  • the tablet 14 outputs the three-dimensional position information associated with the identification information of the controller 17a to the game device 12. Accordingly, the control unit 121 can determine that the input three-dimensional position information corresponds to the controller 17a.
  • the tablet 14 supplies a sine wave current signal having a resonance frequency corresponding to the controllers 17b and 17c to the X sensor coil 141x and the Y sensor coil 141y.
  • the control part 121 can acquire the three-dimensional position information corresponding to the controllers 17b and 17c.
  • the controller 17a is attached to the thumb of the right hand of the user.
  • the controller 17b is worn on the index finger of the user's right hand.
  • the controller 17c is worn on the middle finger of the user's right hand.
  • the controllers 17a, 17b, and 17c correspond to the left front leg 60c, the head 60d, and the right front leg 60e of the finger puppet object 60b.
  • the controller 121 determines that the controller 17a has moved because the three-dimensional position information associated with the controller 17a changes.
  • the control unit 121 moves the left front leg 60c of the finger puppet object 60b according to the change in the three-dimensional position information of the controller 17a.
  • the user can operate the finger puppet object 60b by moving the thumb, index finger, and middle finger of the right hand to which the controller 17 is attached on the operation surface 14a.
  • the user can move the entire finger puppet object 60b.
  • the user moves the controller 17 simultaneously in the same direction by moving the entire right hand on the operation surface 14a.
  • the control unit 121 determines that the controllers 17a to 17c have simultaneously moved in the same direction, and moves the entire finger puppet object 60b. Further, the user can move the finger doll object 60b as if walking by moving the thumb, index finger, and middle finger at the same time as moving the entire right hand.
  • FIG. 24 shows a deer as the finger puppet object 60b, but the present invention is not limited to this.
  • the user may select an operation target from a plurality of characters.
  • the pottery game is a game in which the pottery can be enjoyed in a pseudo manner by arranging the clay in the game space into a shape such as a teacup.
  • a finger suck type controller 17 is used in the pottery game.
  • the user sets the optical disc 16 storing the pottery game program in the optical disc drive 122. Thereby, the pottery game is started.
  • the process in the control part 121 is a process by execution of the pottery game program.
  • FIG. 27 is a diagram for explaining an operation method of the pottery game.
  • ten finger suck type controllers 17a, 17b, 17c, 17d, 17e, 17f, 17g, 17h, 17i, and 17j are used.
  • the controller 17a to 17j will be collectively referred to as the controller 17 when they are described in common.
  • the controller 17 is worn on all the fingers of the user.
  • the controller 17 to be worn on the user's finger is designated in advance.
  • Controllers 17a, 17b, 17c, 17d, and 17e are attached to the thumb, index finger, middle finger, ring finger, and little finger of the right hand.
  • Controllers 17f, 17g, 17h, 17i, and 17j are attached to the thumb, index finger, middle finger, ring finger, and little finger of the left hand.
  • the control unit 121 generates a game space 61a for the pottery game in response to the start of the pottery game.
  • a game space 61a is displayed on the left side of the screen 15a.
  • the control unit 121 generates a teacup object 61b, a potter's wheel object 61c, a right hand object 61d, and a left hand object 61e.
  • the potter's wheel object 61c is arranged at a default position in the game space 61a.
  • the teacup object 61b is placed on the potter's wheel object 61c.
  • the control unit 121 converts the three-dimensional position information of each controller 17 into virtual position information.
  • the three-dimensional position information of the controllers 17a to 17e is associated with each finger of the right hand object 61d.
  • the three-dimensional position information of the controllers 17f to 17j is associated with each finger of the left hand object 61e.
  • the control unit 121 determines the positions and shapes of the right hand object 61d and the left hand object 61e based on the virtual position information converted from the three-dimensional position information of the controller 17, and places the right hand object 61d and the left hand object 61e in the game space 61a. Deploy.
  • the position of the finger corresponding to the right hand object 61d or the left hand object changes according to the change in the three-dimensional position information of the controller 17.
  • the controllers 17a to 17c simultaneously move in the same direction.
  • the control unit 121 moves the entire right hand object 61d.
  • the operation menu 61f is displayed on the right side of the screen 15a.
  • the operation menu 61f is used for instructing the rotation start or rotation stop of the potter's wheel object 61c.
  • the user operates the operation menu 61f to rotate the potter's wheel 61c.
  • the user moves the right hand and the left hand while viewing the teacup object 61b, the right hand object 61d, and the left hand object 61e displayed on the screen 15a.
  • the control unit 121 determines that the teacup object 61b and the right hand object 61d overlap
  • the control unit 121 changes the shape of the teacup object 61b.
  • the shape of the teacup object 61b changes according to the overlapping area where the teacup object 61b and the right hand object 61d overlap. The same applies when the bowl object 61b and the left hand object 61e overlap.
  • the shape of the teacup object 61b can be adjusted by moving the right hand and the left hand of the user while rotating the potter's wheel 61c.
  • the user may arrange the shape of the teacup object 61b in a state where the rotation of the potter's wheel object 61c is stopped. Thereby, the shape of the teacup object 61b can be finely adjusted.
  • the teacup object 61b may be formed by regarding the controller 13 as another device such as a spatula.
  • a selection menu for the appliance to be used is displayed under the operation menu 61f. The user operates the controller 13 on the operation surface 14a while looking at the controller 13 on the appliance selected from the selection menu.
  • the stationary game system 1 has been described as an example, but the present invention is not limited to this.
  • the tablet 14 By incorporating the tablet 14 in a portable game device, the above game programs can be executed even in the portable game device.
  • FIG. 28 is a diagram showing a portable game device 18 according to a modification of the present embodiment.
  • the game device 18 shown in FIG. 28 includes a liquid crystal panel 181, an A button 182a, a B button 182b, a cross key 182c, a tablet 183, and a stylus pen 184.
  • the liquid crystal panel 181 corresponds to the screen 15a of the television receiver 15 and displays a game image.
  • the A button 182a, the B button 182b, and the cross key 182c correspond to the A button 132a, the B button 132b, and the cross key 132c of the controller 13, respectively.
  • the tablet 183 corresponds to the tablet 14, detects the position of the position instruction unit 134 built in the stylus pen 184, and generates three-dimensional position information.
  • a liquid crystal panel may be disposed on the tablet 14.
  • the game space of each game may be displayed on the liquid crystal panel arranged on the tablet 14.
  • a game space can be displayed on the liquid crystal panel arranged on the tablet 14.
  • the billiard table 52b may be displayed on a liquid crystal panel arranged on the tablet 14. Thereby, the user can easily confirm the correspondence between the position of the controller 13 and the position of the cue object 52e.
  • videos of the home base and batter box may be displayed on the liquid crystal panel arranged on the tablet 14. Thereby, the user can easily confirm the position where the stylus pen 182 is swung.
  • FIG. 29 is an external view of the game apparatus 2 according to the present embodiment.
  • the game apparatus 2 includes liquid crystal panels 21 and 22. On the display surface 21a of the liquid crystal panel 21 and the display surface 22a of the liquid crystal panel 22, an image of the game being played by the user is displayed.
  • buttons 23a, a B button 23b, and a speaker 24 are provided on the right side of the liquid crystal panel 22 .
  • a cross key 23c and a speaker 24 are provided on the left side of the liquid crystal panel 22 .
  • the buttons are collectively referred to as the operation unit 23.
  • the operation unit 23 is used to input game operation information.
  • the speakers 24 and 24 output the sound of the game that the user is playing.
  • the game device 2 includes a stylus pen 25 as an input device for inputting operation information.
  • a tablet 26 (see FIG. 30) is provided under the liquid crystal panel 22.
  • the user inputs information indicating the position and orientation of the stylus pen 25 as operation information by moving the stylus pen 25 while floating on the display surface 22a. Therefore, the game apparatus 2 can provide a game using a new operation method in which the user changes the posture of the stylus pen 25 in the three-dimensional space on the display surface 22a.
  • FIG. 30 is a block diagram showing a functional configuration of the game apparatus 2.
  • the game apparatus 2 includes a control unit 20, a tablet 26, and a connector 28 in addition to the liquid crystal panels 21 and 22, the operation unit 23, and the speaker 24.
  • the connector 28 is an interface for connecting the game apparatus 2 and a memory card 29 in which a game program is recorded.
  • the connector 28 is provided in the back of the card slot 27 (see FIG. 29) of the game apparatus 2. The user inserts the memory card 29 into the card slot 27 and connects the memory card 29 and the connector 28. Thereby, the user can enjoy various games.
  • the control unit 20 includes a CPU, a RAM, and the like, and performs overall control of the game apparatus 2.
  • the control unit 20 accesses the memory card 29 connected to the connector 28 and executes the game program recorded on the memory card 29.
  • the tablet 26 is an electromagnetic induction type tablet, and detects the position and orientation of the stylus pen 25.
  • FIG. 31 is a diagram showing a positional relationship between the liquid crystal panel 22 and the tablet 26 when the game apparatus 2 is viewed from the card slot 27 side. In FIG. 31, the components other than the housing 2 a, the liquid crystal panel 22, and the tablet 26 of the game apparatus 2 are not shown.
  • the liquid crystal panel 22 and the tablet 26 are built in the housing 2 a of the game apparatus 2.
  • the tablet 26 is disposed below the liquid crystal panel 22.
  • the detection surface 26a (see FIG. 33) on which the tablet 26 can detect the position of the stylus pen 25 and the display surface 22a of the liquid crystal panel 22 overlap each other.
  • FIG. 32 is a block diagram showing a functional configuration of the stylus pen 25.
  • the stylus pen 25 includes a position instruction unit 251, a position instruction unit 252, and a position instruction signal supply unit 253.
  • the position indicating units 251 and 252 are resonance circuits including coils and capacitors.
  • the position instruction signal supply unit 253 supplies a position instruction signal to the position instruction units 251 and 252.
  • the position instruction signal is a current signal used by the position instruction units 251 and 252 to transmit a resonance signal to the tablet 26.
  • FIG. 33 is a diagram for explaining the detection of the positions of the position instruction units 251 and 252.
  • the tablet 26 can detect the respective positions of the position instruction units 251 and 252. In other words, the tablet 26 can detect the position of the stylus pen 25 if the user moves the stylus pen 25 on the display surface 22a.
  • the position indicating unit 251 includes a coil 251a and a capacitor 251b, and is arranged on the pen tip side.
  • the position indicating unit 252 includes a coil 252a and a capacitor 252b, and is disposed at the center of the stylus pen 25.
  • the position indicating units 251 and 252 are arranged so that the central axis p of the stylus pen 25 passes.
  • the resonance frequencies of the position indicating units 251 and 252 are different from each other.
  • the resonance frequencies of the position indicating units 251 and 252 are set to f3 (Hz) and f4 (Hz), respectively.
  • the X axis, the Y axis, and the Z axis are defined with the lower left corner of the detection surface 26a as the origin.
  • the detection surface 26a corresponds to the XY plane.
  • the Z axis is a direction perpendicular to the detection surface 26a.
  • the X sensor coil 26x and the Y sensor coil 26y are indicated by solid lines.
  • the X sensor coil 26x is installed along the Y-axis direction and is used to detect the X coordinates of the position indicating units 251 and 252.
  • the Y sensor coil 26y is installed along the X-axis direction and is used to detect the Y coordinates of the position indicating units 251 and 252.
  • the tablet 26 detects the position of the position instruction unit 251
  • the X sensor coil 26x and the Y sensor coil 26y have a circuit configuration in which f3 (Hz) and f4 (Hz) are resonance frequencies.
  • the tablet 26 supplies a sine wave signal of f3 (Hz) to the X sensor coil 26x and the Y sensor coil 26y.
  • the magnetic field on the detection surface 26 a changes and an induced current flows through the position indicating unit 251. Since the resonance frequency of the position indicating unit 252 is f2 (Hz), no induced current is generated in the position indicating unit 252.
  • the position instruction signal supply unit 253 supplies a position instruction signal to the position instruction unit 251 when an induced current flows through the position instruction unit 251.
  • the magnetic field around the position instruction unit 251 changes, and a resonance signal of f3 (Hz) is generated around the coil 251a.
  • the X sensor coil 26x and the Y sensor coil 26y receive a resonance signal. Since the frequency of the resonance signal is f3 (Hz), the tablet 26 determines that the resonance signal has been received from the position indicating unit 251.
  • the tablet 26 specifies the X coordinate and the Y coordinate of the position instruction unit 251 based on the received resonance signal.
  • the Z coordinate of the position indicating unit 251 is specified based on the waveform of the received resonance signal.
  • the tablet 26 generates three-dimensional position information (first position information) of the position instruction unit 251 based on the specified X coordinate, Y coordinate, and Z coordinate.
  • the tablet 26 generates three-dimensional position information (second position information) of the position instruction unit 252 by supplying a sine wave signal of f4 (Hz) to the X sensor coil 26x and the Y sensor coil 26y.
  • the tablet 26 outputs the first position information and the second position information as three-dimensional position information indicating the position of the stylus pen 25.
  • the tablet 26 detects the inclination of the central axis p with respect to the detection surface 26a based on the first position information and the second position information.
  • the tablet 26 outputs information indicating the detected tilt as tilt information.
  • the stylus pen 25 may include three position instruction units including position instruction units 251 and 252.
  • the three position indicating units are arranged to form a plane.
  • the tablet 26 specifies a plane including the three position instruction units based on the three-dimensional position information of the three position instruction units.
  • the tablet 26 detects the rotation angle of the stylus pen 25 based on the specified plane, and outputs information indicating the rotation angle as rotation information.
  • the position instruction units 251 and 252 and the position instruction signal supply unit 253 are shown as separate components.
  • the position instruction unit 251 and the position instruction signal supply unit 253 may be the same resonance circuit.
  • the capacitor 251b accumulates electric charge according to the induced current generated in the position indicating unit 251.
  • the position instruction signal is supplied to the coil 251a using the electric charge accumulated in the capacitor 251b.
  • the position instruction unit 252 and the position instruction signal supply unit 253 may be the same resonance circuit.
  • the shooting game is a game in which a bullet is fired from a handgun to defeat an enemy character.
  • the user connects the memory card 29 storing the shooting game program to the connector 28. Thereby, a shooting game is started.
  • the processing in the control unit 20 is processing by executing a shooting game program.
  • FIG. 34 is a diagram for explaining the operation method of the shooting game.
  • FIG. 34 shows only the liquid crystal panels 21 and 22 and the tablet 26 among the components of the game apparatus 2.
  • the control unit 20 In response to the start of the shooting game, the control unit 20 generates a game space 62a for the shooting game.
  • the game space 62a is a virtual three-dimensional space.
  • the control unit 20 generates a handgun object 62b and enemy objects 62c, 62c,.
  • the handgun object 62b is an operation target of the user.
  • the handgun object 62b advances at a constant speed in a predetermined direction in the game space 62a together with a character object (not shown) that holds the handgun object 62b.
  • the enemy objects 62c, 62c,... Are randomly arranged in the game space 62a.
  • the handgun object 62b is arranged in the game space 62a based on the first position information and the tilt information. Specifically, the muzzle 62d of the handgun object 62b corresponds to the position instruction unit 251.
  • the control unit 20 converts the first position information into position information (first virtual position information) of the game space 62a.
  • the position of the muzzle 62d is determined based on the first virtual position information.
  • the barrel 62e of the handgun object 62b is arranged in the direction opposite to the traveling direction of the handgun object 62b with the muzzle 62d as a reference.
  • the control unit 20 displays the game space 62a on the display surfaces 21a and 22a.
  • a moving image of a character object (not shown) is displayed.
  • a foot direction image of a player character (not shown) holding the handgun object 62b is displayed.
  • the handgun object 62b is displayed on one of the display surfaces 21a and 22a based on the first position information and the tilt information.
  • the aiming 62f of the handgun object 62b is displayed together with the handgun object 62b on either of the display surfaces 21a and 22a.
  • the aim 62f is displayed based on the tilt information of the stylus pen 25.
  • the control unit 20 converts the straight line corresponding to the central axis p into a virtual straight line corresponding to the game space 62a using the first position information and the inclination information.
  • the aim 62f is displayed in the direction in which the virtual straight line extends from the muzzle 62d.
  • the control unit 20 converts the input information into information corresponding to the game space 62a (first virtual position information and information indicating a virtual straight line). .
  • the control unit 20 changes the positions of the handgun object 62b and the aim 62f based on the converted information.
  • the user can move the handgun object 62b and the aim 62f by moving the stylus pen 25 on the display surface 22a or changing the direction.
  • the stylus pen 25 is directed in the moving direction of the player character (the direction in which the liquid crystal panel 21 is disposed)
  • the handgun object 62b and the aim 62f are displayed on the display surface 21a.
  • the stylus pen 25 is facing the display surface 22a, the handgun object 62b and the aim 62f are displayed on the display surface 22a.
  • the user can fire a bullet from the handgun object 62b by pressing the A button 23a.
  • the A button 23a is pressed at the timing when the aim 62f and the enemy object 62c overlap, the fired bullet hits the enemy object 62c. Thereby, the enemy object 62c can be defeated.
  • the game apparatus 2 can provide a shooting game using a new operation method in which the aim 62f is moved by changing the direction of the stylus pen 25.
  • the action game is a game in which a character object that is a user's operation target is moved to avoid an obstacle in the game space or to defeat an enemy in the game space.
  • the user plays an action game using a doll resembling a player object instead of the stylus pen 25.
  • the user connects the memory card 29 storing the action game program to the connector 28. Thereby, an action game is started.
  • the process in the control part 20 is a process by execution of an action game program.
  • FIG. 35 is a diagram showing an example of a doll 25A used in an action game. Similar to the stylus pen 25, the duck doll 25A shown in FIG. 35 includes position instruction units 251 and 252 and a position instruction signal supply unit 253. The position indicating unit 251 is embedded near the base of the leg of the doll 25A. The position instruction unit 252 is embedded in the head of the doll 25A. The position instruction signal supply unit 253 is embedded near the neck of the doll 25A. In the doll 25A, a straight line passing through the position instruction units 251 and 252 is defined as a straight line k. The straight line k corresponds to the central axis p of the stylus pen 25. The user plays the action game by placing the doll 25A on the display surface 22a.
  • FIG. 36 is a diagram for explaining an action game operation method.
  • FIG. 36 shows only the liquid crystal panels 21 and 22 and the tablet 26 among the components of the game apparatus 2. The same applies to FIGS. 37 and 38 to be described later.
  • the control unit 20 In response to the start of the action game, the control unit 20 generates a game space 63a for the action game.
  • the control unit 20 generates a character object 63b as a user operation target in the action game.
  • the character object 63b is arranged at a default position in the game space 63a.
  • the control unit 20 displays the game space 63a on the display surface 21a. As shown in FIG. 36, since the doll 25A is placed on the display surface 22a, the game space 63a is not displayed on the display surface 22a.
  • the user operates the character object 63b by changing the posture of the doll 25A on the display surface 22a. For example, when the doll 25A is simply placed on the display surface 22a, the straight line k is perpendicular to the display surface 22a. At this time, the control unit 20 stops the character object 63b in the game space 63a.
  • FIG. 37 is a diagram showing the posture of the doll 25A when the character object 63b is advanced.
  • the user tilts the doll 25A forward so that the head of the doll 25A is tilted in a direction in which the A button 23a is present (rightward in FIG. 36).
  • the character object 63b moves forward in the game space 63a.
  • the angle (forward tilt angle) ⁇ between the straight line k and the display surface 22a becomes an angle (for example, 85 degrees to 30 degrees) that instructs the character object 63b to move forward
  • the character object 63b starts moving forward.
  • the speed at which the character object 63b moves forward may be changed according to the forward tilt angle ⁇ .
  • the control unit 20 increases the speed of the character object 63b as the forward tilt angle ⁇ decreases.
  • the video displayed on the display surface 21a changes according to the advance of the character object 63b.
  • the tree 63c is located in front of the character object 63b (on the right side of the display surface 21a).
  • the tree 63c is positioned behind the character object 63b (on the left side of the display surface 21a).
  • the control unit 20 performs a process of bending the character object 63b when the forward tilt angle ⁇ is smaller than 30 degrees. Specifically, the control unit 20 stops the character object 63b and causes the display surface 21a to display an image in which the character object 63b is crouching while covering its head with a wing. For example, when an enemy object (not shown) is above the character object 63b, the user performs an operation of bending the character object 63b.
  • FIG. 38 is a diagram showing the posture of the doll 25A when the character object 63b is flying in the game space 63a.
  • the character object 63b can fly in the game space 63a. This operation is performed when the obstacle 63d provided in the game space 63a is avoided.
  • the control unit 20 monitors the distance between the doll 25A and the display surface 22a in real time based on the first position information (corresponding to the position instruction unit 251). When the distance between the doll 25A and the display surface 22a is greater than a predetermined distance, the control unit 20 performs a process of flying the character object 63b in the game space 63b. The height at which the character object 63b flies is determined based on the first virtual position information converted from the first position information. On the display surface 21a, an image of the character object 63b flying while flapping its wings is displayed.
  • the character object 63b can be operated by changing the posture of the doll 25A. Therefore, the user can enjoy the action game with a new operation method that has never been seen before.
  • the character object 63b may be operated with the stylus pen 25 as a doll 25A.
  • the user may stand the stylus pen 25 vertically with respect to the display surface 22a.
  • the stylus pen 25 may be tilted in the direction in which the A button 23a is present. Since the user does not have to prepare the doll 25A in which the position instruction units 251 and 252 are built, the user can easily enjoy the action game.
  • Screwdriver game is a game in which a tire is fixed to an automobile by turning a screw using a virtual driver.
  • the stylus pen 25 is regarded as a driver. The user can turn the virtual driver by rotating the stylus pen 25 about the central axis p.
  • the user connects the memory card 29 storing the screwdriver game program to the connector 28. Thereby, a screwdriver game is started.
  • the processing in the control unit 20 is processing by executing a screwdriver game program.
  • FIG. 39 is a diagram showing an initial screen of the screwdriver game.
  • FIG. 39 shows only the liquid crystal panels 21 and 22 and the tablet 26 among the components of the game apparatus 2. The same applies to FIG. 40 described later.
  • the control unit 20 In response to the start of the screwdriver game, the control unit 20 generates a game space 64a for the screwdriver game.
  • the control unit 20 generates a tire object 64b and a car object 64c.
  • the car object 64c is arranged at a default position in the game space 64a.
  • the tire object 64b is attached in a state where it is not fixed to the automobile object 64c.
  • a game space 64a including a tire object 64b is displayed on the display surface 21a.
  • a list of screw objects 64d, 64e, 64f, and 64g is displayed on the display surface 22a.
  • the screw objects 64d to 64g have different shaft lengths, shaft diameters, and thread intervals.
  • Each screw object corresponds to any one of screw holes 641b, 642b, 643b, and 644b formed in the tire object 64b.
  • the user fixes the tire object 64b to the automobile object 64c by tightening the screw objects 64d to 64g in the screw holes 641b to 644b.
  • FIG. 40 is a diagram for explaining the operation method of the screwdriver game.
  • the screw objects 64d to 64g displayed on the display surface 22a are omitted.
  • the user uses the stylus pen 25 to touch the screw object 64d displayed on the display surface 22a to select the screw object 64d.
  • the control unit 20 newly generates a driver object 64h, and displays an image with the screw object 64d attached to the tip of the driver object 64h on the display surface 21a.
  • the position indicating unit 251 of the stylus pen 25 corresponds to the tip of the driver object 64h.
  • the position instruction unit 252 corresponds to the pattern of the driver object 64h.
  • the control unit 20 arranges the driver object 64h in the game space 64a based on the two virtual position information respectively converted from the first position information and the second position information.
  • the user moves the stylus pen 25 on the display surface 22a to move the screw object 64d to the position of the screw hole 641b.
  • the user rotates the stylus pen 25 about the central axis p in a state where the positions of the screw object 64d and the screw hole 641b are matched.
  • the control unit 20 detects a change in the rotation angle of the stylus pen 25 based on the rotation information input from the tablet 26 in real time.
  • the driver object 64h rotates according to the change in the rotation angle.
  • the control unit 20 reproduces the friction sound data in which the metal friction sound is recorded. By reproducing the friction sound data, the user can recognize that the screw object 64d does not correspond to the screw hole 641b. The user repeats the operation of selecting a new screw object from the screw objects 64e to 64g and tightening the screw. By tightening all the screws in the screw holes 641b to 644b, the tire object 64b can be fixed to the automobile object 64c.
  • the game apparatus 2 can provide a game apparatus that uses a new operation method that has never existed before.
  • the Chinese cuisine game is a game for virtually creating Chinese cuisine.
  • the stylus pen 25 is regarded as a cooking utensil such as a wok or a ladle. The user virtually cooks by moving the stylus pen 25 on the display surface 22a.
  • the user connects the memory card 29 storing the Chinese food game program to the connector 28. Thereby, the Chinese cooking game is started.
  • the process in the control part 20 is a process by execution of a Chinese food game program.
  • FIG. 41 is a diagram for explaining the operation method of the Chinese cuisine game.
  • FIG. 41 shows only the liquid crystal panels 21 and 22 and the tablet 26 among the components of the game apparatus 2. The same applies to FIG. 42 described later.
  • the control unit 20 generates a game space 65a for the Chinese cuisine game in response to the start of the Chinese cuisine game.
  • the control unit 20 generates a wok object 65b, a ladle object 65c, and container objects 65d, 65e, and 65f.
  • the container objects 65d, 65e, and 65f are virtual containers in which water, soy sauce, and chili oil are placed.
  • a game space 65a in which a wok object 65b, a ladle object 65c, and container objects 65d to 65f are arranged is displayed.
  • An item selection menu 65g is displayed on the right side of the display surface 21a. The user selects a cooking utensil to be used using the item selection menu 65g.
  • the video of the game is not displayed on the display surface 22a.
  • the item selection menu 65g may be displayed on the display surface 22a.
  • the control unit 20 arranges the egg object 65c in the game space 65a based on the first virtual position information and the second virtual position information.
  • the position instruction unit 251 of the stylus pen 25 corresponds to a hemispherical portion of the tama object 65c.
  • the position instruction unit 252 corresponds to the pattern portion of the Otama object 65c.
  • the wok object 65b and the container objects 65d to 65f are arranged at default positions in the game space 65a.
  • the user can put water, soy sauce, and chili oil into the wok object 65b by operating the tama object 65c.
  • the user moves the stylus pen 25 on the display surface 22a.
  • the control unit 20 moves the octopus object 65c based on changes in the first position information and the second position information input in real time.
  • the user operates the stylus pen 25 to put the hemispherical portion of the tama object 65c into the container object 65d. Thereby, water can be put into a hemispherical part.
  • the user moves the ladle object 65c containing water over the wok object 65b. At this time, it is desirable for the user to move the stylus pen 25 so as to stand vertically with respect to the display surface 22a so that water does not spill from the cot object 65c.
  • FIG. 42 is a diagram illustrating a state in which water enters the wok object 65b.
  • the user moves the candy object 65c onto the wok object 65b and tilts the stylus pen 25.
  • the control unit 20 changes the tilt of the tama object 65c in accordance with the change of the tilt information input in real time. Thereby, the user can move the water contained in the ladle object 65c to the wok object 65b.
  • the control unit 20 causes water to enter the wok object 65b.
  • Judge that entered In order to indicate that water has entered the wok object 65b, the control unit 20 performs cooking in which sound (water boiling sound) generated when water enters the heated wok is recorded as a sound effect. Play sound data.
  • the user can put soy sauce or chili oil in the wok object 65b by selecting “soy sauce” or “ra oil” in the item selection menu 65g.
  • soy sauce or chili oil in the wok object 65b
  • ra oil in the item selection menu 65g.
  • Chinese food ingredients such as tofu and minced meat
  • the user can experience the cooking of mapo tofu in a pseudo manner.
  • the user can artificially perform an operation of hitting the wok to the top by operating the tama object 65c.
  • the lap object 65c moves in the game space 65a.
  • the control unit 20 monitors in real time whether or not the tama object 65c contacts the wok object 65b.
  • the control unit 20 reproduces sound data in which sound (metal sound) generated when metals collide with each other is recorded as a sound effect. Thereby, the user can perform the operation
  • the user can operate the wok object by selecting “Chinese wok” from the item selection menu 65g.
  • the stylus pen 25 corresponds to the pattern of the wok object 65b.
  • the position instruction unit 251 corresponds to the handle attachment portion of the wok object 65b.
  • the wok object 65b moves in the game space 65a in accordance with changes in the first position information and the second position information of the stylus pen 25.
  • the stylus pen 25 moves up and down on the display surface 22a, and the wok object 65b moves up and down in the game space 65a.
  • the control unit 20 determines that the stylus pen 25 has moved up and down at a certain speed or higher, the control unit 20 displays an image in which the food contained in the wok object 65b jumps out of the wok object 65b. . Thereby, the user can experience the operation
  • the user can artificially make Chinese food with a more realistic feeling by moving the stylus pen 25 in the three-dimensional space on the display surface 22a.
  • FIG. 43 is an overall view of a game system 200 including a stationary game apparatus 210 according to a modification of the present embodiment.
  • the game device 210 uses a controller 212 and a tablet 213 as input devices for operation information.
  • the controller 212 and the tablet 213 can communicate with the game apparatus 210 wirelessly.
  • the game device 210 accesses an optical disc (not shown) set in the optical disc drive 210a, executes a game program, and outputs video and audio of the game to the television receiver 270.
  • the controller 212 transmits information input by the user to the game apparatus 2 as game operation information.
  • the controller 212 includes an A button 212a, a B button 212b, and a cross key 212c.
  • the user inputs operation information using each button of the controller 212.
  • the controller 3 includes position instruction units 251 and 252 (see FIGS. 32 and 33) in addition to the buttons.
  • the position instruction unit 251 is provided at the end of the controller 212 on the B button 212b side.
  • the position instruction unit 252 is provided at the end of the controller 212 on the cross key 212c side.
  • the tablet 213 is a device corresponding to the tablet 26 of the game device 2. Similar to the stylus pen 25, the user moves the controller 212 while floating on the operation surface 213 a of the tablet 213. The tablet 213 detects the positions of the position instruction units 251 and 252 in the three-dimensional space on the operation surface 213aa. The tablet 213 transmits information indicating the position and orientation of the controller 212 to the game device 210 as operation information.
  • the user can enjoy each of the above games on the stationary game device 210 by moving the controller 212 on the tablet 213.
  • a more realistic game can be enjoyed by using a dedicated controller corresponding to each game.
  • a dedicated controller in the shape of a handgun.
  • a dedicated controller having a shape such as a Chinese wok or a ladle can be used.
  • FIG. 44 is an overall view of the game system 3 according to the present embodiment.
  • the game system 3 includes a game device 32, a main controller 40, and a sub controller 49.
  • the game device 32 is a processing device that performs overall control of the game system 3, and executes a game program recorded on an optical disc.
  • the game apparatus 32 generates image data for displaying the video of the game on the screen 35 a of the television receiver 35, and outputs the image data to the television receiver 35.
  • the main controller 40 and the sub controller 49 are devices for inputting information (operation information) for a user to operate a player object or the like of a game.
  • the main controller 40 receives the operation information input to the sub-controller 49 by wired communication.
  • the operation information input to the main controller 40 and the sub controller 49 is transmitted from the main controller 40 to the game apparatus 32 by wireless communication.
  • the main controller 40 detects the relative position of the sub-controller 49 with reference to its own device.
  • Information indicating the relative position of the sub-controller 49 includes distance information indicating the distance between the main controller 40 and the sub-controller 49 and direction information indicating the direction of the sub-controller 49 viewed from the main controller 40. .
  • the relative position information is used as operation information as will be described later.
  • the user for example, plays a game while holding the main controller 40 in the right hand and holding the sub-controller 49 in the left hand.
  • the game apparatus 32 can control the movement and posture of the player object based on the positional relationship between both hands of the user by using the relative position information as the operation information. Therefore, since the game system 3 can advance a game according to a user's movement, it can provide a game with a higher sense of presence.
  • FIG. 45 is a block diagram of the game apparatus 32.
  • the game apparatus 32 includes a control unit 321, an optical disc drive 322, an output unit 323, and a wireless communication unit 324.
  • the control unit 321 includes a CPU and a RAM, and executes various game programs.
  • the control unit 321 accesses the optical disc 36 set in the optical disc drive 322 and executes a game program recorded on the optical disc 36.
  • the user can enjoy various games by setting the optical disc 36 on which the game program is recorded in the optical disc drive 322.
  • the output unit 323 outputs game video and game audio to the television receiver 35 connected to the game apparatus 32.
  • the wireless communication unit 324 performs wireless communication between the game apparatus 32 and the main controller 40.
  • Bluetooth registered trademark
  • the like can be used for wireless communication.
  • FIG. 46 is an external view of the main controller 40 and the sub-controller 49.
  • the main controller 40 and the sub controller 49 are connected via a communication cable 490 provided in the sub controller 49.
  • the operation information input to the sub controller 49 is transmitted to the main controller 40 via the communication cable 490.
  • the main controller 40 transmits the operation information input to the own device and the sub controller 49 to the game device 32 by wireless communication.
  • the main controller 40 has a rectangular parallelepiped shape, and includes an A button 42a, a B button 42b, a cross button 42c, a start button 42d, and a select button 42e.
  • the user inputs operation information by pressing various buttons of the A button 42a, B button 42b, cross button 42c, start button 42d, and select button 42e (hereinafter referred to as the operation unit 42, see FIG. 47A).
  • a surface on which the operation unit 42 is disposed is a front surface 40a.
  • a surface opposite to the front surface 40a is defined as a back surface 40b.
  • the side surface on the cross button 42c side is the upper surface 40c
  • the side surface on the B button 42b side is the lower surface 40d.
  • the lower surface 40d is provided with a connector 30 that is connected to the communication cable 490.
  • the side surface on the start button 42d side is defined as a left surface 40e, and the right surface 40f on the select button 42e side.
  • the X axis is a central axis in the longitudinal direction of the main controller 40.
  • the direction from the left surface 40e to the right surface 40f is defined as the Y-axis direction.
  • the direction from the back surface 40b to the front surface 40a is taken as the Z-axis direction.
  • the sub-controller 49 has an egg shape and includes a C button 492c and a communication cable 490. The user inputs operation information by pressing the C button 492c.
  • FIG. 47A is a block diagram showing a functional configuration of the main controller 40.
  • the main controller 40 includes a control unit 41, an operation unit 42, a wireless communication unit 43, a wired communication unit 44, a position detection unit 45, an acceleration sensor 46, and a vibration unit 47.
  • the control unit 41 performs overall control of the main controller 40.
  • the operation unit 42 is various buttons shown in FIG.
  • the wireless communication unit 43 transmits operation information of the main controller 40 and the sub controller 49 to the game device 32 by performing wireless communication with the game device 32.
  • the operation information of the main controller 40 includes relative position information, information input to the operation unit 42 (see FIG. 47B), and information detected by the acceleration sensor 46.
  • the wired communication unit 44 acquires operation information input to the sub-controller 49 by performing wired communication with the sub-controller 49 via the communication cable 490.
  • the position detection unit 45 is a sensor coil or the like provided on each surface of the main controller 40, and detects the position of the position instruction unit 494 (see FIG. 47B) provided on the sub-controller 49.
  • the acceleration sensor 46 is a triaxial acceleration sensor, and detects the acceleration of the main controller 40 and the inclination of the main controller 40 with respect to the ground surface.
  • the vibration unit 47 vibrates the entire main controller 40 in accordance with an instruction from the control unit 41.
  • FIG. 47B is a block diagram showing a functional configuration of the sub-controller 49.
  • the sub-controller 49 includes a control unit 491, an operation unit 492, a wired communication unit 493, a position instruction unit 494, and an acceleration sensor 495.
  • the control unit 491 performs overall control of the sub-controller 49.
  • the operation unit 492 corresponds to the C button 492c illustrated in FIG.
  • the sub-controller 49 may include another button (such as a cross button) of the C button 492c as the operation unit 492.
  • the wired communication unit 493 transmits operation information of the sub controller 49 to the main controller 40 via the communication cable 490.
  • the operation information of the sub-controller 49 includes information indicating that the C button 492c is pressed and information detected by the acceleration sensor 495.
  • the position instruction unit 494 transmits a resonance signal (position notification signal) for notifying the position of the sub-controller 49 to the main controller 40.
  • the acceleration sensor 495 is a three-axis acceleration sensor similar to the acceleration sensor 46, and detects the acceleration of the sub controller 49 and the inclination of the sub controller 49 with respect to the ground surface.
  • the main controller 40 and the sub controller 49 may further include a gyro sensor. Thereby, the main controller 40 and the sub controller 49 can detect the inclination with respect to the ground surface more accurately.
  • FIG. 48 is a diagram for explaining the principle of detecting the relative position of the sub-controller 49. It is defined that the origin O serving as a reference for the relative position is on the upper surface 40c. In FIG. 48, a sensor coil described later is indicated by a straight line.
  • the main controller 40 detects the relative position of the sub-controller 49 will be described with reference to FIG.
  • the sub-controller 49 is located on the upper left side (position facing the front surface 40a, the upper surface 40c, and the left surface 40e) of the main controller 40 on the front side in the drawing.
  • the position indicating unit 494 includes a resonance circuit including a coil 494a and a capacitor 494b.
  • the resonance frequency of the position indicating unit 494 is f5 (Hz).
  • a position detection unit 45 is provided inside each surface of the main controller 40.
  • a position detection unit 45a having sensor coils 451a, 451a,... Provided along the Y-axis direction is provided on the front surface 40a.
  • the sensor coil 451a is actually provided at a position that does not overlap the operation unit 42.
  • a position detector 45c having sensor coils 451c, 451c,... Is provided on the upper surface 40c.
  • a position detector 45e having sensor coils 451e, 451e,... Is provided on the left surface 40e.
  • the direction in which the sensor coils 451c and 451e are arranged is the Z-axis direction.
  • a position detection unit 45 having a plurality of sensor coils is also provided inside the back surface 40b, the bottom surface 40d, and the right surface 40f.
  • the main controller 40 supplies a sine wave signal to the position detection unit 45 of each surface in order to notify the timing at which the position instruction unit 494 transmits the position instruction signal.
  • the frequency of the sine wave signal is f5 (Hz).
  • the sine wave signal is not supplied to the position detection unit 45 of each surface all at once, but is supplied in the order of, for example, the front surface 40a, the back surface 40b, the upper surface 40c, the lower surface 40d, the left surface 40e, and the right surface 40f.
  • the position detection unit 45 on each surface has a plurality of sensor coils. In the position detector 45 on each surface, a sine wave signal of a certain level is sequentially supplied to each sensor coil.
  • sine wave signals are sequentially supplied from the sensor coil 451a located on the upper side of FIG.
  • the main controller 40 After supplying the sine wave signal to all the sensor coils 451a of the position detection unit 45a, the main controller 40 starts supplying the sine wave signal to the position detection unit 45 on the back surface 40b. In this way, a sine wave signal is supplied to all sensor coils included in the position detection unit 45 on each surface.
  • the sensor coil supplied with the sine wave signal changes the surrounding magnetic field to generate a resonance signal (timing notification signal) of f5 (Hz).
  • a resonance signal timing notification signal
  • f5 Hz
  • the position indicating unit 494 receives the timing notification signal
  • an induced current flows through the coil 494a.
  • the capacitor 494b accumulates charges corresponding to the magnitude of the induced current.
  • the magnitude of the induced current changes according to the distance between the position instruction unit 494 and the sensor coil to which the sine wave signal is supplied.
  • the capacitor 494b supplies the accumulated charge to the coil 494a.
  • the coil 494a supplied with the current generates a position indication signal of f1 (Hz). For this reason, the magnitude of the current supplied to the coil 494a and the strength of the position indication signal change according to the amount of charge accumulated in the capacitor 494b.
  • the position detection unit 45 of each surface receives a position instruction signal.
  • the position detectors 45a, 45c, 45e can receive the position instruction signal.
  • the intensity of the position instruction signal received by each sensor coil is different. This is because the distance between each sensor coil and the position indicating unit 494 is different. Accordingly, the main controller 40 can detect the azimuth angle A of the sub-controller 49 with reference to the main controller 40 based on the received intensity of the position instruction signal in the position detector 45 on each surface.
  • the distance D between the main controller 40 and the sub-controller 49 can be obtained based on the strength of the position instruction signal received by the position instruction section 494. Specifically, the distance D can be obtained by calculating the ratio between the strength of the timing notification signal transmitted by the main controller 40 and the sum of the strengths of the position indication signals received by the sensor coils.
  • the main controller 40 transmits relative position information R1 indicating the relative position P1 of the sub-controller 49 with the main controller 40 as a reference to the game apparatus 32.
  • the relative position information R1 includes distance information indicating the distance D and azimuth information indicating the azimuth angle A.
  • the relative position information R1 may be information based on three-dimensional orthogonal coordinates with the origin O (see FIG. 48) as a reference.
  • the main controller 40 may transmit not the relative position information R1 but relative position information R2 indicating the relative position of the main controller 40 based on the sub-controller 49 to the game apparatus 32.
  • the game device 32 may generate the relative position information R2 from the relative position information R1 according to the game program to be executed.
  • the sub-controller 49 may supply current to the coil 494a from the built-in battery when generating the position instruction signal.
  • the position instruction unit 494 receives the timing signal
  • the sub-controller 49 supplies a certain level of current to the coil 494a.
  • the capacitor 494b can supply a larger current to the coil 494a than the current supplied to the coil 494a, the main controller 40 can increase the limit distance at which the position indication signal can be detected.
  • the juggling game is a game in which juggling can be simulated. In the juggling game, both relative position information R1 indicating the relative position of the sub-controller 49 and relative position information R2 indicating the relative position of the main controller 40 are used.
  • the user sets the optical disc 36 in which the juggling game program is stored in the optical disc drive 322. Thereby, a juggling game is started.
  • the processing in the control unit 321 is processing by executing a juggling game program.
  • FIG. 49 is a diagram showing an initial screen of the juggling game.
  • the control unit 321 In response to the start of the juggling game, the control unit 321 generates a game space 66a for the juggling game.
  • the game space 66a is a virtual three-dimensional space.
  • the control unit 321 generates a player object 66b as a user operation target.
  • the player object 66b includes a right hand object 66c and a left hand object 66d, and is arranged at a default position in the game space 66a.
  • the left hand object 66d holds a ball object 66e.
  • the user When playing the juggling game, the user holds the main controller 40 in the right hand 81 and the sub-controller 49 in the left hand 82.
  • the user and the player object 66b have a mirror image relationship. Therefore, the main controller 40 corresponds to the left hand object 66d.
  • the sub controller 49 corresponds to the right hand object 66c.
  • the player object 66b stands with the right hand object 66c and the left hand object 66d spread downward.
  • the user takes the same posture as the player object 66b while holding the main controller 40 and the sub-controller 49.
  • the operation method of the juggling game will be described by taking as an example an operation of throwing the ball object 66e from the left hand object 66d to the right hand object 66c.
  • the user moves the main controller 40 (right hand 81) in the upper left direction so as to throw the ball to the left hand 82.
  • the acceleration sensor 46 detects an acceleration according to the movement of the main controller 40.
  • the acceleration of the main controller 40 is transmitted to the game device 32 in real time.
  • the control unit 321 changes the position (virtual position) of the left hand object 66d in the game space 66a based on the direction and magnitude of the acceleration of the main controller 40.
  • FIG. 50 is a diagram showing the player object 66b and the ball object 66e when the main controller 40 is moved in the upper left direction.
  • the control unit 321 moves the ball object 66e so as to draw a parabola in the left direction.
  • the trajectory of the ball object 66e is determined based on the magnitude and direction of the acceleration of the main controller 40.
  • the control unit 321 converts the relative position information R1 into virtual relative position information corresponding to the game space 66a. Based on the virtual position of the left hand object 66d and the virtual relative position information, the virtual position of the right hand object 66c is determined. The control unit 321 changes the virtual position of the right hand object 66c based on the change in the virtual position of the left hand object 66d and the relative position information R1.
  • control unit 321 determines that the right hand object 66c has caught the ball object 66e.
  • the user moves the sub-controller 49 (left hand 82) in the upper right direction in order to throw the ball object 66e from the right hand object 66c to the left hand object 66d.
  • the control unit 321 changes the virtual position of the right hand object 66c based on the magnitude and direction of the acceleration of the sub-controller 49. Based on the magnitude and direction of the acceleration of the sub-controller 49, the ball object 66e moves while drawing a parabola. The user moves the main controller 40 to cause the left hand object 66d to catch the ball object 66e. The virtual position of the left hand object 66d changes based on the virtual position of the right hand object 66c and the relative position information R2.
  • the user repeats the operation of throwing and catching the ball object 66e by moving the main controller 40 and the sub-controller 49. In this way, the user can play the juggling game.
  • both the right hand object 66c and the left hand object 66d may hold the ball object 66e at the same time.
  • the control unit 321 may determine the virtual position of the hand object that previously caught the ball object 66e based on the magnitude and direction of the acceleration of the controller.
  • the virtual position of the other hand object can be determined based on the relative position information and the virtual position of the hand object that has previously caught the ball object 66e.
  • the positions of the right hand object 66c and the left hand object 66d may be determined based on the acceleration of the controller corresponding to each hand object.
  • the right hand object 66c and the left hand object 66d can be moved in accordance with the movement of the right hand 81 and the left hand 82 of the user by using the relative position information R1 and R2. Therefore, the user can enjoy the juggling game with a more realistic feeling.
  • the archery game is a game in which archery can be simulated.
  • relative position information R1 indicating the relative position of the sub-controller 49 is used.
  • the user sets the optical disc 36 in which the archery game program is stored in the optical disc drive 322. Thereby, the archery game is started.
  • the processing in the control unit 321 is processing by executing an archery game program.
  • FIG. 51 is a diagram for explaining an archery game operation method.
  • the control unit 321 In response to the start of the archery game, the control unit 321 generates a game space 67a for the archery game.
  • the control unit 321 generates a player object 67b and a target object 67c.
  • the player object 67b is a user operation target in the archery game, and includes a right hand object 67d and a left hand object 67e.
  • the player object 67b holds a bow object 67g in which an arrow object 67f is set.
  • the player object 67b and the target object 67c are arranged at default positions in the game space 67a.
  • the user plays the archery game with the main controller 40 in the left hand 82 and the sub-controller 49 in the right hand 81.
  • the main controller 40 corresponds to the left hand object 67e.
  • the sub controller 49 corresponds to the right hand object 67d.
  • the player object 67b In the initial state, the player object 67b is in an upright and stationary posture. Similarly to the player object 67b, the user also starts the archery game from an upright and stationary posture.
  • the user projects the main controller 40 (left hand 82) in the direction of the screen 35a.
  • the direction in which the main controller 40 (left hand 82) protrudes is different from the direction in which the left hand object 67e extends.
  • the upper surface 40c (see FIG. 4) of the main controller 40 faces the screen 35a.
  • the acceleration of the main controller 40 changes.
  • the control unit 321 changes the position of the left hand object 67e based on the direction and magnitude of the acceleration of the main controller 40.
  • the player object 67b takes a posture in which the left hand object 67e is extended in the direction of the target object 67c.
  • the user moves the sub controller 49 (right hand 81) while pulling the bow string while the main controller 40 is fixed.
  • the control unit 321 changes the position of the right hand object 67d based on the virtual position of the left hand object 67e and the change in the relative position information R1.
  • the player object 67b has a posture in which the bow object 67g is pulled.
  • the user can fire the arrow object 67f from the bow object 67g by pressing the C button 492c of the sub-controller 49 while keeping the posture of pulling the bow.
  • the flying distance and firing direction of the arrow object 67f are determined based on the relative position information R1. Specifically, the flight distance of the arrow object 67f is determined based on the distance information (distance D) included in the relative position information R1.
  • the distance D corresponds to the distance between the right hand 81 and the left hand 82 when the user takes a posture of pulling a bow.
  • the virtual distance converted from the distance D corresponds to the strength with which the player object 67b pulls the string of the bow object 67g. Therefore, the control unit 321 can determine the flight distance based on the virtual distance Dv.
  • the user can control the flight distance of the arrow object 67f by adjusting the distance between the main controller 40 and the sub-controller 49.
  • the user In order to apply the arrow object 67f to the target object 67c, the user needs to press the C button 492c while keeping the distance D larger than a certain distance.
  • the firing direction of the arrow object 67f is determined by the azimuth information (azimuth angle A) included in the relative position information R1.
  • the control unit 321 converts the orientation information into virtual orientation information corresponding to the game space 67a.
  • the firing direction of the arrow object 67f is determined based on the virtual orientation information. That is, the arrow object 67f is fired in the direction from the right hand object 67d to the left hand object 67e.
  • the tilt information of the main controller 40 may be used when determining the firing direction of the arrow object 67f.
  • the control unit 321 may cause the arrow object 67f to fire in the horizontal direction.
  • the movement of the arrow object 67f can be controlled based on the positional relationship between the right hand 81 and the left hand 82. Therefore, the game system 3 can provide an extremely high archery game.
  • rock climbing game The rock climbing game is a game that allows the user to experience the rock climbing in a pseudo manner. In the rock climbing game, both relative position information R1 indicating the relative position of the sub-controller 49 and relative position information R2 indicating the relative position of the main controller 40 are used.
  • the user sets the optical disc 36 in which the rock climbing game program is stored in the optical disc drive 322. Thereby, the rock climbing game is started.
  • the processing in the control unit 321 is processing by executing a rock climbing game program.
  • FIG. 52 is a diagram for explaining the operation method of the rock climbing game.
  • the control unit 321 In response to the start of the rock climbing game, the control unit 321 generates a game space 68a for the rock climbing game.
  • the control unit 321 generates a player object 68b and a rock wall object 68c.
  • the player object 68b is a user operation target, and includes a right hand object 68d and a left hand object 68e.
  • the player object 68b and the rock wall object 68c are arranged at default positions in the game space 68a.
  • On the screen 35a an image in which the player object 68b is held by the rock wall object 68c is displayed.
  • the rock wall object 68c is provided with protrusions 68f and 68g as hold points that the player object 68b can grasp.
  • a depression or the like may be provided in the rock wall object 68c as a hold point.
  • the user plays the rock climbing game with the main controller 40 in the right hand 81 and the sub-controller 49 in the left hand 82.
  • the main controller 40 corresponds to the right hand object 68d.
  • the sub-controller 49 corresponds to the left hand object 68e.
  • the user moves each of the controllers 40 and 49 and designates the protrusion gripped by the player object 68b among the protrusions 68f and 68g.
  • the player object 68b climbs the rock wall object 68c by changing the protrusions 68f and 68g gripped by the right hand object 68d and the left hand object 68e according to the operation of the user.
  • the user presses a button on the main controller 40 or the sub controller 49 to designate a hand object to be operated.
  • the user may press the A button 42a of the main controller 40.
  • the position of the right hand object 67d is determined based on the virtual position of the left hand object 68e and the relative position information R2.
  • the user moves the main controller 40 while the sub-controller 49 is fixed.
  • the control unit 321 moves the right hand object 68d according to the virtual position of the left hand object 68e and the change in the virtual relative position information converted from the relative position information R2. For example, when the user moves the main controller 40 in the direction of arrow H (upward), the right hand object 68d moves in the direction of arrow 68h.
  • the control unit 321 vibrates the vibration unit 47 when the distance between the right hand object 68d and the protrusion 68g is equal to or less than a predetermined distance. Thereby, the user can recognize that the right hand object 68d can grasp the protrusion 68g. The user can grip the protrusion 68g with the right hand object 68d by pressing the A button 42a.
  • the operation target changes to the left hand object 68e.
  • the virtual position of the left hand object 68e is determined based on the virtual position of the right hand object 68d and the relative position information R1.
  • the operation method for the left hand object 68e to grasp the protrusion 68f is the same as the above-described procedure.
  • the main controller 40 and the sub-controller 49 are moved with the right hand object 68d and the left hand object 68e being alternately operated.
  • the user can cause the player object 68b to climb the rock wall object 68c as if actually performing rock climbing.
  • the curling game is a game in which curling can be simulated.
  • relative position information R2 indicating the relative position of the main controller 40 is used.
  • the user sets the optical disc 36 in which the curling game program is stored in the optical disc drive 322. Thereby, the curling game is started.
  • the processing in the control unit 321 is processing by executing a curling game program.
  • FIG. 53 is a diagram for explaining a method of operating the curling game.
  • the control unit 321 In response to the start of the curling game, the control unit 321 generates a game space 69a provided with a curling link 69b.
  • the control unit 321 generates a player object 69c, a brush object 69d, and stone objects 69e, 69f, and 69g.
  • the player object 69c is used to indicate the position of the user on the link 69b.
  • the user's operation target is the brush object 69d.
  • the main controller 40 has the role of a brush.
  • the user plays a curling game as a sweeper.
  • the sub-controller 49 is fixed to the user's abdomen using a belt or the like.
  • the sub controller 49 corresponds to the player object 69c.
  • the origin O (see FIG. 48) of the main controller 40 corresponds to the head of the brush object 69d (the part that sweeps the link 69b).
  • stone objects 69e and 69f indicated by a sand pattern are stones of the user team to which the user belongs.
  • a stone object 69g indicated by hatching is a stone of the opponent team.
  • the stone objects 69e and 69g are randomly arranged in the vicinity of a target (house) on which a concentric circle is drawn on the link 69b.
  • the player object of the user team slides the stone object 69f on the link 69b. As a result, the stone object 69f moves from left to right at a constant speed.
  • the player object 69c is located on the right side (lower side in the screen 35a) of the stone object 69f in the traveling direction, and moves on the link 69b together with the stone object 69f.
  • the player object 69c holds a brush object 69d.
  • the position of the header of the brush object 69d is determined based on the relative position information R2 and the virtual position of the player object 69c corresponding to the sub controller 49.
  • the user moves the main controller 40 having the role of a brush back and forth.
  • the relative position information R2 changes.
  • the header position of the brush object 69d changes in accordance with the relative position information R2 and the change in the virtual position of the player object 69c. Thereby, the user can sweep the link 69b using the brush object 69d.
  • the user can determine the position where the brush object 69d sweeps by changing the position where the main controller 40 is moved back and forth. That is, the user can control the movement of the stone object 69f by changing the position of sweeping the link 69b.
  • the ice on the right side (the lower side in the screen 35a) of the stone traveling direction may be swept with the brush object 69d.
  • the user moves the main controller 40 back and forth while attracting the main controller 40 to the abdomen. Further, by sweeping the ice in front of the stone object 69f, the moving distance of the stone can be extended.
  • the user can change the position for sweeping the link 69b by changing not only the main controller 40 but also the position where the main controller 40 is moved. Therefore, since the movement of the stone object 69f can be controlled like an actual curling competition, the game system 3 can provide a curling game with higher reality.
  • the judo game is a game in which various judo throwing techniques can be virtually experienced.
  • relative position information R2 indicating the relative position of the main controller 40 is used.
  • the user sets the optical disc 36 in which the judo game program is stored in the optical disc drive 322. Thereby, a judo game is started.
  • the processing in the control unit 321 is processing by executing a judo game program.
  • FIG. 54 is a diagram for explaining the operation method of the judo game.
  • the control unit 321 In response to the start of the judo game, the control unit 321 generates a game space 70a for the judo game.
  • the control unit 321 generates a player object 70b and an opponent object 70c.
  • the opponent object 70c is arranged at a default position in the game space 70a. Only the left hand of the player object 70b is displayed on the screen 35a.
  • the user plays the judo game while holding the main controller 40 in the right hand 81 and holding the sub-controller 49 in the left hand 82.
  • the main controller 40 corresponds to the right hand of the player object 70b.
  • the sub-controller 49 corresponds to the left hand of the player object 70b.
  • the user combines with the opponent object 70 c by moving the main controller 40 and the sub-controller 49.
  • the left hand of the player object 70b corresponding to the user holds the right sleeve 70d of the opponent object 70c.
  • the user moves the main controller 40 while fixing the position of the sub-controller 49, and designates a part of the opponent object 70c that is gripped by the right hand of the player object 70b.
  • the throwing technique when the player object 70b throws the opponent object 70c is determined according to the part that the right hand of the player object 70b grips.
  • the user moves the main controller 40 and designates the part of the opponent object 70c that the right hand of the player object 70b grips. Based on the virtual position of the left hand of the player object 70b and the change in the relative position information R2, the position of the right hand of the player object 70b changes.
  • the control unit 321 determines the throwing technique to be thrown on the back.
  • the control unit 321 displays on the screen 35a a process in which the player object 70b throws the opponent object 70c by throwing on the back.
  • the throwing technique is determined based on the part grasped by the right hand of the player object 70b. For example, when the right hand of the player object 70b grabs the left shoulder 70f of the opponent object 70c, the throwing technique is determined to be a large trimming. When the right hand of the player object 70b grasps the right sleeve 70d, the throwing technique is determined to be throwing with one back.
  • the difficulty level may be set.
  • the control unit 321 may move the right hand of the opponent object 70c. In this case, the difficulty level of grasping the opponent object 70c increases. Further, when the player object 70b cannot grasp the opponent object 70c within a predetermined time, the opponent object 70c may throw the player object 70b.
  • the user can virtually perform a judo combination.
  • the user can easily enjoy judo without knowing how to take the judo passive.
  • the main controller 40 includes the position detection unit 45 and the sub-controller 49 includes the position instruction unit 494 has been described.
  • the sub controller 49 may include a position detection unit
  • the main controller 40 may include a position instruction unit.
  • the sub controller 49 has been described with respect to an example in which operation information input by the user is transmitted to the main controller 40 via the communication cable 490.
  • the sub controller 49 includes a wireless communication unit. Also good.
  • the operation information input to the sub controller 49 is directly transmitted to the game apparatus 32 by wireless communication. Since there is no need to connect the main controller 40 and the sub-controller 49 with the communication cable 490, the degree of freedom when moving the main controller 40 and the sub-controller 49 can be increased.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Cardiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Dans un système de jeu (1), un dispositif de jeu (12) exécute un programme de jeu enregistré sur un disque optique et amène un récepteur de télévision (5) à afficher la vidéo d'un jeu. Une tablette (14) détecte la position d'un dispositif de commande (13) dans un espace tridimensionnel au-dessus d'une surface de fonctionnement (14a) et génère des informations tridimensionnelles de position indiquant la position du dispositif de commande (13). Un utilisateur joue au jeu en déplaçant le dispositif de commande (13) tout en élevant celui-ci au-dessus de la surface de fonctionnement (14a). Le dispositif de jeu (12) utilise non seulement les informations transmises à partir du dispositif de commande (13), mais également les informations tridimensionnelles de position comme informations fonctionnelles relatives au jeu.
PCT/JP2010/066715 2009-10-16 2010-09-27 Système de jeu et dispositif de jeu WO2011046014A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2009-238956 2009-10-16
JP2009238956A JP5964008B2 (ja) 2009-10-16 2009-10-16 ゲームシステム
JP2009245458A JP2011087848A (ja) 2009-10-26 2009-10-26 ゲーム装置
JP2009-245458 2009-10-26
JP2009-267785 2009-11-25
JP2009267785A JP5843342B2 (ja) 2009-11-25 2009-11-25 ゲームシステム

Publications (1)

Publication Number Publication Date
WO2011046014A1 true WO2011046014A1 (fr) 2011-04-21

Family

ID=43876067

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2010/066715 WO2011046014A1 (fr) 2009-10-16 2010-09-27 Système de jeu et dispositif de jeu

Country Status (1)

Country Link
WO (1) WO2011046014A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017050182A1 (fr) * 2015-09-23 2017-03-30 腾讯科技(深圳)有限公司 Procédé et dispositif d'interaction de matériel intelligent

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10503395A (ja) * 1994-07-28 1998-03-31 スーパー ディメンション インコーポレイテッド コンピュータ式ゲーム盤

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10503395A (ja) * 1994-07-28 1998-03-31 スーパー ディメンション インコーポレイテッド コンピュータ式ゲーム盤

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017050182A1 (fr) * 2015-09-23 2017-03-30 腾讯科技(深圳)有限公司 Procédé et dispositif d'interaction de matériel intelligent
US10874937B2 (en) 2015-09-23 2020-12-29 Tencent Technology (Shenzhen) Company Limited Intelligent hardware interaction method and system

Similar Documents

Publication Publication Date Title
KR100713058B1 (ko) 게임 장치, 이것에 사용하는 입력 수단, 및 기억 매체
US8992322B2 (en) Interactive gaming systems with haptic feedback
KR100858400B1 (ko) 체감 볼 게임 장치
JP4009433B2 (ja) ゲーム装置、ゲームプログラム及びゲームシステム
US20170136356A1 (en) Systems and methods for control device including a movement detector
US20110021273A1 (en) Interactive music and game device and method
US8894490B2 (en) Interactive sports gaming device
US20050209066A1 (en) Martial Arts Exercise Device and Method
EP2090346A1 (fr) Procédé et appareil pour la simulation de jeux impliquant un ballon
JP2014530043A (ja) ボクシング及び格闘技における双方向システムとメソッド
JP2002082751A (ja) 仮想空間とのインタラクション装置およびこれを応用した仮想空間システム
JP2003208263A (ja) コントロール装置及びその取付け体を有する画像処理装置
JP6910809B2 (ja) シミュレーションシステム、プログラム及びコントローラ
JP2003180896A (ja) バーチャルスポーツシステム
JP3119263B1 (ja) ゲーム装置、ゲーム装置の制御方法、及びゲームの画面表示方法
TWI407992B (zh) 運動模擬系統
JP2007152057A (ja) 打撃系練習機
US9604141B2 (en) Storage medium having game program stored thereon, game apparatus, game system, and game processing method
CN102671372A (zh) 游戏装置及使用该游戏装置的方法
WO2021067536A1 (fr) Accessoire pour simulation de réalité virtuelle
WO2011046014A1 (fr) Système de jeu et dispositif de jeu
JP5964008B2 (ja) ゲームシステム
WO2017110775A1 (fr) Machine de jeu et système de jeu
JP4692691B2 (ja) ゲーム装置、これに使用する入力手段、及び記憶媒体
TW201503938A (zh) 360度環場虛擬實境釣魚系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10823282

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10823282

Country of ref document: EP

Kind code of ref document: A1