US20120115609A1 - Game device and computer program - Google Patents

Game device and computer program Download PDF

Info

Publication number
US20120115609A1
US20120115609A1 US13/350,649 US201213350649A US2012115609A1 US 20120115609 A1 US20120115609 A1 US 20120115609A1 US 201213350649 A US201213350649 A US 201213350649A US 2012115609 A1 US2012115609 A1 US 2012115609A1
Authority
US
United States
Prior art keywords
value
load
corrected
load data
supporter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/350,649
Inventor
Chiaki Sugiyama
Jun Tokuhara
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Corp filed Critical Sega Corp
Assigned to KABUSHIKI KAISHA SEGA reassignment KABUSHIKI KAISHA SEGA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SUGIYAMA, CHIAKI, TOKUHARA, JUN
Publication of US20120115609A1 publication Critical patent/US20120115609A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/218Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0334Foot operated pointing devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1056Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving pressure sensitive buttons
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying

Definitions

  • the present invention relates to a game device and a computer program, more specifically, a game device using a controller, and a computer program.
  • Patent Reference 1 a technique that a player gets on a controller with a plurality of load sensors provided on, and changing loading modes on the controller to thereby make input operations of the game.
  • Patent Reference 1 Japanese Patent Application Unexamined Publication No. 2008-264195
  • Patent Reference 2 Japanese Patent Application Unexamined Publication No. 2008-119211
  • An object of the present invention is to provide a game device which can realize realistic operations, and a computer program.
  • the present invention provides a computer program for operating a computer as a game device using a controller including a supporter for supporting a player; a first load sensor provided at a right front part of the supporter, for detecting a load from supporter; a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, the third load sensor provided at a left front part of the supporter, for detecting a load from the supporter, and the fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter, said computer being operated as an action determination means which calculates a first value, based on a difference of a first load data based on an output of the first load sensor and the second load data based on an output of the second load sensor, calculates a second value based on a difference between a third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines
  • the present invention provides a game device comprising a controller including a supporter for supporting a player, and a first load sensor provided at a right front part of the supporter, for detecting a load from the supporter, a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, a third load sensor provided at a front left part of the supporter, for detecting a load from the supporter and a fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter, the game device comprising an action determination means which calculates a first value based on a difference between the first load data based on an output of the first load sensor and an output of the second load sensor, calculates a second value, based on a difference between the third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines an action of an object to be displayed on a display screen,
  • a first value is calculated, based on a difference between a first load data based on an output of a first load sensor provided at a right front part of the supporter for a player to be supported on, and a second load data based on an output of a second load sensor provided at a right rear part of the supporter.
  • a second value is calculated, based on a difference between a third load data based on an output of a third load sensor provided at a left front part of the supporter and a fourth load data based on an output of a fourth load sensor provided at a left rear part.
  • an action of an object displayed on a display screen is determined.
  • the object can be cause to make rotation, etc., which makes it possible to realistically enjoy the game.
  • FIG. 1 is an appearance view of the game device according to one embodiment of the present invention.
  • FIG. 2 is a top view and a bottom view of a controller used in the embodiment of the present invention.
  • FIG. 3 is a block diagram of the game device according to the embodiment of the present invention.
  • FIG. 4 is a perspective view illustrating a player getting on the first controller.
  • FIG. 5 is a view illustrating an example of the display screen of the game device according to the embodiment of the present invention.
  • FIG. 6 is a view of the respective memories provided in the system memory.
  • FIG. 7 is a view illustrating a relationship between the local coordinate system and the object.
  • FIG. 8 is the flow chart of the processing of determination of an action of the object.
  • FIG. 9 is views illustrating the operations of the first controller with the legs of a player.
  • FIG. 10 is views illustrating the case that a player applies the weight to the tiptoe of the right leg and the tiptoe of the left leg.
  • FIG. 11 is views illustrating the case that the player applies the weight to the heel of the right leg and the heel of the left leg.
  • FIG. 12 is views illustrating the case that the player applies the weight to the heel of the right leg and the tiptoe of the left leg.
  • FIG. 13 is views illustrating the case that the player applies the weight to the heel of the left leg and the tiptoe of the right leg.
  • FIG. 14 is views illustrating a specific example (Part 1) of the actions of the object.
  • FIG. 15 is views illustrating a specific example (Part 2) of the actions of the object.
  • FIG. 16 is views illustrating a specific example (Part 3) of the actions of the object.
  • FIG. 17 is views illustrating a specific example (Part 4) of the actions of the object.
  • FIG. 18 is views illustrating a specific example (Part 4) of the actions of the object.
  • the game device and a computer program according to an embodiment of the present invention will be described with reference to FIGS. 1 to 17 .
  • FIG. 1 is an appearance view of the game device according to the present embodiment.
  • the game device 10 is used, connected to a TV monitor (a display, display means) 4 placed on a TV rack 2 .
  • the game device 10 includes a first controller 20 to be operated by a player, and a second controller 22 to be operated by the player.
  • the game device body 12 and the TV monitor 4 are connected to each other by a cable.
  • the game device body 12 and the first controller 20 can communicate with each other by wireless.
  • the game device body 12 and the second controller 22 can communicate with each other by wireless.
  • FIG. 2 is an upper surface view and an underside view of the controller used in the present embodiment.
  • FIG. 2A is the top view
  • FIG. 2B is the bottom view.
  • the first controller 20 includes a supporter 78 for legs of a player to put on, and four load sensors 82 a - 82 d provided on the four corners of the supporter 20 for sensing loads applied to the supporter 78 .
  • the controller 20 transmits sensed load values sensed by the load sensors 82 a - 82 d to the game device body 12 .
  • the player adjusts loading the body weight to thereby manipulate actions of objects displayed on the TV monitor 4 to play the game.
  • the second controller 22 is operated by, e.g., a hand 23 of the player.
  • various operation buttons 28 such as a power source button 28 a, a cross button 28 b, etc. (see FIG. 3 ) are provided.
  • the player uses the second controller 22 to input commands, such as a start of the game, etc., in the game device body 12 .
  • FIG. 3 is a block diagram of the game device according to the present embodiment.
  • a CPU 40 which executes the gate program, the general control of the entire system, the coordinates computation for image displays, etc., and a system memory (RAM) 42 to be used as the buffer memory which stores programs and data necessary for the CPU 40 to process are connected to a bus arbiter 44 by a common bus line.
  • the bus arbiter 44 controls flows of programs and data to the respective blocks of the game device 10 and devices connected outside.
  • a program data memory device or a memory storage medium including an optical disc, an optical disc drive, etc. for driving CD-ROMs, etc. which are game record media
  • a BOOT ROM 48 storing programs and data for actuating the game device 10 are connected to the bus arbiter 44 via bus lines.
  • polygon data having three-dimensional local coordinates data forming objects to be displayed (vertex data), NURBS (Non Uniform Rational B-Spline) data (curved-face or control point data) are stored in the system memory 42 , and these are arranged in a world coordinate system of a three-dimensional virtual space by the CPU 40 and a geometry processor (not illustrated) to convert the local coordinates to the world coordinate system.
  • view point coordinates generated by an operation of a player and in accordance with a progress of the game are set, and objects present in the view range seen at this view point in a prescribed view direction and at a view angle are converted into a view point coordinate system having the view coordinates at the origin, and the converted coordinates of the objects are transmitted to the rendering processor 50 .
  • the rendering processor 50 first makes interpolation processing, such as light source processing, on the transmitted coordinates of the objects and details the surfaces of the objects by adhering texture data stored in the graphic memory 52 to the objects. Furthermore, the rendering processor 50 projects from the three-dimensional stereoscopic projects the objects (polygons) on the two-dimensional plane (screen) for the display on the TV monitor 4 to convert the objects to the two-dimensional coordinate data (screen coordinate system), displays first the polygons whose depth in the Z coordinate are smaller, i.e., nearer to the view point coordinates to thereby generate two-dimensional images, and outputs the two-dimensional images on the TV monitor 4 , such as a CRT, a liquid crystal display or others.
  • interpolation processing such as light source processing
  • the rendering processor 50 which reproduces image (MOVIE) data read from the program data memory device or the memory storage medium 46 and generates images for image displays by an operation of a player or in accordance with a progress of the game, and a graphic memory 52 which stores graphic data, etc. necessary for the rendering processor 50 to generate images are connected to each other.
  • the image signals outputted from the rendering processor 50 are converted from digital signals to analog signals by the video DAC 54 to be displayed by the TV monitor 4 .
  • a sound processor 56 which reproduces music data read from the program data memory device or the memory storage medium 46 and generates effect sounds and voices by an operation of a player and in accordance with a progress of the game, and a sound memory 58 storing sound data, etc. necessary for the sound processor 56 to generate effect sounds and voices are connected to each other.
  • the voice signals outputted from the sound processor 56 are converted from digital signals to analog signals by an audio DAC 60 to be outputted from the speaker of the TV monitor 4 .
  • a communication interface 62 is connected to the bus arbiter 44 .
  • the communication interface 62 is connected to outside networks, such as telephone circuits, etc., via a LAN adapter 64 .
  • the game device body 12 is connected to internets by the LAN adapter 64 and can communicate with other game devices, network severs, etc.
  • the communication interface 62 and the LAN adapter 64 use telephone circuits but may use terminal adapters (TA) or routers using telephone circuits, cable modems using cable TV circuits, wireless communication means using portable telephones or PHS and other communication means, such as optical fiber communication means, etc., using optical fibers.
  • TA terminal adapters
  • routers using telephone circuits
  • cable modems using cable TV circuits
  • wireless communication means using portable telephones or PHS
  • other communication means such as optical fiber communication means, etc., using optical fibers.
  • a wireless receiver unit 68 which wireless communicates with the controller 20 and the remote controller 22 is connected.
  • the wireless receiver unit 68 receives information transmitted from the first controller 20 and information transmitted from the second controller 22 .
  • a peripheral I/F (Interface) 70 is connected to the bus arbiter 44 .
  • various peripheral devices can be connected.
  • the game device 10 is not essentially a domestic game device and can be a personal computer, a portable electronic game machine, an electronic device, such as a portable telephone, a PDA or others, an information processing device of a game device or others installed in amusements facilities, shops such as game cafes, etc.
  • the first controller 20 of the present embodiment functions as the operation device (operating means) of the game.
  • the controller 20 includes the supporter (support plate) 78 for a player to get on, the load sensors 82 a - 82 d which detect loads applied to the supporter 78 .
  • the load sensors 82 a - 82 d are arranged on the four corners of the supporter 78 .
  • the first load sensor 82 a is arranged at the right front part of the supporter 78
  • the second load sensor 82 b is arranged at the right rear part of the supporter 78
  • the third load sensor 82 c is arranged at the left front part of the supporter 78
  • the fourth load sensor 82 d is arranged at the left rear part of the supporter 78 .
  • the legs 80 a - 80 d are provided on the four corners of the first controller 20 .
  • the respective legs 80 a - 80 d are formed, e.g., cylindrical.
  • the first load sensor 82 a is supported by the leg 80 a
  • the second load sensor 82 b is supported by the leg 80 b
  • the third load sensor 82 c is supported by the leg 80 c
  • the fourth load sensor 82 d is supported by the leg 80 d. That is, the supporter 78 is supported by the four legs 80 a - 80 d via the load sensors 82 a - 82 d provided on the four corners.
  • FIG. 4 is a perspective view of the state that a player is on the first controller.
  • the load sensors 82 a - 82 d can be, e.g., strain gauges-type load cells or others.
  • the load sensors 82 a - 82 d output electric signals of intensities corresponding to applied loads.
  • the first controller 20 can be suitably a controller which includes a plurality of load sensors for detecting loads applied to the supporter.
  • the first controller 20 includes a CPU 88 which controls the general operation of the first controller 20 .
  • a CPU 88 controls the operation of the first controller 20 by a computer program stored in the ROM.
  • the respective load sensors 82 a - 82 d are connected to an AD converter 86 via the respective amplifiers 84 a - 84 d.
  • a wireless transmission unit 90 transmits data from the first controller 20 to the game device body 12 .
  • the wireless transmission unit 90 is provided in the inside of, e.g., the first controller 20 .
  • the amplifiers 84 a - 84 d, the AD converter 86 , the CPU 88 and the wireless communication unit 90 prescribed voltages are supplied from a battery (not illustrated), such as electric cells or others.
  • the respective load sensors 82 a - 82 d output signals indicating inputted loads.
  • the electric signals outputted from the respective load sensors 82 a - 82 d are amplified respectively by the amplifiers 84 , converted from the analog signals to digital data by the AD converter 86 , and inputted to the CPU 88 .
  • identification information of the respective load sensors 82 a - 82 d is added, and which load sensors 82 a - 82 d the detected load values belong to can be identified.
  • the CPU 88 obtains data f 1 -f 4 of detected load values of the respective load sensors 82 a - 82 d.
  • f 1 is data of the detected load value given by the first load sensor 82 a
  • f 2 is data of the detected load value given by the second load sensor 82 b
  • f 3 is the detected load value given by the third load sensor 82 c
  • f 4 is the detected load value given by the fourth load sensor 82 d.
  • the data f 1 -f 4 of the detected load values given by the load sensors 82 a - 82 d are transmitted as operational input data for the first controller 20 from the CPU 88 to the game device body 12 via the wireless transmission unit 90 .
  • the CPU 88 transmits for, e.g., each frame, the data f 1 -f 4 of the detected load values given by the load sensors 82 a - 82 d.
  • FIG. 5 is a view illustrating an example of the display images of the game device according to the present embodiment.
  • a player operates with the first controller 20 directions (proceeding directions) of a hover machine (an object) 100 displayed on the TV monitor (a display, display means) 2 and velocities (moving velocities, proceeding velocities) to reach the hover machine 100 to the flag (a target to gain points) 102 displayed on the TV monitor 2 and obtain the flag 102 , whereby the player goes on gaining points.
  • the game is set. The player can enjoy the game by competing in obtaining all the flags 102 within a shorter period of time.
  • the best record (BEST RECORD) 104 of the games so far played is indicated.
  • a number 106 of remaining flags is indicated.
  • an elapsed time 108 from the start of the game is indicated.
  • flags 102 which are targets to gain points are displayed.
  • the object (the hover machine) 100 to be operated by a player in the three-dimensional virtual space is displayed.
  • the hover machine 100 is displayed with an operator (a character) 110 riding the hover machine 100 .
  • jet engines 112 a - 112 d are respectively provided.
  • the jet engines 112 a - 112 d are for applying driving forces to hover machine 100 .
  • the hover machine 100 runs on virtual ice with the driving forces of the jet engines 112 a - 112 d.
  • the jet engines 112 a, 112 b on the right side, and the jet engines 112 c, 112 d on the left side are controlled independently.
  • the jet engine 112 a at the right front part and the jet engine 112 b at the right rear part do not simultaneously jet.
  • the jet engine 112 c at the left front part and the jet engine 112 d at the left rear part do not simultaneously jet.
  • the hover machine 100 rotates right there without changing the central position.
  • the hover machine 100 moves backward, rotating right.
  • the hover machine 100 moves forward, rotating right.
  • the hover machine 100 rotates left there without changing the central position.
  • the hover machine 100 moves forward, rotating left.
  • the hover machine 100 moves backward, rotating left.
  • a display which asks a player to get on the first controller 20 is displayed on the TV monitor 4 .
  • the measuring of the weight of the player is performed. Specifically, the total value of data f 1 -f 4 of detected load values given by the four load sensors 82 a - 82 d is taken for data F of the weight of the player.
  • the weight F of the player is expressed by the following formula.
  • FIG. 6 is a view of the respective memories provided in the system memory.
  • the CPU 40 stores the data F of the weight of the player in the player weight memory provided in the system memory 42 (see FIG. 6 ).
  • the CPU 40 calculates a coefficient a for normalizing the data f 1 -f 4 .
  • the coefficient (normalization coefficient) a for normalizing the f 1 -f 4 is expressed by the following formula.
  • the CPU 40 stores the normalization coefficient a in the normalization coefficient memory provided in the system memory 42 (see FIG. 6 ).
  • the data f 1 -f 4 of the detected load values of the respective load sensors 82 a - 82 d are obtained for each frame.
  • the CPU 40 stores the data f 1 -f 4 of the obtained detected load values in the f 1 -f 4 memory provided in the system memory 42 (see FIG. 6 ).
  • the data g 1(n) -g 4(n) of the detected load values given by normalizing the data f 1(n) -f 4(n) of the n-th frame are expressed by the following formulas.
  • the CPU 40 stores the normalized data g 1 -g 4 in the g 1 -g 4 memory provided in the system memory 42 (see FIG. 6 ).
  • correction coefficients (correction values) b 1 -b 4 for making the data g 1 -g 4 of the normalized detected load values 0.25 are determined respectively in advance.
  • the correction coefficients b 1 -b 4 are not determined for the respective players but are applied uniformly to all the players.
  • correction coefficients b 1 -b 4 are applied uniformly to all the players here but the correction coefficients may be determined for the respective players.
  • the CPU 40 stores the calculated load data W 1(n) -W 4(n) in the W 1 -W 4 memory provided in the system memory 42 .
  • the position coordinates P (n) of the object (the hover machine) 100 at the n-th frame in the world coordinate system are expressed as follows.
  • P (n) ( P X(n) , P Y(n) , P Z(n) )
  • the position coordinates P (n) are stored in the position coordinates memory provided in the system memory 42 (see FIG. 6 ).
  • the X Z plane in the world coordinate system is a plane in parallel with the virtual ground, and the Y axis of the world coordinate system is normal to the virtual ground.
  • FIG. 7 is a view showing the relationships between the local coordinate system and an object.
  • the left-to-right direction as viewed in FIG. 7 corresponds to the X axial direction of the local coordinate system.
  • the rightward direction as viewed in FIG. 7 corresponds to the positive direction of the X axis, and the leftward direction as viewed in FIG. 7 corresponds to the negative direction of the X axis.
  • the up-to-down direction as viewed in FIG. 7 corresponds to the Z axial direction of the local coordinate system.
  • the downward direction as viewed in FIG. 7 is the positive direction of the Z axis, and the upward direction of the Z axis as viewed in FIG. 7 is the negative direction of the Z axis.
  • the normal direction as viewed in FIG. 7 corresponds to the Y axial direction of the local coordinate system.
  • the direction toward this side as viewed in FIG. 7 is the positive direction of the Y axis
  • the away direction as viewed in FIG. 7 is the negative direction of the
  • the central line of the object 100 in the left-to-right direction agrees with the X axial direction. More specifically, on the initial stage of the game, the rightward direction of the object 100 agrees with the positive direction of the X axis, and the leftward direction of the object 100 agrees with the negative direction of the X direction.
  • the central line of the object 100 in the front-to-rear direction agrees with the Z axial direction. More specifically, on the initial stage of the game, the rear direction of the object 100 agrees with the positive direction of the Z axis, and the forward direction of the object 100 agrees with the negative direction of the Z axis.
  • the central line of the object 100 in the up-to-down direction agrees with the Y axis. More specifically, in the state that the object 100 is positioned on the horizontal plane, the downward direction of the object 100 agrees with the positive direction of the Y axis, and the upward direction of the object 100 agrees with the negative direction of the Y axis.
  • FIG. 8 is the flow chart of the processing of determination of action of the object.
  • the CPU makes the normalization with the normalization coefficient a and the correction with the correction coefficients b 1 -b 4 as described above, based on the data f 1(n) -f 4(n) of the detected load values of the load sensors 82 a - 82 d to thereby calculate the respective load data W 1(n) -W 4(n) (Step S 1 ).
  • the CPU 40 makes the computation of the respective load data W 1(n) -W 4(n) for each frame.
  • the CPU 40 stores the calculated load data W 1(n) -W 4(n) in the W 1 -W 4 memory provided in the system memory 42 (see FIG. 6 ).
  • the CPU 40 subtracts first load data W 1(n) from the second load data W 2(n) to thereby calculate the first value W R(n) (Step S 2 ).
  • the first value W R(n) is calculated for each frame.
  • the first value W R(n) is expressed by the following formula.
  • the CPU 40 stores the calculated first value W R(n) in the W R(n) memory provided in the system memory 42 (see FIG. 6 ).
  • the CPU 40 subtracts the third load data W 3(n) from the fourth load data W 4(n) to thereby calculate the second value W L(n) (Step S 3 ).
  • the second value W L(n) is calculated for each frame.
  • the second value W L(n) is expressed by the following formula.
  • the CPU 40 stores the calculated second value W L(n) in the W L(n) memory provided in the system memory 42 (see FIG. 6 ).
  • the CPU 40 corrects the first value W R(n) (Step S 5 ). Specifically, when the absolute value of the first value W R(n) is smaller than the prescribed threshold value c, a the first value W R(n) is multiplied by a prescribed correction coefficient d.
  • the prescribed threshold value c is set at, e.g., 0.05.
  • the prescribed correction coefficient d is set at, e.g., 0.02.
  • the CPU 40 corrects the second value W L(n) (Step S 7 ). Specifically, when the absolute value of the second value W L(n) is smaller than the prescribed threshold value c, the second value W L(n) is multiplied by the prescribed correction coefficient d.
  • the CPU 40 gives a product of the multiplication of the first value W R(n) by the second value W L(n) , i.e., the value of (W R(n) ⁇ W L(n) ) (Step S 8 ).
  • the value of (W R(n) ⁇ W L(n) ) is negative and is smaller than the prescribed threshold value e (Step S 9 )
  • the processing of rotating the object to be described later is made (Steps S 11 -S 15 ).
  • the action of the object 100 is determined as follows, based on the first value W R(n) and the second value W L(n) (Step S 10 ).
  • the prescribed threshold value e can be, e.g., ⁇ 0.01. Such use of the prescribed threshold value e is for preventing the rotation, etc. of the object 100 due to a small load change even when a player does not intend.
  • the CPU 40 determines an action of the object 100 as follows, based on the first value W R(n) and the second value W L(n) .
  • the rotation angle A (n) of the object 100 in the n-th frame of the local coordinate system is expressed as follows.
  • a (n) ( A X(n) , A Y(n) , A Z(n) )
  • a X(n) is the data of the rotation angle of the object 100 on the rotation axis of the X axis.
  • a Y(n) is the data of the rotation angle of the object 100 on the rotation axis of the Y axis.
  • a Z(n) is the data of the rotation angle of the object 100 on the rotation axis of the Z axis. The right rotation on the rotation axis is positive, and the left rotation on the rotation axis is negative.
  • the object 100 may move on an inclined plane.
  • the object 100 moves on an inclined plane, not only the A Y(n) but also the A X(n) and the A Z(n) are considered.
  • the rotation angle A Y(n) in the local coordinate system is given by the following formula.
  • a Y(n) A Y(n ⁇ 1) +( W L(n) ⁇ W R(n) )
  • a Y(n ⁇ 1) is the data of the rotation angle of the n ⁇ 1-th frame
  • a Y(n) is the data of the rotation angle of the n ⁇ 1-th frame.
  • the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the minimum value of the (W L(n) ⁇ W R(n) ) is ⁇ 1.0, and the maximum value thereof is 1.0.
  • the data of the rotation angle A (n) of the n-th frame such given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the data A (n) of a rotation angle is converted to a value A (n) ′ of the angle by a prescribed conversion formula or others.
  • k is a prescribed coefficient.
  • the velocity V (n) of the object 100 in the n-th frame of the world coordinate system is expressed as follows.
  • V ( n ) ( V X(n) , V Y(n) , V Z(n) )
  • the velocity V X(n) of the object 100 in the X axial direction is expressed as follows.
  • V X(n) ( W R(n) +W L(n) ) ⁇ sin( A Y(n) ′)
  • the velocity V Z(n) of the object 100 in the Z axial direction is expressed as follows.
  • V Z(n) ( W R(n) +W L(n) ) ⁇ cos( A Y(n) ′)
  • the conversion or others of the V X(n) , the V Y(n) and the V Z(n) is made by a prescribed conversion formula or others.
  • FIG. 9 is views of operations of the first controller with the legs of a player.
  • the first value W R(n) 0.
  • the jet engine 112 a at the right front part and the jet engine at the right rear part dot not jet.
  • the second value W L(n) 0.
  • the jet engine 112 c at the left front part and the jet engine 112 d at the left rear part do not jet.
  • the first value W R(n) is W R(n) ⁇ 0.
  • the intensity of the jetting is set, based on the magnitude of the absolute value of the first value W R(n) .
  • the second value W L(n) is W L(n) ⁇ 0.
  • the intensity of the jetting is set, based on the magnitude of the absolute value of the second value W L(n) .
  • the first value W R(n) is W R(n) >0.
  • the intensity of the jetting is set, based on the magnitude of the absolute value of the first W R(n) .
  • the second value W L(n) is W L(n) >0.
  • the intensity of the jetting is set, based on the magnitude of the absolute value of the second W L(n) .
  • FIG. 10 is views illustrating the case that a player applies the weight to the side of the tiptoe of the right leg and the side of the tiptoe of the left leg.
  • the hatchings in FIG. 10 indicate the parts the player applies the weight to.
  • FIG. 10A is a plan view illustrating loading the first controller
  • FIG. 10B is an example of a display image illustrating a motion of the object.
  • the CPU 40 displays on the TV monitor 4 the jetting of the jet engines 112 b at the right rear part and the jetting of the jet engine 112 d at the left rear part.
  • the action of the object 100 is determined as described above.
  • FIG. 11 is a view illustrating the case that a player applies the weight to the side of the heel of the right leg and also to the side of the heel of the left leg.
  • the hatchings in FIG. 11 indicate the portions to apply the weight to.
  • FIG. 11A is a plan view of loading the first controller
  • FIG. 11B is a view illustrating an example of display image of the action of the object.
  • the CPU 40 displays on the TV monitor 4 the jetting of the jet engine 112 a at the right front part and also the jetting of the jet engine 112 b at the left front part.
  • the action of the object 100 is determined as described above.
  • Step S 9 when the value of the (W R(n) ⁇ W L(n) ) is negative, and the value of the (W R(n) ⁇ W L(n) ) is smaller than the prescribed threshold value e, the processing of the rotation of the object 100 is made as follows.
  • FIG. 12 is a view illustrating the case that the player applies the weight to the side of the heel of the right leg and to the side of the tiptoe of the left leg.
  • the hatchings in FIG. 12 indicate the portions the player applies the weight.
  • FIG. 12A is a plan view of loading the first controller
  • FIG. 12B a view illustrating an example of the display image of the action of the object.
  • the first value W R(n) and the second value W L(n) are as follows.
  • Step S 11 when the first value W R(n) is positive, and the second value W L(n) is negative (Step S 11 ), the object 100 rotates right.
  • the first value W R(n) is corrected (Step S 12 ). Specifically, the first value W R(n) is multiplied by the prescribed correction coefficient h.
  • the prescribed correction coefficient h can be, e.g., 2.
  • the corrected first value W R(n) ′ is expressed as follows.
  • the CPU 40 stores the corrected first value W R(n) ′ in the W R(n) ′ memory provided in the system memory 42 (see FIG. 6 ).
  • Step S 13 When the object 100 is rotated right, by using the corrected first value W R(n) ′ and the second value W L(n) , which has not been corrected, an action of the object 100 is determined as follows (Step S 13 ).
  • a Y(n) A Y(n ⁇ 1) +( W L(n) ⁇ W R(n) ′)
  • a Y(n ⁇ 1) is the data of the rotation angle in the n ⁇ 1-th frame
  • a Y(n) is the data of the rotation angle in the n-th frame.
  • the data of the rotation angle of the n ⁇ 1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the data of the rotation angle A Y(n) of the n-th frame thus given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the velocity V X(n) of the object 100 in the X axial direction is expressed as follows.
  • V X(n) ( W R(n) ′+W L(n) ) ⁇ sin( A Y(n) ′)
  • the velocity V Z(n) of the object 100 in the Z axial direction is expressed as follows.
  • V Z(n) ( W R(n) ′W L(n) ) ⁇ cos( A Y(n) ′)
  • the CPU 40 displays on the TV monitor 4 the jetting of the jet engine 112 a at the right front part and also the jetting of the jet engine 112 d at the left rear part.
  • the intensities of the respective jettings are set, respectively based on the magnitude of the absolute value of the first value W R(n) and the magnitude of the absolute value of the second value W L(n) .
  • FIG. 13 is views illustrating the case that a player applies the weight to the side of the heel of the left leg and to the side of the tiptoe of the right leg.
  • the hatchings in FIG. 13 indicate the portions the player applies the weight to.
  • FIG. 13A is a plan view illustrating loading the first controller
  • FIG. 13B is a view illustrating an example of a display image of the action of the object.
  • the first value W R(n) and the second value W L(n) are as follows.
  • the second value W L(n) is corrected (Step S 14 ). Specifically, the second value W L(n) is multiplied by a prescribed correction coefficient h.
  • the prescribed correction coefficient h can be, e.g., 2.
  • the corrected second value W L(n) ′ is expressed as follows.
  • the CPU 40 stores the corrected second value W L(n) ′ in the W L(n) ′ memory provided in the system memory 42 (see FIG. 6 ).
  • an action of the object 100 is determined as follows by using the first value W R(n) not corrected and the corrected second value W L(n) ′ (Step S 16 ).
  • a Y(n) A Y(n ⁇ 1) +( W L(n) ′ ⁇ W R(n) )
  • a Y(n ⁇ 1) is the data of the rotation angle of the n ⁇ 1-th frame
  • the A Y(n) is the data of the rotation angle of the n-th frame.
  • the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the data of the rotation angle A Y(n) of the n-th frame thus given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6 ).
  • the velocity V X(n) of the object 100 in the X axial direction is expressed as follows.
  • V X(n) ( W R(n) +W L(n) ′) ⁇ sin( A Y(n) ′)
  • the velocity V Z(n) of the object 100 in the Z axial direction is expressed as follows.
  • V Z(n) ( W R(n) +W L(n) ′) ⁇ cos( A Y(n) ′)
  • the CPU 40 displays on the TV monitor 4 the image that the jet engine 112 c at the left front part jets, and also the jet engine 112 b at the right rear part jets.
  • the intensities of the respective jettings are set, respectively based on the magnitudes of the absolute value of the first value W R(n) and the magnitudes of the absolute value of the second value W L(n) .
  • the action of the object 100 is determined for each frame.
  • FIG. 14 is views illustrating a specific example (Part 1) of the actions of the object.
  • Part 1 is a plan view illustrating the loading applied to the first controller
  • FIG. 14B is a plan view illustrating the action of the object.
  • the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame is 0, the first value W R(n) is ⁇ 0.3, and the second value W L(n) is also ⁇ 0.3
  • the value given by subtracting the first value W R(n) from the second value W L(n) is 0, and accordingly, the data A Y(n) of the rotation angle of the n-th frame is the same value as the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame. Accordingly, the object 100 does not change the direction.
  • the value given by adding the first value W R(n) and the second value W L(n) is ⁇ 0.6, and accordingly, the object 100 proceeds in the negative direction of the Z axis at a velocity corresponding to the magnitude of 0.6.
  • FIG. 15 is views illustrating a specific example (Part 2) of the actions of the object.
  • Part 2 is a plan view illustrating the loading applied to the first controller
  • FIG. 15B is a plan view illustrating the action of the object.
  • FIG. 15 illustrates the case that the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1 frame is not 0, the first value W R(n) is ⁇ 0.3, and the second value W L(n) is also ⁇ 0.3.
  • the value given by subtracting the first value W R(n) from the second value W L(n) is 0, and accordingly the data A Y(n) of the rotation angle of the n-th frame is the same value as the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame. Accordingly, the object 100 is retained in the same direction as in the n ⁇ 1-th frame.
  • the value given by adding the first value W R(n) and the second value W L(n) is ⁇ 0.6, and accordingly, in the case of FIG. 15 , the object 100 proceeds at a velocity corresponding to the magnitude of 0.6 in a direction corresponding to the data A Y(n) of the rotation angle.
  • FIG. 16 is views illustrating a specific example (Part 3) of the actions of the object.
  • FIG. 16A is a plan view illustrating the loading applied to the first controller
  • FIG. 16B is a plan view illustrating the action of the object.
  • FIG. 16 illustrates the case that the first value W R(n) ′ is 0.3, and the second value W L(n) is ⁇ 0.3.
  • the value given by subtracting the first value W R(n) ′ from the second value W L(n) is ⁇ 0.6, and accordingly, the data A Y(n) of the rotation angle of the n-th frame is varied by ⁇ 0.6 from the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame. More specifically, by the above-described conversion formula, the value of the rotation angle varies by ⁇ 0.6 k ⁇ . In FIG. 16 , in which the direction toward this side as viewed in the drawing is the positive direction of the Y axis, and the away direction as viewed in the drawing is the negative direction of the Y axis, the left rotation on the rotation axis is negative. Accordingly, when FIG. 16 is viewed from the front, the rotation direction of the object 100 is right.
  • the value given by adding the first value W R(n) ′ and the second value W L(n) is 0, and accordingly the positional coordinates of the center of the object 100 do not vary.
  • the object 100 does not proceed but rotates right there by an angle corresponding to the magnitude of 0.6.
  • FIG. 17 is views illustrating a specific example (Part 4) of the actions of the object.
  • FIG. 17A is a plan view illustrating the loading applied to the first controller
  • FIG. 17B is a plan view illustrating the action of the object.
  • FIG. 17 illustrates the case that the first value W R(n) is ⁇ 0.3, and the second value W L(n) is ⁇ 0.1.
  • the value given by subtracting the first value W R(n) from the second value W L(n) is 0.2, and accordingly, the data A Y(n) of the rotation angle of the n-th frame is varied by +0.2 from the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame. More specifically, when the above-described conversion formula is used, the value of the rotation angle is varied by +0.2 k ⁇ .
  • the direction toward this side as viewed in the drawing is the positive direction of the Y axis
  • the away direction as viewed in the drawing is the negative direction of the Y axis
  • the right rotation on the rotation axis is positive. Accordingly, when FIG. 17 is viewed from the front, the rotation direction of the object 100 is left.
  • the value given by adding the first value W R(n) and the second value W L(n) is ⁇ 0.4.
  • the object 100 proceeds at the velocity corresponding to the magnitude of 0.4 while rotating left by the angle corresponding to 0.2.
  • FIG. 18 is views illustrating a specific example (Part 5) of the actions of the object.
  • FIG. 18A is a plan view illustrating the loading applied to the first controller
  • the FIG. 18B is a plan view illustrating the action of the object.
  • FIG. 18 illustrates the case that the first value W R(n) is ⁇ 0.3, and the second value W L(n) ′ is 0.1.
  • the value given by subtracting the first value W R(n) from the second value W L(n) ′ is 0.4, and accordingly, the data A Y(n) of the rotation angle of the n-th frame is varied by +0.4 from the data A Y(n ⁇ 1) of the rotation angle of the n ⁇ 1-th frame. More specifically, when the above-described conversion formula is used, the value of the rotation angle is varied by +0.4 k ⁇ .
  • the direction toward this side as viewed in the drawing is the positive direction of the Y axis
  • the away direction as viewed in the drawing is the negative direction of the Y axis
  • the right rotation on the rotation axis is positive. Accordingly, when FIG. 18 is viewed from the front, the rotation direction of the object 100 is left.
  • the value given by adding the first value W R(n) and the second value W L(n) ′ is ⁇ 0.2.
  • the object 100 proceeds at the velocity corresponding to the magnitude of 0.2 while rotating left by the angle corresponding to 0.4.
  • the actions of the objects 100 are determined, and the object 100 on the screen of the TV monitor 4 acts corresponding to the determined actions.
  • the first value is calculated, based on a difference between the first load data based on an output of the first load sensor 82 a provided at the right front part of the supporter 78 for a player to be supported on, and the second load data based on an output of the second load sensor 82 b provided at the right rear part of the supporter 78
  • the second value is calculated, based on a difference between the third load data based on an output of the third load sensor 82 c provided at the left front part of the supporter 78 and the fourth load data based on an output of the fourth load sensor 82 d provided at the left rear part of the supporter 78 .
  • an action of the object to be displayed on the display screen is determined.
  • the rotation, etc. of the object 100 can be made, which makes the game realistically enjoyable.
  • the above-described embodiment is described by means of the example of determinating the actions of the hover machine by the controller, but the object to have the actions determined is not limited to the hover machine.
  • the present invention is applicable to determine actions of, e.g., four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc.
  • objects such as four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc.
  • the objects such as four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc., may be caused to retreat when the value given by adding the first value and the second value is positive.
  • the objects may be moved in the same directions as the object exemplified in the above-described embodiment or may be moved in directions opposite to the directions the exemplified object in the above-described embodiment is moved in.
  • the objects may be turned in the same directions as the object exemplified in the above-described embodiment or may be turned in the directions opposite to the directions the object exemplified in the above-described embodiment is turned in.
  • the game device and the computer program according to the present invention is useful to provide a game which can be realistically operated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A first value is calculated, based on a difference of a first load data based on an output of a first load sensor 82 a provided at a right front part of a supporter 78 for supporting a player and a second load data based on an output of a second load sensor 82 b provided at a right rear part of the supporter; a second value is calculated, based on a difference of a third load data based on an output of a third load sensor 82 c provided at a left front part of the supporter and a fourth load data based on an output of a fourth load sensor 82 d provided at a left rear part of the supporter; and an action of an object to be displayed on a display screen is determined based on the calculated first value and the second value.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application is a Continuation of PCT application No. PCT/JP2010/053998, which was filed on Mar. 10, 2010, and which designated the United States of America, the entire contents of which are incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates to a game device and a computer program, more specifically, a game device using a controller, and a computer program.
  • BACKGROUND ART
  • Recently, a technique that a player gets on a controller with a plurality of load sensors provided on, and changing loading modes on the controller to thereby make input operations of the game (Patent Reference 1).
  • The background art of the invention of the present application is as follows.
  • PRIOR ART REFERENCES Patent References
  • Patent Reference 1: Japanese Patent Application Unexamined Publication No. 2008-264195
  • Patent Reference 2: Japanese Patent Application Unexamined Publication No. 2008-119211
  • SUMMARY OF THE INVENTION Problems to be Solved by the Invention
  • However, by using such controller, realistic operations cannot be always realized.
  • An object of the present invention is to provide a game device which can realize realistic operations, and a computer program.
  • Means for Solving the Problems
  • According to one aspect of the present invention, the present invention provides a computer program for operating a computer as a game device using a controller including a supporter for supporting a player; a first load sensor provided at a right front part of the supporter, for detecting a load from supporter; a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, the third load sensor provided at a left front part of the supporter, for detecting a load from the supporter, and the fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter, said computer being operated as an action determination means which calculates a first value, based on a difference of a first load data based on an output of the first load sensor and the second load data based on an output of the second load sensor, calculates a second value based on a difference between a third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines an action of an object to be displayed on a display screen, based on the first value and the second value.
  • According to the other aspect of the present invention, the present invention provides a game device comprising a controller including a supporter for supporting a player, and a first load sensor provided at a right front part of the supporter, for detecting a load from the supporter, a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, a third load sensor provided at a front left part of the supporter, for detecting a load from the supporter and a fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter, the game device comprising an action determination means which calculates a first value based on a difference between the first load data based on an output of the first load sensor and an output of the second load sensor, calculates a second value, based on a difference between the third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines an action of an object to be displayed on a display screen, based on the first value and the second value.
  • EFFECT OF THE INVENTION
  • According to the present invention, a first value is calculated, based on a difference between a first load data based on an output of a first load sensor provided at a right front part of the supporter for a player to be supported on, and a second load data based on an output of a second load sensor provided at a right rear part of the supporter. A second value is calculated, based on a difference between a third load data based on an output of a third load sensor provided at a left front part of the supporter and a fourth load data based on an output of a fourth load sensor provided at a left rear part. Based on the first value and the second value, an action of an object displayed on a display screen is determined. Thus, according to the present invention, the object can be cause to make rotation, etc., which makes it possible to realistically enjoy the game.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [FIG. 1] FIG. 1 is an appearance view of the game device according to one embodiment of the present invention.
  • [FIG. 2] FIG. 2 is a top view and a bottom view of a controller used in the embodiment of the present invention.
  • [FIG. 3] FIG. 3 is a block diagram of the game device according to the embodiment of the present invention.
  • [FIG. 4] FIG. 4 is a perspective view illustrating a player getting on the first controller.
  • [FIG. 5] FIG. 5 is a view illustrating an example of the display screen of the game device according to the embodiment of the present invention.
  • [FIG. 6] FIG. 6 is a view of the respective memories provided in the system memory.
  • [FIG. 7] FIG. 7 is a view illustrating a relationship between the local coordinate system and the object.
  • [FIG. 8] FIG. 8 is the flow chart of the processing of determination of an action of the object.
  • [FIG. 9] FIG. 9 is views illustrating the operations of the first controller with the legs of a player.
  • [FIG. 10] FIG. 10 is views illustrating the case that a player applies the weight to the tiptoe of the right leg and the tiptoe of the left leg.
  • [FIG. 11] FIG. 11 is views illustrating the case that the player applies the weight to the heel of the right leg and the heel of the left leg.
  • [FIG. 12] FIG. 12 is views illustrating the case that the player applies the weight to the heel of the right leg and the tiptoe of the left leg.
  • [FIG. 13] FIG. 13 is views illustrating the case that the player applies the weight to the heel of the left leg and the tiptoe of the right leg.
  • [FIG. 14] FIG. 14 is views illustrating a specific example (Part 1) of the actions of the object.
  • [FIG. 15] FIG. 15 is views illustrating a specific example (Part 2) of the actions of the object.
  • [FIG. 16] FIG. 16 is views illustrating a specific example (Part 3) of the actions of the object.
  • [FIG. 17] FIG. 17 is views illustrating a specific example (Part 4) of the actions of the object.
  • [FIG. 18] FIG. 18 is views illustrating a specific example (Part 4) of the actions of the object.
  • MODE FOR CARRYING OUT THE INVENTION An Embodiment
  • The game device and a computer program according to an embodiment of the present invention will be described with reference to FIGS. 1 to 17.
  • Appearance of the Game Device
  • First, the appearance of the game device 10 according to the present embodiment will be described. FIG. 1 is an appearance view of the game device according to the present embodiment.
  • The game device 10 according to the present embodiment is used, connected to a TV monitor (a display, display means) 4 placed on a TV rack 2.
  • The game device 10 includes a first controller 20 to be operated by a player, and a second controller 22 to be operated by the player.
  • The game device body 12 and the TV monitor 4 are connected to each other by a cable. The game device body 12 and the first controller 20 can communicate with each other by wireless. The game device body 12 and the second controller 22 can communicate with each other by wireless.
  • FIG. 2 is an upper surface view and an underside view of the controller used in the present embodiment. FIG. 2A is the top view, and FIG. 2B is the bottom view.
  • As illustrated in FIG. 2, the first controller 20 includes a supporter 78 for legs of a player to put on, and four load sensors 82 a-82 d provided on the four corners of the supporter 20 for sensing loads applied to the supporter 78. The controller 20 transmits sensed load values sensed by the load sensors 82 a-82 d to the game device body 12. On the first controller 20, the player adjusts loading the body weight to thereby manipulate actions of objects displayed on the TV monitor 4 to play the game.
  • The second controller 22 is operated by, e.g., a hand 23 of the player. On the operation side of the second controller 22, various operation buttons 28, such as a power source button 28 a, a cross button 28 b, etc. (see FIG. 3) are provided. The player uses the second controller 22 to input commands, such as a start of the game, etc., in the game device body 12.
  • Structure of the Game Device
  • Next, the structure of the game device according to the present embodiment will be described. FIG. 3 is a block diagram of the game device according to the present embodiment.
  • As illustrated in FIG. 3, to the game device 10, a CPU 40 which executes the gate program, the general control of the entire system, the coordinates computation for image displays, etc., and a system memory (RAM) 42 to be used as the buffer memory which stores programs and data necessary for the CPU 40 to process are connected to a bus arbiter 44 by a common bus line. The bus arbiter 44 controls flows of programs and data to the respective blocks of the game device 10 and devices connected outside.
  • A program data memory device or a memory storage medium (including an optical disc, an optical disc drive, etc. for driving CD-ROMs, etc. which are game record media) 46 storing the game program and data (including image data and music data), and a BOOT ROM 48 storing programs and data for actuating the game device 10 are connected to the bus arbiter 44 via bus lines.
  • To generate display images, polygon data having three-dimensional local coordinates data forming objects to be displayed (vertex data), NURBS (Non Uniform Rational B-Spline) data (curved-face or control point data) are stored in the system memory 42, and these are arranged in a world coordinate system of a three-dimensional virtual space by the CPU 40 and a geometry processor (not illustrated) to convert the local coordinates to the world coordinate system.
  • Furthermore, in the world coordinate system, view point coordinates generated by an operation of a player and in accordance with a progress of the game are set, and objects present in the view range seen at this view point in a prescribed view direction and at a view angle are converted into a view point coordinate system having the view coordinates at the origin, and the converted coordinates of the objects are transmitted to the rendering processor 50.
  • The rendering processor 50 first makes interpolation processing, such as light source processing, on the transmitted coordinates of the objects and details the surfaces of the objects by adhering texture data stored in the graphic memory 52 to the objects. Furthermore, the rendering processor 50 projects from the three-dimensional stereoscopic projects the objects (polygons) on the two-dimensional plane (screen) for the display on the TV monitor 4 to convert the objects to the two-dimensional coordinate data (screen coordinate system), displays first the polygons whose depth in the Z coordinate are smaller, i.e., nearer to the view point coordinates to thereby generate two-dimensional images, and outputs the two-dimensional images on the TV monitor 4, such as a CRT, a liquid crystal display or others.
  • Via the bus arbiter 44, the rendering processor 50, which reproduces image (MOVIE) data read from the program data memory device or the memory storage medium 46 and generates images for image displays by an operation of a player or in accordance with a progress of the game, and a graphic memory 52 which stores graphic data, etc. necessary for the rendering processor 50 to generate images are connected to each other. The image signals outputted from the rendering processor 50 are converted from digital signals to analog signals by the video DAC 54 to be displayed by the TV monitor 4.
  • Via the bus arbiter 44, a sound processor 56 which reproduces music data read from the program data memory device or the memory storage medium 46 and generates effect sounds and voices by an operation of a player and in accordance with a progress of the game, and a sound memory 58 storing sound data, etc. necessary for the sound processor 56 to generate effect sounds and voices are connected to each other. The voice signals outputted from the sound processor 56 are converted from digital signals to analog signals by an audio DAC 60 to be outputted from the speaker of the TV monitor 4.
  • To the bus arbiter 44, a communication interface 62 is connected. The communication interface 62 is connected to outside networks, such as telephone circuits, etc., via a LAN adapter 64. The game device body 12 is connected to internets by the LAN adapter 64 and can communicate with other game devices, network severs, etc.
  • The communication interface 62 and the LAN adapter 64 use telephone circuits but may use terminal adapters (TA) or routers using telephone circuits, cable modems using cable TV circuits, wireless communication means using portable telephones or PHS and other communication means, such as optical fiber communication means, etc., using optical fibers.
  • To the bus arbiter 44, a wireless receiver unit 68 which wireless communicates with the controller 20 and the remote controller 22 is connected. The wireless receiver unit 68 receives information transmitted from the first controller 20 and information transmitted from the second controller 22.
  • To the bus arbiter 44, a peripheral I/F (Interface) 70 is connected. Via the peripheral I/F 70, various peripheral devices can be connected.
  • The game device 10 is not essentially a domestic game device and can be a personal computer, a portable electronic game machine, an electronic device, such as a portable telephone, a PDA or others, an information processing device of a game device or others installed in amusements facilities, shops such as game cafes, etc.
  • Controller
  • Next, the first controller used in the present embodiment will be described with reference to FIGS. 1 to 4.
  • The first controller 20 of the present embodiment functions as the operation device (operating means) of the game. The controller 20 includes the supporter (support plate) 78 for a player to get on, the load sensors 82 a-82 d which detect loads applied to the supporter 78. The load sensors 82 a-82 d are arranged on the four corners of the supporter 78.
  • More specifically, the first load sensor 82 a is arranged at the right front part of the supporter 78, the second load sensor 82 b is arranged at the right rear part of the supporter 78, the third load sensor 82 c is arranged at the left front part of the supporter 78, and the fourth load sensor 82 d is arranged at the left rear part of the supporter 78.
  • On the four corners of the first controller 20, the legs 80 a-80 d are provided. The respective legs 80 a-80 d are formed, e.g., cylindrical. The first load sensor 82 a is supported by the leg 80 a, the second load sensor 82 b is supported by the leg 80 b, the third load sensor 82 c is supported by the leg 80 c, and the fourth load sensor 82 d is supported by the leg 80 d. That is, the supporter 78 is supported by the four legs 80 a-80 d via the load sensors 82 a-82 d provided on the four corners.
  • FIG. 4 is a perspective view of the state that a player is on the first controller.
  • When a player 6 gets on the supporter 78 of the first controller 20 as illustrated in FIG. 4, a load of the player 6 is transmitted to the load sensors 82 a-82 d via the supporter 78 of the first controller 20.
  • The load sensors 82 a-82 d can be, e.g., strain gauges-type load cells or others. The load sensors 82 a-82 d output electric signals of intensities corresponding to applied loads.
  • The first controller 20 can be suitably a controller which includes a plurality of load sensors for detecting loads applied to the supporter.
  • As illustrated in FIG. 3, the first controller 20 includes a CPU 88 which controls the general operation of the first controller 20. To the CPU 88, a ROM, RAM or others not illustrated is connected. The CPU 88 controls the operation of the first controller 20 by a computer program stored in the ROM.
  • The respective load sensors 82 a-82 d are connected to an AD converter 86 via the respective amplifiers 84 a-84 d.
  • A wireless transmission unit 90 transmits data from the first controller 20 to the game device body 12. The wireless transmission unit 90 is provided in the inside of, e.g., the first controller 20. To the load sensors 82 a-82 d, the amplifiers 84 a-84 d, the AD converter 86, the CPU 88 and the wireless communication unit 90, prescribed voltages are supplied from a battery (not illustrated), such as electric cells or others.
  • The respective load sensors 82 a-82 d output signals indicating inputted loads. The electric signals outputted from the respective load sensors 82 a-82 d are amplified respectively by the amplifiers 84, converted from the analog signals to digital data by the AD converter 86, and inputted to the CPU 88. To the data of the detected values of the respective load sensors 82 a-82 d, identification information of the respective load sensors 82 a-82 d is added, and which load sensors 82 a-82 d the detected load values belong to can be identified.
  • The CPU 88 obtains data f1-f4 of detected load values of the respective load sensors 82 a-82 d. f1 is data of the detected load value given by the first load sensor 82 a, f2 is data of the detected load value given by the second load sensor 82 b, f3 is the detected load value given by the third load sensor 82 c, and f4 is the detected load value given by the fourth load sensor 82 d.
  • The data f1-f4 of the detected load values given by the load sensors 82 a-82 d are transmitted as operational input data for the first controller 20 from the CPU 88 to the game device body 12 via the wireless transmission unit 90. The CPU 88 transmits for, e.g., each frame, the data f1-f4 of the detected load values given by the load sensors 82 a-82 d.
  • Game Execution Processing
  • The game execution processing of the game device according to the present embodiment will be described.
  • FIG. 5 is a view illustrating an example of the display images of the game device according to the present embodiment.
  • In the game of the present embodiment, a player operates with the first controller 20 directions (proceeding directions) of a hover machine (an object) 100 displayed on the TV monitor (a display, display means) 2 and velocities (moving velocities, proceeding velocities) to reach the hover machine 100 to the flag (a target to gain points) 102 displayed on the TV monitor 2 and obtain the flag 102, whereby the player goes on gaining points.
  • For example, when all of the flags 102 displayed are obtained, the game is set. The player can enjoy the game by competing in obtaining all the flags 102 within a shorter period of time.
  • On the upper left part of the screen of the TV monitor 4, the best record (BEST RECORD) 104 of the games so far played is indicated. At the upper central part of the TV monitor 4, a number 106 of remaining flags is indicated. At the upper right part of the screen of the TV monitor 4, an elapsed time 108 from the start of the game is indicated. On the screen of the TV monitor 4, flags 102 which are targets to gain points are displayed.
  • On the screen of the TV monitor 4, the object (the hover machine) 100 to be operated by a player in the three-dimensional virtual space is displayed. On the screen of the TV monitor 4, the hover machine 100 is displayed with an operator (a character) 110 riding the hover machine 100.
  • At the right front part, the right rear part, the left front part and the right rear part of the hover machine 100, jet engines 112 a-112 d are respectively provided. The jet engines 112 a-112 d are for applying driving forces to hover machine 100. The hover machine 100 runs on virtual ice with the driving forces of the jet engines 112 a-112 d. The jet engines 112 a, 112 b on the right side, and the jet engines 112 c, 112 d on the left side are controlled independently. The jet engine 112 a at the right front part and the jet engine 112 b at the right rear part do not simultaneously jet. The jet engine 112 c at the left front part and the jet engine 112 d at the left rear part do not simultaneously jet.
  • For example, by both of the jet engine 112 b at the right rear part and the jet engine 112 d at the left rear part jetting, forward driving forces are applied to the hover machine 100 by the jet engine 112 b at the right rear part and the jet engine 112 d at the left rear part, and the hover machine 100 moves forward.
  • By both of the jet engine 112 a at the right front part and the jet engine 112 c at the left front part jetting, backward driving forces are applied to the hover machine 100, and the hover machine 100 moves backward.
  • By the jet engine 112 a at the right front part and the jet engine 112 d at the left rear part jetting, driving forces are applied to the hover machine 100 by the jet engine 112 a at the right front part and the jet engine 112 d at the left rear part. When the driving force of the jet engine 112 a at the right front part and the driving force of the jet engine 112 d at the left rear part are equal to each other, the hover machine 100 rotates right there without changing the central position. When the driving force of the jet engine 112 a at the right front part is larger than the driving force of the jet engine 112 d at the left rear part, the hover machine 100 moves backward, rotating right. When the driving force of the jet engine 112 d at the left rear part is larger than the driving force of the jet engine 112 a at the right front part, the hover machine 100 moves forward, rotating right.
  • By the jet engine 112 b at the right rear part and the jet engine 112 c at the left front part jetting, driving forces are applied to the hover machine 100 by the jet engine 112 b at the right rear part and the jet engine 112 c at the left front part. When the driving force of the jet engine 112 b at the right rear part and the driving force of the jet engine 112 c at the left front part are equal to each other, the hover machine 100 rotates left there without changing the central position. When the driving force of the jet engine 112 b at the right rear part is larger than the deriving force of the jet engine 112 c at the left front part, the hover machine 100 moves forward, rotating left. When the driving force of the jet engine 112 c at the left front part is larger than the driving force of the jet engine 112 b at the right rear part, the hover machine 100 moves backward, rotating left.
  • On the screen of the TV monitor 4, obstacles 114 that hinder the advance of the hover machine 100 are also displayed.
  • Obtain Weight Data
  • Before the game starts, a display which asks a player to get on the first controller 20 is displayed on the TV monitor 4.
  • When the player gets on the first controller 20, the measuring of the weight of the player is performed. Specifically, the total value of data f1-f4 of detected load values given by the four load sensors 82 a-82 d is taken for data F of the weight of the player. The weight F of the player is expressed by the following formula.

  • F=f 1 +f 2 +f 3 +f 4
  • FIG. 6 is a view of the respective memories provided in the system memory.
  • The CPU 40 stores the data F of the weight of the player in the player weight memory provided in the system memory 42 (see FIG. 6).
  • Obtain Normalization Coefficient
  • Next, the CPU 40 calculates a coefficient a for normalizing the data f1-f4. The coefficient (normalization coefficient) a for normalizing the f1-f4 is expressed by the following formula.

  • a=1/F
  • The CPU 40 stores the normalization coefficient a in the normalization coefficient memory provided in the system memory 42 (see FIG. 6).
  • The data f1-f4 of the detected load values of the respective load sensors 82 a-82 d are obtained for each frame. The CPU 40 stores the data f1-f4 of the obtained detected load values in the f1-f4 memory provided in the system memory 42 (see FIG. 6).
  • The data g1(n)-g4(n) of the detected load values given by normalizing the data f1(n)-f4(n) of the n-th frame are expressed by the following formulas.

  • g 1(n) =f 1(n) ×a

  • g 2(n) =f 2(n) ×a

  • g 3(n) =f 3(n) ×a

  • g 4(n) =f 4(n) ×a
  • Since the data g1(n)-g4(n) are normalized, the total of the data g1(n)-g4(n) is 1.0.
  • The CPU 40 stores the normalized data g1-g4 in the g1-g4 memory provided in the system memory 42 (see FIG. 6).
  • Calculate Load Data
  • In the present embodiment, when a player is standing upright on the first controller 20 without motion, correction coefficients (correction values) b1-b4 for making the data g1-g4 of the normalized detected load values 0.25 are determined respectively in advance. The correction coefficients b1-b4 are not determined for the respective players but are applied uniformly to all the players.
  • The correction coefficients b1-b4 are applied uniformly to all the players here but the correction coefficients may be determined for the respective players.
  • Data (load data) W1(n)-W4(n) of the detected load values normalized and corrected are expressed by the following formulas.

  • W 1(n) =g 1(n) ×b 1 =f 1(n) ×a×b 1

  • W 2(n) =g 2(n) ×b 2 =f 2(n) ×a×b 2

  • W 3(n) =g 3(n) ×b 3 =f 3(n) ×a×b 3

  • W 4(n) =g 4(n) ×b 4 =f 4(n) ×a×b 4
  • The CPU 40 stores the calculated load data W1(n)-W4(n) in the W1-W4 memory provided in the system memory 42.
  • Lay Out Objects
  • The position coordinates P(n) of the object (the hover machine) 100 at the n-th frame in the world coordinate system are expressed as follows.

  • P (n)=(P X(n) , P Y(n) , P Z(n))
  • The position coordinates P(n) are stored in the position coordinates memory provided in the system memory 42 (see FIG. 6).
  • Here, the X Z plane in the world coordinate system is a plane in parallel with the virtual ground, and the Y axis of the world coordinate system is normal to the virtual ground.
  • FIG. 7 is a view showing the relationships between the local coordinate system and an object. The left-to-right direction as viewed in FIG. 7 corresponds to the X axial direction of the local coordinate system. The rightward direction as viewed in FIG. 7 corresponds to the positive direction of the X axis, and the leftward direction as viewed in FIG. 7 corresponds to the negative direction of the X axis. The up-to-down direction as viewed in FIG. 7 corresponds to the Z axial direction of the local coordinate system. The downward direction as viewed in FIG. 7 is the positive direction of the Z axis, and the upward direction of the Z axis as viewed in FIG. 7 is the negative direction of the Z axis. The normal direction as viewed in FIG. 7 corresponds to the Y axial direction of the local coordinate system. The direction toward this side as viewed in FIG. 7 is the positive direction of the Y axis, and the away direction as viewed in FIG. 7 is the negative direction of the Y axis.
  • On the initial stage of the game, the central line of the object 100 in the left-to-right direction agrees with the X axial direction. More specifically, on the initial stage of the game, the rightward direction of the object 100 agrees with the positive direction of the X axis, and the leftward direction of the object 100 agrees with the negative direction of the X direction.
  • On the initial stage of the game, the central line of the object 100 in the front-to-rear direction agrees with the Z axial direction. More specifically, on the initial stage of the game, the rear direction of the object 100 agrees with the positive direction of the Z axis, and the forward direction of the object 100 agrees with the negative direction of the Z axis.
  • In the state that the object 100 is positioned on the horizontal plane, the central line of the object 100 in the up-to-down direction agrees with the Y axis. More specifically, in the state that the object 100 is positioned on the horizontal plane, the downward direction of the object 100 agrees with the positive direction of the Y axis, and the upward direction of the object 100 agrees with the negative direction of the Y axis.
  • Determine Action of the Object
  • FIG. 8 is the flow chart of the processing of determination of action of the object.
  • When the execution of the game is started, the CPU makes the normalization with the normalization coefficient a and the correction with the correction coefficients b1-b4 as described above, based on the data f1(n)-f4(n) of the detected load values of the load sensors 82 a-82 d to thereby calculate the respective load data W1(n)-W4(n) (Step S1). The CPU 40 makes the computation of the respective load data W1(n)-W4(n) for each frame.
  • The CPU 40 stores the calculated load data W1(n)-W4(n) in the W1-W4 memory provided in the system memory 42 (see FIG. 6).
  • Then, the CPU 40 subtracts first load data W1(n) from the second load data W2(n) to thereby calculate the first value WR(n) (Step S2). The first value WR(n) is calculated for each frame.
  • The first value WR(n) is expressed by the following formula.

  • W R(n) =W 2(n) −W 1(n)
  • The CPU 40 stores the calculated first value WR(n) in the WR(n) memory provided in the system memory 42 (see FIG. 6).
  • The CPU 40 subtracts the third load data W3(n) from the fourth load data W4(n) to thereby calculate the second value WL(n) (Step S3). The second value WL(n) is calculated for each frame.
  • The second value WL(n) is expressed by the following formula.

  • W L(n) =W 4(n) −W 3(n)
  • The CPU 40 stores the calculated second value WL(n) in the WL(n) memory provided in the system memory 42 (see FIG. 6).
  • Next, when the absolute value of the first value WR(n) is smaller than a prescribed threshold value c (Step S4), the CPU 40 corrects the first value WR(n) (Step S5). Specifically, when the absolute value of the first value WR(n) is smaller than the prescribed threshold value c, a the first value WR(n) is multiplied by a prescribed correction coefficient d. The prescribed threshold value c is set at, e.g., 0.05. The prescribed correction coefficient d is set at, e.g., 0.02. Such processing is for preventing from the object 100 from acting by a small change of a load even when a player does not intend. This permits the action of the object 100 to be much suppressed when a player is standing upright without motion.
  • When the absolute value of the second value WL(n) is smaller than the prescribed threshold value c (Step S6), the CPU 40 corrects the second value WL(n) (Step S7). Specifically, when the absolute value of the second value WL(n) is smaller than the prescribed threshold value c, the second value WL(n) is multiplied by the prescribed correction coefficient d.
  • Next, the CPU 40 gives a product of the multiplication of the first value WR(n) by the second value WL(n), i.e., the value of (WR(n)×WL(n)) (Step S8). When the value of (WR(n)×WL(n)) is negative and is smaller than the prescribed threshold value e (Step S9), the processing of rotating the object to be described later is made (Steps S11-S15). When the value of (WR(n)×WL(n)) is positive or larger than the prescribed threshold value e even if (WR(n)×WL(n)) is negative, the action of the object 100 is determined as follows, based on the first value WR(n) and the second value WL(n) (Step S10). The prescribed threshold value e can be, e.g., −0.01. Such use of the prescribed threshold value e is for preventing the rotation, etc. of the object 100 due to a small load change even when a player does not intend.
  • The CPU 40 determines an action of the object 100 as follows, based on the first value WR(n) and the second value WL(n).
  • The rotation angle A(n) of the object 100 in the n-th frame of the local coordinate system is expressed as follows.

  • A (n)=(A X(n) , A Y(n) , A Z(n))
  • AX(n) is the data of the rotation angle of the object 100 on the rotation axis of the X axis. AY(n) is the data of the rotation angle of the object 100 on the rotation axis of the Y axis. AZ(n) is the data of the rotation angle of the object 100 on the rotation axis of the Z axis. The right rotation on the rotation axis is positive, and the left rotation on the rotation axis is negative.
  • When the plane on which the object 100 moves is horizontal, no rotation is made on the X axis and no rotation is made on the Z axis. The rotation on the Y axis alone is made. Here, to make the description simple, the description will be made by means of the example that the plane on which the object 100 moves is horizontal. Accordingly, the values of the AX(n) and the AZ(n) are 0 respectively here. AY(n) is 0 when the central line of the object 100 in the front-to-rear direction is parallel with the Z axis and the forward direction of the object 100 agrees with the negative direction of the Z axis.
  • The object 100 may move on an inclined plane. When the object 100 moves on an inclined plane, not only the AY(n) but also the AX(n) and the AZ(n) are considered.
  • The rotation angle AY(n) in the local coordinate system is given by the following formula.

  • A Y(n) =A Y(n−1)+(W L(n) −W R(n))
  • AY(n−1) is the data of the rotation angle of the n−1-th frame, and AY(n) is the data of the rotation angle of the n−1-th frame. The data AY(n−1) of the rotation angle of the n−1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • The minimum value of the (WL(n)−WR(n)) is −1.0, and the maximum value thereof is 1.0.
  • The data of the rotation angle A(n) of the n-th frame such given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • When the object 100 is displayed in the world coordinate system, the data A(n) of a rotation angle is converted to a value A(n)′ of the angle by a prescribed conversion formula or others.
  • As such prescribed conversion formula, the following conversion formula, for example, can be used.

  • A (n) ′=k×A (n)×π
  • k is a prescribed coefficient.
  • The velocity V(n) of the object 100 in the n-th frame of the world coordinate system is expressed as follows.

  • V(n)=(V X(n) , V Y(n) , V Z(n))
  • The velocity VX(n) of the object 100 in the X axial direction is expressed as follows.

  • V X(n)=(W R(n) +W L(n))×sin(A Y(n)′)
  • When the plane on which the object 100 moves is horizontal, the VY(n) , the velocity of the object 100 in the Y axial direction is 0.
  • The velocity VZ(n) of the object 100 in the Z axial direction is expressed as follows.

  • V Z(n)=(W R(n) +W L(n))×cos(A Y(n)′)
  • When the object 100 is displayed in the world coordinate system, the conversion or others of the VX(n), the VY(n) and the VZ(n) is made by a prescribed conversion formula or others.
  • FIG. 9 is views of operations of the first controller with the legs of a player.
  • When the gravity center of the right leg 8 a of the player is neither forward nor backward (see FIG. 9A), the first value WR(n) is WR(n)=0. In this case, the jet engine 112 a at the right front part and the jet engine at the right rear part dot not jet.
  • When the center of the gravity of the left leg 8 b of the player is neither forward nor backward (see FIG. 9A), the second value WL(n) is WL(n)=0. In this case, the jet engine 112 c at the left front part and the jet engine 112 d at the left rear part do not jet.
  • When the player applies the weight to the side of the tiptoe of the right leg 8 a (see FIG. 9B), the first value WR(n) is WR(n)<0. In this case, the jet engine 112 b at the right rear part jets. The intensity of the jetting is set, based on the magnitude of the absolute value of the first value WR(n).
  • When the player applies the weight to the side of the tiptoe of the left leg 8 b (see FIG. 9B), the second value WL(n) is WL(n)<0. In this case, the jet engine 112 d at the left rear part jets. The intensity of the jetting is set, based on the magnitude of the absolute value of the second value WL(n).
  • When the player applies the weight to the side of the heel of the right leg 8 a (see FIG. 9C), the first value WR(n) is WR(n)>0. In this case, the jet engine 112 a at the right front part jets. The intensity of the jetting is set, based on the magnitude of the absolute value of the first WR(n).
  • When the player applies the weight to the side of the heel of the left leg 8 b (see FIG. 9C), the second value WL(n) is WL(n)>0. In this case, the jet engine 112 c at the left front part jets. The intensity of the jetting is set, based on the magnitude of the absolute value of the second WL(n).
  • FIG. 10 is views illustrating the case that a player applies the weight to the side of the tiptoe of the right leg and the side of the tiptoe of the left leg. The hatchings in FIG. 10 indicate the parts the player applies the weight to. FIG. 10A is a plan view illustrating loading the first controller, and FIG. 10B is an example of a display image illustrating a motion of the object.
  • When the player applies the weight to the side of the tiptoe of the right leg 8 a and also to the side of the tiptoe of the left leg 8 b, the CPU 40 displays on the TV monitor 4 the jetting of the jet engines 112 b at the right rear part and the jetting of the jet engine 112 d at the left rear part. The action of the object 100 is determined as described above.
  • FIG. 11 is a view illustrating the case that a player applies the weight to the side of the heel of the right leg and also to the side of the heel of the left leg. The hatchings in FIG. 11 indicate the portions to apply the weight to. FIG. 11A is a plan view of loading the first controller, and FIG. 11B is a view illustrating an example of display image of the action of the object.
  • When a player applies the weight to the heel of the right leg 8 a and also to the side of the heel of the left leg 8 b, the CPU 40 displays on the TV monitor 4 the jetting of the jet engine 112 a at the right front part and also the jetting of the jet engine 112 b at the left front part. The action of the object 100 is determined as described above.
  • In Step S9 (see FIG. 8) described above, when the value of the (WR(n)×WL(n)) is negative, and the value of the (WR(n)×WL(n)) is smaller than the prescribed threshold value e, the processing of the rotation of the object 100 is made as follows.
  • FIG. 12 is a view illustrating the case that the player applies the weight to the side of the heel of the right leg and to the side of the tiptoe of the left leg. The hatchings in FIG. 12 indicate the portions the player applies the weight. FIG. 12A is a plan view of loading the first controller, and FIG. 12B a view illustrating an example of the display image of the action of the object.
  • As illustrated in FIG. 12, when the player applies the weight to the side of the tiptoe of the left leg 8 b while applying the weight to the side of the heel of the right leg 8 a, the first value WR(n) and the second value WL(n) are as follows.

  • W R(n)>0

  • W L(n)<0
  • As described above, when the first value WR(n) is positive, and the second value WL(n) is negative (Step S11), the object 100 rotates right.
  • In this case, however, it is relatively easy for the left leg 8 b to apply the weight to the side of the tiptoe, but it is not always easy for the right leg 8 a to apply the weight sufficiently to the side of the heel. Accordingly, in this case, the first value WR(n) is corrected (Step S12). Specifically, the first value WR(n) is multiplied by the prescribed correction coefficient h. The prescribed correction coefficient h can be, e.g., 2.
  • The corrected first value WR(n)′ is expressed as follows.

  • W R(n) ′=W R(n) ×h
  • The CPU 40 stores the corrected first value WR(n)′ in the WR(n)′ memory provided in the system memory 42 (see FIG. 6).
  • When the object 100 is rotated right, by using the corrected first value WR(n)′ and the second value WL(n), which has not been corrected, an action of the object 100 is determined as follows (Step S13).
  • The data AY(n) of the rotation angle in the local coordinate system is given by the following formula.

  • A Y(n) =A Y(n−1)+(W L(n) −W R(n)′)
  • AY(n−1) is the data of the rotation angle in the n−1-th frame, and AY(n) is the data of the rotation angle in the n-th frame. The data of the rotation angle of the n−1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • The data of the rotation angle AY(n) of the n-th frame thus given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • The velocity VX(n) of the object 100 in the X axial direction is expressed as follows.

  • V X(n)=(W R(n) ′+W L(n))×sin(A Y(n)′)
  • When the plane the object 100 moves on is horizontal, the velocity VY(n) of the object 100 in the Y axial direction is 0.
  • The velocity VZ(n) of the object 100 in the Z axial direction is expressed as follows.

  • V Z(n)=(W R(n) ′W L(n))×cos(A Y(n)′)
  • The CPU 40 displays on the TV monitor 4 the jetting of the jet engine 112 a at the right front part and also the jetting of the jet engine 112 d at the left rear part. The intensities of the respective jettings are set, respectively based on the magnitude of the absolute value of the first value WR(n) and the magnitude of the absolute value of the second value WL(n).
  • FIG. 13 is views illustrating the case that a player applies the weight to the side of the heel of the left leg and to the side of the tiptoe of the right leg. The hatchings in FIG. 13 indicate the portions the player applies the weight to. FIG. 13A is a plan view illustrating loading the first controller, and FIG. 13B is a view illustrating an example of a display image of the action of the object.
  • As illustrated in FIG. 13, when the player applies the weight to the tiptoe of the side of the right leg 8 a while applying the weight to the heel of the left leg 8 b, the first value WR(n) and the second value WL(n) are as follows.

  • W R(n)<0

  • W L(n)>0
  • When the first value WR(n) is negative, and the second value WL(n) is positive as above, the object 100 is rotated left.
  • In this case, however, it is relatively easy to apply the weight to the side of the tiptoe of the right leg 8 a, but it is not always easy sufficiently apply the weight to the side of the heel of the left leg 8 b. Accordingly, in this case, the second value WL(n) is corrected (Step S14). Specifically, the second value WL(n) is multiplied by a prescribed correction coefficient h. The prescribed correction coefficient h can be, e.g., 2.
  • The corrected second value WL(n)′ is expressed as follows.

  • W L(n) ′=W L(n) ×h
  • The CPU 40 stores the corrected second value WL(n)′ in the WL(n)′ memory provided in the system memory 42 (see FIG. 6).
  • When the object 100 is rotated left, an action of the object 100 is determined as follows by using the first value WR(n) not corrected and the corrected second value WL(n)′ (Step S16).
  • The data of a rotation angle AY(n) in the local coordinate system is given by the following formula.

  • A Y(n) =A Y(n−1)+(W L(n) ′−W R(n))
  • AY(n−1) is the data of the rotation angle of the n−1-th frame, and the AY(n) is the data of the rotation angle of the n-th frame. The data AY(n−1) of the rotation angle of the n−1-th frame is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • The data of the rotation angle AY(n) of the n-th frame thus given is stored in the rotation angle memory provided in the system memory 42 (see FIG. 6).
  • The velocity VX(n) of the object 100 in the X axial direction is expressed as follows.

  • V X(n)=(W R(n) +W L(n)′)×sin(A Y(n)′)
  • When the plane the object 100 moves on is horizontal, the velocity VY(n) of the object 100 in the Y axial direction is 0.
  • The velocity VZ(n) of the object 100 in the Z axial direction is expressed as follows.

  • V Z(n)=(W R(n) +W L(n)′)×cos(A Y(n)′)
  • The CPU 40 displays on the TV monitor 4 the image that the jet engine 112 c at the left front part jets, and also the jet engine 112 b at the right rear part jets. The intensities of the respective jettings are set, respectively based on the magnitudes of the absolute value of the first value WR(n) and the magnitudes of the absolute value of the second value WL(n).
  • Thus, the action of the object 100 is determined for each frame.
  • The processing of determinating an action of the object illustrated in FIG. 8 is repeated until the game finishes.
  • Next, actions of the object 100 will be more specifically described with reference to FIGS. 14 to 17.
  • FIG. 14 is views illustrating a specific example (Part 1) of the actions of the object. FIG. 14A is a plan view illustrating the loading applied to the first controller, and FIG. 14B is a plan view illustrating the action of the object.
  • In FIG. 14, the data AY(n−1) of the rotation angle of the n−1-th frame is 0, the first value WR(n) is −0.3, and the second value WL(n) is also −0.3
  • The value given by subtracting the first value WR(n) from the second value WL(n) is 0, and accordingly, the data AY(n) of the rotation angle of the n-th frame is the same value as the data AY(n−1) of the rotation angle of the n−1-th frame. Accordingly, the object 100 does not change the direction.
  • The value given by adding the first value WR(n) and the second value WL(n) is −0.6, and accordingly, the object 100 proceeds in the negative direction of the Z axis at a velocity corresponding to the magnitude of 0.6.
  • FIG. 15 is views illustrating a specific example (Part 2) of the actions of the object. FIG. 15A is a plan view illustrating the loading applied to the first controller, and FIG. 15B is a plan view illustrating the action of the object.
  • FIG. 15 illustrates the case that the data AY(n−1) of the rotation angle of the n−1 frame is not 0, the first value WR(n) is −0.3, and the second value WL(n) is also −0.3.
  • The value given by subtracting the first value WR(n) from the second value WL(n) is 0, and accordingly the data AY(n) of the rotation angle of the n-th frame is the same value as the data AY(n−1) of the rotation angle of the n−1-th frame. Accordingly, the object 100 is retained in the same direction as in the n−1-th frame.
  • The value given by adding the first value WR(n) and the second value WL(n) is −0.6, and accordingly, in the case of FIG. 15, the object 100 proceeds at a velocity corresponding to the magnitude of 0.6 in a direction corresponding to the data AY(n) of the rotation angle.
  • FIG. 16 is views illustrating a specific example (Part 3) of the actions of the object. FIG. 16A is a plan view illustrating the loading applied to the first controller, and FIG. 16B is a plan view illustrating the action of the object.
  • FIG. 16 illustrates the case that the first value WR(n) ′ is 0.3, and the second value WL(n) is −0.3.
  • The value given by subtracting the first value WR(n)′ from the second value WL(n) is −0.6, and accordingly, the data AY(n) of the rotation angle of the n-th frame is varied by −0.6 from the data AY(n−1) of the rotation angle of the n−1-th frame. More specifically, by the above-described conversion formula, the value of the rotation angle varies by −0.6 kπ. In FIG. 16, in which the direction toward this side as viewed in the drawing is the positive direction of the Y axis, and the away direction as viewed in the drawing is the negative direction of the Y axis, the left rotation on the rotation axis is negative. Accordingly, when FIG. 16 is viewed from the front, the rotation direction of the object 100 is right.
  • The value given by adding the first value WR(n)′ and the second value WL(n) is 0, and accordingly the positional coordinates of the center of the object 100 do not vary.
  • Thus, in the case of FIG. 16, the object 100 does not proceed but rotates right there by an angle corresponding to the magnitude of 0.6.
  • FIG. 17 is views illustrating a specific example (Part 4) of the actions of the object. FIG. 17A is a plan view illustrating the loading applied to the first controller, and FIG. 17B is a plan view illustrating the action of the object.
  • FIG. 17 illustrates the case that the first value WR(n) is −0.3, and the second value WL(n) is −0.1.
  • The value given by subtracting the first value WR(n) from the second value WL(n) is 0.2, and accordingly, the data AY(n) of the rotation angle of the n-th frame is varied by +0.2 from the data AY(n−1) of the rotation angle of the n−1-th frame. More specifically, when the above-described conversion formula is used, the value of the rotation angle is varied by +0.2 kπ. In FIG. 17, in which the direction toward this side as viewed in the drawing is the positive direction of the Y axis, and the away direction as viewed in the drawing is the negative direction of the Y axis, the right rotation on the rotation axis is positive. Accordingly, when FIG. 17 is viewed from the front, the rotation direction of the object 100 is left.
  • The value given by adding the first value WR(n) and the second value WL(n) is −0.4.
  • Accordingly, in the case of FIG. 17, the object 100 proceeds at the velocity corresponding to the magnitude of 0.4 while rotating left by the angle corresponding to 0.2.
  • FIG. 18 is views illustrating a specific example (Part 5) of the actions of the object. FIG. 18A is a plan view illustrating the loading applied to the first controller, and the FIG. 18B is a plan view illustrating the action of the object.
  • FIG. 18 illustrates the case that the first value WR(n) is −0.3, and the second value WL(n)′ is 0.1.
  • The value given by subtracting the first value WR(n) from the second value WL(n)′ is 0.4, and accordingly, the data AY(n) of the rotation angle of the n-th frame is varied by +0.4 from the data AY(n−1) of the rotation angle of the n−1-th frame. More specifically, when the above-described conversion formula is used, the value of the rotation angle is varied by +0.4 kπ. In FIG. 18, in which the direction toward this side as viewed in the drawing is the positive direction of the Y axis, and the away direction as viewed in the drawing is the negative direction of the Y axis, the right rotation on the rotation axis is positive. Accordingly, when FIG. 18 is viewed from the front, the rotation direction of the object 100 is left.
  • The value given by adding the first value WR(n) and the second value WL(n)′ is −0.2.
  • Accordingly, in the case of FIG. 18, the object 100 proceeds at the velocity corresponding to the magnitude of 0.2 while rotating left by the angle corresponding to 0.4.
  • Thus, the actions of the objects 100 are determined, and the object 100 on the screen of the TV monitor 4 acts corresponding to the determined actions.
  • In the game of the present embodiment, all the flags 102 have been got, the game finishes.
  • As described above, according to the present embodiment, the first value is calculated, based on a difference between the first load data based on an output of the first load sensor 82 a provided at the right front part of the supporter 78 for a player to be supported on, and the second load data based on an output of the second load sensor 82 b provided at the right rear part of the supporter 78, and the second value is calculated, based on a difference between the third load data based on an output of the third load sensor 82 c provided at the left front part of the supporter 78 and the fourth load data based on an output of the fourth load sensor 82 d provided at the left rear part of the supporter 78. Then, based on the first value and the second value, an action of the object to be displayed on the display screen is determined. Thus, according to the present embodiment, the rotation, etc. of the object 100 can be made, which makes the game realistically enjoyable.
  • Modified Embodiments
  • The present invention is not limited to the above-described embodiment and can cover other various modifications.
  • For example, the above-described embodiment is described by means of the example of determinating the actions of the hover machine by the controller, but the object to have the actions determined is not limited to the hover machine. The present invention is applicable to determine actions of, e.g., four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc. For example, when the value given by adding the first value and the second value is positive, objects, such as four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc., may be caused to proceed, and the objects, such as four-wheel vehicles, two-wheel vehicles, one-wheel vehicles, etc., may be caused to retreat when the value given by adding the first value and the second value is positive. That is, the objects may be moved in the same directions as the object exemplified in the above-described embodiment or may be moved in directions opposite to the directions the exemplified object in the above-described embodiment is moved in. The objects may be turned in the same directions as the object exemplified in the above-described embodiment or may be turned in the directions opposite to the directions the object exemplified in the above-described embodiment is turned in.
  • INDUSTRIAL APPLICABILITY
  • The game device and the computer program according to the present invention is useful to provide a game which can be realistically operated.
  • REFERENCE NUMBERS
    • 2 . . . TV table
    • 4 . . . TV monitor
    • 6 . . . player
    • 8 a, 8 b . . . legs
    • 10 . . . game device
    • 12 . . . game device body
    • 20 . . . the first controller
    • 22 . . . the second controller
    • 28 . . . various operational buttons
    • 28 a . . . source button
    • 28 b . . . cross button
    • 40 . . . CPU
    • 42 . . . system memory (RAM)
    • 44 . . . bus arbiter
    • 46 . . . program data memory device or memory storage medium
    • 48 . . . BOOT ROM
    • 50 . . . rendering processor
    • 52 . . . graphic memory
    • 54 . . . video DAC
    • 56 . . . sound processor
    • 58 . . . sound memory
    • 60 . . . audio DAC
    • 62 . . . communication interface
    • 64 . . . LAN adaptor
    • 68 . . . wireless receiver unit
    • 70 . . . peripheral interface
    • 78 . . . supporter
    • 80 a-80 d . . . leg
    • 82 a-82 d . . . load sensor
    • 86 . . . AD converter
    • 88 . . . CPU
    • 90 . . . wireless transmission unit
    • 100 . . . object
    • 102 . . . flag
    • 104 . . . best record
    • 106 . . . number of remaining flags
    • 108 . . . elapsed time
    • 110 . . . character
    • 112 a-112 d . . . jet engines
    • 114 . . . obstacle

Claims (16)

1. A non-transient memory storage medium, readable by a computer, which stores a computer program for operating a computer as a game device using a controller including a supporter for supporting a player; a first load sensor provided at a right front part of the supporter, for detecting a load from the supporter; a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, a third load sensor provided at a left front part of the supporter, for detecting a load from the supporter, and a fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter,
said computer being operated as an action determination means which calculates a first value, based on a difference of a first load data based on an output of the first load sensor and a second load data based on an output of the second load sensor, calculates a second value based on a difference between a third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines an action of an object to be displayed on a display screen, based on the first value and the second value.
2. A non-transient memory storage medium according to claim 1, wherein
a direction of the object is varied, based on a difference between the first value and the second value.
3. A non-transient memory storage medium according to claim 1, wherein
a velocity of the object is determined, based on a sum of the first value and the second value.
4. A non-transient memory storage medium according to claim 2, wherein
a velocity of the object is determined, based on a sum of the first value and the second value.
5. A non-transient memory storage medium according to claim 1, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and varies a direction of the object, based on a difference between the corrected first value and the second value not corrected,
the action determination means, when the first value is negative, and the second value is positive, corrects the second value by the prescribed correction coefficient and varies a direction of the object, based on a difference between the first value not corrected and the corrected second value.
6. A non-transient memory storage medium according to claim 2, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and varies a direction of the object, based on a difference between the corrected first value and the second value not corrected,
the action determination means, when the first value is negative, and the second value is positive, corrects the second value by the prescribed correction coefficient and varies a direction of the object, based on a difference between the first value not corrected and the corrected second value.
7. A non-transient memory storage medium according to claim 3, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and varies a direction of the object, based on a difference between the corrected first value and the second value not corrected,
the action determination means, when the first value is negative, and the second value is positive, corrects the second value by the prescribed correction coefficient and varies a direction of the object, based on a difference between the first value not corrected and the corrected second value.
8. A non-transient memory storage medium according to claim 4, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and varies a direction of the object, based on a difference between the corrected first value and the second value not corrected,
the action determination means, when the first value is negative, and the second value is positive, corrects the second value by the prescribed correction coefficient and varies a direction of the object, based on a difference between the first value not corrected and the corrected second value.
9. A non-transient memory storage medium according to claim 1, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and determines a velocity of the object, based on a sum of the corrected first value and the second value not corrected, and
the action determination means, when the first value is negative, and the second value is positive, determines a velocity of the object, based on a sum of the first value not corrected and the corrected second value.
10. A non-transient memory storage medium according to claim 2, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and determines a velocity of the object, based on a sum of the corrected first value and the second value not corrected, and
the action determination means, when the first value is negative, and the second value is positive, determines a velocity of the object, based on a sum of the first value not corrected and the corrected second value.
11. A non-transient memory storage medium according to claim 3, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and determines a velocity of the object, based on a sum of the corrected first value and the second value not corrected, and
the action determination means, when the first value is negative, and the second value is positive, determines a velocity of the object, based on a sum of the first value not corrected and the corrected second value.
12. A non-transient memory storage medium according to claim 4, wherein
the first value is a value given by subtracting the first load data from the second load data,
the second value is a value given by subtracting the third load data from the fourth load data,
the action determination means, when the first value is positive, and the second value is negative, corrects the first value by a prescribed correction coefficient and determines a velocity of the object, based on a sum of the corrected first value and the second value not corrected, and
the action determination means, when the first value is negative, and the second value is positive, determines a velocity of the object, based on a sum of the first value not corrected and the corrected second value.
13. A non-transient memory storage medium according to claim 1, wherein
the object is caused to proceed or to retreat, based on a sign of the sum of the first value and the second value.
14. A non-transient memory storage medium according to claim 2, wherein
the object is caused to proceed or to retreat, based on a sign of the sum of the first value and the second value.
15. A non-transient memory storage medium according to claim 3, wherein
the object is caused to proceed or to retreat, based on a sign of the sum of the first value and the second value.
16. A game device comprising a controller including a supporter for supporting a player, and a first load sensor provided at a right front part of the supporter, for detecting a load from the supporter, a second load sensor provided at a right rear part of the supporter, for detecting a load from the supporter, a third load sensor provided at a front left part of the supporter, for detecting a load from the supporter and a fourth load sensor provided at a left rear part of the supporter, for detecting a load from the supporter,
the game device comprising an action determination means which calculates a first value based on a difference between a first load data based on an output of the first load sensor and a second load data based on an output of the second load sensor, calculates a second value, based on a difference between a third load data based on an output of the third load sensor and a fourth load data based on an output of the fourth load sensor, and determines an action of an object to be displayed on a display screen, based on the first value and the second value.
US13/350,649 2009-07-17 2012-01-13 Game device and computer program Abandoned US20120115609A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2009-169062 2009-07-17
JP2009169062A JP2011019817A (en) 2009-07-17 2009-07-17 Game device and computer program
PCT/JP2010/053998 WO2011007600A1 (en) 2009-07-17 2010-03-10 Game apparatus and computer program
JPPCT/JP2010/053998 2010-03-20

Publications (1)

Publication Number Publication Date
US20120115609A1 true US20120115609A1 (en) 2012-05-10

Family

ID=43449210

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/350,649 Abandoned US20120115609A1 (en) 2009-07-17 2012-01-13 Game device and computer program

Country Status (4)

Country Link
US (1) US20120115609A1 (en)
EP (1) EP2455147A4 (en)
JP (1) JP2011019817A (en)
WO (1) WO2011007600A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8834269B2 (en) 2010-05-25 2014-09-16 Kabushiki Kaisha Sega Program, game device and method of controlling the same
US9205327B2 (en) 2011-03-08 2015-12-08 Nintento Co., Ltd. Storage medium having information processing program stored thereon, information processing apparatus, information processing system, and information processing method
US9375640B2 (en) 2011-03-08 2016-06-28 Nintendo Co., Ltd. Information processing system, computer-readable storage medium, and information processing method
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
US9643085B2 (en) 2011-03-08 2017-05-09 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for controlling a virtual object using attitude data
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US11504626B2 (en) * 2018-11-29 2022-11-22 Ts Tech Co., Ltd. Seat system and seat experience device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5767575B2 (en) * 2011-11-28 2015-08-19 日本放送協会 Position measuring apparatus and position measuring system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5864333A (en) * 1996-02-26 1999-01-26 O'heir; Brian S. Foot force actuated computer input apparatus and method
US20080261696A1 (en) * 2007-04-20 2008-10-23 Nintendo Co., Ltd. Game controller, storage medium storing game program, and game apparatus
US20090093315A1 (en) * 2007-10-04 2009-04-09 Nintendo Co., Ltd. Storage medium storing load detection program, load detection apparatus, and load detection method
US20100137063A1 (en) * 2008-11-28 2010-06-03 Mari Shirakawa Information processing apparatus and computer readable storage medium
US20100245236A1 (en) * 2009-03-30 2010-09-30 Nintendo Co., Ltd. Computer-readable storage medium and information processing apparatus
US20110077899A1 (en) * 2009-09-28 2011-03-31 Nintendo Co., Ltd. Computer-readable storage medium having information processing program stored therein and information processing apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3542994B2 (en) * 2002-02-20 2004-07-14 株式会社ニューオプト Barymeter
JP5204381B2 (en) * 2006-05-01 2013-06-05 任天堂株式会社 GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP5050494B2 (en) 2006-11-10 2012-10-17 株式会社セガ A computer program that displays information on the predicted trajectory of a moving object.
JP5153122B2 (en) * 2006-11-15 2013-02-27 任天堂株式会社 GAME PROGRAM AND GAME DEVICE
JPWO2008099582A1 (en) * 2007-02-08 2010-05-27 新世代株式会社 Input system, entertainment device, and local brain training device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5864333A (en) * 1996-02-26 1999-01-26 O'heir; Brian S. Foot force actuated computer input apparatus and method
US20080261696A1 (en) * 2007-04-20 2008-10-23 Nintendo Co., Ltd. Game controller, storage medium storing game program, and game apparatus
US20090093315A1 (en) * 2007-10-04 2009-04-09 Nintendo Co., Ltd. Storage medium storing load detection program, load detection apparatus, and load detection method
US20100137063A1 (en) * 2008-11-28 2010-06-03 Mari Shirakawa Information processing apparatus and computer readable storage medium
US20100245236A1 (en) * 2009-03-30 2010-09-30 Nintendo Co., Ltd. Computer-readable storage medium and information processing apparatus
US20110077899A1 (en) * 2009-09-28 2011-03-31 Nintendo Co., Ltd. Computer-readable storage medium having information processing program stored therein and information processing apparatus

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8834269B2 (en) 2010-05-25 2014-09-16 Kabushiki Kaisha Sega Program, game device and method of controlling the same
US9205327B2 (en) 2011-03-08 2015-12-08 Nintento Co., Ltd. Storage medium having information processing program stored thereon, information processing apparatus, information processing system, and information processing method
US9345962B2 (en) 2011-03-08 2016-05-24 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9370712B2 (en) 2011-03-08 2016-06-21 Nintendo Co., Ltd. Information processing system, information processing apparatus, storage medium having information processing program stored therein, and image display method for controlling virtual objects based on at least body state data and/or touch position data
US9375640B2 (en) 2011-03-08 2016-06-28 Nintendo Co., Ltd. Information processing system, computer-readable storage medium, and information processing method
US9492742B2 (en) 2011-03-08 2016-11-15 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9492743B2 (en) 2011-03-08 2016-11-15 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9522323B2 (en) 2011-03-08 2016-12-20 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9526981B2 (en) 2011-03-08 2016-12-27 Nintendo Co., Ltd. Storage medium having stored thereon information processing program, information processing apparatus, information processing system, and information processing method
US9539511B2 (en) 2011-03-08 2017-01-10 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for operating objects in a virtual world based on orientation data related to an orientation of a device
US9561443B2 (en) 2011-03-08 2017-02-07 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method
US9643085B2 (en) 2011-03-08 2017-05-09 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for controlling a virtual object using attitude data
US9925464B2 (en) 2011-03-08 2018-03-27 Nintendo Co., Ltd. Computer-readable storage medium, information processing system, and information processing method for displaying an image on a display device using attitude data of a display device
US11504626B2 (en) * 2018-11-29 2022-11-22 Ts Tech Co., Ltd. Seat system and seat experience device

Also Published As

Publication number Publication date
JP2011019817A (en) 2011-02-03
EP2455147A4 (en) 2013-11-20
EP2455147A1 (en) 2012-05-23
WO2011007600A1 (en) 2011-01-20

Similar Documents

Publication Publication Date Title
US20120115609A1 (en) Game device and computer program
EP2497551B1 (en) Information processing program, information processing apparatus, information processing system, and information processing method
KR100463906B1 (en) Image processing apparatus, game machine and image processing method and medium using the processing apparatus
EP2497545B1 (en) Information processing program, information processing system, and information processing method
US8393964B2 (en) Base station for position location
US7833098B2 (en) Input data processing program and input data processing apparatus
JP2003334379A (en) Game system and game program
US20040224761A1 (en) Game apparatus, storing medium that stores control program of virtual camera, and control method of virtual camera
US8834269B2 (en) Program, game device and method of controlling the same
US9643085B2 (en) Computer-readable storage medium, information processing system, and information processing method for controlling a virtual object using attitude data
US6897865B2 (en) Three-dimensional image processing method and apparatus, readable storage medium storing three-dimensional image processing program and video game system
US9561443B2 (en) Computer-readable storage medium, information processing system, and information processing method
EP3482806B1 (en) Operation input system, operation input device, and game system
JP2012239778A (en) Game program, game device, game system, and game processing method
EP2497546A2 (en) Information processing program, information processing system, and information processing method
JP3602835B2 (en) VIDEO GAME DEVICE, ITS CONTROL METHOD, AND GAME PROGRAM
JP2012252468A (en) Information processing program, information processor, information processing system, and information processing method
JP2012239776A (en) Game program, game device, game system, and game processing method
JP3839355B2 (en) GAME DEVICE AND GAME PROGRAM
JP5912289B2 (en) Information processing program, information processing apparatus, information processing system, and information processing method
JP2005050070A (en) Image processing device, method, and program
JP2012249760A (en) Information processing program, information processor, information processing system, and information processing method
JP2007304727A (en) Program, information storage medium, and image generation device
JP2009160076A (en) Program, information storage medium, and game device

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA SEGA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUGIYAMA, CHIAKI;TOKUHARA, JUN;REEL/FRAME:027534/0912

Effective date: 20111228

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION