GB2395645A - Motion correction of on-screen targets in shooting game - Google Patents

Motion correction of on-screen targets in shooting game Download PDF

Info

Publication number
GB2395645A
GB2395645A GB0403651A GB0403651A GB2395645A GB 2395645 A GB2395645 A GB 2395645A GB 0403651 A GB0403651 A GB 0403651A GB 0403651 A GB0403651 A GB 0403651A GB 2395645 A GB2395645 A GB 2395645A
Authority
GB
United Kingdom
Prior art keywords
motion
motion correction
model object
game
changed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0403651A
Other versions
GB0403651D0 (en
GB2395645B (en
Inventor
Atsushi Hayashi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Namco Ltd
Original Assignee
Namco Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP23865199A external-priority patent/JP4301471B2/en
Application filed by Namco Ltd filed Critical Namco Ltd
Publication of GB0403651D0 publication Critical patent/GB0403651D0/en
Publication of GB2395645A publication Critical patent/GB2395645A/en
Application granted granted Critical
Publication of GB2395645B publication Critical patent/GB2395645B/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • A63F13/10
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6607Methods for processing data by generating or executing the game program for rendering three dimensional images for animating game characters, e.g. skeleton kinematics
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Abstract

An image generation system having means for performing motion correction on an on-screen skeleton shape object, comprising a plurality of parts, when the object is hit. The motion correction may be based on hit information and a restoring force for returning the skeleton shape to its original shape. The change of the skeleton shape may be gradually reduced as time passes. The skeleton shape represents an enemy in a computer game, the player using a model gun to shoot the enemies. Flags to be used for individually enabling and disabling motion corrections are respectively set to the enemy characters may be included.

Description

IMAGE GENERATION SYSTEM AND PROGRAM
BACKGROUND OF THE INVENTION
5 Filed of Industrial Application The present invention relates to an image generation system and program.
Description of the Related Art
There is known an image generation system which con 10 generate an image as viewed within a virtual three-dimensional or object space from a given viewpoint. Such a system is very popular since one can experience a socalled virtual reality through it Now considering an image generation system for playing a gun game, a player (or operator) can enjoy a 15 three-dimensional shooting game by manipulating a gun-shaped controller (or shooting device) to shot targets such as enemy characters (or model objects) and the like which are displayed on a screen.
In such an image generation system, it is important that 20 images can be generated with reality to improve the degree of virtual reality for players. It is thus desired that the motion of enemy characters is also represented with reality. In the conventional image generation systems, motion data for specifying a skeleton shape (or skeleton configuration) for 25 each enemy characterin each frameispreviously prepared. Based on such motion data, a motion play (ormotion animation) process is performed to represent the motion of that enemy character.
However, such a motion play process only based on the motion data has the following problems: (1) Even though the position or direction of a player (or a player's character controlled by the player or a virtual 5 camera) varies or a shot (or bullet or the like) from the player hits the enemy character, that enemy character moves with the same motion at all times. Thus, the representation of motion becomes monotonous.
(2) If it is attempted to give variety to the motion of 10 the enemy character to prevent monotonous representation, the motion data must be increased. However, since the capacity of a memory for storing the motion data is limited, variety of the motion is also limited.
On the other hand, the motion play process based on the 15 motion dataisadvantageousin thatitcan move end arrange enemy characters in the same manner at all times.
Such atype of gun game proceeds according to a particular story, and a virtual camera always moves along a given pattern accordingtothestory.If an enemy character movesirregularly, 20 the virtualcamera will not be able to take the enemy character.
Thus, the game story cannot smoothly be developed.
However, the motion playprocess teased on themotion data can move and arrange the enemy characters in the same manner at all times so that the enemy characters can surely be taken 25 by the virtual camera at all times. It is thus desirable that such a motion play process is more effectively utilized.
The present invention is devised in the light of the above problems and has as an obj ective thereof the provision of an image generation system and program which can implement motion representation with more reality and variety using a less amount 5 of data by utilizing a motion play process based on the motion data.
According to a the present invention there is provided an image generation system for generating an image, comprising: means for performing a first motion correction in which a skeleton shape of 10 a model object specified by motion data is changed, the model object including a plurality of parts, then performing a second motion correction in which the skeleton shape of the model object which has been changed by the first motion correction is further changed, and so on, and then finally performing an N-th (N>2) motion correction in which the skeleton shape of the model object which has been changed by 15 the (N - 1)-th motion correction is still further changed; and means for generating images including an image of the model object, wherein information for respectively enabling or disabling the first to N-th motion correction is set to the model object.
20 The model object is formed by a plurality of parts, and the position or rotation angle of eachpartis specified by the position or rotation angle ofthe corresponding one of the bones defining the skeleton. However, the skeleton and bones will virtually be represented, but actually be in the form of data representing the position or rotation angle of each part.
The present invention furtherprovides a computer-usable program embodied on an information storage medium or in a carrier wave, the program comprising a processing routine for implementing (or executing) the abovedescribed means.
This makes it possible to perform a multi-step motion correction process including the first through N-th motion corrections. Even if only a single set of motion data is used for a motion, that motion can be viewed differently if the first to N-th motion corrections are different, leading to motion correction with variety.
It is possible, for example, to enable the first and second motion correction for a first model object, enable the first motion correction but disable the second motion correction for a second model object, and disable the first and second motion corrections for a third model object. Therefore, even if the same motion data or program is used for 10 a plurality of model objects, these model objects can perform different motions by changing the setting of the information. As a result, model objects which perform various motions can be easily provided.
The image generation system and program according to the present invention 15 may perform the first to N-th motion corrections while substantially fixing a position or rotation angle of the body coordinate system of the model obj cot relative to the world coordinate system to a position or rotation angle specified by the motion data. The skeleton shape of the model object (movement of each part) is changed by the first through N-th motion corrections, but the entire movement of the model object can be 20 controlled to faithfully follow the motion data.
According to the various aspects of the present invention, the model object can be moved to faithfully follow the motion data, while turning a given part (such as a head, hand or fire arm) of the model obj ect toward a given target position (a player's 25 character or a virtual camera, for example). Tfthe model object is hit, its skeleton shape is slightly changed based on the hit information. As a result, a model object which performs an optimum movement as a target (or enemy character) of a shooting game such as a gun game can be implemented.
Fig. 1 shows an example of an arcade game system to which the embodiment of the present invention is applied.
Fig. 2 is a block diagram of an image generation system / / / is
constructed according to the embodiment of the present invention. Fig. 3showsanexample of a skeleton of an enemy character (or a model object).
5 Fig. 4 illustrates a rotation angle of a bone.
Fig. S illustrates the motion correction for slightly changing the skeleton shape based on the hit information.
Figs. 6A, 6B and 6C illustrate a motion correction technique using a restoring force.
10 Figs. 7A, 7B, 7C and 7D illustrate a technique of gradually reducing the amount of slight change of the skeleton shape as time passes.
Fig. 8 illustrates a technique of performing the motion correction while fixing a hand (or gun) or foot.
15 Fig. 9 illustrates a technique of performing the motion correction while fixing the position and rotation angle of a base (or body coordinate system) to a position and rotation angle specified by the motion data.
Fig. 10 illustrates a technique of performing a 20 multi-step correction process including a first motion correction for turning a head or gun of the model object toward a player and a second motion correction for slightly changing the skeleton shape.
Figs. llA and llB illustrate flags used to enable or 25 disable each motion correction.
Figs. 12A and 12B also illustrate flags used to enable or disable each motion correction.
Fig. 13 isa flowchart showing the entire process in this embodiment. Fig. 14isaflowchart showing the firstmotion correction process 5 Fig. 15isaflowchartshowing the firstmotion correction process. Fig. 16 illustrates a technique of turning a head of an enemy character toward a target position.
Fi US. ?.A., ED alla i7C illustrate a technique of turning 10 a gun of an enemy character toward a target position.
Fig. 18 is a flowchart showing the second motion correction process.
Figs. l9A to l9E illustrate a technique of performing the second motion correction.
15 Fig. 20 shows an example of a hardware configuration for the embodiment of the present invention.
Figs. 21A and 21B show various systems to which the embodiment of the present invention is applied.
20 DESCRIPTION OF THE PREFERRED EMBODIMENTS
One preferred embodiment of the present invention will now be described with reference to the drawings. Although the embodiment of the present invention will be described as to a gun game (or shooting game) using a gun-like controller, the 25 present inventionisnotlimited to such a formbutmaybeapplied to any of various other forms.
1. Structure Fig. 1 shows an example of an arcade game system to which the present invention is applied.
A player 500 holds a gun-shaped controller (or a shooting 5 device in a broad sense) 502 which is formed similar to a real machine gun. The player 500 can enjoy the gun game by using the gun-shaped controller 500 to shot targets such as enemy characters (or model objects in a broad sense) which are displayed on a screen 504.
10 When the gun-shaped controller 502 is triggered, virtual shots such as bullets will be continuously fired at high speed.
Thus, the player can feel the virtual reality as if he or she is shooting the real machine gun.
Shot (or bullet) hitting positions may be sensed by using 15 a photosensor on the gun-shaped controller 502 to sense a scanning ray on the screen or by emitting a light (or laser) beam from the gun-shaped controller 502 and sensing irradiated portion by any suitable means such as CCD camera.
Fig. 2 shows a block diagram of this embodiment. In this 20 figure, this embodiment may comprise at least a processing unit 100 or a processing unit 100 with a storage unit 140 or a processing unit 100 with a storage unit 140 and an information storage medium 150. Each of the other blocks (e.g., operating unit 130, image generation unit 160, display unit 162, sound 25 generation unit 170, sound output unit 172, communication unit 174, I/F unit 176, memory card 180 and so on) may take any suitable form.
The processing unit 100 is designed to perform various processings for control of the entire system, commands to the respective blocks in the system, game computation and so on The function thereof may be realized through any suitable 5 hardware means such as CPU (CISC type, RISC type), DSP or ASIC (or gate array or the like) or a given program (or game program).
The operating unit 130 is used to input operational data from the player and the function thereof may be realized through any suitable hardware means such as the gun-shaped controller 10 502 of Fig. 1, a lever, a button, a housing or the like.
The storage unit 140 provides a working area for the processing unit 100, image generation unit 160, sound generation unit 170, communication unit 174, I/F unit 176 and others. The function thereof may be realized by any suitable 15 hardware means such as RAM or the like.
The information storage medium (or computer-usable storage medium) 150 is designed to store information including programs, data and others. The function thereof may be realized through any suitable hardware means such as optical memory disk 20 (CD or DVD), magneto-optical disk (MO), magnetic disk, hard disk, magnetic tape, semiconductor memory (ROM) or the like. The processing unit 100 performs various processings in the present invention (or this embodiment) based on the information that has been stored in this information storage medium 150. In other 25 words, the information storage medium 150 stores various pieces of information (or programs and data) for implementing (or executing) the means of the present invention (or this
embodiment) which is particularly represented by the block included in the processing unit 100.
Part or the whole of the information stored in the information storage medium 150 will be transferred to the 5 storage unit 140 when the system is initially powered on. The information stored in the information storage medium 150 may contain at least one of program code set for processing the present invention, imageinformation, sound information, shape information of objects to be displayed, table data, list data, lO player information, commandinformation for the processings in the present invention, information for performing the processings according to the commands and so on.
The image generation unit160 is designed to generate and output various images toward the display unit 162 according to 15 instructions from the processing unit 100. The function thereof may be realized through any suitable hardware means such as image generationASIC,CPUorDSPoraccording to a given program (or image generation program) or based on image information.
The sound generation unit170 is designed to generate and 20 output various sounds toward the sound outputunit172 according to instructions from the processing unit 100. The function thereof may tee realized through any suitable hardware means such as sound generation ASIC, CPU or DSP or according to a given program (or sound generation program) or based on sound 25 information (waveform data and the like).
The communicationunit174 is designed to perform various controls for communication between the game system and any
external device (e.g., host machine or other image generation system). The function thereof may be realized through any suitable hardware means such as communication ASIS or CPU or according to a given program (or communication program).
5 Information forimplementing (realizing) the processings in the present invention (or this embodiment) may be delivered from an information storage medium included in a host machine (or server) to the information storage medium 150 through a nettv.k- and fine communication unit 174 The use of such an 10 information storage medium in the hose device (or server) falls in the scope of the invention.
Part or the whole of the function in the processing unit lOOmayberealized through the function of theimage generation unit 16O, sound generation unit 170 or communication unit 174.
15 Alternatively, part or the whole of the [unction in the image generation uniting, sound generation unit170or communication operating unit 174 may be realized through the function of the processing unit 100.
The I/F unit 176 serves as an interface for information 20 interchange between the game system and a memory card (or a portable information storage device including a portable game machine in a broad sense) 180 according to instructions from the processing unit 100. The function thereof may be realized through a slot into which the memory card is inserted, a data 25 write/read controller IC or the like. If the information interchange between the game system and the memory card 180 is to be realized in a wireless manner (e.g., through infra-red
communication), the functionoftheI/Funit176 may be realized throughanysuitablehardwaremeanssuchassemiconductorlaser, infra-red sensor or the like.
The processing unit 100 further comprises a game 5 computing section 110.
The game computing section 110 is designed to perform various processes such as coin (or charge) reception, setting of various modes, game proceeding, setting of scene selection, determination of the position and rotation anal tahout v-, 10 or Z-axis) of an object, determination of the view point and visual line (direction), play (or generation) of the motion, arrangement of theobjectwithintheobjectspace, hit checking, computation of the game results (or scores), processing for causing a plurality of players to play in a common game space, 15 various game computations including game-over and other processes, based on operational data from the operating unit 130 and according to the data and game program from the memory card 180.
The game computing section 110 comprises a hit check 20 portion 112, a motion play portion 114, a first motion correction portion 116 and a second motion correction portion 118. The hit check portion 112 is designed to perform a hit check process for checking whether or note virtual shot emitted 25 by the player through the gun-shaped controller hits an model object. In order to reduce the processing load, it is desirable to perform the hit check process using a simplified object
(bounding volume) which is obtained by simplifying the shape of the model object.
The motion play (motion animation) portion 114 is designed to play (replay) the motion of a model object (or enemy 5 character or the like) based on the motion data which have been stored in a motion data storage portion 142.
More particularly, the motion data storage portion 142 has stored the motion data including the position or rotation angle and the like of each bone (or nary! in the Skeleton at 10 that modelobject. The motion play portion 114 reads this motion data and moves the bones (or parts) of the model object based on the motion data (or changes the skeleton shape of the model object). Thus, the motion of the model object is animated.
It is particularly desirable that the motion data stored 15 in the motion data storage portion 142 are prepared utilizing motion captures. However, the motion data may be produced in real time through a physical simulation (or a simulation utilizing a physical calculation which may be a pseudo calculation). 20 The first motion correction (compensation) portion 116 is designed to change the skeleton shape specified based on the play of motion through the motion correction portion 114.
More particularly, the motion correction is performed to turn the head or fire arm (hand) of the model object toward the 25 player (or player's character or virtual camera). Thus, the motionin the model objectis varied depending on the positional relationship between the player and the model object (or enemy
character). This can realize real representation of various motions with reduced amount of data.
The second motion correction (compensation) portion 118 is performed to further change the skeleton shape previously 5 changed by the first motion correction portion 116. Thus, the motion in the model object is corrected through multiple steps.
More particularly, the skeleton shape of the model object is slightly changed based on a restoring force for restoring the skeleton shape to its original configuration and hit 10 information (hitting force, hittingdirection and soon). Thus, the model object can be represented with fine reactions against the hitting impacts of shot.
The image generation system according to this embodiment may be either of a single-player dedicated system in which only 15 a single player can play the game or a multi-player system in which a plurality of players can play the same game at the same time. If a plurality of players play the game, game images and sounds to be provided to these players may be generated by a 20 single terminal or by a plurality of terminals interconnected through a network (or transmission fine or communicationline).
2. Features of this embodiment Fig. 3 shows an enemy character (or model object) 20 26 having a skeleton shape (configuration) The skeleton shape of Fig. 3 is formed by a parent-child structure of bones (or parts) B1 to Bl5. For example, the parent
of B15 is B14; the parent of B14 is B13; and the parent of B13 is B1. The parent of B12 is B11; the parent of B11 is B10; and the parent of B10 is B1. The parent of B' is Be; the parent of Be is B7; and the parent of B7 is B2 The parent of Be is B5; 5 the parent of B5 is B4; the parent of B4 is B2. The parent of B3 is B2, and the parent of B2 of B1. Finally, the parent of B1 is a base BS.
The base BS is the body coordinate system for the enemy character (or model object) 20. When the position or rotation 10 angle of this base relative to a world coordinate system is varied, the entire enemy character 20 can be rotated or moved.
The position and rotation angle of each of the bones are represented relative to its parent. For example, the position (or joint) J2 of the bone B2 may be represented relative to the 15 position J1 of the parent bone B1. As shown in Fig. 4, the rotation angle of the child bone BC may be represented relative to axes XL, YL and ZL, for example, if the parent bone BP is arranged on the axis XL.
In this connection, the bones B1 to B15 of Fig. 3 are not 20 actually displayed, but are virtual bones which are actually data (coordinate transformation matrix) representing the position or rotation angle of each part of the enemy character (or model object) 20.
Data of the positions J1 to J15 and rotation angles of 25 the bones B1 to B15 (or only the rotation angles) is stored in the motion data storage portion 142 of Fig. 2, as the motion data. It is now assumed herein that a walk motion includes a
series of reference motions (orattitudes)MP1, MP2, MP3,.. MPN.
In this case, data of the position and rotation angle of each bone in each of these reference motions MP1, MP2, MP3,...MPN is stored in the motion data storage portion 142. The motion 5 play can be implemented by sequentially reading the reference motion data as time passes, such as by reading the positions and rotation angles of each bone relative to the reference motion MP1 in the first frame, reading the positions and station angled of each bone relative to the reference motion 10 MP2 in the second frame, and so on.
In this embodiment, the position and rotation angle of the base BS (or body coordinate system) relative to the world coordinate system may be specified by the motion data.
The enemy character 20 of Fig. 3 includes various parts 15 such as hip 1, breast 2, heads, upper right arm 4, right forearm 5, right hand (gun) 6, upper left arm 7, left forearm 8, left hand 9, right thigh 10, right shinll, right foot 12, left thigh 13, left shin 14 and left foot 15, all of which may be moved to follow the bones B1 to B15, respectively. As the skeleton 20 shape formed by the bones B1 to B15 is changed, the shape of the enemy character 20 will correspondingly be changed. A portion of each bone nearitsjointmayfollow the adjacentbones at a predetermined rate. For example, the portion of the upper arm 4 near the joint J5 may follow both the bones B4 and B5 at 25 a predetermined rate. Thus, the shape near any joint may more smoothly be changed.
In this embodiment, when the enemy character is hit by
shot orthelike, themotion correctionis performed to slightly change the skeleton shape of the model object specified by the motion data (which may be the other skeleton shape already subjected to other corrections) according to the hitting force 5 end direction (orhitinformationinabroadsense). Forexample, in Fig. 5, a shot hits an enemy character at its breast. As a result, the skeleton shape is slightly changed depending on the hitting force and direction.
Thus, the skeleton shape will be changed differently 10 depending on hitting force and direction. For example, if the hitting forceislarger, the skeleton shape will also tee charged with increased amount. If the hitting direction is the first direction. the skeleton shape will be changed substantially along the first direction. If the hitting direction is the 15 second direction, the skeleton shape will be changed substantiallyalong the second direction. In such a manner, the skeleton shape can be variously changed while using only a single set of motion data comprising data of a series of reference motions. As a result, the motion representation with 20 reality and variety can be implemented using a less amount of data. In this embodiment, furthermore, the motion correction is mace using a restoring force for returning the skeleton shape into its original configuration (which is specified based on 25 the motion data) in addition to the hit information.
If the enemy character is hit in the right direction as shown in Fig. 6A, for example, a bone BA is rotated
counterclockwise as shown in Fig.6B. When the bone BAis rotated counterclockwise by 1 as shown by E2 from its original direction shown by E1, a restoring force, -1 x 01, acts on the bone BA to rotate it clockwise. On the other hand, 6 when the bone BA is rotated clockwise by 02 as shown by E3 of Fig. 6C from its original direction shown by E1, a restoring force, -K1 x 2, acts on the bone BA to rotate it counterclockwise. When such a restoring force acts on each of the bones, 10 each bone performs elastic (spring) movement. Each time when ashothitsabone, the skeleton shape will be slightly vibrated As a result, the enemy character can be seen as if it trembles with hit shots. This can implement the motion play with more reality, which would not be obtained by a simple motion play.
15 In this embodiment, the motion correction is performed suchthat the amount of slight change of the skeleton shape based on the hit information is gradually reduced as time passes.
More particularly, if a shot hits the enemy character, the timer value TM may be set 90 as shown in Fig. 7A. The timer 20 value TM is decremented by each frame.
If the timer valueTM is90 (initial value), the amplitude of the rotation angle in the bone BA becomes maximum as shown by E4 in Fig. 7A. As the timer value TM is decremented from a hit through passage of time as shown by E5 in Fig. 7B, E6-in 2D Fig. 7C and E7 in Fig. 7D, the amplitude of the rotation angle in the bone BA is gradual ly reduced.
Thus, the skeleton shape may tee shown so thatitislargely
vibrated by a shot hit with the amplitude thereof being then gradually reduced. Thus, the enemy character trembling with hit shots can be shown more really. As a result, the game system can realize the more real representation of motion which would 5 not be obtained by a simple motion play.
In this embodiment, furthermore, the motion correction is performed while substantially fixing the position and rotation angle of a given pert (or bone) in the enemy character.
As shown in Fig. 8, for example, the skeleton shape is 10 slightly changed while fixing the positions J6, J9, J12 and J15 and rotation angles of the right hand (or bone B6), left hand (or bone B9), right foot (or bone B12) and left foot (or bone B15), respectively.
Thus, the position and direction of the gun (or fire arm) 15 integrally formed with the hand can be prevented from being vibrated due to hit shots. Therefore, the enemy character can be represented such thatit trembles with hitshotswhilefirmly holding the gun toward the player.
By performing the motion correction with a foot of the 20 enemy character fixed, it can be prevented that the enemy character is seen as if it walks with gliding steps.
In other words, if the position and rotation angle of the enemy character's foot are changed by the motion correction for slightly changing the skeleton shape, a problem will be raised 25 in that the foot to be stayed may glide on the ground. In this embodiment, however, the position and rotation angle of the enemy character's foot are fixed as shown in Fig. 8, when the
motion correction for slightly changing the skeleton shape is made. Even if such a motion correction is made, the position and rotation angle of the foot will not be shifted from those specified by the motion data. Since the motion data has 5 previously been prepared not to slide a foot, the enemy character's foot will not slide by the motion correction. As a result, themotionof the enemy characterwillbe more natural. In this embodiment, furthermore, the motion correction is performed such
that the position and rotation angle of the 10 base BS (or body coordinate system) in the enemy character of Fig. 3 relative to the world coordinate system (Xw, Yw, ZW) are substantially fixed relative to those specified based on the motion data.
For example, Fig. 9 shows that the position and rotation 15 angle of the base BS are varied based on the motion data. In addition, a virtual camera 30 is changed in position and rotation angle such that it can always take the enemy character moving with movement of the base BS.
In such a case, if the position and rotation angle of the 20 base BS (or the position and rotation angle thereof relative to the world coordinate system) are varied due to the motion correction for slightly changing the skeleton shape, such a situation that the virtual camera cannot take the enemy character may occur. Thus, the camera work to be done along the 25 development of story will be made impossible.
In this embodiment, the motion correction is performed while fixing the position and rotation angle of the base BS
relative to thosespecified teased on themotion data.Therefore, the position and rotation angle of the base BS can be varied to faithfully follow the motion data even if the parts of the enemy character are changed by the motion correction.
5 Consequently, such a situation that the virtual camera 30 is notable to take the enemy character can tee avoided end the camera work to be done along the development of story can be realized.
In this embodiment, the multi-step motion correction is made. For example, the first motion correction may first be 10 executed to change the skeleton shape specified based on the motion data. The second motion correction is then performed to further change the skeleton shape changed by the first motion correction. Furthermore, the third, fourth...and N-th motion corrections may be performed sequentially.
15 More particularly, the motion play is being performed merely based on the motion data at F1 of Fig. 10. In this case, the enemy character 20 performs, for example, its walk motion based on the motion data but is not turned toward the player (or player's character or virtual camera).
20 on the other hand, at F2 of Fig. 10, the first motion correction is made to turn the head and gun (or right hand) of the enemy character 20 toward the player. Thus, the motion of the enemy character 20 in which the head and gun thereof are turned toward the player can be implemented while using the 25 motion data for such a walk motion as shown at F1. As a result, more various motions can be represented by using only a single set of motion data, in comparison with the simple motion
correction as shown at F1 Ol rig. 10.
Furthermore, the second motion correction is performed to slightly change the skeleton shape or the enemy character 20 depending on hit shots from the player. Thus, the skeleton share of the enemy character can be slightly changed when it is hit by shot while turning the head and gun (or in a broad sense, a given part) of the enemy character toward the player (orina broad sense, target position) Asa resultr.mvr^-v-arious motions can be represented by using only a single set of motion lO data, in comparison with only the simple motion correction as shown at F1 of Fig. lO or only the first motion correction as shown at F2 of Fig. 1O.
It is further desirable that flags (or in a broad sense, information) used for enabling or disabling each motion 15 correction can beset to each of the enemy characters (or model objects). In Fig. llA, for example, flag OF (which is used to enable or disable the motion correction for turning the head toward the player), flag OF (which is used to enable or disable the 20 motion correction for turning the gun toward the player) and flag RF (which is used to enableordisablethemotion correction for reacting against a hit impact) are all set at 1 (enabled) for one enemy character 20. As shown in Fig. llB, thus, this enemy character 20 turns the he-d and gun toward the player and So exhibits a reaction (slight vibration) when itishit by a shot.
On the other hand, for another enemy character 21, the flags OF and OF are set at O(disabled) while the rlag RF is set
at [(enabled).tss.ho-,nin Fig. llB, thus, this enem-y character 21 exhibits a reaction when it is hit by shot, but the head and gun thereof are not turned toward the player.
For still ano,herenemy character 22, all the flags At, 3 GE and RF are set ate (disabled). Therefore, theenemycharacter 22 does not turn the head and gun toward the player and does not exhibit any reaction when it is hit by shot, as shown in Fig. 11.
In Fig. 12A, all the flags HP, OF and RF are set at 1 10 (enabled) for theenemycharacter20 since it is during the game play. During the game play, therefore, the enemy character 20 turns the head end gun toward the playerandis slightly vibrated when it is hit by shot.
Ontheotherhand,inFig]2B'alitlleflagsHF'GFandRFareset 15 at 0 (disabled) for the enemy character 20 since it is during the demonstration (or attraction). During the demonstration, therefore, the enemy character 20 does not exhibit any reaction even when it is hit by shot and the head and gun thereof are not turned toward the player.
20 When various flags used to control enabling end disabling of the motion correction can be set to each enemy character, the enemycharactersmaybemoved in differentmanners depending on the situation and mode of game. Even though the enemy characters are to be moved according to the same program or 25 through the same motion data, they can be moved in various different manners by;-arying the flags. As a result, the enemy characters can be realized with more various representations
of motion.
3. Processing in this embodiment Details of the processing in this embodiment will now be 5 described with reference to flowcharts.
Fig. 13 is a flowchart illustrating the entire process in this embodiment.
First of all, the motion data is read out from the motion data storage portion i2 of Trig. 2. Based on the motion data 10 read out, themotionplayis performed (step Sl).In other words, the position and rotation angle of each of the parts (or bones) in the corresponding frame are determined based on the motion data. The first motion correction is performed relating to the 15 directions of the head and gun in such a manner as described in connection with F2 of Fig. 10 (step S2).
It is then judged whether or not an enemy character is hit by a shot. It do so, the procedure proceeds to step S4. If not so, the procedure proceeds to step S7 (step S3).
20 If the procedure proceeds to step S4, it is then judged, based on the flag RF as described in Figs. llA-12B, whether or not the reaction of the enemy character (or the second motion correction) shouldbestarted. If do so, the procedure proceeds to step S5. If not so, the procedure proceeds to step S7.
25 If the procedure proceeds to step S5, the hitting force and direction are determined. Furthermore, the timer value TM of the reaction described in Figs. 7A, 7B, 7C and 7D is set at
its initial value (e.g., 90) (step S6).
The timer value TN is then decremented and it is judged whether or not the timer value MT becomes equal to zero (step S8).If the value MT becomes equal tozero, the procedure returns 5 to step S1 as it is. If the timer value MT is not equal to zero, the procedure proceeds to step S9 therein the second motion correction relating to the reaction is performed. Thereafter, the procedure returns to step S1.
In such a manner, the procedure for one frame is 10 completed.
Fig. 14 is a flowchart illustrating the first motion correction described in connection with the step S2 of Fig. 13 First of all, a target position TP shown in Fig.16 (e.g., player's position) is determined (step S10) 15 It is then judged,basedon the flag HF described in Figs. llA-12B, whether or not the head of the enemy character should be turned toward the target position TP (step Sll). If it is judged that the head should tee turned toward thetargetposition, the procedure proceeds to step S12. If not so, the procedure 20 proceeds to step S17.
If the procedure proceeds to step S17, the head of the enemy character is turned toward original line-of-sigh/vector MHV which is specified by the motion data.
On the other hand, if the procedure proceeds to step S12, 25 a vector THV from the head to thetargetposition TPis determined as shown in Fig. 16. An angle 9Hl included between the original line-of-sight vector MHY specified by the motion data and the
vector THV is determined (step S13).
It is then judged whether or not the angle 5H1 is smaller than 90 degrees (step S14). If the angle H1 is equal to or larger than 90 degrees, the procedure proceeds to step S17 wherein the head of the enemy character is turned toward the original line-of-sigh/vector MHV. In other words, if the angle H1 is equal to or larger than 90 degrees, the head of the enemy character will not be turned toward the player. This is because the rotation of the head through an angle equal to or larger 10 than 90 degrees is not natural in view of the real world event.
If the angle HI is smaller than 90 degrees, the line-of-sight vector HV in the enemy character is corrected so that an angle H2 included between the vector HV and the vector MHV falls within 45 degrees (step S15). The head is then turned 15 toward the corrected vector HV (step S16).
It is then judged, based on the flag GF describedin Figs. llA to 12B, whether or not the gun should be turned toward the target position TP (or the player) (step S18). If not so, the procedure proceeds to step S24. If do so, the procedure proceeds 20 to step Sl9.
If the procedure proceeds to step S24, the breast end both arms are turned to the original direction specified by the motion data.
On the other hand, if the procedure proceeds to step S19, 25 a vector TGV from the breast to the target position TP is determined as shown in Fig. 17A. An angle ERG included between the original gun direction vector MGV specified by the motion
data and the vector TGV is also determined (step S20).
It is then judged whether or not the angle ERG is smaller than 60 degrees (step S21). If the angle ERG is equal to orlarger than 60 degrees, the procedure proceeds to step S24 wherein the 5 breast and both arms of the enemy character are turned to the original direction specified by the motion data.
On the other hand, if the angle ERG is smaller than 60 degrees, the breast is rotated about Y-axis in the base coordinate system foraligning, asshown in Fig. 17B (step S22).
10 Both the arms are then rotated about X-axis in the breast coordinate system for aligning,asshown in Fig. 17C (step S23).
In such a manner, the first motion correction in which the head and gun of the enemy character are turned toward the target position (or player).
15 Fig. 18 is a flowchart of the second motion correction shown at the step S9 of Fig. 13.
First of all, the hitting force corresponding to the hit part is applied to that part (or bone) (step S30). For example, a shot hits the head in Fig. l9A. In such a case, the hitting 20 force against the head is maximized. As the distance between the head and that part increases, the hitting force applied is reduced. It is then judged whether or not the part to be processed is the hip of the enemy character (step S31). If not so, the 25 rotational speed for slightly shifting (moving) the respective parts (or bones) is updated based on the hitting force, restoring force (see Figs. 6A-C) and damping resistance. More
particularly, such a calculation as shown by the following expression (1) is done.
An = mn-1 - K1 x n-1 - K2 x mn-1 + FE (1) In the above expression (1), An and n-1 are rotational speeds in the n-th and (n-1)-th frames; K1 is spring coefficient (or coefficient in the restoring force); n-1 is the amount u' filthy rotation shirt in a pare ar one (n-lj-th frame; K2 10 is coefficient of damping resistance; and FH is hitting force.
Thereafter, the amount of slight rotation shift in each part (or bone) is updated teased on the updated rotational speed.
More particularly, such a calculation as shown by the following expression (2) is done.
on = (TM/TMS) x (A On-1 + An) (2) In the above expression (2), TM is such a timer value as described in Figs. 7A-D; TMS is the initial timer value (e.g., 20 90); On and pn-1 are the amounts of slight rotation shift at the n-th and (n-1)-th frames; and An is the amount of slight rotation shift determined by the aforementionedexpression (1).
The calculations of the expressions (1) and (2) are actually performed for rotation about each axis.
25 If the hip of the enemy character is to be processed, the moving speedis also updated in addition to the rotational speed (step S34). Furthermore, the amount of slight movement shift
is also updated in addition to the amount of slight rotation shift (step S35).
More particularly, a part other than the hip (e.g., the head 3) is shifted only through the rotational angle, as shown in Fig. l9B. However, the hip 1 is shifted relating to not only the rotational angle but also the position, as shown in Fig. 19. In this connection, one arm (including upper arm and forearms is slightly rotated about an axis connecting between 10 the shoulder and the hand as shown in Fig. l9D since the hand (orgun) is fixed to have a reduced degree of freedom es described in Fig. 8. Oneleg (including/high end stain) is slightly rotated about an axis connecting between the hip joint and the foot as shownin Fig. l9E since the footis fixed to have a reduced degree 15 of freedom as described in Fig. 8.
Next, the positions androtation angles of the hip, breast and head will be determined (step S36).
More particularly, theposition and rotation angle of the base BS shown in Fig. 3 iS first determined based on the motion 20 data. Based on the position and rotation angle of this base BS, the motion data and the amounts of slight rotation and movement shifts determined at the step 535, the position and rotation angle of the hip is determined (that is, the matrix of the coordinate transformation of the hip bone to the world 25 coordinate system being determined).
Based on the determined position and rotation angle of the hip, the motion data varied by the first motion correction
of Figs. 14 and 15 and the amount of slight rotation shift obtained at the step S33, the position and rotation angle of the breast is determined (thatis, the matrix of the coordinate transformation of the breastbone to the world coordinate system 5 being determined).
Based on the determined position and rotation angle of the breast, the motion data varied by the first motion correction of Figs. 14 and 15 and the amount of slight rotation spit otaiiJeu ac One seep Sit, the position and rotation angle 10 of the head is determined (thatis, the matrix of the coordinate transformation of the head bone to the world coordinate system being determined).
The positions and rotation angles of the arms are then determined based on the positions of the shoulders specified 15 by the position and rotation angle of the breast and the positions of both the fixed hands (or gun) (step S37). Namely, considering the amount of slight rotation shift in Fig. l9D, the positions and rotation angles of the upper arms and forearms are determined through inverse kinematics (that is, the matrix 20 of the coordinate transformation of the upper arm and forearm bones to the world coordinate system being determined).
The positions and rotation angles of the legs are then determined based on the positions of the hip joints specified by the position and rotation angle of the hip and the positions 25 of the fixed feet (step S37). Namely, considering the amount of slight rotation shiftin Fig.19E, the positions and rotation angles of the thighs and shins are determined through inverse
kinematics (that is, the matrix of the coordinate transformation of the thigh and shin bones to the world coordinate system being determined).
In such a manner, the positions and rotation angles of 5 alltheparts (or bones) (thatis, the matrixes of the coordinate transformation to the world coordinate system) are determined.
Thereafter, the positions of the vertexes forming parts are determined (that is, each vertex being coordinate-transformed into theworld coordinate system). Thus, the image of the enemy 10 character can be generated.
4. Hardware configuration An example of a hardware configuration capable of implementing this embodiment will now be described with 15 reference to Fig. 20. Thesystem shown in Fig. 20 comprises CPU 1000, ROM 1002, RAM 1004, an information storage medium 1006, a sound generation IC 1008, an image generation IC 1010 and I/O ports 1012, 1014, all of which are interconnected through a system bus 1016 for data reception and transmission. The image 20 generation IC 1010 is connected to a display 1018; the sound generation IC 1008 to a speaker 1020; the I/O port 1012 to a control device 1022; and the I/O port 1014 to a communication device 1024.
The information storage medium 1006 has mainly stored a 25 program. image data for representing objects, sound data and others. For example, a home game apparatus may use DVD, game cassette, CD-ROM or the like as an information storage medium
for storing the game program and other data. An arcade game apparatus may use a memory such as ROM or the like In the latter case, the information storage medium 1006 is in the form of ROM 1002. 5 The control device 1022 corresponds to a game controller, control panel or the like. The control device 1022 is used by the player for inputting his or her judgment into the game apparatus according to the progress of game.
ecu loud is co perform the control of the entire game 10 apparatus and the processing of various data according to the program stored in the information storage medium 1006, the system program (such asinformation forinitializing the entire system) stored in the ROM 1002, input signals from the control device1022 and so on. RAM1004is a memory means used asaworking 15 area for the CPU 1000 and has stored given contents in the information storage medium 1006 and ROM 1002 or the results of computation in the CPU 1000 The structures of data having a logical structure for realizing this embodiment may be build on this RAM or information storage medium 20 The sound and image generation IC's 1008, 1010 in this game apparatus are to output game sounds and images in a preferred manner. The sound generation IC 1008 is in the form of an integrated circuit for generating game sounds such as sound effects, background music and others, based on the
25 information stored in the information storage medium 1006 and ROM1002, the generated sounds being then outputted through the speaker 1020. The image generation IC 1010 is in the form of
an integrated circuit which can generate pixel information to be outputted toward the display 1018 based on the image information from the RAM 1004, ROM 1002, information storage medium 1006 and so on. The display 1018 may be in the form of 5 a so-called head mount display (HMD).
The communication device 1024 is to receive and transmit various pieces of information which are utilized in the game apparatus from and to external. The communication device 1024 is connected to the other game apparatus (or apparatuses) to 10 transmit and receive given information corresponding to the game program from and to the other game apparatuses or utilized to transmit and receive the information including the game program and other data through the communication line.
Various processing steps previously described in 15 connection with Figs. 1 to l9E are realized by the information storage medium 1006stored theinformation such es program, data and so on, and CPU 1000, image generation IC 1010 and sound generation IC 1008 which operate based on the information from the information storage medium 1006. The processings in the 20 image generation IC 1010 and sound generation IC 1008 may be performed in a software manner through the CPU 1000 or all purpose DSP.
When this embodiment is applied to such an arcade game apparatus as shown in Fig. 1, a system board (or circuit board) 25 1106 included in the game apparatus comprises CPU, image generation IC, sound generation IC and others all of which are mounted therein. The system board 1106 includes an information
storage medium or semiconductor memory 1108 which has stored information for executing (or realizing) the processings of this embodiment (or means of the present invention). These pieces of information will be referred to "the stored 6 information pieces".
Fig. 21A shows a home game apparatus to which this embodiment is applied A player enjoys a game by manipulating game controllers 1202 and 1204 while viewing a game picture displayed on a display 1200. In such a ease, the aforementioned 10 stored information pieces have been stored in DVD 1206 and memory cards 1208, 1209 which are detachable information storage media in the game apparatus body.
Fig. 21B shows an example wherein this embodiment is applied to a game system which includes a host machine 1300 and 15 terminals 1304-1 to 1304 n connected to the host machine 1300 through a communication line (which is a small-scale network such as LAN or a global network such as the Internet) 13 02. In such a case, the above stored information pieces have been stored in an information storage medium 1306 such as magnetic 20 disk device, magnetic tape device, semiconductor memory or the like which can be controlled by the host machine 1300, for example. If the terminals 1304-1 to 1304-n are designed each to have a CPU, image generation IC and sound processing IC and to generate game images andgamesoundsina standalone manner, 25 the host machine 1300 delivers game program and other data for generating game images and game sounds to the terminals 1304-1 to 1304-n.Onthe otherhand, if the gameimages end sounds cannot
be generated by the terminals in the standalone manner, the hostmachine1300will generate the gameimages end sounds which are in turn transmitted to the terminals 1304-l to 1304-n.
In the arrangement of Fig. 213, the processions of the D present invention may be decentralized into the host machine (or server) and terminals. The above information pieces for realizing the present invention may be distributed and stored into the information storage m,,. a of th_ nos. machine (or server) and terminals.
10 Each of theterminals connected to the communication line maybeeitherothomeorarcade type.When the arcade game systems are connected to the communication line, it is desirable that each of the arcade game systems includes a portable in ormation storage device (memory card or portable game machine) which can 15 not only transmit the information between the arcade game systems but also transmit the information between the arcade game sys teas and the home game systems.
The technique of slightly changing the skeleton shape of the model objectisnot limited to that described in connection MA _
with this embodiment, but may be carried out in any of various other forms.
The techniques of motion correction are not limited to the forms of this embodiment.
5 Although this embodiment has been described as to the two-step motion correction, the presentinvention may similarly be applied to any other technique of three or more step type motion correction.
Although this embodiment has been described as to the 10 model objects adopted as enemy characters, the model objects are not limited to the enemy characters, but may be used as various other characters each having at least a skeleton structure and being moved by the motion data, such as player's characters, moving bodies, stationary bodies and others.
15 Although this embodiment has been described as to the model object hit by a shot, the presentinvention is not limited to such a shot, but may include various other hits by sword, punch and kick.
In addition to the gun game, the present invention may 20 be applied to any of various other games such as other shooting games, fighting games, robot fighting games, sports games, competition games, role-playing games, music playing games, dancing games and so on.
Furthermore, the present invention can be applied to 25 various image generation systems such as arcade game systems, home game systems, largescaled multi-player attraction systems, simulators, multimedia terminals, image generation
systems, game image generation system boards and so on.

Claims (4)

1. An image generation system for generating an image, comprising: means for performing a first motion correction in which a skeleton shape of 5 a model object specified by motion data is changed, the model object including a plurality of parts, then performing a second motion correction in which the skeleton shape of the model object which has been changed by the first motion correction is further changed, and so on, and then finally performing an N-th (N>2) motion correction in which the skeleton shape of the model object which has been changed by 10 the (N - 1)-th motion correction is still further changed; and means for generating images including an image of the model object, wherein information for respectively enabling or disabling the first to N-th motion correction is set to the model object.
15
2. An image generation system according to claim l, wherein the first to N-th motion correction are performed while substantially fixing a position or rotation angle of the body coordinate system of the model object relative to the world coordinate system to a position or rotation angle specified by the motion data.
20
3. A computer usable program embodied on an information storage medium or in a carrier wave, comprising a processing routine for implementing: means for performing a first motion correction in which a skeleton shape of a model object specified by motion data is changed, the model object including a plurality of parts, then performing a second motion correction in which the skeleton 25 shape of the model object which has changed has been changed by the first motion correction is further changed, and so on, and then finally performing an N-th (N>2) motion correction in which the skeleton shape of the model object which has been changed by the (N l)-th motion correction is still further changed; and means for generating images including an image of the model object,
wherein information for respectively enabling or disabling the first to Nth motion correction is set to the model object.
4. A computer usable program according to claim 3, wherein the first to Nth 5 motion correction are performed while substantially fixing a position or rotation angle of the body coordinate system of the model object relative to the world coordinate system to a position or rotation angle specified by the motion data.
GB0403651A 1999-08-25 2000-08-21 Image generation system and program Expired - Lifetime GB2395645B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP23865199A JP4301471B2 (en) 1999-08-25 1999-08-25 Image generation system and information storage medium
GB0020623A GB2356785B (en) 1999-08-25 2000-08-21 Image generation system and program

Publications (3)

Publication Number Publication Date
GB0403651D0 GB0403651D0 (en) 2004-03-24
GB2395645A true GB2395645A (en) 2004-05-26
GB2395645B GB2395645B (en) 2004-07-21

Family

ID=32178848

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0403651A Expired - Lifetime GB2395645B (en) 1999-08-25 2000-08-21 Image generation system and program

Country Status (1)

Country Link
GB (1) GB2395645B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1721645A3 (en) * 2005-05-11 2011-11-23 Nintendo Co., Ltd. Image processing program and image processing apparatus
US8151007B2 (en) 2007-08-24 2012-04-03 Nintendo Co., Ltd. Information processing program and information processing apparatus

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2324690A (en) * 1997-04-25 1998-10-28 Nintendo Co Ltd Video games system
GB2332863A (en) * 1997-12-12 1999-07-07 Namco Ltd Gun based video game.

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2324690A (en) * 1997-04-25 1998-10-28 Nintendo Co Ltd Video games system
GB2332863A (en) * 1997-12-12 1999-07-07 Namco Ltd Gun based video game.

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1721645A3 (en) * 2005-05-11 2011-11-23 Nintendo Co., Ltd. Image processing program and image processing apparatus
US8297622B2 (en) 2005-05-11 2012-10-30 Nintendo Co., Ltd. Image processing program and image processing apparatus
US8151007B2 (en) 2007-08-24 2012-04-03 Nintendo Co., Ltd. Information processing program and information processing apparatus
US9616337B2 (en) 2007-08-24 2017-04-11 Nintendo Co., Ltd. Information processing program and information processing apparatus
US10071309B2 (en) 2007-08-24 2018-09-11 Nintendo Co., Ltd. Information processing program and information processing apparatus

Also Published As

Publication number Publication date
GB0403651D0 (en) 2004-03-24
GB2395645B (en) 2004-07-21

Similar Documents

Publication Publication Date Title
US6532015B1 (en) Image generation system and program
JP3183632B2 (en) Information storage medium and image generation device
JP3145059B2 (en) Information storage medium and image generation device
US7084855B2 (en) Image generation method, program, and information storage medium
US7922584B2 (en) Image generation method and information storage medium with program for video game in which operation of the controller beyond a predetermined angle causes a character to attack
US7088366B2 (en) Image generation method, program, and information storage medium
JP4775989B2 (en) Image generation system, program, and information storage medium
US7281981B2 (en) Image generation method, program, and information storage medium
JP4278072B2 (en) Image generation system and information storage medium
JP4059408B2 (en) GAME DEVICE AND INFORMATION STORAGE MEDIUM
US7148894B1 (en) Image generation system and program
US7008323B1 (en) Image generation method and program
JP4097236B2 (en) Image generating apparatus and information storage medium
JP4114825B2 (en) Image generating apparatus and information storage medium
GB2395645A (en) Motion correction of on-screen targets in shooting game
JP4669054B2 (en) Image generation system and information storage medium
JP4420729B2 (en) Program, information storage medium, and image generation system
JP4642118B2 (en) Image generation system and information storage medium
JP3990050B2 (en) GAME DEVICE AND INFORMATION STORAGE MEDIUM
JP4641602B2 (en) GAME SYSTEM AND INFORMATION STORAGE MEDIUM
JP2001052199A (en) Image generation system and information storage medium
JP4642104B2 (en) Image generation system and information storage medium
JPH11244532A (en) Image producing device and information storage medium
JP2001104634A (en) Game machine
JP2001052201A (en) Image generation system and information storage medium

Legal Events

Date Code Title Description
PE20 Patent expired after termination of 20 years

Expiry date: 20200820