WO2004052481A1 - ゲーム装置、ゲーム制御方法、及びプログラム - Google Patents
ゲーム装置、ゲーム制御方法、及びプログラム Download PDFInfo
- Publication number
- WO2004052481A1 WO2004052481A1 PCT/JP2003/012634 JP0312634W WO2004052481A1 WO 2004052481 A1 WO2004052481 A1 WO 2004052481A1 JP 0312634 W JP0312634 W JP 0312634W WO 2004052481 A1 WO2004052481 A1 WO 2004052481A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- game
- image
- game field
- image data
- display
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 239000000203 mixture Substances 0.000 claims description 47
- 238000009877 rendering Methods 0.000 claims description 18
- 239000002131 composite material Substances 0.000 claims description 16
- 230000015572 biosynthetic process Effects 0.000 claims description 8
- 238000003786 synthesis reaction Methods 0.000 claims description 8
- 230000002194 synthesizing effect Effects 0.000 claims 1
- 230000001747 exhibiting effect Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 8
- 244000025254 Cannabis sativa Species 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000013138 pruning Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
Classifications
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5255—Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/812—Ball games, e.g. soccer or baseball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
Definitions
- the present invention relates to a game apparatus that controls a game played in a game field.
- game fields such as stadiums and game scenes such as soccer games and baseball games.
- game scenes such as soccer games and baseball games.
- the reality of the scene is regarded as important in order to increase the sense of presence given to the player, and a realistic depiction of the game field is also required.
- the following measures are taken for the lawn in the stadium, which is a game field.
- grass is mowing with a mowing machine, so the grass may move in that direction, and stripes and grids may appear on the grass. Therefore, generating a game field image that mimics such a pattern is also performed in a single game.
- the lawn pattern does not change even if the viewpoint is changed three-dimensionally.
- the viewpoint changes in three dimensions, the line-of-sight direction with respect to the pruning direction changes. If the pattern does not change accordingly, the reality is impaired.
- the pattern may change based on changes in the viewpoint.
- the reality in such a case is lost.
- the present invention has been made in view of the above circumstances, and it is possible to display a game field image based on various factors such as a change in viewpoint by using a simple method to improve the game field reality.
- One of the purposes is to provide a device, a game control method, and a program. -Disclosure of the invention
- the present invention for solving the problems of the above-described conventional example is a game device for controlling a game performed in a game field, wherein the game content performed in the game field is determined based on a player's operation.
- the game field is arranged in a three-dimensional space
- the display means displays the game field based on a set line-of-sight direction
- the composition rate changing means includes the line-of-sight direction and The image composition rate may be changed based on at least one of the set light source positions.
- the display means may display a game field by superimposing a model in which each of the plurality of image data is set as a texture, placing the model in a three-dimensional space, and performing rendering.
- the present invention for solving the problems of the conventional example described above is a game control method for controlling a game performed in a game field using a combination of games.
- the present invention for solving the problems of the conventional example described above is an information storage medium storing a program for executing a game control performed in a game field, and is performed in the game field.
- a procedure for displaying a field and causing the computer to execute.
- FIG. 1 is a block diagram showing the configuration of the game device according to the embodiment of the present invention.
- FIG. 2 is a configuration block diagram illustrating an example of a drawing control unit.
- FIG. 3 is a functional block diagram showing an example of game field display processing.
- FIG. 4 is an explanatory diagram showing the angle formed between the line-of-sight direction and the reference line of the game field.
- FIG. 5 is an explanatory diagram showing an example of the display state of the game field.
- FIG. 6 is an explanatory diagram showing an arrangement example when a model is used.
- FIG. 7 is an explanatory diagram illustrating an example of image data to be combined.
- the game device includes a control unit 1 1, a storage unit 1 2, a drawing control unit 1 3, a display storage unit 1 4, and a display unit. 1 5 and the operation unit 1 6.
- the control unit 11 operates according to a program stored in the storage unit 12, executes a predetermined game process according to the player's operation input from the operation unit 16, and displays the result as a drawing control unit 1 3.
- the drawing process is executed and the display part 15 is displayed.
- the control unit 11 also determines the game screen drawing contents such as characters and obstacles on the game field or game field, and outputs the determined contents to the drawing control unit 13 to play the game. Let the screen draw.
- a special feature of this embodiment is that the game field image is displayed by the display processing of the game field by the control unit 1 1 and the drawing control unit 1 3 as the line-of-sight direction and the light source position change. It is to be changed. The contents of this game field display process will be described in detail later.
- the storage unit 1 2 stores a program executed by the control unit 1 1.
- the storage unit 12 may include a device that reads a program from a storage medium such as a CD-ROM or a DVD-ROM in addition to the semiconductor memory.
- the storage unit 12 also operates as a work memory that holds various data generated in the process of the control unit 11.
- the drawing control unit 13 includes a texture buffer 2 1, a model buffer 2 2, and a rendering unit 2 3.
- the texture buffer 21 stores at least one image data as a texture in accordance with an instruction input from the control unit 11.
- Model buffer 22 also stores model data (vertex coordinate settings, shape data indicating what kind of figure is drawn based on each vertex coordinate, etc., and appearance data indicating which texture is set. ) Is received from the control unit 11 and at least one model data is stored.
- the rendering unit 2 3 receives the input of setting of viewpoint coordinates, line-of-sight direction, light source position, light source type and other conditions from the control unit 11 1, and from the set viewpoint coordinates based on the setting, the line-of-sight direction
- Each model stored in the model buffer 22 is rendered by rendering sequentially from a distance as viewed from the viewpoint, and each rendering result is sequentially stored in the display storage unit 14.
- the rendering unit 23 sets the appearance data for each model, and executes rendering while considering the conditions such as the set light source position and light source type. Since this rendering method can use a general real-time rendering technique such as a Z buffer method, a detailed description thereof is omitted.
- the display storage unit 14 is a so-called VRAM (Video Random Access Memory), and holds the drawing result of the screen for at least one frame input from the drawing control unit 13.
- the display unit 15 is a display controller or the like, and sends the drawing result stored in the display storage unit 14 to the display.
- the display may be a television receiver if the game device of the present embodiment is a home game device. If it ’s a commercial game device, It may be a projector or the like.
- the operation unit 16 is a game controller, a keyboard, a mouse, or the like, and accepts the player's operation and outputs the contents to the control unit 11.
- the functions for realizing the display processing of the game field include a game control unit 3 1, an effect control unit 3 2, a composition rate change unit 3 3, and a display control unit 3 4. It is implemented by hardware or software, or a combination of both.
- the game control unit 31 processes the content of the game performed in the game field based on the operation of the player, and in response to the processing result, instructs the player to display the character controlled by the player on the game field. Is output to the display control unit 3 4. Further, the game control unit 31 determines and displays the positions of characters other than the character controlled by the player. For example, in the case of a soccer game, the game control unit 31 It performs processing necessary for the game, such as determining whether a goal has been made.
- the presentation control unit 3 2 is based on conditions determined in advance according to the progress of the game controlled by the game control unit 3 1, for example, the position (viewpoint) for viewing the game field, the viewing direction (gaze direction), the light source position, etc.
- the performance conditions are determined and output to the composition ratio changing unit 3 3 and the display control unit 3 4.
- this production control unit 3 2 moves the viewpoint along the outer periphery of the game field before the game starts (before the game starts in the case of a soccer game), Center circle) and the direction of the part of interest Set as the line-of-sight direction, and produce an effect as if the power mela makes a round around the game field.
- the line of sight is directed toward the attention area on the game field (in the case of a soccer game, the character controlling the pole) and projected from a certain viewpoint (where the camera is located).
- the presentation control unit 3 2 switches to another viewpoint and the same.
- the game field can be seen from various directions.
- the composition rate changing unit 33 refers to information on the rendering conditions such as the viewpoint, the line-of-sight direction, and the light source position input from the rendering control unit 32, and uses the plurality of information used for displaying the game field based on the referenced information. Change each image data composition ratio (image composition ratio).
- the composition rate changing unit 33 as shown in FIG. 4, has a reference line L that is virtually set on the game field (if the game field is fixedly aligned with the X and Y coordinate axes, the X axis Or, if the game field is flat as shown in Fig. 4, the vector in the line of sight is projected onto the plane.
- the image composition rate may be changed based on a mathematical formula set in advance according to the angle 0 (which may be the angle between the vector and the vector in the direction of the reference line L).
- the subsequent image composition rate is output to the display control unit 34.
- the numerical value of the image composition ratio H 1 and ⁇ 2 for each of the two image data is calculated as follows.
- ⁇ 1 is “2 4” and ⁇ 2 is “6 8”
- 0 ⁇ , ⁇ ⁇ is“ 6 8 ”. 2 becomes “2 4”.
- ⁇ 1 + H 2 is set to be a constant value “1 0 0”, but this need not be a constant value.
- the composite image is slightly translucent (the background can be seen through).
- the game field of a soccer game is used as an example, so the game field may be fixed in 3D coordinates, but depending on the type of game, the game field that is the stage rotates. There is also. Even in such a case, the image composition rate will gradually change by adjusting the angle between the line-of-sight direction and the virtual reference line of the game field.
- the example using a gaze direction is only an example.
- the image composition rate can be changed depending on the light source position.
- the image composition rate is changed based on how far and in the direction the light source position is relative to a virtual reference point on the game field (in the case of a soccer game, the center of the center circle, etc.).
- the image composition rate may be changed based on the viewpoint position, and other environmental conditions (game parameters generated or used in the game control unit 3 1 such as generation of fog, environmental temperature, or score). ) Can also change the image composition rate.
- the player can select the type of stadium to be a game field, and the calculation content of the image composition rate can be changed for each stadium.
- the display control unit 3 4 includes a drawing control unit 1 3, a display storage unit 14, and a display unit. 1 5 to generate a composite image by combining each of a plurality of image data set in advance as a texture at a composite rate based on the image composite rate input from the composite rate changing unit 33. This is displayed on the display as an image of the game field.
- the display control unit 34 displays the object to be displayed on the game field, such as a character or an obstacle, in accordance with the display instruction input from the game control unit 31 and displays the synthesized image on the game field image. .
- this game field display processing in the present embodiment will be described by taking as an example a case where the game field of a soccer game is basically a striped pattern as shown in FIG.
- the darkness of the regions U and V is reversed when viewed from the main stand (Fig. 5 (a)) and when viewed from the back stand (Fig. 5 (b)).
- the reality can be improved by displaying the game field so that the darkness of each area is almost the same as shown in FIG. 5 (c).
- the difference in density is indicated by the difference in hatching.
- the model is set in the model buffer 2 2 of the drawing control unit 1 3 as follows. That is, as shown in Fig. 6, the first planar model (first model) is applied to the rectangular plane model (base model) B that serves as the base, and the plane of the base model that has the viewpoint. ) Place P so that it touches the bottom. Place the 2nd plane model (2nd model) Q against the plane of the 1st model P so that its bottom face touches.
- the base model B, the first plane model P, and the second plane model Q all have the same shape.
- the models are spaced apart for easy identification. However, in reality, they are arranged in close proximity to each other.
- image data (basic image data) representing the green color of the turf is stored in the texture buffer 21.
- the texture to be set for the first model P as shown in Fig. 7 (a), the dark green area G and the transparent area so as to form a stripe pattern parallel to the vertical side of the rectangle of the first model P, for example.
- the image data that is drawn alternately with X first pattern image data
- the texture to be set for the second model Q so that it is just inverted from the first pattern image data.
- image data (second pattern image data) in which a dark green region G and a transparent region X are drawn parallel to the vertical side is stored in the texture buffer 21.
- the synthesis rate change unit 3 3 determines the image synthesis rate of the first pattern image data and the second pattern image data. Output to display controller 3 4.
- the display control unit 34 performs drawing on the display storage unit 14 in order from a distance from the viewpoint. That is, the drawing control unit 13 first sets the texture of the basic image data in the base model, renders it, and stores the rendering result in the display storage unit 14. The drawing control unit 13 further sets the image composition rate for the first pattern image data determined by the composition rate change unit 33 for the first pattern image data, and sets the image composition rate.
- the rendered first pattern image data is set as the texture of the first model P and rendered, and the rendering result is combined with the image data stored in the display storage unit 14 at that time.
- the pixel corresponding to the pixel for which the image synthesis rate is set at the time when the rendering result is synthesized.
- the value obtained by multiplying the image value P 1 by the image value P 1 and the image value P 1 multiplied by the image value ⁇ 1, that is, the value of ⁇ 0 + ⁇ 1 X ⁇ 1 Set as the value of.
- the same processing is performed for the second pattern image data, and the image composition rate related to the second pattern image data determined by the composition rate changing unit 33 is set, and the image composition rate is set.
- the second pattern image data is set as the texture of the second model Q and rendered, and the result is combined with the storage contents at that time of the display storage unit 14.
- the display control unit 34 thus combines composite image data obtained by combining the first pattern image data and the second pattern image data according to the arrangement relationship between the first model and the second model and the setting of the image composition rate. You can generate and display it.
- the image composition based on the image composition rate is performed when the image is stored in the display storage unit 14.
- each pixel of the texture is used prior to setting the texture for the model.
- the image synthesis rate may be multiplied by this value, and the texture after the image synthesis rate is multiplied may be set as the model.
- each method is referred while referring to the image composition rate set for the texture for each model at the time of rendering.
- the pixel value may be calculated and the calculated pixel value may be stored in the display storage unit 14.
- the first model and the second model are used here, but after multiplying the image composition ratios corresponding to the basic image data, the first pattern image data, and the second pattern image data, respectively.
- the image data of the synthesized texture is generated by adding the corresponding pixel values, and this is set as the texture of the base model ⁇ . Rendering may be performed.
- the first model and the second model are not necessarily required.
- the embodiment of the present invention is not limited to the case where a three-dimensional model is used.
- the image composition rate of multiple two-dimensional image data is determined based on information such as virtual light source position information, gaze direction, and information related to effects such as the viewpoint.
- a plurality of two-dimensional image data may be combined and displayed at the determined image combining rate.
- the control unit 11 stores the base model B, the first model P, and the second model Q as shown in FIG. 6 in the model buffer 22 of the drawing control unit 13 before the game starts.
- the control unit 11 also includes basic image data as a texture to be set in the base model B, first pattern image data (FIG. 7 (a)) as a texture to be set in the first model P, and the second model.
- the second pattern image data (Fig. 7 (b)) as the texture to be set in Q is stored in the texture buffer 21.
- control unit 11 executes the game process in accordance with the player's operation input from the operation unit 16 and causes the drawing control unit 13 to draw the result. For example, for a character that is controlled by the player, the movement of the character is controlled in accordance with the operation of the player, and the image of the character as a result of the control is drawn by the drawing control unit 13.
- control unit 11 changes the viewpoint and the line-of-sight direction with respect to the game field based on conditions predetermined according to the progress of the game.
- the control unit 1 1 determines the line of sight and the game feel.
- the image composition ratio between the first pattern image data and the second pattern image data is determined according to the angle between the reference line and the drawing control unit 13.
- the drawing control unit 13 receives the basic model data, the first pattern image data, and the second pattern image data stored in the texture buffer 2 1 as the base model B and the first model P, respectively. And the second model Q, and the first pattern image data and the second model image data are rendered with the corresponding image composition ratio set.
- the drawing control unit 13 has the line-of-sight direction from the main stand side to the back stand side according to the change in the image composition rate of the first pattern image data and the second pattern image data.
- the game field as shown in Fig. 5 (a) is displayed, and when the line of sight is directed from one side stand to the other side stand, the game field as shown in Fig. 5 (c) is displayed.
- the game field as shown in Fig. 5 (b) is displayed.
- you move to the view from the back stand side you can produce a state where the turf is gradually reversed.
- the drawing control unit 13 further combines an object such as a character or an obstacle with the image data of the game field stored in the display storage unit 14. To do.
- the display unit 15 displays the drawing result stored in the display storage unit 14 on the display.
- the reality of the game field can be improved by a simple process of changing the image composition rate.
- the content of the game is a soccer game.
- the same processing can be performed for game fields such as the inside of a dungeon in a role-playing game and a fighting stage in a fighting game. Even in a game that does not use a three-dimensional model, it is possible to easily represent game field image changes in response to changes in the light source position, thereby improving the reality of the game field.
- a synthesized image can be generated with gradation by using, for example, transparency mapping as a mode of setting the synthesis rate.
- the game system according to the present invention can be applied to improve the game field reality in many games by changing the image of the game field based on various factors such as a change in viewpoint by a simple method. .
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
- Display Devices Of Pinball Game Machines (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP03748669A EP1568401B1 (en) | 2002-12-06 | 2003-10-02 | Game machine, game control method, and program |
AU2003268732A AU2003268732A1 (en) | 2002-12-06 | 2003-10-02 | Game machine, game control method, and program |
CN2003801046873A CN1720086B (zh) | 2002-12-06 | 2003-10-02 | 游戏装置以及游戏控制方法 |
DE60310761T DE60310761T2 (de) | 2002-12-06 | 2003-10-02 | Spielvorrichtung, spielsteuerungsverfahren und programm |
HK05107696A HK1073810A1 (en) | 2002-12-06 | 2005-09-01 | Game machine, game control method, and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2002355882A JP4079358B2 (ja) | 2002-12-06 | 2002-12-06 | ゲーム装置、ゲーム制御方法、及びプログラム |
JP2002-355882 | 2002-12-06 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004052481A1 true WO2004052481A1 (ja) | 2004-06-24 |
Family
ID=32463392
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2003/012634 WO2004052481A1 (ja) | 2002-12-06 | 2003-10-02 | ゲーム装置、ゲーム制御方法、及びプログラム |
Country Status (10)
Country | Link |
---|---|
US (1) | US7955173B2 (ja) |
EP (1) | EP1568401B1 (ja) |
JP (1) | JP4079358B2 (ja) |
KR (1) | KR100721751B1 (ja) |
CN (1) | CN1720086B (ja) |
AU (1) | AU2003268732A1 (ja) |
DE (1) | DE60310761T2 (ja) |
HK (1) | HK1073810A1 (ja) |
TW (1) | TWI244400B (ja) |
WO (1) | WO2004052481A1 (ja) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4164101B2 (ja) * | 2006-05-24 | 2008-10-08 | 株式会社コナミデジタルエンタテインメント | ゲームプログラム、ゲーム装置及びゲーム制御方法 |
JP4838221B2 (ja) * | 2007-10-30 | 2011-12-14 | 株式会社コナミデジタルエンタテインメント | 画像処理装置、画像処理装置の制御方法及びプログラム |
BRPI1004032A2 (pt) * | 2010-10-08 | 2013-02-13 | Juan Miguel Mayordomo Vicente | sistema de representaÇço grÁfica aplicÁvel a superfÍcies de grama |
JP2011141898A (ja) * | 2011-04-08 | 2011-07-21 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
JP2016046642A (ja) * | 2014-08-21 | 2016-04-04 | キヤノン株式会社 | 情報処理システム、情報処理方法及びプログラム |
US10357717B2 (en) * | 2015-11-27 | 2019-07-23 | Earthbeat, Inc. | Game system and game program |
US11823318B2 (en) * | 2021-06-04 | 2023-11-21 | Nvidia Corporation | Techniques for interleaving textures |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002298157A (ja) * | 2001-03-28 | 2002-10-11 | Namco Ltd | ゲーム情報、情報記憶媒体およびゲーム装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3227158B2 (ja) * | 1993-06-16 | 2001-11-12 | 株式会社ナムコ | 3次元ゲーム装置 |
TW346611B (en) * | 1996-03-28 | 1998-12-01 | Sega Enterprises Kk | An image processor, a game machine using the image processor, a method of image processing and a medium |
JPH09319891A (ja) * | 1996-06-03 | 1997-12-12 | Sega Enterp Ltd | 画像処理装置及びその処理方法 |
US6280323B1 (en) * | 1996-11-21 | 2001-08-28 | Konami Co., Ltd. | Device, method and storage medium for displaying penalty kick match cursors in a video soccer game |
EP0877991B1 (en) * | 1996-11-21 | 2003-01-08 | Koninklijke Philips Electronics N.V. | Method and apparatus for generating a computer graphics image |
JP3709509B2 (ja) * | 1996-12-04 | 2005-10-26 | 株式会社セガ | ゲーム装置 |
KR100300832B1 (ko) * | 1997-02-18 | 2002-10-19 | 가부시키가이샤 세가 | 화상처리장치및화상처리방법 |
JP3310257B2 (ja) * | 2000-03-24 | 2002-08-05 | 株式会社コナミコンピュータエンタテインメントジャパン | ゲームシステム及びゲーム用プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP3453119B2 (ja) | 2000-12-11 | 2003-10-06 | 株式会社ナムコ | 情報記憶媒体及びゲーム装置 |
-
2002
- 2002-12-06 JP JP2002355882A patent/JP4079358B2/ja not_active Expired - Lifetime
-
2003
- 2003-10-02 AU AU2003268732A patent/AU2003268732A1/en not_active Abandoned
- 2003-10-02 DE DE60310761T patent/DE60310761T2/de not_active Expired - Lifetime
- 2003-10-02 WO PCT/JP2003/012634 patent/WO2004052481A1/ja active IP Right Grant
- 2003-10-02 KR KR1020057009990A patent/KR100721751B1/ko not_active IP Right Cessation
- 2003-10-02 CN CN2003801046873A patent/CN1720086B/zh not_active Expired - Fee Related
- 2003-10-02 EP EP03748669A patent/EP1568401B1/en not_active Expired - Fee Related
- 2003-10-08 TW TW092127937A patent/TWI244400B/zh not_active IP Right Cessation
- 2003-12-04 US US10/726,612 patent/US7955173B2/en not_active Expired - Fee Related
-
2005
- 2005-09-01 HK HK05107696A patent/HK1073810A1/xx not_active IP Right Cessation
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002298157A (ja) * | 2001-03-28 | 2002-10-11 | Namco Ltd | ゲーム情報、情報記憶媒体およびゲーム装置 |
Also Published As
Publication number | Publication date |
---|---|
KR20050085297A (ko) | 2005-08-29 |
TWI244400B (en) | 2005-12-01 |
DE60310761D1 (de) | 2007-02-08 |
KR100721751B1 (ko) | 2007-05-25 |
CN1720086B (zh) | 2010-04-28 |
EP1568401A1 (en) | 2005-08-31 |
JP2004187731A (ja) | 2004-07-08 |
AU2003268732A8 (en) | 2004-06-30 |
CN1720086A (zh) | 2006-01-11 |
EP1568401A4 (en) | 2006-02-01 |
US20040110559A1 (en) | 2004-06-10 |
EP1568401B1 (en) | 2006-12-27 |
DE60310761T2 (de) | 2007-10-11 |
US7955173B2 (en) | 2011-06-07 |
JP4079358B2 (ja) | 2008-04-23 |
AU2003268732A1 (en) | 2004-06-30 |
HK1073810A1 (en) | 2005-10-21 |
TW200413065A (en) | 2004-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9789401B2 (en) | Game device, game system, and information storage medium | |
US20090244064A1 (en) | Program, information storage medium, and image generation system | |
US20120212491A1 (en) | Indirect lighting process for virtual environments | |
KR20000064948A (ko) | 화상 처리 장치 및 화상 처리 방법 | |
JP4079358B2 (ja) | ゲーム装置、ゲーム制御方法、及びプログラム | |
JP2005032140A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP4749198B2 (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP4749064B2 (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP2009129167A (ja) | プログラム、情報記憶媒体、及び画像生成システム | |
US6890261B2 (en) | Game system, program and image generation method | |
JP2005275797A (ja) | プログラム、情報記憶媒体、及び画像生成システム | |
JP4528008B2 (ja) | プログラム、情報記憶媒体、及び画像生成システム | |
JP4707078B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP4913399B2 (ja) | プログラム、情報記憶媒体及び画像生成システム | |
US20100144448A1 (en) | Information storage medium, game device, and game system | |
JP4056035B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP6931723B2 (ja) | ゲーム機、ゲームシステム及びプログラム | |
JP2010033288A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP4688405B2 (ja) | プログラム、情報記憶媒体及びゲーム装置 | |
JP2004334801A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP2009112875A (ja) | ゲーム装置及び情報記憶媒体 | |
JP3763057B2 (ja) | 画像生成装置 | |
JP3990543B2 (ja) | プログラム、情報記憶媒体、および、ゲーム装置 | |
JP2010277407A (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP4632796B2 (ja) | プログラム、情報記憶媒体及び画像生成システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 20038A46873 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020057009990 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2003748669 Country of ref document: EP |
|
WWP | Wipo information: published in national office |
Ref document number: 1020057009990 Country of ref document: KR |
|
WWP | Wipo information: published in national office |
Ref document number: 2003748669 Country of ref document: EP |
|
WWG | Wipo information: grant in national office |
Ref document number: 2003748669 Country of ref document: EP |