EP0786742A1 - Vorrichtung und verfahren zur erzeugung von virtuellen bildern - Google Patents

Vorrichtung und verfahren zur erzeugung von virtuellen bildern Download PDF

Info

Publication number
EP0786742A1
EP0786742A1 EP96926620A EP96926620A EP0786742A1 EP 0786742 A1 EP0786742 A1 EP 0786742A1 EP 96926620 A EP96926620 A EP 96926620A EP 96926620 A EP96926620 A EP 96926620A EP 0786742 A1 EP0786742 A1 EP 0786742A1
Authority
EP
European Patent Office
Prior art keywords
subject
prescribed
physical object
show
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP96926620A
Other languages
English (en)
French (fr)
Other versions
EP0786742A4 (de
EP0786742B1 (de
Inventor
Kenji Yamamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Enterprises Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Enterprises Ltd filed Critical Sega Enterprises Ltd
Publication of EP0786742A1 publication Critical patent/EP0786742A1/de
Publication of EP0786742A4 publication Critical patent/EP0786742A4/de
Application granted granted Critical
Publication of EP0786742B1 publication Critical patent/EP0786742B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • A63F13/5252Changing parameters of virtual cameras using two or more virtual cameras concurrently or sequentially, e.g. automatically switching between fixed virtual cameras when a character changes room or displaying a rear-mirror view in a car-driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/18Stereoscopic photography by simultaneous viewing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6653Methods for processing data by generating or executing the game program for rendering three dimensional images for altering the visibility of an object, e.g. preventing the occlusion of an object, partially hiding an object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera

Definitions

  • the present invention relates to a virtual image generation technique for use in game units, simulators, and the like, and particularly to a technique for generating images (hereinafter termed "virtual images") obtained when a object present in virtually generated three-dimensional space (hereinafter termed “virtual space”) is projected (by perspective projection) onto a two-dimensional plane which corresponds to a prescribed visual point
  • Such virtual image generation apparatuss are usually equipped with a virtual image generation apparatus main unit that houses a computer unit for executing stored programs, an input device for sending control signals to the computer unit to instruct it to move objects displayed on the screen within the virtual image, a display for displaying the virtual images generated by the computer unit according to the program sequence, and a sound device for generating sounds according to the program sequence.
  • Examples of game devices with the architecture described above include those with a combat theme in which a player-controlled object (robot, human, or the like) engages in combat with enemy objects with which the player fights over a terrain created in virtual space (hereinafter termed "virtual terrain").
  • a player-controlled object robot, human, or the like
  • virtual terrain a terrain created in virtual space
  • the objects controlled by the player in such game units attack enemies by shooting at them while hiding behind the obstacles and the like which are provided as part of the virtual terrain.
  • an image like that perceived when the virtual space is observed from a prescribed visual point is used. This is accomplished using coordinate conversion for perspective projection, whereby a coordinate system for the virtual space is represented in perspective from a prescribed visual point and projected onto a two-dimensional plane lying in front of the visual point.
  • the line of sight which extends from the visual point of the virtual image is oriented towards the player's object so that the object controlled by the player is visible to the player.
  • the object controlled by the player is displayed as a subject located nearlly at the center of the display.
  • the present invention is intended to provide a virtual image generation apparatus that does not employ the aforementioned methods (1) and (2), and that affords a game that does not suffer from impaired look.
  • a second object of the present invention is to provide a virtual image generation apparatus that correctly determines whether a subject can be displayed overlapping a physical object in virtual space, and which performs appropriate transparent processing to make both the subject and physical object visible, and to a method therefor.
  • Claim 1 is a virtual image generation apparatus which generates within a virtually defined virtual space virtual images of the below-mentioned subjects, physical objects, and other figures present in said virtual space as they would be observed from a prescribed visual point, while rendering said images show-through or non-show-through, comprising virtual image generation means for rendering said non-show-through images into show-through images when prescribed conditions have been fulfilled, and rendering the show-through images into non-show-through images when said prescribed conditions are no longer fulfilled.
  • the invention of Claim 2 is a virtual image generation apparatus comprising shape data memory means for storing shape data pertaining to physical objects present in said virtual space, position data specification means for specifying position data for said subjects, overlap determination means for determining, on the basis of said shape data stored in said shape data memory means and position data for said subjects specified by said position data specification means, whether or not said physical object located between said visual point and said subject should overlap and be visible from said visual point, and image generation means for generating virtual images wherein said physical object is processed by prescribed show-through processing in the event that said overlap determination means has determined that said subject and said physical object are disposed in a prescribed overlapping state, and for generating virtual images wherein said physical object is processed by non-show-through processing and is not rendered show-through in the event that said subject and said physical object are disposed in a state other than a prescribed overlapping state.
  • the invention of Claim 6 is a virtual image generation apparatus as defined in Claim 2, wherein, for show-through display, the image generation means generates a virtual image by displaying pixels for displaying a subject in accordance with a prescribed pattern (a pattern in which a pixel is replaced every few dots, a striped pattern, or the like), rather than pixels for displaying a physical object.
  • a prescribed pattern a pattern in which a pixel is replaced every few dots, a striped pattern, or the like
  • the invention of Claim 7 is a virtual image generation apparatus as defined in Claim 5, wherein the pattern comprises an alternating sequence of pixels for displaying a physical object and pixels for displaying a subject.
  • obstacles and other physical objects which are displayed without show-through processing are, when certain prescribed conditions are fulfilled (for example, when a physical object comes between a visual point and a subject, as observed from the visual point), processed by show-through treatment; when these conditions no longer apply, physical object image generation returns to non-show-through processing.
  • the direction of the vector from the visual point to the subject and the vector from the physical object to the subject essentially coincide.
  • the angle defined by the two vectors is relatively small.
  • this angle is compared to a reference angle; thus, if the reference angle setting is made small enough to determine overlap, it may be accurately determined whether the physical object should overlap the subject.
  • the input device 11 is provided with control levers which are operated with the player's left and right hands in order to control the movement of the robot. Codes associated with various control attitudes of the control levers are transmitted as control signals to an I/O interface 106.
  • the output device 12 is provided with various types of lamps which notify the player of the operational status of the unit.
  • the TV monitor 13 displays the combat game image; a head mounted display (HMD), projector, or the like may be used in place of a TV monitor.
  • HMD head mounted display
  • the RAM 103 temporarily stores data required for polygon data coordinate conversion and other functions, and stores various command writes for the geometalyzer (such as object display), the results of matrix operations during conversion process operations, and other data.
  • the sound device 104 is connected to speakers 14 through a power amplifier 105. Audio signals output by the sound device 104 are amplified by the power amplifier 105 and delivered to the speaker 14.
  • the ROM 109 stores shape data for physical objects (buildings, obstacles, topographical features, and the like) required to make overlap determinations as to whether a subject (object) should be obscured by an obstacle or other topographical feature, and collision determinations as to whether a subject should collide with another topographical feature.
  • topographical feature data might include an ID for each surface which defines a topographical feature, and what is termed relationship of this ID and topographical feature surface is put in table form and stored in the ROM 111.
  • a coordinate system that indicates the relative positions of objects, obstacles, and other physical objects in a virtual space must be converted to a two-dimensional coordinate system (visual point coordinate system) that represents the virtual space as viewed from a designated visual point (for example, a camera or the like).
  • the visual point is set at some prescribed position (for example, diagonally above the subject) from which the subject to be controlled is visible.
  • the position relationship between the visual point and the subject may change in the course of the game.
  • the coordinates which represent subject position are sent as control signals from the input device 11 to the CPU 101.
  • virtual images are generated by a conversion process which involves projection onto a two-dimensional plane which constitutes the field of vision, re-creating images of the physical objects present in this virtual space as they would be observed from a given visual point (for example, camera photography).
  • This is termed perspective projection
  • the coordinate conversion performed through matrix operations for perspective projection is termed perspective conversion. It is the geometalyzer 110 that executes perspective conversion to produce the virtual image which is actually displayed.
  • the geometalyzer 110 is connected to the shape data ROM 111 and to the displaying device 112.
  • the geometalyzer 110 is provided by the CPU 101 with data indicating the polygon data required for perspective conversion, as well as with the matrix data required for perspective conversion.
  • the geometalyzer 110 performs perspective conversion on the polygon data stored in the shape data ROM 111 to produce data converted from the three-dimensional coordinate system in virtual space to the visual point coordinate system.
  • polygon data for the explosion image is used.
  • the scroll data processor 107 computes text and other scroll screen data (stored in ROM 102).
  • the image synthesis device 116 imposes text data output from the processor 107 onto the image data provided by the aforementioned frame buffer 115 and re-synthesizes the image.
  • the re-synthesized image data is output to the TV monitor 13 through the D/A converter 117.
  • step S1 the CPU 101 performs the initialization necessary for displaying an obstacle. Specifically, when a new control signal is supplied by the input device 11, the CPU 101 uses the movement assigned to the control signal to compute the coordinates for the destination to which the player-controlled object is to be moved. Once the object destination has been determined, a new location for the visual point from which the object will be observed as subject is determined.
  • step S2 When an obstacle or other physical object to be displayed is not present in the visual field of virtual space observed from the visual point (step S2: NO), the CPU 101 provides conversion matrix data for perspective projection for the new visual point to the geometalyzer 110 and completes processing. Since a plurality of obstacles or other physical objects are usually contained within a visual field (step S2: YES), the overlap determination process described below is performed in sequence for each obstacle or other physical object contained within the visual field.
  • vector size is usually unimportant; thus, these vectors are usually given a prescribed size.
  • the vector CR which extends from the visual point projection point C towards the object projection point R is computed from the coordinates of projection point C in the x-z plane (x1, z1) and the coordinates of projection point R in the x-z plane (x2, z2).
  • step S5 the CPU 101 compares the reference angle specified by the program with the interior angle computed in step S4.
  • step S5 the angle formed by vector OR and vector CR is within the reference angle (step S5: YES)
  • step S6 the height of the reference point for the subject (distance in the y direction) is compared with the height of the reference point for the obstacle (step S6).
  • step S7 a code that prescribes non-show-through processing (the usual display mode for obstacles) is provided to the geometalyzer 110 (step S7).
  • vector 70 is selected as vector OR.
  • the line-of-sight vector CR which extends from the point of projection C of the virtual camera (which serves as the visual point) onto the x-z plane to projection point R is given as shown in Fig. 3A.
  • the determination is made that the angle ⁇ 1 formed by the vector OR and vector CR is smaller than the reference angle, and that object R' and the obstacle O can overlap (see Fig. 6).
  • the system then proceeds to step S6.
  • the current object coordinates are used to compute the height (y coordinate) H of the object R' with respect to the virtual ground surface 80.
  • This height H is compared with the height of the obstacle, and where the height (H1) of the first origin of the object (bottom edge of the object) is higher than the height (H0) of the second origin of the obstacle (top edge of the obstacle), it is determined that the entire object is visible from visual point C' and that the object and obstacle can overlap, whereupon the obstacle O image is generated in the usual manner, without show-through processing.
  • the angle formed by the vectors is used as the basis for making the overlap determination for an object and an obstacle for the following reason.
  • an object is positioned behind an obstacle when viewed from the visual point, as shown in Fig. 3A
  • both the vector OR and the vector CR lie in essentially the same direction when the object is viewed from the back surface 700 of the obstacle. In such cases, the interior angle formed by the two vectors tends to be small.
  • the vector OR lies in the direction extending from the back surface 700 to the front surface 740 of the obstacle, while the vector CR lies in the direction extending from to the front surface 740 to the back surface 700 of the obstacle. Since these two directions are opposite from each other, the interior angle formed by the two vectors tends to be greater than it is in Fig. 3A.
  • a reference angle that is suitable for distinguishing between the state depicted in Fig. 3A and the state depicted in Fig. 3B, and comparing the actual interior angle formed by the two vectors with this reference angle, it becomes possible to distinguish between Fig. 3A and B.
  • the reference angle will differ depending on factors such as the angle formed by the visual point and the object and the distance, but favorably ranges from 70 to 90 ⁇ .
  • step S6 the height of the physical objects, that is, their y coordinates in the world coordinate system of the virtual space, is used as a reference because the y coordinate for the visual point is always greater (higher) than the y coordinate for obstacles.
  • a comparison of the magnitude of the x coordinate for each physical object may be used in place of the "height" comparison.
  • the displaying device 112 When show-through processing has been instructed (step S8), the displaying device 112 performs "mesh" processing when applying texture to the obstacle in question on the basis of texture data. Where show-through processing has been instructed for a plurality of physical objects, “mesh” processing is performed for the plurality of physical objects.
  • This mesh processing refers to a process in which pixels are selected from among the pixel array for displaying the obstacle in question, and these pixels for displaying the obstacle are replaced by inserting pixels for displaying the background in accordance with a prescribed pattern. Any type of pattern that renders the background and the object equally recognizable and that does not excessively change the look of the obstacle may be used as the prescribed pattern. For example, a pattern in which obstacle pixels and background pixels are disposed in alternating fashion is favorable.
  • the angle formed by the vector OR and the line-of-sight vector CR when overlap occurs will differ. This angle also differs with the distance between the visual point and each physical object.
  • the reference angle used for the comparison in step S5 may be varied in accordance with the size of the exterior of the physical objects and the distance between the visual point and the physical objects.
  • This vector is computed as a vector extending from a prescribed reference point on the obstacle O towards a reference point on the subject R'.
  • the reference point is, for example, the center point of the subject or obstacle.
  • Center point refers to a point corresponding to the center of gravity of the solid form envelope of the physical object, as viewed in geometric terms. In game units, objects and the visual point move in extreme fashion; thus, it is not necessary to compute the center point of the physical objects in an overly precise fashion, but is sufficient merely to store the position as center point coordinates in the ROM 109 or the like.
  • the reference points for determining the height of a physical object can be the same physical object center point used for vector computation; for obstacles, the height of the top surface of the obstacle may be used, and for objects which serve as subjects, the height of the bottommost area of the object may be used for reference.
  • mesh processing whereby pixels are modified on a per-dot basis, was used for the show-through processing performed by the image generation apparatus; however, the pixels may be replaced in accordance with other patterns. Specifically, it would be possible to perform pixel replacement every two dots, or to display the background and obstacle in striped fashion. It would also be possible to use show-through processing whereby the obstacle display is rendered translucent, rather than "mesh" processing, in which pixels are replaced. To render the obstacle translucent, various operations (addition, multiplication, or the like) can be performed on the color information (RGB) for the image displaying the obstacle and the color information for the image displaying the background, so that portions of the background obscured by the obstacle become recognizable.
  • RGB color information
  • the show-through processing may be performed on selected areas in the ⁇ -2 portion only.
  • Fig. 11 which is depicted from the same direction as in Fig. 7, is a diagram showing two obstacles 01 and 02, viewed from the y direction.
  • Fig. 11 indicates that the following relationships hold between the projection points R-1 and R-2 of the two objects onto the x-z plane, the visual point projection point C-1 for R-1, and the visual point projection point C-2 for R-2:
  • the ROM may be provided with a status flag register indicating whether overlap determination is necessary for individual obstacles. For example, where the height of an obstacle is lower than that of an object such that almost the entire object is not obscured by the obstacle even when the position of the visual point changes, a "1" is placed in the flag to indicate that overlap determination is unnecessary.
  • Fig. 4 is a diagram which illustrates position relationships in a case where no overlap between a subject and an obstacle occurs.
  • the virtual camera C' which observes a subject R views the visual space from above the subject R' and centered on the subject R'.
  • the obstacle O is located behind the subject R as viewed from the camera C', so the angle formed by the vector extending from the visual point to the subject and the vector extending from the obstacle to the subject is greater than the reference angle, and it is therefore determined that no overlapping state exists. Therefore, show-through processing is not performed on the obstacle O, and the usual virtual image depicted in 4C is displayed on the monitor.
  • images are generated in such a way that figures normally generated without show-through processing are rendered as show-through figures when necessary, thereby affording a virtual image generation apparatus in which the look of the game is not impaired, and which does not require omitting obstacle display, displaying obstacles as wire frames from the start, or similar means.
  • Position relationships in which a subject is obscured by a physical object are determined to be overlapping states, whereupon show-through processing is applied to the physical object which obscures the subject.
  • the subject image is therefore adequately visible to allow the player to control and discern the status of the subject without difficulty.
  • overlap determinations are performed on the basis of the angle formed by a vector extending from the visual point to the subject and a vector extending from the obstacle to the subject, allowing for easy and accurate determination of whether the obstacle obscures the subject.
  • Overlap determinations may also be made by comparing the position of a subject and a physical object.
  • the image can be made show-through by means of relatively simple processing without impairing the color, shape, or look of physical objects and subjects.
  • the image can be made show-through by means of relatively simple processing without impairing the color, shape, or look of physical objects and subjects.
  • by displaying physical object display pixels and background display pixels in alternating fashion sufficiently discernible virtual images of both physical object and subject can be obtained in an overlapping state.
EP96926620A 1995-08-10 1996-08-09 Vorrichtung und verfahren zur erzeugung von virtuellen bildern Expired - Lifetime EP0786742B1 (de)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP07204849A JP3141737B2 (ja) 1995-08-10 1995-08-10 仮想画像生成装置及びその方法
JP204849/95 1995-08-10
JP20484995 1995-08-10
PCT/JP1996/002268 WO1997006511A1 (fr) 1995-08-10 1996-08-09 Appareil et procede pour former une image virtuelle

Publications (3)

Publication Number Publication Date
EP0786742A1 true EP0786742A1 (de) 1997-07-30
EP0786742A4 EP0786742A4 (de) 1998-12-09
EP0786742B1 EP0786742B1 (de) 2002-12-18

Family

ID=16497420

Family Applications (1)

Application Number Title Priority Date Filing Date
EP96926620A Expired - Lifetime EP0786742B1 (de) 1995-08-10 1996-08-09 Vorrichtung und verfahren zur erzeugung von virtuellen bildern

Country Status (9)

Country Link
US (2) US6377277B1 (de)
EP (1) EP0786742B1 (de)
JP (2) JP3141737B2 (de)
KR (1) KR100439931B1 (de)
CN (1) CN1158632C (de)
BR (1) BR9606580A (de)
CA (1) CA2201755C (de)
DE (1) DE69625455T2 (de)
WO (1) WO1997006511A1 (de)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0893149A3 (de) * 1997-07-25 2001-03-07 Konami Co., Ltd. Videospielgerät, Videospielbildverarbeitungsmethode, und rechnerlesbares Aufzeichnungsmedium mit gespeichertem Videospielprogramm
EP1086729A2 (de) * 1999-09-24 2001-03-28 Konami Corporation Schiessvideospielsystem und Bilddarstellungsverfahren in einem Schiessvideospiel
EP1149617A2 (de) * 2000-04-28 2001-10-31 Konami Computer Entertainment Japan Inc. Spielsystem, Verfahren zur Generierung einer Bewertungstabelle und computerlesbares Aufzeichungsmedium für Spielprogramm
EP1350545A2 (de) * 2002-04-03 2003-10-08 Nintendo Co., Limited Spielvorrichtung und Spielprogramm
US8180103B2 (en) 2005-07-19 2012-05-15 Fujitsu Limited Image determining method, image determining apparatus, and recording medium having recorded therein program for causing computer to execute image determining method

Families Citing this family (117)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3141737B2 (ja) 1995-08-10 2001-03-05 株式会社セガ 仮想画像生成装置及びその方法
JP2902352B2 (ja) * 1996-05-15 1999-06-07 コナミ株式会社 ビデオゲーム装置
US5769718A (en) * 1996-05-15 1998-06-23 Rieder; William R. Video game apparatus and medium readable by a computer stored with video game program
CN1640519B (zh) * 1997-02-18 2010-06-09 世嘉股份有限公司 图像处理装置和图像处理方法
JP3145059B2 (ja) 1997-06-13 2001-03-12 株式会社ナムコ 情報記憶媒体及び画像生成装置
JP3183632B2 (ja) 1997-06-13 2001-07-09 株式会社ナムコ 情報記憶媒体及び画像生成装置
JP4035867B2 (ja) * 1997-09-11 2008-01-23 株式会社セガ 画像処理装置及び画像処理方法並びに媒体
EP1829590A3 (de) * 1997-11-25 2008-12-17 Sega Enterprises, Ltd. Gerät zur Bilderzeugung für ein Ego-Shooter-Spiel
JPH11207029A (ja) 1998-01-28 1999-08-03 Konami Co Ltd ビデオゲーム装置、ビデオゲームにおける画面表示方法及び画面表示プログラムが格納された可読記録媒体
JP4114824B2 (ja) * 1998-04-24 2008-07-09 株式会社バンダイナムコゲームス 画像生成装置及び情報記憶媒体
JP4258824B2 (ja) * 1998-05-20 2009-04-30 株式会社セガ 画像処理装置、画像処理方法及び記録媒体
US6552722B1 (en) * 1998-07-17 2003-04-22 Sensable Technologies, Inc. Systems and methods for sculpting virtual objects in a haptic virtual reality environment
JP4188463B2 (ja) 1998-09-03 2008-11-26 株式会社バンダイナムコゲームス 画像生成装置、画像生成方法および記憶媒体
JP2001276420A (ja) 2000-03-30 2001-10-09 Namco Ltd ゲーム装置および情報記憶媒体
JP3345600B2 (ja) * 2000-04-10 2002-11-18 コナミ株式会社 ゲームシステムおよびコンピュータ読取可能な記憶媒体
JP2002202770A (ja) * 2000-12-28 2002-07-19 Genki Kk コンピュータゲームの逐次読込方法並びに逐次読込方法を用いた記録媒体
JP2002360920A (ja) * 2001-06-05 2002-12-17 Atlus Co Ltd ゲーム画像制御装置
JP3764070B2 (ja) * 2001-06-07 2006-04-05 富士通株式会社 オブジェクト表示プログラムおよびオブジェクト表示装置
JP4707080B2 (ja) * 2001-07-30 2011-06-22 株式会社バンダイナムコゲームス 画像生成システム、プログラム及び情報記憶媒体
JP3594915B2 (ja) * 2001-08-03 2004-12-02 株式会社ナムコ プログラム、情報記憶媒体及びゲーム装置
US7133083B2 (en) * 2001-12-07 2006-11-07 University Of Kentucky Research Foundation Dynamic shadow removal from front projection displays
JP4115188B2 (ja) * 2002-07-19 2008-07-09 キヤノン株式会社 仮想空間描画表示装置
JP2005108108A (ja) * 2003-10-01 2005-04-21 Canon Inc 三次元cg操作装置および方法、並びに位置姿勢センサのキャリブレーション装置
US7502036B2 (en) * 2004-03-03 2009-03-10 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
US7542050B2 (en) 2004-03-03 2009-06-02 Virtual Iris Studios, Inc. System for delivering and enabling interactivity with images
JP2007535733A (ja) * 2004-03-03 2007-12-06 バーチャル アイリス スタジオ,インク. 画像の配信および対話型操作を可能にするシステム
JP4412715B2 (ja) * 2004-05-07 2010-02-10 株式会社バンダイナムコゲームス プログラム、情報記憶媒体および画像生成システム
JP3877077B2 (ja) 2004-08-31 2007-02-07 任天堂株式会社 ゲーム装置および画像処理プログラム
JP4636908B2 (ja) * 2005-03-14 2011-02-23 キヤノン株式会社 画像処理装置、画像処理方法
JP4717622B2 (ja) * 2005-12-15 2011-07-06 株式会社バンダイナムコゲームス プログラム、情報記録媒体および画像生成システム
JP5154775B2 (ja) 2006-08-18 2013-02-27 任天堂株式会社 ゲームプログラムおよびゲーム装置
US20080231627A1 (en) * 2007-03-20 2008-09-25 Robert Allen Shearer Using Ray Tracing to Enhance Artificial Intelligence Character Behavior
JP4489800B2 (ja) 2007-08-30 2010-06-23 株式会社スクウェア・エニックス 画像生成装置及び方法、並びにプログラム及び記録媒体
JP5016443B2 (ja) * 2007-10-25 2012-09-05 ティーオーエー株式会社 カメラ設置シミュレータプログラム
CA2720834A1 (en) 2008-05-29 2009-12-03 Tomtom International B.V. Displaying route information on a digital map image
WO2014055924A1 (en) * 2012-10-04 2014-04-10 Disney Enterprises, Inc. Interactive objects for immersive environment
JP2010033252A (ja) * 2008-07-28 2010-02-12 Namco Bandai Games Inc プログラム、情報記憶媒体および画像生成システム
JP2010033285A (ja) * 2008-07-28 2010-02-12 Namco Bandai Games Inc プログラム、情報記憶媒体および画像生成システム
US8898574B2 (en) * 2008-12-19 2014-11-25 International Business Machines Corporation Degrading avatar appearances in a virtual universe
US9633465B2 (en) 2009-02-28 2017-04-25 International Business Machines Corporation Altering avatar appearances based on avatar population in a virtual universe
US9440591B2 (en) * 2009-05-13 2016-09-13 Deere & Company Enhanced visibility system
CN101819678A (zh) * 2010-03-16 2010-09-01 昆明理工大学 驾驶模拟系统三维虚拟图像的标定方法
JP5614211B2 (ja) * 2010-09-30 2014-10-29 株式会社セガ 画像処理プログラム及びコンピュータ読み取り可能な記録媒体
CN102456200A (zh) * 2010-10-25 2012-05-16 鸿富锦精密工业(深圳)有限公司 商品广告插播系统及方法
JP5677050B2 (ja) * 2010-11-26 2015-02-25 株式会社カプコン ゲームプログラム及びゲーム装置
JP5670253B2 (ja) * 2011-05-18 2015-02-18 日立アロカメディカル株式会社 超音波診断装置
JP6085411B2 (ja) * 2011-06-02 2017-02-22 任天堂株式会社 画像処理装置、画像処理方法、および画像処理装置の制御プログラム
WO2013005868A1 (en) 2011-07-01 2013-01-10 Empire Technology Development Llc Safety scheme for gesture-based game
CN103764235B (zh) 2011-08-31 2016-03-23 英派尔科技开发有限公司 用于基于姿势的游戏系统的位置设置
KR101567591B1 (ko) 2011-12-02 2015-11-20 엠파이어 테크놀로지 디벨롭먼트 엘엘씨 제스처 기반 게임 시스템을 위한 안전 체계
WO2013126071A1 (en) 2012-02-24 2013-08-29 Empire Technology Development Llc Safety scheme for gesture-based game system
US10528135B2 (en) 2013-01-14 2020-01-07 Ctrl-Labs Corporation Wearable muscle interface systems, devices and methods that interact with content displayed on an electronic display
JP6148887B2 (ja) * 2013-03-29 2017-06-14 富士通テン株式会社 画像処理装置、画像処理方法、及び、画像処理システム
US10152082B2 (en) 2013-05-13 2018-12-11 North Inc. Systems, articles and methods for wearable electronic devices that accommodate different user forms
US9666427B2 (en) * 2013-06-21 2017-05-30 Lam Research Corporation Method of collapse-free drying of high aspect ratio structures
US11921471B2 (en) 2013-08-16 2024-03-05 Meta Platforms Technologies, Llc Systems, articles, and methods for wearable devices having secondary power sources in links of a band for providing secondary power in addition to a primary power source
US20150124566A1 (en) 2013-10-04 2015-05-07 Thalmic Labs Inc. Systems, articles and methods for wearable electronic devices employing contact sensors
US11426123B2 (en) 2013-08-16 2022-08-30 Meta Platforms Technologies, Llc Systems, articles and methods for signal routing in wearable electronic devices that detect muscle activity of a user using a set of discrete and separately enclosed pod structures
US10042422B2 (en) 2013-11-12 2018-08-07 Thalmic Labs Inc. Systems, articles, and methods for capacitive electromyography sensors
US10188309B2 (en) 2013-11-27 2019-01-29 North Inc. Systems, articles, and methods for electromyography sensors
US9788789B2 (en) 2013-08-30 2017-10-17 Thalmic Labs Inc. Systems, articles, and methods for stretchable printed circuit boards
CN106102504A (zh) 2014-02-14 2016-11-09 赛尔米克实验室公司 用于弹性电力缆线的系统、制品和方法以及采用弹性电力缆线的可佩戴电子装置
US10199008B2 (en) 2014-03-27 2019-02-05 North Inc. Systems, devices, and methods for wearable electronic devices as state machines
US20150325202A1 (en) * 2014-05-07 2015-11-12 Thalmic Labs Inc. Systems, devices, and methods for wearable computers with heads-up displays
US9880632B2 (en) 2014-06-19 2018-01-30 Thalmic Labs Inc. Systems, devices, and methods for gesture identification
US9874744B2 (en) 2014-06-25 2018-01-23 Thalmic Labs Inc. Systems, devices, and methods for wearable heads-up displays
JP2016093362A (ja) * 2014-11-14 2016-05-26 株式会社コロプラ ゲーム空間内における仮想カメラの制御プログラム及びゲームシステム。
US9807221B2 (en) 2014-11-28 2017-10-31 Thalmic Labs Inc. Systems, devices, and methods effected in response to establishing and/or terminating a physical communications link
JP6122047B2 (ja) * 2015-02-10 2017-04-26 株式会社カプコン ゲームプログラムおよびゲーム装置
KR20170139509A (ko) 2015-02-17 2017-12-19 탈믹 랩스 인크 웨어러블 헤드-업 디스플레이 내의 아이박스 확장을 위한 시스템, 장치, 및 방법
US9958682B1 (en) 2015-02-17 2018-05-01 Thalmic Labs Inc. Systems, devices, and methods for splitter optics in wearable heads-up displays
US10078435B2 (en) 2015-04-24 2018-09-18 Thalmic Labs Inc. Systems, methods, and computer program products for interacting with electronically displayed presentation materials
US10133075B2 (en) 2015-05-04 2018-11-20 Thalmic Labs Inc. Systems, devices, and methods for angle- and wavelength-multiplexed holographic optical elements
US10078219B2 (en) 2015-05-28 2018-09-18 Thalmic Labs Inc. Wearable heads-up display with integrated eye tracker and different optical power holograms
JP2018528475A (ja) 2015-09-04 2018-09-27 サルミック ラブス インコーポレイテッド ホログラフィック光学素子を眼鏡レンズに統合するシステム、製品、及び方法
WO2017059285A1 (en) 2015-10-01 2017-04-06 Thalmic Labs Inc. Systems, devices, and methods for interacting with content displayed on head-mounted displays
US10937332B2 (en) * 2015-10-20 2021-03-02 The Boeing Company Systems and methods for providing a virtual heads up display in a vehicle simulator
US9904051B2 (en) 2015-10-23 2018-02-27 Thalmic Labs Inc. Systems, devices, and methods for laser eye tracking
US10802190B2 (en) 2015-12-17 2020-10-13 Covestro Llc Systems, devices, and methods for curved holographic optical elements
US10303246B2 (en) 2016-01-20 2019-05-28 North Inc. Systems, devices, and methods for proximity-based eye tracking
US10151926B2 (en) 2016-01-29 2018-12-11 North Inc. Systems, devices, and methods for preventing eyebox degradation in a wearable heads-up display
JP6002346B1 (ja) * 2016-04-07 2016-10-05 株式会社Cygames ゲームにおけるオブジェクト画像表示のためのプログラム、方法、電子装置及びシステム
US20170294135A1 (en) * 2016-04-11 2017-10-12 The Boeing Company Real-time, in-flight simulation of a target
JP2019518979A (ja) 2016-04-13 2019-07-04 ノース インコーポレイテッドNorth Inc. レーザプロジェクタの焦点を合わせるためのシステム、デバイス、及び方法
US11216069B2 (en) 2018-05-08 2022-01-04 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
WO2020112986A1 (en) 2018-11-27 2020-06-04 Facebook Technologies, Inc. Methods and apparatus for autocalibration of a wearable electrode sensor system
CN110300542A (zh) 2016-07-25 2019-10-01 开创拉布斯公司 使用可穿戴的自动传感器预测肌肉骨骼位置信息的方法和装置
US11635736B2 (en) 2017-10-19 2023-04-25 Meta Platforms Technologies, Llc Systems and methods for identifying biological structures associated with neuromuscular source signals
US10277874B2 (en) 2016-07-27 2019-04-30 North Inc. Systems, devices, and methods for laser projectors
WO2018027326A1 (en) 2016-08-12 2018-02-15 Thalmic Labs Inc. Systems, devices, and methods for variable luminance in wearable heads-up displays
US10345596B2 (en) 2016-11-10 2019-07-09 North Inc. Systems, devices, and methods for astigmatism compensation in a wearable heads-up display
US10459220B2 (en) 2016-11-30 2019-10-29 North Inc. Systems, devices, and methods for laser eye tracking in wearable heads-up displays
US10365492B2 (en) 2016-12-23 2019-07-30 North Inc. Systems, devices, and methods for beam combining in wearable heads-up displays
JP6681352B2 (ja) * 2017-01-06 2020-04-15 任天堂株式会社 情報処理システム、情報処理プログラム、情報処理装置、情報処理方法、ゲームシステム、ゲームプログラム、ゲーム装置、及びゲーム方法
JP6606791B2 (ja) 2017-01-12 2019-11-20 株式会社コナミデジタルエンタテインメント ゲーム装置及びプログラム
US10437074B2 (en) 2017-01-25 2019-10-08 North Inc. Systems, devices, and methods for beam combining in laser projectors
WO2019079894A1 (en) 2017-10-23 2019-05-02 North Inc. MULTIPLE LASER DIODE MODULES WITH FREE SPACES
JP6669783B2 (ja) * 2017-12-25 2020-03-18 ガンホー・オンライン・エンターテイメント株式会社 端末装置、システム、プログラム及び方法
US11481030B2 (en) 2019-03-29 2022-10-25 Meta Platforms Technologies, Llc Methods and apparatus for gesture detection and classification
US10937414B2 (en) 2018-05-08 2021-03-02 Facebook Technologies, Llc Systems and methods for text input using neuromuscular information
US11961494B1 (en) 2019-03-29 2024-04-16 Meta Platforms Technologies, Llc Electromagnetic interference reduction in extended reality environments
US11907423B2 (en) 2019-11-25 2024-02-20 Meta Platforms Technologies, Llc Systems and methods for contextualized interactions with an environment
US11150730B1 (en) 2019-04-30 2021-10-19 Facebook Technologies, Llc Devices, systems, and methods for controlling computing devices via neuromuscular signals of users
US11493993B2 (en) 2019-09-04 2022-11-08 Meta Platforms Technologies, Llc Systems, methods, and interfaces for performing inputs based on neuromuscular control
WO2019186622A1 (ja) * 2018-03-26 2019-10-03 楽天株式会社 表示装置、表示方法、プログラム、ならびに、非一時的なコンピュータ読取可能な情報記録媒体
US10843089B2 (en) * 2018-04-06 2020-11-24 Rovi Guides, Inc. Methods and systems for facilitating intra-game communications in a video game environment
US10592001B2 (en) 2018-05-08 2020-03-17 Facebook Technologies, Llc Systems and methods for improved speech recognition using neuromuscular information
US11709370B2 (en) * 2018-05-08 2023-07-25 Apple Inc. Presentation of an enriched view of a physical setting
WO2020047429A1 (en) 2018-08-31 2020-03-05 Ctrl-Labs Corporation Camera-guided interpretation of neuromuscular signals
WO2020061451A1 (en) 2018-09-20 2020-03-26 Ctrl-Labs Corporation Neuromuscular text entry, writing and drawing in augmented reality systems
WO2020189319A1 (ja) * 2019-03-19 2020-09-24 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
US11756259B2 (en) 2019-04-17 2023-09-12 Rakuten Group, Inc. Display controlling device, display controlling method, program, and non-transitory computer-readable information recording medium
JP7001719B2 (ja) 2020-01-29 2022-02-04 グリー株式会社 コンピュータプログラム、サーバ装置、端末装置、及び方法
JPWO2021215246A1 (de) * 2020-04-21 2021-10-28
CN112807684A (zh) * 2020-12-31 2021-05-18 上海米哈游天命科技有限公司 一种遮挡物信息获取方法、装置、设备及存储介质
US11868531B1 (en) 2021-04-08 2024-01-09 Meta Platforms Technologies, Llc Wearable device providing for thumb-to-finger-based input gestures detected based on neuromuscular signals, and systems and methods of use thereof
CN117163302B (zh) * 2023-10-31 2024-01-23 安胜(天津)飞行模拟系统有限公司 飞行器仪表显示方法、装置、设备及存储介质

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0145321A2 (de) * 1983-11-15 1985-06-19 Motorola, Inc. Einrichtung und Verfahren zum Ändern eines Aspektes eines Objektes einer Vielheit von koinzidierenden visuellen Objekten in einem Videoanzeigegenerator

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5202672A (en) * 1987-12-30 1993-04-13 Namco Ltd. Object display system
US6054991A (en) * 1991-12-02 2000-04-25 Texas Instruments Incorporated Method of modeling player position and movement in a virtual reality system
DE69315969T2 (de) * 1992-12-15 1998-07-30 Sun Microsystems Inc Darstellung von Informationen in einem Anzeigesystem mit transparenten Fenstern
JPH06215150A (ja) * 1993-01-18 1994-08-05 Toshiba Corp 三次元画像表示装置
CA2109681C (en) * 1993-03-10 1998-08-25 Donald Edgar Blahut Method and apparatus for the coding and display of overlapping windows with transparency
JPH06290254A (ja) * 1993-03-31 1994-10-18 Toshiba Corp 三次元図形の表示処理装置
US5638501A (en) * 1993-05-10 1997-06-10 Apple Computer, Inc. Method and apparatus for displaying an overlay image
US5491510A (en) * 1993-12-03 1996-02-13 Texas Instruments Incorporated System and method for simultaneously viewing a scene and an obscured object
WO1996020470A1 (en) * 1994-12-23 1996-07-04 Philips Electronics N.V. Single frame buffer image processing system
JP3734045B2 (ja) * 1995-08-10 2006-01-11 株式会社セガ 仮想画像生成方法及びその装置
JP3141737B2 (ja) 1995-08-10 2001-03-05 株式会社セガ 仮想画像生成装置及びその方法

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0145321A2 (de) * 1983-11-15 1985-06-19 Motorola, Inc. Einrichtung und Verfahren zum Ändern eines Aspektes eines Objektes einer Vielheit von koinzidierenden visuellen Objekten in einem Videoanzeigegenerator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO9706511A1 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0893149A3 (de) * 1997-07-25 2001-03-07 Konami Co., Ltd. Videospielgerät, Videospielbildverarbeitungsmethode, und rechnerlesbares Aufzeichnungsmedium mit gespeichertem Videospielprogramm
EP1086729A2 (de) * 1999-09-24 2001-03-28 Konami Corporation Schiessvideospielsystem und Bilddarstellungsverfahren in einem Schiessvideospiel
EP1149617A2 (de) * 2000-04-28 2001-10-31 Konami Computer Entertainment Japan Inc. Spielsystem, Verfahren zur Generierung einer Bewertungstabelle und computerlesbares Aufzeichungsmedium für Spielprogramm
EP1149617A3 (de) * 2000-04-28 2002-11-27 Konami Computer Entertainment Japan Inc. Spielsystem, Verfahren zur Generierung einer Bewertungstabelle und computerlesbares Aufzeichungsmedium für Spielprogramm
US6688981B2 (en) 2000-04-28 2004-02-10 Konami Computer Entertainment Japan, Inc. Game system, judgment table producing method, and computer-readable storage medium carrying game program
EP1350545A2 (de) * 2002-04-03 2003-10-08 Nintendo Co., Limited Spielvorrichtung und Spielprogramm
EP1350545A3 (de) * 2002-04-03 2005-01-05 Nintendo Co., Limited Spielvorrichtung und Spielprogramm
US7513829B2 (en) 2002-04-03 2009-04-07 Nintendo Co., Ltd. Game machine and game program for rendering a mark image of a player character which may be hidden behind an object
US8180103B2 (en) 2005-07-19 2012-05-15 Fujitsu Limited Image determining method, image determining apparatus, and recording medium having recorded therein program for causing computer to execute image determining method

Also Published As

Publication number Publication date
JP3769747B2 (ja) 2006-04-26
JP3141737B2 (ja) 2001-03-05
CA2201755C (en) 2004-07-06
EP0786742A4 (de) 1998-12-09
DE69625455D1 (de) 2003-01-30
MX9702603A (es) 1998-05-31
BR9606580A (pt) 1998-07-07
CN1158632C (zh) 2004-07-21
KR100439931B1 (ko) 2005-04-06
CN1161096A (zh) 1997-10-01
CA2201755A1 (en) 1997-02-20
KR970706554A (ko) 1997-11-03
USRE41414E1 (en) 2010-07-06
JPH0950541A (ja) 1997-02-18
WO1997006511A1 (fr) 1997-02-20
JP2000348217A (ja) 2000-12-15
DE69625455T2 (de) 2003-10-23
EP0786742B1 (de) 2002-12-18
US6377277B1 (en) 2002-04-23

Similar Documents

Publication Publication Date Title
EP0786742B1 (de) Vorrichtung und verfahren zur erzeugung von virtuellen bildern
EP0802508B1 (de) Eingabegerät für bilderzeugungsverfahren und -gerät
JP5597837B2 (ja) プログラム、情報記憶媒体、及び、画像生成装置
JP5390115B2 (ja) プログラム、ゲームシステム
JP6448196B2 (ja) 画像生成システム及びプログラム
WO2012043009A1 (ja) 画像処理プログラム及びコンピュータ読み取り可能な記録媒体
JPWO2004045734A1 (ja) ゲーム画像処理プログラム及び記憶媒体
KR20000064948A (ko) 화상 처리 장치 및 화상 처리 방법
JP2000350859A (ja) マーカ配置方法及び複合現実感装置
JP2000350860A (ja) 複合現実感装置及び複合現実空間画像の生成方法
US6878058B1 (en) Image processor and game device with image processor
EP0797172A3 (de) Bildprozessor und damit ausgerüstetes spielgerät
US7666097B2 (en) Method for generating image animation effect
JP3413129B2 (ja) 画像処理方法及び画像処理装置
JP2006061717A (ja) ゲーム画像の表示制御プログラム及びゲーム装置並びに記憶媒体
JPH11467A (ja) ゲーム装置
JP3769531B2 (ja) プログラム、記憶媒体、ゲーム装置及び画像処理方法
EP1151771B1 (de) Spielsystem, Verfahren zur Bilderzeugung in das Spielsystem, und Computerlesbares Speichermedium mit Spielprogramm
JP4022847B2 (ja) ゲーム装置
JP2018171309A (ja) シミュレーションシステム及びプログラム
JP5054908B2 (ja) プログラム、情報記憶媒体、及び画像生成システム
MXPA97002603A (en) Apparatus and method for the generation of virt image
JPH08131653A (ja) 3次元ゲーム装置および3次元ゲーム画像生成方法
JP4782631B2 (ja) プログラム、情報記憶媒体及び画像生成システム
JPH11306383A (ja) 画像生成装置及び情報記憶媒体

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 19970409

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE ES FR GB IT

RBV Designated contracting states (corrected)

Designated state(s): DE ES FR GB IT

A4 Supplementary search report drawn up and despatched

Effective date: 19981023

AK Designated contracting states

Kind code of ref document: A4

Designated state(s): DE ES FR GB IT

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

RIC1 Information provided on ipc code assigned before grant

Free format text: 7A 63F 13/10 A, 7G 06T 15/00 B

17Q First examination report despatched

Effective date: 20020312

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE ES FR GB IT

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REF Corresponds to:

Ref document number: 69625455

Country of ref document: DE

Date of ref document: 20030130

Kind code of ref document: P

Ref document number: 69625455

Country of ref document: DE

Date of ref document: 20030130

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20030627

ET Fr: translation filed
PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20030919

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: IT

Payment date: 20100824

Year of fee payment: 15

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20110823

Year of fee payment: 16

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20110809

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20130301

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 69625455

Country of ref document: DE

Effective date: 20130301

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20140820

Year of fee payment: 19

Ref country code: FR

Payment date: 20140821

Year of fee payment: 19

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20150809

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20160429

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150809

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20150831