US6500069B1 - Image processor, image processing method, game machine and recording medium - Google Patents

Image processor, image processing method, game machine and recording medium Download PDF

Info

Publication number
US6500069B1
US6500069B1 US09/011,267 US1126798A US6500069B1 US 6500069 B1 US6500069 B1 US 6500069B1 US 1126798 A US1126798 A US 1126798A US 6500069 B1 US6500069 B1 US 6500069B1
Authority
US
United States
Prior art keywords
data
cursor
display
virtual space
player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
US09/011,267
Other languages
English (en)
Inventor
Noriyoshi Ohba
Kenichi Ono
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sega Corp
Original Assignee
Sega Enterprises Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sega Enterprises Ltd filed Critical Sega Enterprises Ltd
Assigned to KABUSHIKI KAISHA SEGA ENTERPRISES reassignment KABUSHIKI KAISHA SEGA ENTERPRISES ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: OHBA, NORIYOSHI, ONO, KENICHI
Priority to US10/295,996 priority Critical patent/US20030119587A1/en
Application granted granted Critical
Publication of US6500069B1 publication Critical patent/US6500069B1/en
Priority to US11/896,989 priority patent/US7573479B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/305Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/306Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying a marker associated to an object or location in the game field
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/64Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/65Methods for processing data by generating or executing the game program for computing the condition of a game character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • A63F2300/6684Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dynamically adapting its position to keep a game object in its viewing frustrum, e.g. for tracking a character or a ball
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S345/00Computer graphics processing and selective visual display systems
    • Y10S345/949Animation processing method
    • Y10S345/958Collision avoidance
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S345/00Computer graphics processing and selective visual display systems
    • Y10S345/949Animation processing method
    • Y10S345/959Object path adherence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S707/00Data processing: database and file management or data structures
    • Y10S707/99941Database schema or data structure
    • Y10S707/99943Generating database or data structure, e.g. via user interface

Definitions

  • This invention relates to a graphics processing technique for generating an image, analogous to one observed from a prescribed viewing point, of a virtual space which is defined by three-dimensional coordinates and which contains segments (terrain features, human figures moving about in the virtual space, and the like) disposed therein, and more particularly to a method for shifting this viewing point in an effective manner.
  • simulation games the game consists of a plurality of stages.
  • the display image contains a designated movable segment which can be controlled by the player, other segments which move under the control of the program, and other segments depicting terrain features (hereinafter, the movable human figure or other player-controlled segment will be referred to as the “player-controlled character,” and segments which move under the control of the program will be referred to as “enemy characters”).
  • the player controls the player-controlled character to fight with the enemy characters to “beat” (complete a stage) the various stages.
  • Some simulation games of this type display an introductory screen which introduces the plurality of stages upon the issuance of a game start instruction, but these have several drawbacks.
  • Some simulation games which use a cursor to facilitate control allow one to move the cursor to a desired location within the display, at which point data describing the terrain feature segment displayed at that location is presented. However, only data for the location selected with the cursor is shown; relationships to terrain features adjacent to cursor are not indicated (Conventional Example 2).
  • Conventional simulation games also include those which use topographical mapping data.
  • this topographical mapping data is defined two-dimensionally; none of the simulation games employ topographical mapping data defined three-dimensionally. Accordingly, no game processing exists which utilizes terrain features rendered as images capable of simulating events likely to happen in any ordinary three-dimensional terrain, for example, a character sustaining injury by falling from a cliff. Even if such a game exists, the orientation of an injury and the extent of an injury would be predetermined factors, and the inability for the nature of the injury to change in a manner dependent on the terrain results in a lack of realism (Conventional Example 3).
  • the display is two-dimensionally defined, and the resulting display is unavoidably lacking in realism when compared to the real world, which is three-dimensional.
  • representation whereby the display in each stage is rendered on the basis of three-dimensionally-defined topographical mapping data, the position of the viewing point can be shifted vertically and horizontally, and the player is presented a more three-dimensional display would serve to facilitate understanding of terrain features.
  • the assignment for the UP switch on the pad is such that when it is pushed, the player-controlled character moves FORWARD.
  • This is not a problem where the viewing point is such that the player-controlled character is viewed from behind.
  • Pushing the UP switch causes the player-controlled character to move in the z-axis direction in the viewing point coordinate system.
  • pushing the UP switch will cause the player-controlled character to move in the z-axis direction, i.e., in the sideways direction with respect to the player-controlled character.
  • FORWARD should move the player-controlled character in the forward direction.
  • This invention was developed to address the problems noted earlier, and is intended to provide a graphics processing device which allows the viewing point in a three-dimensionally-defined virtual space to be shifted arbitrarily, and to present a suitable operating environment (Object 1).
  • This invention was developed to address the problems noted earlier, and is intended to provide a graphics processing device which allows information for the surroundings of the cursor-selected position to be displayed, and to present a suitable operating environment (Object 2).
  • This invention was developed to address the problems noted earlier, and is intended to account for the effects of a three-dimensionally-defined terrain feature on a player-controlled segment, and to present a suitable operating environment (Object 3).
  • This invention was developed to address the problems noted earlier, and is intended to align the orientation of a player-controlled segment in virtual space with the direction of the line of sight for visual field conversion, and to present a suitable operating environment (Object 4).
  • An embodiment of the invention provides a graphics processing device for generating a display wherein segments defined three-dimensionally within a virtual space are viewed from a viewing point located within the virtual space, comprising viewing point shifting means for shifting the viewing point over the predetermined three-dimensional paths established within the virtual space.
  • the segments are representations of terrain features, human figures, and the like, and are constructed from polygons, for example. Two-dimensional representations of the polygons observed from a viewing point in the virtual space are displayed. In contrast to moving through two dimensions, the paths are designed to allow movement while changing position in a third dimension direction (such as the height direction).
  • An embodiment of the invention provides a graphics processing device, wherein display locations for displaying predetermined messages are established along a path, and the viewing point shifting means displays messages at these display locations.
  • Locations for message display may include, for example, locations where enemy characters are positioned, locations of prescribed objects, location of characteristic terrain features such as cliffs or precipices, and other locations where information useful to the player in proceeding through the game should be placed.
  • Messages may be displayed in a prescribed message window, for example.
  • the message window need not be three-dimensional; a two-dimensional display may be used.
  • An embodiment of the invention provides a graphics processing device, wherein the path is configured such that each of the plurality of segments can be represented from different viewing point positions.
  • paths otherwise modifiable for achieving movie effects such as pan, zoom, and the like, for example, paths set up such that the camera can be adjusted continuously from extreme zoom-out to close-up in order to focus on a particular point.
  • An embodiment of the invention provides a graphics processing device, wherein the viewing point shifting means holds a reference point for the viewing point in a predetermined location when shifting the viewing point along a path.
  • An embodiment of the invention provides a game machine designed with a plurality of stages, comprising a graphics processing device whereby virtual terrain features are defined three-dimensionally within virtual space for each stage, and representations thereof are displayed as viewed from the viewing point.
  • An embodiment of the invention provides a graphics processing method for generating representations of segments defined three-dimensionally within virtual space displayed as viewed from the viewing point, comprising the step of shifting the viewing point over the predetermined three-dimensional paths established within the virtual space.
  • An embodiment of the invention provides a graphics processing method, wherein display locations for displaying predetermined messages are selected along a path, and the step for shifting the viewing point displays the messages at the display locations.
  • An embodiment of the invention provides a graphics processing device for generating representations of segments defined three-dimensionally within virtual space displayed as viewed from the viewing point, comprising cursor generation means for generating a cursor, cursor moving means for moving the cursor through operation by the player, data generating means for acquiring data concerning segments located peripherally around the cursor and generating display data, and data display means for producing data displays on the basis of the display data.
  • An embodiment of the invention provides a graphics processing device, wherein the data generating means, on the basis of conditions of motion applied to the cursor and data concerning segments located peripherally thereto, makes decisions as to whether cursor movement should be enabled, computing the load required for movement where movement is enabled, and the data display means displays a “movement not enabled” indicator in directions in which cursor movement is not enabled, as well as displaying a “movement enabled” indicator in directions in which cursor movement is enabled, together with the load required therefor.
  • An embodiment of the invention provides a graphics processing device, wherein the data generating means acquires attribute data concerning segments located peripherally around the cursor and generates display data, and the data display means displays the display data next to the segment(s) in question.
  • An embodiment of the invention provides a graphics processing device, wherein the cursor generating means changes the cursor display with reference to the attributes of the segments.
  • An embodiment of the invention provides a game machine designed with a plurality of stages, comprising a graphics processing device whereby virtual terrain features are defined three-dimensionally within virtual space for each stage, and a cursor is displayed in the display of each stage.
  • An embodiment of the invention provides a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, comprising a cursor moving step in which the cursor is moved through player control, a data generation step in which data pertaining to segments located peripherally around the cursor is acquired and display data is generated, and a data display step in which a data display is produced on the basis of the display data.
  • An embodiment of the invention provides a graphics processing device for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, comprising attribute modification value generating means which, where a segment has moved, computes an attribute modification value for the segment on the basis of its status prior to moving, after moving, or both, and attribute modifying means for modifying the attributes of the segment on the basis of the attribute modification value.
  • An embodiment of the invention provides a graphics processing device, wherein the attribute modification value generating means computes the attribute modification value on the basis of the difference in distance of the segment prior to and after moving.
  • An embodiment of the invention provides a graphics processing device, wherein the attribute modification value generating means computes the attribute modification value on the basis of the status defined for the terrain feature segment located at the current position of a segment which has moved.
  • An embodiment of the invention provides a game machine designed with a plurality of stages, comprising a graphics processing device for defining virtual terrain features three-dimensionally within a virtual space and modifying segment attributes for each stage.
  • An embodiment of the invention provides a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, comprising an attribute modification value generating step wherein, where a segment has moved, an attribute modification value is computed for the segment on the basis of its status prior to moving, after moving, or both, and an attribute modifying step wherein the attributes of the segment are modified on the basis of the attribute modification value.
  • An embodiment of the invention provides a graphics processing device for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, comprising segment moving means for moving prescribed segments through control by the player, coordinate alignment determining means for determining if the direction in which a designated segment in virtual space is facing is aligned with the direction of the line of sight extending from the viewing point, and association modifying means for modifying the association of the control direction instructed through player control and the direction of movement of the segment where the coordinate alignment determining means has determined that these are not aligned.
  • An embodiment of the invention provides a graphics processing device, further comprising control input type determining means for determining whether a control input by the player pertains to movement of a segment, and control direction setting means for setting the direction instructed through control by the player to a predefined direction in the event that it is determined by the control input type determining means that the input does not pertain to movement of a segment.
  • Cases where a determination that a particular control input does not pertain to movement of a segment would be made include, for example, specification of an action not directly related to movement of a segment but rather performed on a terrain feature, tree, rock, or other object in the virtual space, or of some modification of attributes (equipment, weapons, tools, etc.) including those of segments.
  • Cases where a determination that a control input does not pertain to the virtual space would be made include, for example, operations performed in display screens not directly related to virtual space coordinates (such as game setting, segment setting, and other initial screens, setting screens for modifying parameters during the course of the game, message windows, and the like).
  • “Predefined direction” refers to some direction defined with reference to the display screen (for example, UP, DOWN, LEFT, or RIGHT).
  • An embodiment of the invention provides a graphics processing device, further comprising control input reference determining means for determining whether a control input by the player is an operation to be performed on the display screen which displays the virtual space, and control direction setting means for setting the direction instructed through control by the player to a predefined direction in the event that it is determined by the control input reference determining means that the operation is not one to be performed on the display screen which displays the virtual space.
  • An embodiment of the invention provides a game machine designed with a plurality of stages and comprising a graphics processing device for defining virtual terrain features three-dimensionally within a virtual space for each stage, and for moving the segments.
  • An embodiment of the invention provides a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, comprising a segment moving step in which a designated segment is moved through control by the player, a coordinate alignment determining step in which a determination is made as to whether the direction in which the designated segment in virtual space is facing is aligned with the direction of the line of sight extending from the viewing point, and an association modifying step in which the association of the control direction instructed through player control and the direction of movement of the segment is modified in the event that the coordinate alignment determining means has determined that these are not aligned.
  • An embodiment of the invention provides a graphics processing method, further comprising a control input type determining step in which a determination is made as to whether a control input by the player pertains to movement of a segment, and a control direction setting step in which the direction instructed through control by the player is set to a predefined direction in the event that it is determined by the control input type determining means that the input does not pertain to movement of a segment.
  • An embodiment of the invention provides in a computer a machine-readable storage medium for storing a program which embodies a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, and which executes a step whereby the viewing point is shifted over predetermined three-dimensional paths established within the virtual space.
  • An embodiment of the invention provides in a computer a machine-readable storage medium for storing a program which embodies a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, and which executes a cursor movement step wherein the cursor is moved through control by the player, a data generation step wherein data pertaining to segments located peripherally around the cursor is acquired and display data is generated, and a data display step in which a data display is produced on the basis of the display data.
  • An embodiment of the invention provides in a computer a machine-readable storage medium for storing a program which embodies a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, and which executes an attribute modification value generating step wherein, where a segment has moved, an attribute modification value is computed for the segment on the basis of its status prior to moving, after moving, or both, and an attribute modifying step wherein the attributes of the segment are modified on the basis of the attribute modification value.
  • An embodiment of the invention provides in a computer a machine-readable storage medium for storing a program which embodies a graphics processing method for generating a display of segments defined three-dimensionally within a virtual space and portrayed as viewed from a viewing point located within the virtual space, and which executes a segment moving step in which a designated segment is moved through control by the player, a coordinate alignment determining step in which a determination is made as to whether the direction in which the designated segment in virtual space is facing is aligned with the direction of the line of sight extending from the viewing point, and an association modifying step in which the association of the control direction instructed through player control and the direction of movement of the segment is modified in the event that the coordinate alignment determining means has determined that these are not aligned.
  • Storage media examples include floppy disks, magnetic tape, magnetooptical disks, CD-ROM, DVD, ROM cartridges, RAM memory cartridges equipped with battery packs, flash memory cartridges, nonvolatile RAM cartridges, and the like.
  • “Storage medium” refers to a component capable of storing data (mainly digital data and programs) by some physical means and of enabling a computer, dedicated processor, or other processing device to perform prescribed functions.
  • Wired communications media such as phone lines, wireless communications media such as microwave circuits, and other communications media are included as well.
  • the Internet is also included in this definition of communications media.
  • FIG. 1 is an exterior view of a game machine employing the graphics processing device of Embodiment 1 of this invention
  • FIG. 2 is a functional block diagram of a game machine employing the graphics processing device of Embodiment 1 of this invention
  • FIG. 3 is a flow chart illustrating the operation of the graphics processing device of Embodiment 1 of this invention.
  • FIG. 4 is a plan view of a stage illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 5 is a sectional view of a stage illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 6 is a diagram showing a camera shift path illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 7 is a diagram showing a camera shift path illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 8 is an example of a display screen illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 9 is an example of another display screen illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 10 is a diagram showing a camera shift path and the orientation thereof illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 11 is a flow chart depicting the operation of the graphics processing device of Embodiment 2 of this invention.
  • FIG. 12 is a plan view of a cursor and icons displayed by the graphics processing device of Embodiment 2 of this invention.
  • FIG. 13 is a plan view of another cursor and icons displayed by the graphics processing device of Embodiment 2 of this invention.
  • FIG. 14 is a perspective view of the cursor, icons, and grid in stage illustrative of the operation of Embodiment 2 of this invention.
  • FIG. 15 is a plan view of the cursor, icons, and grid in stage illustrative of another operation of Embodiment 2 of this invention.
  • FIG. 16 is a flow chart depicting the operation of the graphics processing device of Embodiment 3 of this invention.
  • FIG. 17 is a flow chart depicting the operation of the graphics processing device of Embodiment 4 of this invention.
  • FIG. 18 shows an example of a display screen illustrative of the operation of Embodiment 4 of this invention.
  • FIG. 19 shows an example of another display screen illustrative of the operation of Embodiment 1 of this invention.
  • FIG. 1 is an exterior view of a video game machine employing the graphics processing device which pertains to Embodiment 1 of this invention.
  • the video game console 1 has a shape approximating a box, and houses the boards for game processing and the like.
  • Two connectors 2 a are provided on the front panel of the video game console 1 ; the pads which serve as the input devices for game control (pads) 2 b are connected to these connectors 2 a through cables 2 c .
  • two pads 2 b are used for two-player play.
  • a cartridge I/F 1 a for connecting a ROM cartridge and a CD-ROM drive 1 b for reading CD-ROMs.
  • the back panel of the video game console 1 is provided with a video output terminal and an audio output terminal.
  • the video output terminal is hooked up to the video input terminal of a television receiver 5 via a cable 4 a .
  • the audio output terminal is hooked up to the audio input terminal of a television receiver 5 via a cable 4 b .
  • FIG. 2 is a block diagram showing the scheme of the TV game machine which pertains to Embodiment 1 of the invention.
  • This graphics processing device comprises a CPU block 10 for controlling the entire device, a video block 11 for controlling game screen displays, a sound block 12 for generating effect sounds, and a sub-system 13 for reading CD-ROMS and the like.
  • the CPU block 10 comprises an SCU (system control unit) 100 , a main CPU 101 , RAM 102 , ROM 103 , a cartridge I/F 1 a , a sub-CPU 104 , a CPU bus 105 , and so on.
  • SCU system control unit
  • the CPU block 10 comprises an SCU (system control unit) 100 , a main CPU 101 , RAM 102 , ROM 103 , a cartridge I/F 1 a , a sub-CPU 104 , a CPU bus 105 , and so on.
  • SCU system control unit
  • the main CPU 101 is designed to control the entire device. This main CPU 101 incorporates a processing function (not shown) similar to a DSP (digital signal processor) and is designed for rapid execution of application software.
  • a processing function similar to a DSP (digital signal processor) and is designed for rapid execution of application software.
  • the RAM 102 is configured to serve as the work area for the main CPU 101 .
  • An initialization program for the initialization process and so on are written to the ROM 103 , making it possible for the device 10 to boot up.
  • the SCU 100 controls the buses 105 , 106 , and 107 to enable data exchange among the main CPU 101 , the VDPs 120 and 130 , the DSP 140 , the CPU 141 , and other components.
  • the SCU 100 is provided internally with a DMA controller, and is designed such that during the game, image data for the display elements which make up the segments (polygon data and the like) can be transferred to the VRAM in the video block 11 .
  • the cartridge I/F 1 a is designed to transfer program data and image data from the storage medium (provided in the form of a ROM cartridge) to the CPU block.
  • the sub-CPU 104 is called an SMPC (system manager & peripheral controller), and is designed to acquire control data from the peripheral devices 2 b via the connector 2 a shown in FIG. 1 in response to requests from the main CPU 101 .
  • SMPC system manager & peripheral controller
  • the main CPU 101 On the basis of control signals received from the sub-CPU 104 , the main CPU 101 performs, for example, display control (changing character rotation, changing perspective, and other elements) on the game screen.
  • the connectors 2 a are designed to allow connections to any peripheral device such as a pad, joystick, keyboard, or the like.
  • the sub-CPU 104 has the function of automatically recognizing the type of peripheral device plugged into the connectors 2 a (console terminals) and acquiring control signals and the like in accordance with a particular communication mode corresponding to the type of peripheral device.
  • the video block 11 comprises a first VDP (video display processor) 120 , VRAM (DRAM) 121 , frame buffers 122 and 123 , a second VDP 130 , VRAM 131 , and a frame memory 132 .
  • the first VDP 120 houses a system register and is connected to VRAM (DRAM): 121 and to the frame buffers 122 and 123 , and is designed to enable generation of segments (characters) consisting of polygons for the TV game.
  • The-second VDP 130 houses a register and color RAM, is connected to the VRAM 131 and the frame memory 132 , and is designed to enable various processes such as rendering background images, priority (display priority)-based segment image data/background image data image synthesis, clipping, display color designation, and the like.
  • the VRAM 121 is designed to store polygon data (collections of apex point coordinates) for TV game character representation transferred from the main CPU 101 and to store conversion matrix data for shifting the visual field.
  • the frame buffers 122 and 123 are designed to hold the image data (generated in 16 or 8 bits per pixel format, for example) generated by the first VDP 120 on the basis of polygon data, etc.
  • the VRAM 131 is designed to store background image data supplied by the main CPU 101 through the SCU 100 .
  • the memory 132 is designed to store final display data generated by the second VDP 130 through synthesis of texture-mapped polygon image data sent from the VDP 120 and background image data while applying display priority (priority).
  • the encoder 160 is designed to generate video signals by attaching sync frames and so on to the display data, and to output these to the TV receiver.
  • the sound block 12 comprises a DSP 140 for synthesizing sounds by the PCM format or FM format, and a CPU 141 for controlling this DSP 140 .
  • the DSP 140 is designed to convert audio signals to 2-channel signals through a D/A converter 170 and to output these to the two speakers 5 a.
  • the sub-system 13 comprises a CD-ROM drive 1 b , a CD I/F 180 , MPEG AUDIO 182 , MPEG VIDEO 183 , and so on.
  • This sub-system 13 has the function of reading application software provided in CD-ROM format, reproducing video, and so on.
  • the CD-ROM drive 1 b reads data from the CD-ROM.
  • the CPU 181 is designed to control the CD-ROM drive 1 b and to perform error correction on read data and other such processes. Data read out from a CD-ROM is delivered to the main CPU 101 through the CD I/F 180 , bus 106 , and SCU 100 and is used as the application software.
  • the MPEG AUDIO 182 and MPEG VIDEO 183 are devices for restoring data which has been compressed in accordance with MPEG standards (Motion Picture Expert Group). By using the MPEG AUDIO 182 and MPEG VIDEO 183 to restore MPEG-compressed data written on a CD-ROM it is possible to reproduce the video images.
  • MPEG standards Motion Picture Expert Group
  • FIG. 4 is a drawing illustrating processing operations of the device of Embodiment 1 of this invention.
  • FIG. 5 is a cross section of plane A—A in FIG. 4, viewed in the direction indicated by the arrows.
  • FIG. 4 is a plan view of a stage in the game containing terrain feature segments which have been generated within virtual space on the basis of three-dimensionally defined topographical data; the virtual space is viewed from above the horizontal plane in which these terrain features lie.
  • 50 indicates the path over which the viewing point shifts (to facilitate understanding, the viewing point is discussed below in terms of a camera)
  • 51 indicates a player-controlled character controlled by the player
  • 52 indicates a enemy character
  • 53 represents an obstacle (stone monolith) located on the course
  • 54 and 55 indicate sloped surfaces (cliffs)
  • 56 and 57 indicate plateaus.
  • the game process flow in one in which the character 51 drives away enemy characters 52 blocking a path that is surrounded by sloped surfaces 54 and 55 to reach the exit located at the right in the drawing. Exiting through the exit leads the player to the next stage.
  • a stage is introduced by moving the camera along the shifting path 50 to show all the terrain features in the stage.
  • prescribed messages are displayed in association with display images at points P 1 through P 4 .
  • the camera points up at the sloped surface 55 from below to allow the player to make a visual estimation of the steepness of the sloped surface, and a message such as “climbing up this sloped surface is tough” or “if you slip down this sloped surface you will get hurt” is displayed.
  • a description of the enemy characters is provided at point P 2 and a description of the obstacle 53 is provided at point P 3 .
  • Point P 4 in contrast to point P 1 , the camera points down the sloped surface, allowing the player to make a visual estimation of the steepness of the sloped surface.
  • Points P 1 through P 4 are preset. Alternatively, arbitrary position setting through player control could be enabled.
  • Embodiment 1 The operation of the device of Embodiment 1 will be described with reference to the flow chart in FIG. 3 .
  • a viewing point shifting method is provided during introduction of the simulation game; specifically, varying camera angles are employed to describe important features of the terrain. Most processing is done by the CPU 101 .
  • the camera can move not only through the horizontal plane but also in the vertical direction to provide three-dimensional motion, thereby providing an impression of three-dimensional terrain, even for two-dimensional coordinate visual field-converted images. Demonstrations of “objects” actually encountered by the player-controlled character are also presented for the stage.
  • the way in the camera moves may be set arbitrarily by the designer at the programming stage.
  • the designated sites for message display may be set as well.
  • Topographical mapping data check mode refers to a mode in which the entire stage can be observed to provide an understanding of conditions in each stage prior to play. Upon entering this mode, a predetermined camera path 50 is readied and the camera begins to move along this path. “Battles” between the player-controlled character and enemy characters do not occur in this mode.
  • the camera position begins to move. As depicted in FIG. 6 and FIG. 7, the camera starts to move from the left side in the drawings and proceeds until reaching the obstacle 53 located on the right side. It then turns around, passing over the plateau 57 and returning to the starting position. During this time the line of sight of the camera faces downward and towards the front.
  • This camera direction has been preset, but can be freely modified or rendered selectable by the player.
  • the camera height can be changed during motion. For example, it can be raised immediately in front of the obstacle 53 and subsequent dropped down rapidly to close-up on the obstacle 53 .
  • it can be raised to allow the entire terrain of the stage to be viewed; when looking down on a sloped surface 55 , it can be dropped to closely approach the sloped surface.
  • the scenes displayed through these various movements are full of variation and afford effective scenes that are very interesting. In this way, movie-like effects such as pan, zoom, and close-up can be produced by freely moving the camera within the virtual space.
  • Step ST 2 Determinations as to whether to display messages are made as the camera is moving. Where a display is not indicated (NO), the system returns to step ST 2 and the camera continues to move. On the other hand, where there is a message to be displayed (YES), the system proceeds to step ST 4 and the message is displayed.
  • points P 1 through P 4 for message display have been preset. Accordingly, in this example, four messages are displayed.
  • the message is displayed. For example, as the camera gradually approaches point P 3 , the camera pulls up rapidly to give a bird's-eye view like that depicted in FIG. 8 . From this camera position, the overall size and shape of the obstacle 53 may be discerned. As the camera reaches point P 3 , a message is displayed through the determination made in step ST 3 . The camera position at this time is in proximity to the obstacle 53 . As the obstacle 53 is shown in close-up, a message window 60 appears on the screen, and a sub-window 61 showing the face of the character appears therein. Various messages can be displayed in the message window 60 , for example, “(character name) there seems to be some kind of trap here,” or some similar message.
  • the window can be made transparent so as to avoid hiding the background scenery.
  • multiple windows can be opened to enable a simulated conversation among a plurality of characters.
  • the camera does not move during the time that the message is being displayed. This allows the player time to discern the content of the message.
  • the camera is released from the suspended state and resumes movement along the path 50 .
  • the message can be designed to display for a predetermined period of time without waiting for a command from the player.
  • the device of Embodiment 1 of this invention allows the player of a simulation game with a display of terrain segments represented three-dimensionally to move a camera in three-dimensional fashion in order to view the entire terrain, rather than simply scrolling through a display, thereby allowing the player to experience a sense of realism from the three-dimensionally constructed terrain.
  • the player can view terrain features from a multitude of camera positions during topographical mapping data check mode. Scene displays from camera positions that are not commonly employed (for example, an overall view from a very high position, looking upward from the ground, getting really close to a cliff) are also possible, producing a display that has impact and that stimulates the interest of the player.
  • the possibility of battles in three-dimensional space can be suggested by indicating the action space for the three-dimensionally constructed player-controlled character.
  • the direction in which the camera faces was assumed to be fixed; however, the device of Embodiment 1 of this invention is not limited to this configuration.
  • the camera line of sight follow a designated target (shown as a triangle in the drawing) as the camera travels along the path 50 .
  • the target is a vehicle
  • Camera movement is not limited to the horizontal plane and may take place in a vertical plane.
  • FIG. 11 is a simple flow chart showing the operation of this device.
  • FIGS. 12 through 15 are diagrams illustrating the operation of this device.
  • FIG. 12 shows the cursor 63 and the icons 64 displayed around its perimeter.
  • the cursor 63 is shown on the basic screen display and on the movement select screen display.
  • Displayed icon shapes comprise arrows and X's, and each of these has a particular meaning.
  • An X icon 64 a display indicates that the character cannot advance in the direction in which the X is located (upward in the drawing).
  • a single arrow icon 64 b display indicates that the character can move in the direction indicated by the arrow (downward in the drawing) and that the cost entailed in doing so (this refers to a parameter such as the point score required to continue game play) is equivalent to one arrow's worth.
  • a double arrow icon 64 c or triple arrow icon 64 d display respectively indicate that the associated movement costs two times and three times that of a single arrow.
  • FIG. 13 is an example of another cursor 63 and icon 64 display format.
  • shading 63 s and 64 s is depicted low the cursor 63 and the icons 64 .
  • This display is used when the character associated with the cursor has the ability to fly. Applying shading give the impression that the cursor is flying through the air, thereby calling attention to the functional characteristics with which the character is endowed.
  • the cursor display can also be changed to reflect character function. For example, the cursor color could be blue or red depending on whether the sex of the player-controlled character is male or female. Another possibility would be to have a fat cursor for a powerful character and a thin cursor for a weak character.
  • Shadows may simulate light rays coming from some position in the virtual sky, or may be portrayed as conforming to the shapes of terrain features. Alternatively, shadows may be produced by simply adopting a double display for the cursor 63 and the icons 64 .
  • FIG. 14 shows an example screen in which the cursor 63 is displayed.
  • the cursor 63 moves over a grid 65 which reflects the shape of a terrain feature.
  • the icon display surrounding the cursor 63 changes depending on whether the ground over which the cursor 63 is positioned is flat ground, the sloped surface 54 , or the plateau 56 .
  • the device of Embodiment 2 of this invention relates to a cursor which is used in the context of a simulation game for controlling characters and the like, for determining the shape, qualities, and so on of a terrain feature at any location, and for displaying attribute information concerning enemy characters. It also brings up displays of data for terrain features located adjacently to the cursor. Specifically, the cursor provides information not only for a selected terrain feature but also for terrain features located adjacently to the terrain feature in question, thereby affording a display which facilitates understanding of relationships among continuous terrain features.
  • Data for the grid 65 adjacent to the cursor 63 is acquired.
  • Terrain feature data for the position of the cursor 63 and data for surrounding terrain features is acquired, and a decision as to whether a certain movement is possible is made on the basis thereof. Where a movement is possible, the extent of the cost required is also computed.
  • the cursor 63 can be moved to various positions along the grid 65 . When the cursor 63 is located over flat ground, conditions at the cursor and its surroundings are not significantly different. On the other hand, when the cursor 63 is located over the sloped surface 54 , conditions change significantly in the direction of the slope, while conditions in the direction orthogonal to the direction of the slope do not change significantly. This affords information regarding surrounding terrain features, which change in various ways depending on the position of the cursor 63 .
  • Grid direction conditions are computed.
  • the cost entailed in moving is determined by the slope between the two points traversed.
  • Slope can be expressed as the difference in height between the cursor 63 and height of an adjacent grid.
  • the height of each grid is predetermined; a quantitative index thereof is created on the basis of a fixed reference value.
  • the relationship between the steepness of a slope and its height index can be classified as follows.
  • Climbing ability type is classified as follows with reference to the action capabilities with which a character is endowed. Numerical values represent action capabilities. “Extreme”, “strong”, “normal”, and “weak” represent action settings for a player-controlled character; “strong”, “normal”, and “weak” represent walking strength.
  • Determination is made as to whether all grids have been completed. Where the cursor 64 has a square shape, as in this embodiment, a four-fold processing iteration is required.
  • the conditions are indicated by displaying icons around the cursor. For example, if the value in the previous example is “x”, an “X” icon 64 a is displayed, if “1”, a single arrow icon 64 b is displayed, if “2”, a double arrow icon 64 b is displayed, and if “3” or more, a triple arrow icon 64 b is displayed. Icons comprising four or more arrows may be used as well.
  • the icons 64 for display around the cursor 63 are selected on the basis of the height difference (slope) with respect to the surrounding area; however, the invention is not limited thereto, and terrain conditions around the cursor may be represented, for example, through selection on the basis of the conditions of the ground in the surrounding area (rough terrain, grassy terrain, pavement, and so on). Selection may be made on the basis of both this type of condition and the height differential.
  • Embodiment 2 of the present invention is designed such that information pertaining to the height differential between a location selected through the cursor and the adjacent terrain is designated and the results thereof are displayed around the cursor, affording a display which facilitates understanding of relationships with adjacent terrain features.
  • the cursor can be positioned arbitrarily by the player, and analogous correspondence when the terrain changes is possible.
  • cursor form besides that illustrated are possible. Any form indicating to the player the climbing power required for movement would be acceptable. For example, any form capable of displaying the required climbing power would be acceptable.
  • a display like that depicted in FIG. 15 would also be possible.
  • the human powers (HP) and magical powers (MP) possessed by a character 51 present in the area surrounding the cursor are displayed as numerical values (in the drawing, the actual numerical values are not shown).
  • Information for characters present in the eight frames around the cursor 63 can be displayed there.
  • Information for characters located further away (for example, the character 52 represented by the “X”) is not displayed.
  • the player can acquire information about characters in the surrounding area by moving the cursor to any desired position.
  • the device of Embodiment 3 of this invention is used in simulation games in which terrain feature segments are constituted three-dimensionally; where a character or the like falls (refers to movement in a direction opposite the height up a particular terrain feature) during the game, it can vary the effects (damage) on a character and the direction of movement in accordance with this height differential.
  • a character or the like falls (refers to movement in a direction opposite the height up a particular terrain feature) during the game, it can vary the effects (damage) on a character and the direction of movement in accordance with this height differential.
  • the height differential between the starting point and adjacent terrain features is determined to select the direction of fall, and the amount of damage is varied in accordance with the height differential between the fall endpoint and the starting point.
  • Step ST 21 the damage inflicted is proportional to the height of the fall.
  • the height of the fall and terrain conditions at the fall destination are computed.
  • the difference (H 2 ⁇ H 1 ) between the height of the character prior to the fall H 2 and the height H 1 after the fall is computed.
  • An index S indicating conditions at the fall destination is also computed. This index S has been predefined in the topographical data.
  • the index S will differ with rough terrain, grassy terrain, concrete, and so on. In general, the index S is greater (greater damage) the harder the ground and the more rugged it is.
  • the amount of damage is computed.
  • the amount of damage is computed using the following equation, for example.
  • This condition is occurs when a flying character lands, or has been bounced into the air by a enemy.
  • conditions at the fall destination may be taken into account through computation using the following equation, for example.
  • k is a proportional coefficient, which may be constant or which may vary for individual stages or individual characters.
  • Character attributes are modified.
  • the attributes of the character are modified to reflect the amount of damage computed in Step ST 22 . This involves reducing the human power HP of the character; where the damage is significant, the character may die (at which point further game play is disabled). For example, if a character should stand in a location where an indicator indicating the danger of falling is displayed, the character will fall and die unless the character is flying. There is no effect if the character is flying.
  • Embodiment 3 of this invention in the event that a character or the like should fall during the game, the height differential between the position prior to the fall and the position after the fall is computed, and the damage inflicted to the player and the direction of movement are changed.
  • the player-controlled character can thus be damaged by elements other than attacks by enemy characters. Since the extent of damage can be increased or reduced through modification to topographical mapping data, the game designer is provided with increased latitude in terms of designing an interesting game. Since the player must take into consideration damage caused by falls in addition to attacks by enemy characters during play, the interest of the game is enhanced. Gravity and acceleration may be simulated in the representation of the fall, thereby enhancing the realism in the game.
  • FIG. 18 depicts the same game stage shown FIG. 3 .
  • the camera is located towards the top of the entrance, and the camera direction is inclined downward toward the exit.
  • FIG. 19 shows a display screen example of the same stage in which the camera is located above the plateau 57 in the sideways direction, with the camera direction facing the plateau 56 on the opposite side.
  • a message window 60 is displayed together with the screen display. The player can selected any of a plurality of messages (in the drawing, there are two types, “1” and “2”).
  • a triangle symbol represents message selection.
  • the arrows shown to the right in the drawings are provided to facilitate the description. Each arrow corresponds to a direction button on the pad 2 b .
  • the labels UP, DOWN, RIGHT, and LEFT indicate the directions assigned to the direction buttons on the pad 2 b .
  • the labels in parentheses, (FORWARD), (BACK), (LEFT), and (RIGHT), indicate the directions in which the character will move on the screen (i.e., within the virtual space of this stage) when direction buttons are pressed.
  • the arrows in FIG. 18 indicate that the character will move FORWARD, BACK, LEFT, and RIGHT (as viewed from the direction in which the character is facing) within the virtual space when the UP, DOWN, RIGHT, and LEFT buttons are pressed respectively.
  • the associations for the arrows in FIG. 19 are created only when the process depicted in flow chart of FIG. 17 is performed. If this process is not performed, pushing the UP, DOWN, RIGHT, and LEFT direction buttons will result, for example, in the character advancing RIGHT, LEFT, DOWN, and UP within the virtual space as viewed from the direction in which the character is facing; these associations are not intuitive.
  • the message window 60 display is the same in both FIG. 18 and FIG. 19, so when direction button assignments are different, intuitive interface is lost.
  • the flow chart shown in FIG. 17 takes this into consideration.
  • the type of control input is determined. This determination is made since key assignments differ between message inputs and character control inputs. If the input is a character control input, the system proceeds to Step ST 31 ; if it is a message input, it proceeds to the Step ST 35 .
  • the angle formed by the direction in which the character is facing and the direction of the line of sight is computed. For example, in the case depicted in FIG. 18, it would be determined that the two directions are aligned, while in the case depicted in FIG. 19, it would be determined that the direction of the line of sight is rotated 90° to the left with respect to the direction in which the character is facing. Specifically, the direction of the line of sight reflects counter-clockwise rotation of the viewing point coordinate system around the axis representing height in the virtual space (the z axis), with the angle of rotation equal to 90°.
  • key assignments are set to default settings. For example, settings for a rotation angle of 0° are used.
  • key assignments are set to default settings, since the message window 60 display is the same regardless of the angle of rotation. For example, settings for a rotation angle of 0° are used.
  • FIG. 17 depicts one example of a flow chart. Key assignment may be accomplished by other processes as long as message window operations are distinguished from character control when making the respective key assignments for the modes. For example, if there is no need to modify existing key assignments, it is not necessary to perform Steps ST 33 and ST 34 . The order of Steps ST 30 and ST 31 may be reversed.
  • Embodiment 4 of this invention allows direction button assignments to be made on the basis of the angle formed by the direction the character is facing and the direction of the line of sight when the camera position changes, thereby allowing assignments to be modified to suit the player control perspective when the viewing point position has changed. Accordingly, intuitive operation can be continued without affecting the modified viewing point position.
  • the character is readily seen or difficult to see depending on camera position, so the ability to modify camera position is important.
  • the present invention creates key assignment settings that offer intuitive control, so play can continue without any unnatural feel.
  • the player can modify the camera position to a position allowing the entire terrain to be readily discerned, and this modification of the camera position has no adverse effect on ease of control.
  • control inputs and “message inputs” was made in control input determinations, but the invention is not limited thereto. For example, determinations could be made regarding whether a control input relates to character movement or to the virtual space.
  • a display screen not directly related to virtual space coordinates could be displayed.
  • this invention allows the viewing point within a virtual space defined in three dimensions to be shifted arbitrarily, and affords a favorable game environment.
  • This invention further provides displays of information regarding the area surrounding the cursor-selected position, affording a favorable game environment.
  • This invention still further takes into account the effects of three-dimensionally-defined terrain features on player-controlled segments, affording a favorable game environment.
  • This invention still further coordinates the orientation of a player-controlled segment in virtual space with the direction of the line of sight for modifying the visual field, affording a favorable game environment.
US09/011,267 1996-06-05 1997-06-05 Image processor, image processing method, game machine and recording medium Expired - Fee Related US6500069B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US10/295,996 US20030119587A1 (en) 1996-06-05 2002-11-18 Graphics processing device, graphics processing method, game machine, and storage medium
US11/896,989 US7573479B2 (en) 1996-06-05 2007-09-07 Graphics processing device, graphics processing method, game machine, and storage medium

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP8-143337 1996-06-05
JP14333796 1996-06-05
PCT/JP1997/001912 WO1997046970A1 (fr) 1996-06-05 1997-06-05 Processeur d'image, procede de traitement d'images, jeu electronique et support d'enregistrement

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP1997/001912 A-371-Of-International WO1997046970A1 (fr) 1996-06-05 1997-06-05 Processeur d'image, procede de traitement d'images, jeu electronique et support d'enregistrement

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/295,996 Division US20030119587A1 (en) 1996-06-05 2002-11-18 Graphics processing device, graphics processing method, game machine, and storage medium

Publications (1)

Publication Number Publication Date
US6500069B1 true US6500069B1 (en) 2002-12-31

Family

ID=15336445

Family Applications (3)

Application Number Title Priority Date Filing Date
US09/011,267 Expired - Fee Related US6500069B1 (en) 1996-06-05 1997-06-05 Image processor, image processing method, game machine and recording medium
US10/295,996 Abandoned US20030119587A1 (en) 1996-06-05 2002-11-18 Graphics processing device, graphics processing method, game machine, and storage medium
US11/896,989 Expired - Fee Related US7573479B2 (en) 1996-06-05 2007-09-07 Graphics processing device, graphics processing method, game machine, and storage medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
US10/295,996 Abandoned US20030119587A1 (en) 1996-06-05 2002-11-18 Graphics processing device, graphics processing method, game machine, and storage medium
US11/896,989 Expired - Fee Related US7573479B2 (en) 1996-06-05 2007-09-07 Graphics processing device, graphics processing method, game machine, and storage medium

Country Status (10)

Country Link
US (3) US6500069B1 (de)
EP (3) EP1498844B1 (de)
JP (1) JP3706393B2 (de)
KR (2) KR100592456B1 (de)
CN (1) CN1134748C (de)
DE (2) DE69731715T2 (de)
ES (1) ES2332271T3 (de)
HK (1) HK1072825A1 (de)
TW (1) TW346612B (de)
WO (1) WO1997046970A1 (de)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020187832A1 (en) * 2001-06-06 2002-12-12 Konami Corporation Game system and game program
US20020193161A1 (en) * 2000-01-25 2002-12-19 Katsuhiro Ishii Game system, program and image generation method
US20030003978A1 (en) * 2001-06-29 2003-01-02 Square Co., Ltd. Video game with distinctive attributes for enemy characters, predetermined characters, and candidate characters
US6932705B2 (en) * 2001-03-29 2005-08-23 Square Enix Co., Ltd. Video game with sub-display for tracking target
US20060246974A1 (en) * 2004-12-21 2006-11-02 Jumpei Tsuda Program for controlling the movement of group of characters, recorded medium, and game device thereof
US20060246968A1 (en) * 2005-04-28 2006-11-02 Nintendo Co., Ltd. Storage medium having game program stored therein and game apparatus
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US20080070684A1 (en) * 2006-09-14 2008-03-20 Mark Haigh-Hutchinson Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US20080125224A1 (en) * 2006-09-26 2008-05-29 Pollatsek David Method and apparatus for controlling simulated in flight realistic and non realistic object effects by sensing rotation of a hand-held controller
US20090181736A1 (en) * 2007-08-17 2009-07-16 Nintendo Of America Inc. System and method for lock on target tracking with free targeting capability
US20090259976A1 (en) * 2008-04-14 2009-10-15 Google Inc. Swoop Navigation
US20090313341A1 (en) * 1999-07-29 2009-12-17 Electronic Arts Inc. Electronic in-application postcards
US20100216550A1 (en) * 2008-02-15 2010-08-26 Sony Computer Entertainment Inc. Game device, game control method, and game control program
US20100240457A1 (en) * 2008-02-18 2010-09-23 Sony Computer Entertainment Inc. Game device, game control method, and game control program
US20110300948A1 (en) * 2008-12-16 2011-12-08 Masashi Takehiro Game device, game processing method, information recording medium, and program
US20120190451A1 (en) * 2009-08-05 2012-07-26 Ncsoft Corporation Device and method for controlling the movement of a game character
US8626434B1 (en) 2012-03-01 2014-01-07 Google Inc. Automatic adjustment of a camera view for a three-dimensional navigation system
US20150024843A1 (en) * 2011-06-14 2015-01-22 Nintendo Co., Ltd. Methods and/or systems for designing virtual environments
US10140765B2 (en) 2013-02-25 2018-11-27 Google Llc Staged camera traversal for three dimensional environment
US10525354B2 (en) 2016-06-10 2020-01-07 Nintendo Co., Ltd. Game apparatus, game controlling method and storage medium for determining a terrain based on a distribution of collision positions
US10697996B2 (en) 2006-09-26 2020-06-30 Nintendo Co., Ltd. Accelerometer sensing and object control
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11151376A (ja) 1997-11-20 1999-06-08 Nintendo Co Ltd ビデオゲーム装置およびその記憶媒体
JPH11207029A (ja) 1998-01-28 1999-08-03 Konami Co Ltd ビデオゲーム装置、ビデオゲームにおける画面表示方法及び画面表示プログラムが格納された可読記録媒体
US6712703B2 (en) 1998-11-19 2004-03-30 Nintendo Co., Ltd. Video game apparatus and information storage medium for video game
JP2000153061A (ja) 1998-11-19 2000-06-06 Nintendo Co Ltd ビデオゲーム装置およびビデオゲーム用情報記憶媒体
JP2000153062A (ja) * 1998-11-19 2000-06-06 Nintendo Co Ltd ビデオゲーム装置およびビデオゲーム用情報記憶媒体
US6612930B2 (en) 1998-11-19 2003-09-02 Nintendo Co., Ltd. Video game apparatus and method with enhanced virtual camera control
EP1080756A3 (de) * 1999-09-02 2004-10-27 Sony Computer Entertainment Inc. Unterhaltungssystem, Unterhaltungsvorrichtung, Aufzeichnungsmedium und Programm
JP3479522B2 (ja) 2001-07-12 2003-12-15 コナミ株式会社 3次元画像処理プログラム、3次元画像処理方法及び装置
JP4560711B2 (ja) * 2004-06-22 2010-10-13 株式会社セガ 画像処理
US7620530B2 (en) * 2004-11-16 2009-11-17 Nvidia Corporation System with PPU/GPU architecture
KR100759364B1 (ko) * 2006-05-02 2007-09-19 한국과학기술원 사용자 반응형 실시간 그래픽스와 고품질 애니메이션 영상합성 방법
US9649551B2 (en) 2008-06-03 2017-05-16 Tweedletech, Llc Furniture and building structures comprising sensors for determining the position of one or more objects
US8602857B2 (en) 2008-06-03 2013-12-10 Tweedletech, Llc Intelligent board game system with visual marker based game object tracking and identification
US8974295B2 (en) * 2008-06-03 2015-03-10 Tweedletech, Llc Intelligent game system including intelligent foldable three-dimensional terrain
US10155156B2 (en) 2008-06-03 2018-12-18 Tweedletech, Llc Multi-dimensional game comprising interactive physical and virtual components
WO2009149112A1 (en) 2008-06-03 2009-12-10 Tweedletech, Llc An intelligent game system for putting intelligence into board and tabletop games including miniatures
EP2613855A4 (de) 2010-09-09 2014-12-31 Tweedletech Llc Brettspiel mit dynamischer merkmalsverfolgung
CN102547126A (zh) * 2012-01-17 2012-07-04 苏州佳世达电通有限公司 监视系统及其控制方法
JP6298613B2 (ja) * 2013-10-17 2018-03-20 株式会社ソニー・インタラクティブエンタテインメント ゲームシステム、ゲーム制御方法、及びゲーム制御プログラム
KR101608172B1 (ko) 2014-12-22 2016-03-31 주식회사 넥슨코리아 객체를 제어하는 방법 및 장치
KR101674378B1 (ko) * 2015-03-25 2016-11-10 (주) 지오씨엔아이 홍수 범람 시뮬레이션을 위한 실시간 카메라 시점 이동에 따른 3차원 모델링 및 렌더링 자동화 방법
CN106139590B (zh) * 2015-04-15 2019-12-03 乐线韩国股份有限公司 控制对象的方法和装置

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600200A (en) 1982-01-14 1986-07-15 Ikegami Tsushinki Co., Ltd. Three-dimensional image display system
US5261820A (en) * 1990-12-21 1993-11-16 Dynamix, Inc. Computer simulation playback method and simulation
WO1994008309A1 (en) 1992-09-30 1994-04-14 Marshall Paul S Virtual reality generator for use with financial information
JPH06274577A (ja) 1993-03-19 1994-09-30 Hitachi Ltd 動作履歴出力機能を有する情報処理装置
WO1994022544A1 (en) 1993-03-26 1994-10-13 Namco Ltd. Video game machine
JPH0775689A (ja) 1993-06-16 1995-03-20 Namco Ltd ビデオゲーム装置
US5415549A (en) 1991-03-21 1995-05-16 Atari Games Corporation Method for coloring a polygon on a video display
WO1995035140A1 (fr) 1994-06-20 1995-12-28 Sega Enterprises Ltd. Procede et dispositif de commande du sens de deplacement d'un objet
JPH08117440A (ja) 1994-10-24 1996-05-14 Namco Ltd 3次元シミュレータ装置及び画像合成方法
US5588914A (en) * 1994-06-28 1996-12-31 The Walt Disney Company Method and system for guiding a user in a virtual reality presentation
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
US5808614A (en) * 1995-06-16 1998-09-15 Sony Corporation Device and method for displaying a guide picture in virtual reality
US5830066A (en) * 1995-05-19 1998-11-03 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method, and game device and storage medium using the same
US5920304A (en) * 1997-02-18 1999-07-06 International Business Machines Corporation Random bounce cursor mode after cessation of user input
US5952995A (en) * 1997-02-10 1999-09-14 International Business Machines Corporation Scroll indicating cursor
US5995102A (en) * 1997-06-25 1999-11-30 Comet Systems, Inc. Server system and method for modifying a cursor image
US6144378A (en) * 1997-02-11 2000-11-07 Microsoft Corporation Symbol entry system and methods
US6166718A (en) * 1996-06-18 2000-12-26 Konami Co., Ltd. Video game system with vertical array of cursor images

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3154418B2 (ja) * 1990-06-21 2001-04-09 キヤノン株式会社 半導体光増幅装置、光通信システム、双方向光通信システム、光通信ネットワーク、及び集積型光ノード
JP3089251B2 (ja) 1990-09-03 2000-09-18 株式会社電脳商会 マルチメディア情報処理方法
JP3352475B2 (ja) 1992-10-13 2002-12-03 有限会社ジーティービー 画像表示装置
US5484926A (en) * 1993-10-07 1996-01-16 Agouron Pharmaceuticals, Inc. HIV protease inhibitors
JP2729137B2 (ja) 1993-01-20 1998-03-18 株式会社ナムコ シューティング型ゲーム装置
US5463722A (en) * 1993-07-23 1995-10-31 Apple Computer, Inc. Automatic alignment of objects in two-dimensional and three-dimensional display space using an alignment field gradient
JP3421746B2 (ja) * 1993-12-21 2003-06-30 株式会社セガ ゲーム機における球技のパス先選択方法
JP3544268B2 (ja) * 1995-10-09 2004-07-21 任天堂株式会社 三次元画像処理装置およびそれを用いた画像処理方法
US6022274A (en) * 1995-11-22 2000-02-08 Nintendo Co., Ltd. Video game system using memory module
US6139433A (en) * 1995-11-22 2000-10-31 Nintendo Co., Ltd. Video game system and method with enhanced three-dimensional character and background control due to environmental conditions
US5909218A (en) * 1996-04-25 1999-06-01 Matsushita Electric Industrial Co., Ltd. Transmitter-receiver of three-dimensional skeleton structure motions and method thereof
US5769718A (en) * 1996-05-15 1998-06-23 Rieder; William R. Video game apparatus and medium readable by a computer stored with video game program
US6535215B1 (en) * 1999-08-06 2003-03-18 Vcom3D, Incorporated Method for animating 3-D computer generated characters
US6622500B1 (en) * 2002-05-08 2003-09-23 Delphi Technologies, Inc. Energy-efficient capacity control method for an air conditioning compressor
US7129951B2 (en) * 2004-05-06 2006-10-31 Valve Corporation Method and system for performing speculative collisions for a video game

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4600200A (en) 1982-01-14 1986-07-15 Ikegami Tsushinki Co., Ltd. Three-dimensional image display system
US5261820A (en) * 1990-12-21 1993-11-16 Dynamix, Inc. Computer simulation playback method and simulation
US5415549A (en) 1991-03-21 1995-05-16 Atari Games Corporation Method for coloring a polygon on a video display
WO1994008309A1 (en) 1992-09-30 1994-04-14 Marshall Paul S Virtual reality generator for use with financial information
JPH06274577A (ja) 1993-03-19 1994-09-30 Hitachi Ltd 動作履歴出力機能を有する情報処理装置
WO1994022544A1 (en) 1993-03-26 1994-10-13 Namco Ltd. Video game machine
US5704837A (en) 1993-03-26 1998-01-06 Namco Ltd. Video game steering system causing translation, rotation and curvilinear motion on the object
JPH0775689A (ja) 1993-06-16 1995-03-20 Namco Ltd ビデオゲーム装置
WO1995035140A1 (fr) 1994-06-20 1995-12-28 Sega Enterprises Ltd. Procede et dispositif de commande du sens de deplacement d'un objet
EP0714685A1 (de) 1994-06-20 1996-06-05 Sega Enterprises, Ltd. Verfahren und vorrichtung zur richtungssteuerung eines objektes
US5588914A (en) * 1994-06-28 1996-12-31 The Walt Disney Company Method and system for guiding a user in a virtual reality presentation
US5734807A (en) * 1994-07-21 1998-03-31 Kabushiki Kaisha Sega Enterprises Image processing devices and methods
JPH08117440A (ja) 1994-10-24 1996-05-14 Namco Ltd 3次元シミュレータ装置及び画像合成方法
US5830066A (en) * 1995-05-19 1998-11-03 Kabushiki Kaisha Sega Enterprises Image processing device, image processing method, and game device and storage medium using the same
US5808614A (en) * 1995-06-16 1998-09-15 Sony Corporation Device and method for displaying a guide picture in virtual reality
US6166718A (en) * 1996-06-18 2000-12-26 Konami Co., Ltd. Video game system with vertical array of cursor images
US5952995A (en) * 1997-02-10 1999-09-14 International Business Machines Corporation Scroll indicating cursor
US6144378A (en) * 1997-02-11 2000-11-07 Microsoft Corporation Symbol entry system and methods
US5920304A (en) * 1997-02-18 1999-07-06 International Business Machines Corporation Random bounce cursor mode after cessation of user input
US5995102A (en) * 1997-06-25 1999-11-30 Comet Systems, Inc. Server system and method for modifying a cursor image

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Adventure Review, Daggerfall Review website by Andy Backer, pp. 1-5. Printed off the Internet at http://www.cdmag.com/adventure_vault/daggerfall_review, Dec. 1996.* *
Game Revolution, Daggerfall Review website by George la Tourette, Jr. pp. 1-4. Printed off the Internet at: http://www.game-revolution.com/games/pc/daggerfall.htm, Nov. 1996.* *
Ollinger, J., "Microsoft Space Simulator by BAO," Games Bytes Magazine, Online! (1994).

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7870479B2 (en) * 1999-07-29 2011-01-11 Electronic Arts Inc. Electronic in-application postcards
US20090313341A1 (en) * 1999-07-29 2009-12-17 Electronic Arts Inc. Electronic in-application postcards
US20020193161A1 (en) * 2000-01-25 2002-12-19 Katsuhiro Ishii Game system, program and image generation method
US6932705B2 (en) * 2001-03-29 2005-08-23 Square Enix Co., Ltd. Video game with sub-display for tracking target
US20020187832A1 (en) * 2001-06-06 2002-12-12 Konami Corporation Game system and game program
US6860813B2 (en) * 2001-06-06 2005-03-01 Konami Corporation Game ethod and system for indicating input device orientation relative to game space
US6860807B2 (en) * 2001-06-29 2005-03-01 Kabushiki Kaisha Square Enix Video game with distinctive attributes for enemy characters, predetermined characters, and candidate characters
US20030003978A1 (en) * 2001-06-29 2003-01-02 Square Co., Ltd. Video game with distinctive attributes for enemy characters, predetermined characters, and candidate characters
US20060246974A1 (en) * 2004-12-21 2006-11-02 Jumpei Tsuda Program for controlling the movement of group of characters, recorded medium, and game device thereof
US20060246968A1 (en) * 2005-04-28 2006-11-02 Nintendo Co., Ltd. Storage medium having game program stored therein and game apparatus
US7585224B2 (en) * 2005-04-28 2009-09-08 Nintendo Co., Ltd. Storage medium having game program stored therein and game apparatus
US20070270215A1 (en) * 2006-05-08 2007-11-22 Shigeru Miyamoto Method and apparatus for enhanced virtual camera control within 3d video games or other computer graphics presentations providing intelligent automatic 3d-assist for third person viewpoints
US9327191B2 (en) 2006-05-08 2016-05-03 Nintendo Co., Ltd. Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints
US20080070684A1 (en) * 2006-09-14 2008-03-20 Mark Haigh-Hutchinson Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US8277316B2 (en) * 2006-09-14 2012-10-02 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US9339724B2 (en) 2006-09-14 2016-05-17 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US9789391B2 (en) 2006-09-14 2017-10-17 Nintendo Co., Ltd. Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting
US10697996B2 (en) 2006-09-26 2020-06-30 Nintendo Co., Ltd. Accelerometer sensing and object control
US20080125224A1 (en) * 2006-09-26 2008-05-29 Pollatsek David Method and apparatus for controlling simulated in flight realistic and non realistic object effects by sensing rotation of a hand-held controller
US20090181736A1 (en) * 2007-08-17 2009-07-16 Nintendo Of America Inc. System and method for lock on target tracking with free targeting capability
US8834245B2 (en) 2007-08-17 2014-09-16 Nintendo Co., Ltd. System and method for lock on target tracking with free targeting capability
US8267782B2 (en) 2008-02-15 2012-09-18 Sony Computer Entertainment Inc. Game device, game control method, and game control program
US20100216550A1 (en) * 2008-02-15 2010-08-26 Sony Computer Entertainment Inc. Game device, game control method, and game control program
US8251817B2 (en) 2008-02-18 2012-08-28 Sony Computer Entertainment Inc. Game device, game control method, and game control program
US20100240457A1 (en) * 2008-02-18 2010-09-23 Sony Computer Entertainment Inc. Game device, game control method, and game control program
WO2009128899A1 (en) * 2008-04-14 2009-10-22 Google Inc. Swoop navigation
US20090259976A1 (en) * 2008-04-14 2009-10-15 Google Inc. Swoop Navigation
US20110300948A1 (en) * 2008-12-16 2011-12-08 Masashi Takehiro Game device, game processing method, information recording medium, and program
US8444488B2 (en) * 2009-08-05 2013-05-21 Ncsoft Corporation Device and method for controlling the movement of a game character
US20120190451A1 (en) * 2009-08-05 2012-07-26 Ncsoft Corporation Device and method for controlling the movement of a game character
US20150024843A1 (en) * 2011-06-14 2015-01-22 Nintendo Co., Ltd. Methods and/or systems for designing virtual environments
US9814983B2 (en) * 2011-06-14 2017-11-14 Nintendo Co., Ltd Methods and/or systems for designing virtual environments
US8626434B1 (en) 2012-03-01 2014-01-07 Google Inc. Automatic adjustment of a camera view for a three-dimensional navigation system
US10140765B2 (en) 2013-02-25 2018-11-27 Google Llc Staged camera traversal for three dimensional environment
US10525354B2 (en) 2016-06-10 2020-01-07 Nintendo Co., Ltd. Game apparatus, game controlling method and storage medium for determining a terrain based on a distribution of collision positions
US11498004B2 (en) * 2020-06-23 2022-11-15 Nintendo Co., Ltd. Computer-readable non-transitory storage medium having instructions stored therein, game apparatus, game system, and game processing method

Also Published As

Publication number Publication date
US7573479B2 (en) 2009-08-11
EP1498844A2 (de) 2005-01-19
US20080074425A1 (en) 2008-03-27
CN1194705A (zh) 1998-09-30
KR19990036168A (ko) 1999-05-25
EP1498844B1 (de) 2009-09-30
JP3706393B2 (ja) 2005-10-12
CN1134748C (zh) 2004-01-14
US20030119587A1 (en) 2003-06-26
EP0844587B1 (de) 2004-11-24
EP2096600A3 (de) 2009-12-09
WO1997046970A1 (fr) 1997-12-11
DE69731715T2 (de) 2005-11-03
EP2096600A2 (de) 2009-09-02
KR20040097392A (ko) 2004-11-17
DE69731715D1 (de) 2004-12-30
KR100524327B1 (ko) 2005-10-28
TW346612B (en) 1998-12-01
ES2332271T3 (es) 2010-02-01
EP0844587A4 (de) 2002-04-03
EP1498844A3 (de) 2005-09-21
DE69739609D1 (de) 2009-11-12
EP0844587A1 (de) 1998-05-27
HK1072825A1 (en) 2005-09-09
KR100592456B1 (ko) 2006-08-30

Similar Documents

Publication Publication Date Title
US6500069B1 (en) Image processor, image processing method, game machine and recording medium
EP0841640B1 (de) Bildverarbeitungsgerät, spielmaschine mit diesem bildverarbeitungsgerät, bildverarbeitungsverfarhen und -medium
KR100300832B1 (ko) 화상처리장치및화상처리방법
JP3859084B2 (ja) 画像処理装置、画像処理方法及びこれを用いたゲーム装置並びに記憶媒体
US6949024B2 (en) Image processing apparatus for a game, method and program for image processing for a game, and computer-readable medium
US8259112B2 (en) Image generating apparatus, method of generating image, program, and recording medium
US6724385B2 (en) Method of replaying game, recording medium, program, and entertainment system
EP1132122A2 (de) Verfahren zur Wiederholung eines Spiels, Aufzeichnungsmedium, Programm und Unterhaltungssystem
US6878058B1 (en) Image processor and game device with image processor
JP2955989B2 (ja) ゲーム装置
JP3968586B2 (ja) ゲーム装置、画像処理方法、及び記録媒体
JP3951246B2 (ja) ゲーム装置
JP3763220B2 (ja) ゲーム装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA SEGA ENTERPRISES, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:OHBA, NORIYOSHI;ONO, KENICHI;REEL/FRAME:009041/0491

Effective date: 19980109

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20101231