MXPA99010643A - Video game device and media storage device for vi game - Google Patents

Video game device and media storage device for vi game

Info

Publication number
MXPA99010643A
MXPA99010643A MXPA/A/1999/010643A MX9910643A MXPA99010643A MX PA99010643 A MXPA99010643 A MX PA99010643A MX 9910643 A MX9910643 A MX 9910643A MX PA99010643 A MXPA99010643 A MX PA99010643A
Authority
MX
Mexico
Prior art keywords
program
data
code
player object
sound
Prior art date
Application number
MXPA/A/1999/010643A
Other languages
Spanish (es)
Inventor
Miyamoto Shigeru
Koizumi Yoshiaki
Yamada Yoichi
Iwawaki Toshio
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of MXPA99010643A publication Critical patent/MXPA99010643A/en

Links

Abstract

A video game apparatus is provided that includes a CPU that detects a program control code included in a terrain object in the vicinity of a player object, and then, the CPU determines the class of terrain object. If the terrain object is a "hole", the CPU executes a hole operation subroutine. Similarly, if the terrain object is "wall", "door" or "ladder" the CPU performs the subroutines of "wall operation", "door operation" or "ladder"

Description

VIDEO GAME DEVICE AND MEDIA STORAGE OF INFORMATION FOR VIDEO GAME FIELD OF THE INVENTION This invention relates to a video game apparatus and to a game program memory medium for the same, and more particularly to a video game apparatus which generates and supplies to a screen, an image signal for display of a player object that exists in a terrain object in a virtual three-dimensional space by virtue of, for example, player object data and terrain object data, and with a game program memory medium to be used by the same. .
DESCRIPTION OF THE PREVIOUS TECHNIQUE In a conventional video game machine, when a player wishes a player object, for example, to jump, the player presses a jump button on a controller so that the CPU causes the player object to jump in response to the operation of the jump button. That is, when the player object is caused to jump over an obstacle, such as a recess or hole, the player is asked to press the jump button in a synchronized manner in the front of the hole or hole while manipulating a movement direction instruction means, for example such as a joystick or a cross button. However, there may be a case in which the object or player does not succeed when jumping through the obstacle, since the synchronization may be to press the jump button, or the position of the player object, in the operation of the jump button . That is, a skill operation has been required with a jump button to make the player object jump up and cross an obstacle. Meanwhile, a complicated button operation has been needed to cause the player object to perform other actions than jumping, for example, opening and closing a door or ascending stairs, etc.). The player can be placed in difficulties when playing a game that enjoys progress of game because his attention is directed to the manipulation of buttons. Such games, called action games, become harder to play year after year. They are too difficult for the player. In particular, there is a tendency for beginners to avoid games of this kind.
BRIEF DESCRIPTION OF THE INVENTION Therefore, it is a primary objective of the present invention to provide a novel video game apparatus and a program memory means to be used by it. Another object of the present invention is to provide a novel video game apparatus which is easy for a player to cause the player object to work, and a game program memory means to be used therein. A further object of the present invention is to provide a video game apparatus with which a player object can overcome obstacles without difficulty, and a game program memory means to be used therein. Another object of the present invention is to provide a video game apparatus in which it is possible for a player object, a virtual camera or the like to automatically perform a required operation such as jumping, changing camera or the like, and a means of game program memory to be used in it. Another object of the present invention is to provide a video game apparatus which can carry out complicated control with a simple program, and a game program memory means to be used therein.
A video game apparatus according to the present invention is for generating, and supplying to a screen, an image signal for displaying a player object that exists in a terrain object in a virtual three-dimensional space when processing image data so that the player object and the terrain object, according to the program, the video game apparatus comprises: a player object image data generating means for generating player object image data for displaying a player object; and a terrain object image data generating means for generating terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code; the video game apparatus further comprises a program control code detecting means for detecting the program control code in relation to a position of the player object, and a means for changing the image, to cause the signal to change. image based on the detected program control code. The program control code includes an action code for controlling an action of the player object, the means for changing the image includes an animation data transmission means for automatically transmitting animation data to cause the player object to perform an action in accordance with the action code.
Specifically, when the terrain object is a hole or hole and the action code is "jump", the animation data transmission means transmits animation data to cause the player object to perform a jump action on the hole or hole ' In one embodiment, the video game apparatus has a controller, in association therewith, which includes a direction instruction means for instructing a direction of movement of the player object so that the player object moves in the direction of movement , the video game apparatus further comprises; a means of detecting movement speed to detect the speed of movement of the player object, and a means of calculating the distance of jump to calculate a distance of jump of the player object based on the speed of movement, the means of data transmission of animation transmits animation data to cause the player object to perform a jumping action, according to the jumping distance. When the terrain object is the surface of a wall and the action code is "climb", the animation data transmission means transmits such animation data so that the player object performs an action of climbing the surface of the wall . However, when the action code is not "climbing", a means is also used to calculate the height of the wall surface in order to calculate the height of the wall surface, and the data transmission medium. of animation transmits such animation data that the player object performs an optimal action according to the height of the wall surface. In one embodiment of the present invention, the program control code includes a camera control code, the image change means includes a camera control means for controlling a virtual camera that is provided in the three-dimensional virtual space. Incidentally, when the virtual camera includes a plurality of virtual cameras, the camera control code includes a camera switching code, and the camera control means includes a camera switching means for switching between the plurality of virtual cameras in the camera. base in the camera switching code. When the program control code includes a sound code, the video game apparatus further comprises a sound data generating means for generating sound data, and a sound control means for controlling the sound that is transmitted from the sound source. medium sound data generator, based on the sound code. When the sound data generating means can generate sound data of a plurality of sounds, the sound code includes a sound change code and the sound control means includes a means of changing sound to change the sound data depending on the code of sound change. Inadvertently, it is possible to control only the sound according to a program control code. In this case, a video game apparatus for generating and supplying to a screen an image signal for displaying a player object that exists in a terrain object in a virtual three-dimensional space by processing image data for the player object and the ground object, according to a program, and which also supplies a sound signal to a sound output means by processing sound data according to a program, the video game apparatus comprises: player object image data for generating player object image data to display a player object; and a means of generating terrain object image data to generate terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code, - the video game apparatus further comprises a program control code detecting means for detecting the program control code in relation to a position of the player object, a means of changing sound to cause the sound signal to change according to the detected program control code.
In addition, the video game apparatus generally uses a memory means to pre-memorize a game program or image data. A memory medium according to the present invention is applicable to a video game apparatus for generating and supplying on a screen, an image signal for displaying a player object that exists on a terrain object in a virtual three-dimensional space when processing image data for a player object and the terrain object according to a program, and memorize with a program that is to be processed by an information processing means including the video game apparatus, the memory means comprises: a player object image data generator program for generating player object image data to display a player object; and a terrain object image data generating program for generating terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code; and further comprising a program control code detecting program for detecting the program control code in relation to a position of the player object, and a program for changing image, to cause the image signal to change based on the program control code detected. The game program memory medium is formed with an image data area. The image data area is memorized with player object data and terrain object data. The player object data includes polygon data to represent a shape and animation data to represent an action state. The terrain object data includes polygon data to represent a shape, and attribute data. These attribute data include a program control code that includes an action code, a camera code and a sound code. The game memory means further includes a program for processing image data. The video game apparatus advances a game according to the image data and programs, and at the same time takes into consideration as required driver controller data. In response, on a display screen a game image is displayed that has a player object that exists on a terrain object in a virtual three-dimensional space. When the player object approaches or exists in a relevant terrain object, a program control code contained in the terrain object image data is detected by the detection means. Consequently, the medium changes the image, which is different from a usual program, and controls an action of the player object or a virtual camera. When an action code is contained in the terrain object data representative of a terrain object or in the vicinity in which a player object exists, - the action code is detected by an action code detection means ( action code detection program). The animation data transmission medium (animation data transmission program) transmits such animation data in a manner that causes the player object to perform an action in accordance with the detected action code. Therefore it is possible for the player object to automatically perform an optimal action in compliance with the detected action code. Specifically, when the terrain object is a hole or hole and the action code is "jump", the animation transmission means (animation data transmission medium) transmits animation data to cause the player object to perform an action of jumping through the hole or hole. When the player object is moved in accordance with a direction instruction means in the controller, the motion speed detection means (movement speed detection program) detects a movement speed of the player object while the detection means jump distance (jump distance detection program) calculates a jump distance of the player object. In this case, the animation data transmission medium (animation data transmission program) transmits animation data-li to cause the player object to perform a jumping action depending on the jumping distance. When the young object is a wall surface and the action code is "climb", the data transmission medium of the animation (animation data transmission program) transmits animation data to cause the player object to climb an wall surface. If a wall surface height is calculated by a means of calculating wall surface height (wall height detection program), it is determined under which of the intervals of 0 <; H = 25, 25 < H = 50, 50 < H 100 or 100 < H = 150 is the height (H) of wall surface. The animation data transmission medium (animation data transmission program) transmits animation data to cause an optimal action of the wall surface height.In addition, when the program control code is a camera code, for example, a plurality of virtual cameras arranged appropriately in the virtual three-dimensional space which is selectively activated by the camera code (camera switching code). Also, when the program control code is a sound code, for example, Sound data that will be produced by a sound data generator means is switched.
According to the present invention, the required control can be performed automatically in accordance with a control code contained in the terrain object image data, which include, for example, control of the player object's action, change of virtual camera include control of image change, and sound control such as change of sound data. Even in the case where the player object or the camera is complicatedly controlled according to the position in which the player object exists, it is easy to design a program. For example, if the program control code is an action code, it is very easy for a player to manipulate a player object. If the action code is "jump", the player object automatically jumps. Therefore, the player object can easily traverse such an obstacle as a hole or gap. If the action code is "climb", a player object can automatically climb a wall surface. When the action code is "door", a door is automatically opened and the player object is allowed to enter through the door. In addition, if the action code is "ladder" (ladder), the player object automatically climbs a ladder. The objects described above and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken together with the accompanying drawings.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 is a schematic illustrative view showing a video game system of one embodiment of this invention; Figure 2 is a block diagram showing in detail a video game machine of the system of Figure 1; Figure 3 is a block diagram showing in detail a controller control circuit of the video game machine of Figure 2; Figure 4 is a block diagram showing in detail a controller and a driver package for the video game machine of Figure 2; Figure 5 is an illustrative view showing a memory map of an external ROM for the video game machine of Figure 2; Figure 6 is an illustrative view showing a memory map of a RAM of the video game machine of Figure 2; Figure 7 is a flow chart showing a total operation of the embodiment of Figure 1; Fig. 8 is a flow chart showing in detail the process of the terrain object in the flow chart of Fig. 7; Figure 9 is a flowchart showing in detail a part of the action determination process of the flowchart of Figure 1, - Figure 10 is a flowchart showing in detail an action determination process for the case of a hole in the flow diagram of Figure 9; Figure 11 is an illustrative view showing an example of a jumping action (large jump) which is obtained in the flow chart of Figure 10; Figure 12 is an illustrative view showing an example of a jump action (medium jump) which is obtained in the flow diagram of Figure 10; Figure 13 is an illustrative view showing an example of a jump action (small hop) that is obtained in the flow chart of Figure 10; Fig. 14 is an illustrative view showing an example of an action (not falling) in the flow chart of Fig. 10; Fig. 15 is a flow diagram showing in detail an action determination process for the case of a wall surface in the flow diagram of Fig. 9; Figure 16 is an illustrative view showing an example of a wall coding action that is obtained by the flow chart of Figure 15; Fig. 17 is a flow chart showing an example of a step assembly action that is obtained by the flow diagram of Fig. 15; Figure 18 is an illustrative view showing an upward jumping action that is obtained by the flow diagram of Figure 15; Figure 19 is an illustrative view showing an example of a light climbing action that is obtained by the flow chart of Figure 15; Figure 20 is an illustrative view showing an example of the usual climbing action obtained by the flow chart of Figure 15; Figure 21 is an illustrative view showing an example of difficult climbing action that is obtained by the flow chart of Figure 15; Fig. 22 is a flow chart showing in detail a gate action in the flow diagram of Fig. 9; Fig. 23 is an illustrative view showing an example of a door action that is carried out by the flow chart of Fig. 22; Fig. 24 is a flow chart showing in detail an action of climbing a ladder in the flow chart of Fig. 9; Fig. 25 is an illustrative view showing an example of an action of climbing a ladder which is obtained by the flow chart of Fig. 24; Fig. 26 is a flow diagram showing in detail a player object process in the flow chart of Fig. 7; Fig. 27 is an illustrative view showing in detail a process of camera determination in the flow diagram of Fig. 1, Fig. 28 is an illustrative view showing an example of a camera arrangement as a premise for the process of camera determination of the flow chart of Figure 27; Fig. 29 is a flow chart showing in detail a first camera control program in the flow diagram of Fig. 27; Figure 30 is an illustrative view showing a player object taken by the first camera according to the flow chart of Figure 29; Fig. 31 is a flow chart showing in detail a second camera control program (fifth camera) in the flow chart of Fig. 27; Fig. 32 is a flow chart showing in detail a third camera control program of the flow chart of Fig. 27; Fig. 33 is an illustrative view showing a player object taken by a third camera according to the flow chart of Fig. 32; * figure 34 is a flow chart showing in detail a control program of a fourth chamber in the flow chart of figure 27; Figure 35 is an illustrative view showing a player object taken by a fourth camera, according to the flow chart of Figure 34; Fig. 36 is an illustrative view showing a player object taken by the fourth chamber according to the flow chart of Fig. 32; and Figure 37 is a flow chart showing in detail a sound process in the flow chart of Figure 7.
DETAILED DESCRIPTION OF THE PREFERRED MODALITIES With reference to Figure 1, a video game apparatus in this embodiment includes a video game machine 10, a ROM cartridge 20 as an example of an information memory medium, an exhibit unit 30 connected to the machine 10 of video game, and a controller 40. The controller 40 is removably mounted with a controller package 50. The controller 40 is structured by a plurality of switches or buttons provided on the housing 41 in a shape that can be held by both or by a hand. Specifically, the controller 40 includes handles 41L, 41C, 41R, extending downward respectively from the left end, from the right end and in the center portion of the housing 41, which provides an operating area on a top surface of the housing 41. In the area of operation, an analogically-available control lever (hereinafter referred to as "analog control lever") 45 is provided in a central lower portion thereof, and a digital direction switch 46 in cross shape (hereinafter referred to as "cross switch") on the left side, and a plurality of button switches 47A, 47B, 47D, 47E and 47F on the right side. The analog control lever 45 is used to enter a movement direction and / or velocity of mcvipp-í-pto or - •• - * - irfar-ote rroviipiatD of the jugacfcr object (to aer qa-cc-h per eL juga ± a · taraués of <-rnt - O ----- fcr) det-aap - n - ria per la - '• * »* - i &i &cT-rexa ote --nrlirerrim de la joystick. The cross switch 46 is used to designate a direction of movement of the player object, instead of the control lever 45. The button switches 47A and 47B are used to designate a movement of the player object. The button switches 47C-47D are used for changes on a visual point of a three-dimensional image camera or to adjust the speed or the like of the player object. A start switch 47S is provided almost in the center of the operation area. This start switch 47S is operated when a game is started. A switch 47Z is provided on the back side of the central handle 41C. This 47Z switch is used, for example, as a trigger switch in a shooting game. The switches 47L and 47R are provided on the upper left and right side of a side surface of the housing 41. ~ Incidentally, the button switches 47C-47F mentioned above, can also be used to control movement and / or speed of movement (for example, acceleration or deceleration) of the player object in a shooting or action game, in addition to the purpose of changing the visual point of the camera. However, these switches 47A-47F, 47S, 47Z, 47L and 47R can be defined arbitrarily by their function based on a game program. Figure 2 is a block diagram of the game system of the video mode of Figure 1. The video game machine 10 incorporates therein a central processing unit (hereinafter referred to as "CPU") 11 and a coprocessor (reality coprocessor: hereinafter referred to as "RCP") 12. RCP 12 includes a common link control circuit 121 for controlling common links, a signal processor (reality signal processor); then referred to as "RSP") 122 to perform the transformation of polygonal coordinates, shadows treatment and so on, and the performance processor (Processor display reality hereinafter referred to as "RDP") 46 for scanning polygon data into an image to be displayed and converting the same as data (point data) that can be stored in a memory framework . The RCP 12 is connected with a connector 13 of cartridge loading downloadable way cartridge 20 ROM having an external ROM 21 incorporated therein, a connector 197 drive to mount detachably unit 29 disk and RAM 14. In addition, the RCP 12 is connected to the DACs (digital / analog converters) 15 and 16 to respectively transmit a sound signal and a video signal to be processed by the CPU 11. In addition, the RCP 12 is connected by means of a controller control circuit 17 for serially transferring operation data to one or a plurality of controllers 40 and / or a controller pack 50.
The common link control circuit 121 included in the RCP 12 performs the parallel / series conversion of an instruction supplied in a parallel signal from the CPU by means of a common link, so as to provide a serial signal to the controller control circuit 18. In addition, the common link control circuit 121 converts a serial signal input to the controller control circuit 17 into a parallel signal, which provides output to the CPU 11 via the common link. Data representative of an operation state (operation signal or operation data) is read from the controller 40A-40D and processed by the CPU 11 and temporarily stored in a RAM 14, and so on. In other words, the RAM 15 includes a storage site for temporarily storing the data to be processed by the CPU 11, so that it is used for a more uniform reading and writing of data through the link control circuit 121. common. The sound DAC 15 is connected to a connector 19a which is provided on the rear face of the video game machine 10. The video DAC 16 is connected to a connector 19b which is provided on the back face of the video game machine 10. The connector 19a is connected to a loudspeaker 31 of a screen 30, while the connector 19b is connected to a screen 30 such as a TV or CRT receiver.
The controller control circuit 17 is connected to a controller connector that is provided on the front face of the video game machine 10. The connector 18 is disconnectably connected by a controller 40 through a female connector. The connection of the controller 40 to the connector 18 places the controller in electrical connection with the video game machine 10, whereby the transmission / reception or transfer of data between them is allowed. The controller control circuit 17 is used to transmit and receive serial data between the RCP 12 and the connector 18. The controller control circuit 17 includes, as shown in FIG. 3, a transfer control circuit 171 data, a transmitter circuit 172, a receiver circuit 173 and a RAM 174 for temporarily storing transmission and reception data. The data transfer control circuit 171 includes a parallel / series conversion circuit and a serial / parallel conversion circuit in order to convert a data format during transfer, and further performs write / read control in RAM 174. The serial / parallel converter circuit converts the serial data supplied from RCP 12 into data in parallel by supplying it to RAM 174 or to transmission circuit 172. The parallel / series conversion circuit converts the data in parallel supplied from the RAM 174 or the receiver circuit 173 into serial data, to be supplied to the RCP 12. The transmission circuit 172 converts the instructions for read signals from the controller 40 and the write data (data in parallel) to the controller package 50, in serial data to be supplied to channels CH1-CH4 corresponding to the respective controls. The receiver circuit 173 receives, in serial data, operational status data of the controllers input through the corresponding channels CH1-CH4 and data read from the controller packet 50, to convert them into data in parallel to be supplied to the circuit 171 of control of data transfer. The data transfer control circuit 171 writes within RAM 174 data transferred from the RCP 12, controller data received by the receiver circuit 183, or data read from the RAM controller pack 50, and reads data from the RAM 174 in based on an instruction of the RCP 12 so that it transfers them to the RCP 12. The RAM 174, although not shown, includes memory sites for the respective channels CH1-CH4. Each of the memory sites is stored with an instruction for the channel, data transmission and / or data reception. Figure 4 is a detailed circuit diagram of controller 40 and controller pack 50. The controller housing 40 incorporates an operation signal processing circuit 44, etc. in order to detect a state of operation of the control lever 45, the switches 46, 47, etc., and transfer the detected data to the control circuit 17 of the controller. The operation signal processing circuit 44 includes a receiver circuit 441, a control circuit 442, a switching signal detection circuit 443, a counter circuit 444, a control lever port control circuit 446, a circuit 447 of readjustment and a gate 448 ÑOR (NO). The receiver circuit 441 converts a serial signal, such as a control signal transmitted from the controller control or data writing circuit 17 to the controller pack 50, into a parallel signal to be supplied to the control circuit 442. The control circuit 442 generates a readjustment signal to readjust (0), through gate 448 ÑOR (NO), count values of an X-axis counter 444X and a Y-axis counter 444Y into counter 444, when the control signal transmitted from the controller control circuit 17 is a signal for readjusting the X, Y coordinates of the control lever 45. The control lever 45 includes photo-switches of the X-axis and the Y-axis in order to decompose a Lever tilt in components of X axis and Y axis, which generates pulses in an amount proportional to the inclination. The pulse signals are respectively supplied to the 444X counter and the 444Y counter. The 444X counter counts the number of pulses generated in response to an amount of tilt when the control lever 45 is tilted in the X-axis direction. The 444Y counter counts the number of pulses generated in response to an amount of tilt when the lever control 45 is inclined in the direction of the Y axis. Consequently, the resulting vector of the X axis and the Y axis determined by the count values of the counters 444X and 444Y serves to determine a direction of movement and a coordinated position of the object player or hero character, or a cursor. Incidentally, the counter 444X and 444Y are readjusted when a reset signal is supplied from the reset signal generator circuit 447 when it is turned on at power, or a reset signal is supplied from a switch signal detector circuit 443 upon obtaining simultaneously two predetermined switches. The switching signal detection circuit 443 responds to a switch status output instruction supplied at a constant period interval (e.g., a 1/30 second interval as a TV frame period) of the control circuit 442 , for reading a signal that varies depending on the state of depression of commutator 46 in cross and switches 47A-47Z. The read signal is supplied to control circuit 442. The control circuit 442 responds to an instruction signal read from the operational status data of the controller control circuit 17 to supply in a predetermined data format the operational state data of the 47A-47Z switches and the count values of the the 444X and 444Y counters to the 445 transmission circuit. The transmit circuit 445 converts the parallel signal transmitted from the control circuit 442 into a serial signal, and transfers it to the controller control circuit 17 via the converter circuit 43 and a signal line 42. The control circuit 442 is connected to a control lever control circuit 446 by means of a common address link and a common data link as well as a port connector 46. The control lever port control circuit 446 performs data input / output (or control transmission / reception, in accordance with the instructions of the CPU 11, when the controller packet 50 is connected to the port connector 46.) The controller package 50 is structured by connecting the RAM 51 to the common address link and the common data link and by connecting the RAM 51 to a battery 52. The RAM 51 is for storing backup data in relation to a game, and saves data of backup by the application of electrical power from the battery 52 even if the controller pack 50 is removed from the port connector 46.
Fig. 5 is a memory map illustrating a memory space of an external ROM 21 built into the ROM cartridge (Fig. 1, Fig. 2). The external ROM 21 includes a plurality of memory areas (which can then be referred to only as "areas"), i.e., a program area 22, an image data area 23 and a sound memory area 24 , which are previously memorized and fixed with various programs. The program area 22 is memorized with a program necessary to process game images, game data suitable for a game content, etc. Specifically, the program area 22 includes memory areas 22a-22i for pre-memorizing and setting a program of operation of the CPU 11. A main program area 22a is memorized with a main routine processing program for a game that it is shown in figure 7, etc., which is referred to below. A controller data determination program area 22b is memorized by means of a program to an operation data process controller 40. A field object program area 22c is memorized with a program for displaying and controlling a terrain object or in the vicinity of which a player object will exist. A player object program area 22d is memorized with a program for displaying and controlling an object to be operated by the player (hereinafter referred to solely as "player object"). The program area 22 further includes a control code that detects the program area 22e. In this area 22e a program is installed to detect a 'control code contained in the terrain object image data (referred to below). A camera control program area 22f is memorized with a camera control program to control in which direction and / or position a moving object, including a player object, a background object is to be considered in a three-dimensional space. In the embodiment, a plurality of virtual cameras are installed in a three-dimensional space. Accordingly, the camera control program area 22f includes a first camera control program, a second camera control program, ..., a Nesimo camera control program for individually controlling the first to nth respective virtual cameras . An area 22g of the action control program is memorized with a program for reading animation data contained in the player object image data, in order to cause the player object to act in accordance with a control code detected by a program of detection of control code. The action control program, specifically, includes several calculation programs. The calculation programs include a movement speed detection program for detecting the speed of movement of the player object, a program for calculating the jump distance to calculate a jump distance of the player object based on the speed of movement, and a Wall height calculation program to calculate the height of a wall. This action control program determines an action for the player object according to an action code, a control code or a calculation program, and reads the animation data from the image data area 23 on the basis of an action. Accordingly, the action control program 22g cooperates with the image data area 23 to thereby constitute an animation data output program. An image buffer and a buffer Z of the write area 22h are stored with a write program by means of which the CPU 11 causes the RCP 12 to perform writing on an image buffer and a buffer Z. For example, the writing area 22h is memorized with a program for writing color data to a frame memory area (FIG. 6) of RAM, and a program for writing depth data to an area 204 of FIG. buffer Z (Figure 6), as image data based on texture data for a plurality of moving objects or background objects to be displayed in a background scene.
Incidentally, a sound process program area 22i is memorized with a program to generate a message through the sound effect, melody or voices. The image data area 23 includes, as shown in FIG. 5, two memory areas 23a and 23b. The memory area 23a is memorized with image data such as coordinate data and animation data of a plurality of polygons, on an object basis per object, in order to display a player object, and with an exhibit control program to display an object in a predetermined fixed position or movably. The memory area 23b is memorized with image data, such as a plurality of polygon data and attributable data, on a subject basis by object, to display a terrain object, and with a display control program to display an object of land. The attribute data includes an action code representative of an action that is to be performed by the player object (eg jumping, climbing a wall, opening a door and closing it, climbing a ladder, etc.), a representative class code from a class of a terrain polygon (hole, ice, sand, lava, etc.), a melody code representative of the BGM class, an enemy code representative of whether or not an enemy exists and what kind of enemy, and a camera code to instruct changes between cameras. These codes are collectively referred to as "control codes". The control codes have been previously established within the polygon data of each polygon that constitutes the terrain objects to be adjusted. Incidentally, the required terrain objects are considered to include a terrain object in which the player object will exist, and a terrain object in the vicinity of which the player object will exist, and so on. A sound memory area 24 is memorized with sound data, such as phrases, effect sound and game melody, for each scene to transmit a message like the previous one in a manner suitable for the relevant scene. Specifically, BGM 1 and BGM 2 are memorized as a game melody and sound data such as "crying" as a sound effect. Incidentally, the memory medium or the external memory may use an arbitrary memory medium, such as a CD-ROM or a magnetic disk, in place of or in addition to the ROM cartridge 20. In such a case, a disk unit (not shown) must be provided in order to read or write, as required, the various data for a game (including program data and image display data) from the medium of memory formed of the optical or magnetic disk, such as a CD-ROM or a magnetic disk. This disk unit reads the memorized data on the magnetic disk or the optical disk which has been magnetically or optically stored with program data similar to those of the external ROM 21 and transfers the data to the RAM 14. In this way, the program area 22 is installed with the program so that the game image signal can be created by processing the image data that is placed in the image data area 23 in a manner similar to a conventional video game apparatus , and a sound signal may be produced by processing the sound data installed in the sound memory area 24. In this embodiment, furthermore, a program control code is placed on the image data stored in the image data area 23, that is, in the image data of the terrain object. When a program control code is detected in dependence based on a position of the player object, the animation for the player object is varied, the virtual camera is changed and the sound signal is also changed in accordance with the control code of detected program. Therefore, the program control code serves as a program control factor or a program change factor. Because of this, when a program code is detected, the player object changes in the animation or the camera is changed, and it is possible to provide an image change in a different way to that by the execution of a usual program. In addition, if when a program control code is detected, the sound signal is inactivated, it is possible to cause a different sound change from the execution of a usual program. Incidentally, the control code is explained in more detail. As mentioned above, the terrain object data includes attribute data, wherein the control code is included in the attribute data. The attribute data is a predetermined number of data bits representative of what the terrain object presents, ie, a class of an object, such as a hole, floor, wall surface, stairs, fatty soil or the like. Therefore, the CPU 11 can determine a class of terrain object by detecting the attribute data. _ The control code is configured by one or two or more bits in the attribute data. The attribute data is included within each polygon to constitute a terrain object. As a result, the control data is included in each polygon. The control code represents, by one, two or more bits, the control content of, for example, "jump", "climb", "enter through the door", "climb ladder", "change camera", "change sound", etc. Incidentally, in the above explanation, a class of terrain object is determined with reference to the attribute data. However, the method for detecting a terrain object may be as follows. For example, a terrain object in which the player object moves can be detected as a floor object so that a terrain object that is provided 90 degrees (vertically) with respect to the floor object is detected as a wall or a wall surface object. In this case, a terrain object that exists above the player object will 'be detected as a roof object. That is, a class of terrain object can be determined by a relation of position, angle or similar in relation to the player object. In any case, a program control code (which includes a control code, an action code, a camera code, a sound code and so on) is placed in the attribute data. Figure 6 is a memory map illustrating the entire memory space of the RAM 14. The RAM 14 includes several memory areas 201-209. For example, RAM 14 includes a display list area 201, a program area 202, a frame buffer area 203 (or image buffer) for temporarily storing an image data frame, an area 204 of buffer Z for storing, point by point, depth data of the frame memory area data, an image data area 205, a sound memory area 206, an area 207 for storing status data controller operation, a working memory area 208 and a registration / flag area 209. The memory areas 201-209 are memory spaces that are accessed through the common link control circuit 121 by the CPU 11 or directly by the RCP 12, and assigned with arbitrary capacity (or memory space) for the game used. Meanwhile, the image data area 205 and the sound memory area 206 are for temporarily storing image data or sound data required to execute a program transferred to the program area 202, which program is part of the data of the program. game programs for a complete game scene (scenario) memorized in area 22 of ROM 21, for example, a game program required for 1 course or stage. In this way, if the program required for a certain scene or part of data is memorized in the areas 202, 205, 206, it is possible to improve the efficiency of data processing and therefore, the speed of image processing compared to processing when reading directly from ROM 21 whenever the CPU requires it. Specifically, the frame memory area 203 has a memory capacity corresponding to the number of image elements (pixels or points) of the screen 30 (FIG. 1) X the number of bits of color data per pixel, to store data point-by-point color that correspond to the pixelee on the screen 30. The frame memory area 203 memorizes temporarily point-by-point color data when it exhibits a moving object, such as a player object, a friendly object, an enemy object , a chief object, etc., or various other objects such as a ground object, a bottom (or stationary) object, etc. which are memorized in the image data area 105. The buffer area Z has a memory capacity corresponding to the number of image elements (pixels or dots) of the screen 30 X the number of depth data bits per pixel, to memorize depth data point by point correspond to each pixel on the screen 30. The Z-buffer area 204 temporarily stores the point-by-point depth data, when they show a moving and / or stationary object, i.e. a moving object such as a player object, a companion object, an enemy object, a chief object or the like, and various other objects such as a terrain object, a background (or stationary) object or the like which are memorized in the image data area 205. The image data area ee for storing coordinate data and texture data so that polygons are formed in a plurality of sets for each of the stationary and / or moveable objects to display the set stored in ROM 21, to the which course 1 or data stage, for example, is transformed from ROM 21 in advance of its image processing. Incidentally, this image data area 205 also memorizes animation data that has been read, as required, from the image data area 23 of the external ROM 21. The sound memory area 206 is transferred by the sound data (phrase data, melody and sound effect) memorized in the memory area of the ROM 21, and temporarily memorizes the same as sound data that must be recorded. Play through a sound producing unit 32. The controller data memory area 207 (operation data data) temporarily memorizes operating state data representative of an operating state read from controller 40. The working memory area 208 temporarily memorizes data such as parameters during operation. execution of a program by the CPU 11. The registration / flag area 209 includes the registration area 209r and the flag area 209f. The registration area 209r, although not shown, is comprised of a plurality of records that are individually loaded with data. Area 209r, although not shown, is constituted with a plurality of flags that are placed or readjusted separately. Figure 7 is a main flow diagram of the video game system in this mode. If the power is turned on, in a first step SI, the CPU 11 at startup sets the video game machine 10 to a predetermined initial state. For example, the CPU 11 transfers a start program of the game programs stored in the program area 22 of the external ROM 21 to the program area 202 of the RAM 14, and sets parameters for its initial values, sequentially executing the steps of Figure 7. The operation of the main flow chart of Figure 7 is carried out, for example, at a frame interval (1 / 60th of a second) or 2 or 3 frames. Steps S2-S12 are repeatedly executed haeta which has cleared the entire course. If the game comes without clearing the successful course, in step S14 subsequent to step S13, a finished game process is performed. If the clear course is successful, the process returns from step S12 to stage SI. That is, in the SI stage a game course screen and / or a course selection screen is displayed. However, if the game starts after the power has been turned on, a screen of the first course is displayed. If the first course is clarified, a subsequent course is established. In the subsequent step S2, the stage is carried out Yes of the controller process. In this process, a detection is performed on one which was operated by the control lever 45 of the controller 40, the cross switch 46 and the switches 47A-47Z. The operation state detection data (controller data) is read, and the controller data is read and written in the controller data area 141 of the RAM 14. In step S3, a process of terrain object. This procedure, although explained in detail in the following with reference to a subroutine of Figure 8, includes a calculation of a position and shape of the display of the terrain object based on a program partially transferred from the memory and data area 22c. of polygon of terrain object transferred from the memory area (figure 5). In step S4, a procedure is performed to determine an action for the player object. Specifically, as explained in the following with reference to Figure 9 through Figure 26, a determination is made regarding an action for the player object in accordance with a control code or action code explained above. In step S5 a process is performed to display a player object. This process is basically a procedure to cause changes in position, direction, shape and location based on the operating status of the control lever 45 (controller data) operated by a player and the presence or absence of enemy attack. For example, the polygon data after the change is determined by calculation based on the program transferred from the memory area 22e (figure 5) of the external ROM 21, the player object polygon data traced from the memory area 23a and the controller data, i.e. the operating state of the joystick 45. The colors are provided by texture data to a plurality of polygons obtained by the above. Step S6 is a step to carry out a camera determination process. Specifically, it is determined which virtual camera of a plurality of virtual cameras will be used when taking the images of an object in a virtual three-dimensional space, according to a change code (control code) contained in the object data of terrain explained before. In the following, this will be explained in more detail, with reference to FIGS. 27 through 36. In step S7, a chamber process is carried out. For example, ee calculates a coordinate of a vintage point for the object so that a line or field of view as seen through a lens of a virtual camera is placed at a slanted angle through the joystick. by the player. In step S8, the RSP 122 performs an interpretation process. That is, the RCP 12 under the control of the CPU 11 performs the transformation (coordinate transformation and interpretation of frame memory) on the image data to display a movable object and a stationary object based on texture data for the movable object. , such as an enemy object, an object or the like, and the stationary object, such as for a background, memorized in the image data area 201 of the RAM 14. Specifically, colors are provided to a plurality of polygons for each of a plurality of movable objects and stationary objects. In step S9, the CPU 11 performs a sound process based on sound data, such as messages, melodies, sound effects, etc. In particular, the BGM and the like is switched according to the melody code (control code) previously established in the terrain object, as shown in the subroutine of Figure 37. In the next stage SIO, the CPU 11 reads the image data stored on frame memory side 203 of RAM 14 according to a result of the interpretation process of step S7. Accordingly, a player object, an object in motion, a stationary object and an enemy object and the like are displayed on a display screen of the display 30 (FIG., figure 2). In the Sil stage, the RCP 12 reads the echo data obtained as a result of the sound processing of the step S18, by which it transmits sound such as melody, sound effect, conversation, etc. In step S12, it is determined whether the course has been cleared or not (clear course detection). If the course has not been clarified, it is determined in step S13 whether the game ends or not. If the game is not yet finished, the process returns to step S2 to repeat steps S2-S13 until a finished game condition is detected. If a finished game condition is detected, i.e. the number of errors allowed for the player reaches a predetermined number of times or the life of the player object has been consumed by a predetermined amount, then step S14 of a player is carried out. finished game process, such as selection of a continuation of game play or backup data storage. Incidentally, in step S12, a course clearing condition is detected (for example, beating a boss, etc.), the clear course process is carried out, and then proceeding returns to the Si stage. 8 is a subroutine of the terrain object process shown in step S3 of FIG. 7. In a first step 301, CPU 11 (FIG. 2) reads the polygon data, or a terrain object required at that time , it is transferred from the image data area 23 (FIG. 5) of the external ROM 21 to the image data area 205 (FIG. 6) of the internal RAM 14. These polygon data have a control code previously established as required herein, and as explained above. Accordingly, in step S301, the same control data is read simultaneously. Incidentally, read polygon data containing a control code (action code, camera change code, sound code or the like) are temporarily retained in a display list area 201 of the internal RAM 14. In step S302 texture data are read which correspond to the ground object and are transferred to the image data area 205 of the internal RAM 14. In step S303, the camera data is similarly read from the image data area 205 which correspond to the ground object. This texture data and camera data are memorized in the display list area 201, similarly to the polygon data. Then, in step S304, the terrain object is memorized in the display list area 201. It is determined in step S305 whether the process from step S301 to step S304 has been executed in all terrain objects or not. If the determination is "NO", the procedure is executed again from step S301. If all the terrain objects ee have been completed for the process, that is, if "YES" is determined, the subroutine of figure 8 is completed and the process returns to the main routine. The action determination process in step S4 of FIG. 7 is carried out, namely, according to a flow chart shown in FIG. 9. This is, in the first step S401, the CPU 11 (FIG. 2) detects a state of the player object. That is, it is detected if the player object is in any action or not. If the player object is in the course of an action, "YES" is determined in step S402, and the process advances to the next step S403. In step S403, the CPU 11 makes reference to the area 209 of RAM register / flag 14 shown in Figure 6, and detects a control code or action code contained in the object data of a terrain object that exists at the feet of the player object. The control code or action code, as explained above, has been previously established within the terrain object area 23b of the external ROM 21 shown in Fig. 5, and has previously been traced to the image data area 205 . The terrain object data is read in the display area 201 of each frame. Accordingly, the CPU 11 detects an action code in the display area 201. Subsequently, the CPU 11 in step S404 detects whether the player object is falling or not. That is, it is determined whether the player object in action in the preceding step S402, and whether the action is the action of "falling" or not, is determined. If the player object is falling, then the CPU 11, in the next step S405 detects the height of the player object at that moment from the ground object. The CPU 11 in step S406 determines that the player object will land when the height of the player object from the terrain object is a predetermined height, that is, the height is low enough. At this time, the CPU 11 in the next step S407 causes the player object to begin a landing action. That is, the CPU 11 in this step S407 causes the player object to change in the form based on the landing action animation data memorized in the player object data area 23a of the external ROM 201, and controls the RCP 12 to write color data to the frame memory area 203. Incidentally, these animation data are repreeentative data of skeleton movement of the player object. The player object is shown by a combination of the animation data and the polygon data, similarly to the objects. Consequently, although the same polygon data and the animation data are different, the player object changes in the action. Because of this, at this stage S407 when reading the animation data for "landing action", the player object may cause it to perform a landing action. If it is determined in the previous step S402 that the state of the action of the player object is not "in the course of an action", the CPU 11 in step S408 detects a control code or an action code to land an existing object. in the vicinity (at the front or at the feet of) the player object of the display area 201, similarly to step S403. In the next step S409, the CPU 11 refers to the attribute data of the terrain object at the foot of the player object, by which it determines whether the terrain object is a "hole" or "hole". Alternatively, the land object at that time is a hole or hole and it can be determined if there is a floor object located at zero degrees (parallel or horizontal) with respect to a direction of movement of the player object and the floor object is formed with a step down. When the terrain object is a "hole" or "hole", the CPU. in the next step S410 executes a "hole action" subroutine shown in FIG. 10. If "NO" is determined in step 409, then it is determined in step S411 whether the ground object is a "wall surface". or not, by the attribute code. However, as stated above, a wall surface object can be detected by an angle (90 degrees) with respect to the player object advancing in the direction or the floor object. If the terrain object is a "wall surface", the CPU 11 in the successive step S412 executes a subroutine of "wall surface action" which is shown in figure 16. If in step S411"NO" is determined , then it is determined in step S413 whether the terrain object is a "door" by the attribute code or an angle with respect to the floor object. When the object of land ee a "door", the CPU in the successive step S414 executes a "gate action" subroutine shown in FIG. 23. If "NO" is determined in step S413, then it is determined in step S415 whether the terrain object is a "ladder" of hands "or not, by means of the attribute code or by an angle with respect to the floor object. When the terrain object is a "ladder", the CPU 11, in the subsequent step S416 executes a subroutine of "ladder action" shown in figure 25. The explanation here is performed in an "action" fyi-FiHn "with reference to Figure 10, as well as Figure 11 to Figure 15 related thereto. In the first step S417 of figure 10, reference is made to an exhibition list area 201 (figure 6) to detect an action code or control code for the ground object at the feet of the player object in front of the hole. More specifically, if the attribute data of a floor object constituting a "hole" includes one or two bits or more of a control code and the control code is "0", the control code is set, by default , to jump". Meanwhile, the control codes of a floor object constituting a hole include, in addition to this, "bottomless space", "scene change", "no fall", "step out" and so on. In the control code or action code detected in step S418 if a "do not drop" code is not used, ie when the control code or the action code is "skip", a "NO" is determined in the Stage S418. The CPU 11 in the next step S419 determines a height of the player object at that moment from a ground object, in a manner similar to the previous step S405. It is determined in step S420 whether the calculated height of the player object is less than a predetermined height, eg, 200 cm or not. If you note that "cm" is a unit of virtual length within a virtual three-dimensional space, as it applies in the following. If "NO" is determined in this step S420, the CPU 11 in the next step S421 calculates a movement speed of the player object at that time. In step S422, the CPU 11 calculates a distance over which the player object will raise in bae at the height calculated in step S419 and the speed calculated in the speed S421. In the next step S423 of a jump action ee starts according to the jumping distance. Figure 11 shows an example of such jumping action that a player object can jump through a hole to an opposite bank due to the short distance Ll of the hole. Figure 12 shows an example of such jumping action that because the hole is a little larger in the distance L2, the player object can not jump through the hole but can place his hand on the opposite bench. Figure 13 shows an example of such a jump where the hole distance L3 is too large for the player object to jump through the hole or to place his hand on the opposite bench, resulting in a drop in the hole. In any case, the required jump action is automatically carried out according to a jump code contained in an existing terrain object on it. The distance that the player object must jump through is correlated with a movement speed of the player object. That is, if the player object runs quickly, it can jump through a large hole similar to distance L. However, when the player object moves by walking, the case may arise in which the player object can not jump through the object. hole even if the "skip" control code has been set. Consequently, when the player object walks, the player object can not jump through but falls into the hole or can be made to fall into a hanging position with only one hand placed on the opposite shore. Such jumping actions can be obtained by reading the corresponding animation data of the player object data area 23a of external ROM 221, as explained before.
Incidentally, "YES" is determined in step S418, that is, if the control code or action code of a ground object in the front of the hole is not a "do not fall" code, the CPU 11 in step S424 it causes the player object to begin an action of not falling. In this case, the player object begins to fall into the hole but assumes a hanging position downward where only one hand is on the opulete edge. While both, in step S420 of the player object is determined less than 200 cm, it is determined that no jump should be made. In step S425, CPU 11 begins to cause a player object to perform an action to fall. That is, if the height or depth of the hole is greater than 200 cm (virtual length), a jump action is performed as mentioned above. If it is less than 200 cm, the player object is caused to walk back into the hole as it would do without ealting, as shown in figure 14. If "NO" is determined in step S409, in the stage S411 refers to the attribute data or to an angle, so a ground object clause is determined as a "wall surface" or not. If "YES" is determined in this step S411, the CPU 11 in step S412 starts a "wall surface action" action, which is performed when the player object faces the wall surface. This wall surface action is executed, specifically according to a flow diagram shown in Figure 15. In the first step S426 of Figure 15, the CPU 11 determines whether or not there is a control code or a code of action contained in a terrain object of "wall surface" in the vicinity of the player object which is "prohibited", that is, that prohibits the player object from advancing on a wall surface. If a "prohibited" code exists, the process returns to the main routine. When a control code or an action code contained in each polygon constituting the wall surface is "tripar", the CPU 11, in step S428 causes the player object to perform an action of climbing up the wall surface, as ee shown in figure 16. In the example of figure 16, the player object is brought into contact with a wall that is placed on the surface of the wall so that it moves on the surface of the wall in response to the operation of the player's joystick 45. Turning up the control lever 45 causes the player object to climb up the wall surface, while rotating down causes the player object to move downward. If the player object moves upward to a wall surface position where a "climb" control code is not established, the player object can no longer be found on the wall surface, resulting in a fall. That is, if the object of the wall surface oriented with the player object is established with a "climb" action code, the player object automatically performs an option to climb up to the wall surface. However, the direction of movement of the player object ee can be determined through the control lever 45. When the control code or the action code of the wall surface object is not "forbidden" and does not "climb" and in addition , a floor object is facing the wall surface object and is set by default with the control code "jump", the CPU 11 in step S429 calculates the height of the wall surface. Therefore, the player object automatically performs its optimal action according to the calculated wall surface height, as described in the following. First, the CPU 11 determines, in step S430 and i, the height of the calculated wall surface lies within a range from 0 to 25 cm, ie, 0 < H _ = 25 or not. The height in this interval eignifies a very low wall surface. In this case, the player object can advance on the wall surfaces as if walking up stairs. Accordingly, in the next step S431, the CPU 11 reads the required animation data from the external ROM 21, or the RAM 14, so that the player object begins an action of "ascending the stairs", which is shown in FIG. figure 17. In the example of figure 17, the surface of the wall to be overcome is small in terms of height. Consequently, the player object can advance on the stairs as a wall surface by the action of climbing the stairs step by step, according to the control code of "jump" established in the floor object. In this case, the "skip" control code has been previously established in the floor object in the front of the wall surface object, or the elevators, as shown in Figure 17. CPU 11, in the successive step S432, determines whether the wall surface height is in the range of 25 to 50 cm or not, that is, if 25 <; H i 50 or not. This height interval indicates a low wall surface. In this case, the player object can advance on the wall surface by jumping. Accordingly, the CPU 11 in the next step S433 reads the animation data of the ROM 21, or RAM 14 to cause the player object to start a "skip" action which is shown in FIG. 18. In the example of FIG. Figure 18, the player object ealta in front of the wall surface to land on it, so it exceeds the wall surface. Also in this case, a "skip" control code has previously been established on a ground object, or floor object, at the front of the wall surface object, as shown in FIG. 18. At the stage S434, the CPU 11 determines whether or not the wall surface height is within a range of 50 cm to 100 cm, ie, if 50 < H 100 or not. This height interval means a comparatively high wall surface. In this case, the player object can surpass the wall surface by lightly climbing. Accordingly, in the next step S435, the CPU 11 reads the animation data required to cause the player object to begin a "lightweight" action shown in FIG. 19. In FIG. 19, the example of "climbing" lightweight, "the player object places his hands on the surface of the wall as an object so that the body is pushed upwards towards the upper part of the wall surface through the bending force of the hands and the force of jumping feet. In this case, a "skip" control code has been previously established on the floor on this side of the wall surface, as shown in Fig. 19. In step S436, the CPU 11 determines whether the height of the The wall surface is or is not in the range from 100 cm to 150 cm, that is, if 100 < H. = 150 or not. This height interval means a high wall surface. In this case, the player object can surpass the wall surface by usual climbing. Accordingly, the CPU 11 in the next step S437 reads the animation data required to cause the player object to start an "average climbing" action which is shown in figure 20. In the example of figure 20 of "average climbing" ", the player object responds to a" jump "code contained in a floor object at the front of the floor, jumps slightly towards the front of the target wall surface by placing his hands on the wall surface at the upper end. . The player object at this moment is floating on the piee so that his body is tilted towards the upper end of the wall only through the bending force of his hands. In step S438, the CPU determines whether or not the wall surface height is in a range of 150 cm to 250 cm, that is, if 150 < H £ 250 or not. This height interval means an extremely high wall surface. In this case, the player object can overcome the wall surface by difficult climbing. Accordingly, the CPU 11 in the next step S439 causes the player object to begin a "hard climbing" action shown in figure 21. In the example of figure 21 of "hard climbing", the player object responds to a code "Jump" control on a floor object at the front of the object wall surface, and perform a raised jump to place hands on the upper end of the wall. The player object is floating on its feet so that the body is lifted towards the upper wall end through only the flexing force of its hands. In this way, the CPU 11 detects a control code or action code contained in the object data of a terrain object in, or on the vicinity of which the player object exits, so that the player object is caused perform an action according to the control code or action code, that is to say, the wall is opened in the modality. It should be noted that when the control code or action code contained in the wall surface object is "climbing", the process to advance and overcome the wall surface is by "climbing" instead of "jumping" as shown in FIG. explain before However, if a "prohibited" code is embedded in the wall surface object, the player object is not allowed to advance over the wall surface. Fig. 22 shows a subroutine for the "gate action" that is shown in step S414 of Fig. 9. In the first step S440 of Fig. 22, the CPU refers to the controller data area 207 of the RAM 14 and determines if the button A 47A (figure 1) has been operated by the player or if it has not been done that way. If the button A 47A has not been operated, it is determined that the player does not intend to cause the player object to enter through the door, and the process returns to its main routine.
If "YES" is determined in step S440, the CPU 11 in the next step S411 makes the reference to the image data area 205 (FIG. 6) and detects a gate position from a coordinate position of a polygon that constitutes the door. In the following S442, the player object is corrected in position so that the player object is correctly positioned in a position to open the door, in baee in the door poem detected in step S441. The corrected player object position is recorded in the image data area 205. Subsequently, in step S443, a door action is carried out. That is, the CPU reads the required polygon data or the animation data from the image data area 205, to make the player object perform a series of gate actions (i.e., gimmick a knob, open the door , enter through the door and close the door). Figure 23 illustrates a state that is in the door action of the player object that will perform to enter a door. That is, the "gate action" that is to be executed in step S413 of figure 9, the player object is automatically provoked to perform an action of opening and closing a door according to a "door code" established previously in a terrain object that is shown in figure 23. Incidentally, in the above embodiment, an explanation is made that the "door" is placed as a control code or action code in a floor object immediately in the Front of the door. Contrary to this, the "door" code can be placed on the door object. Fig. 24 shows a detail of a "ladder" action that is executed in step S416 of Fig. 9. In the first step 444, of Fig. 24, it refers to the image data area 205 (Fig. 6). ) to detect a ladder position from a coordinate position of a polygon that constitutes the ladder. In the next step S445, the player object is corrected in position so that the player object is placed at the piee of the ladder, based on the ladder position detected by step S444. The corrected player object position is recorded in the image data area 205. Subsequently, in step S446, a ladder action is carried out. That is, the CPU 11 reads the polygon data required or the animation data from the image data 205 and causes the player object to perform a series of steps to climb a ladder, that is, to put hands and loe feet on the stairs and alternately move left and right hands and feet. rights over the ladder. Figure 25 illustrates a state in which the player object ascends the ladder haeta the middle part thereof. This ee, for the "stepladder action" to be executed in step S416 of figure 9, the player object is automatically triggered so that it performs an ascender on a ladder according to the "ladder" code "previously placed in a terrain object shown in Fig. 25, that is, a wall surface object. Explained in more detail, in the embodiment of FIG. 25, a "inactivated stage" control code has not been placed on the floor object in the front of the wall surface object constituting the ladder. Accordingly, the player object can perform a "climb ladder" action according to a "ladder" control code for the wall surface object. This does not affect the action of climbing a ladder unless the control code on the floor object is on "inactivated step". If a "ladder" code is established on the surface of the wall, the player object, when facing the wall surface with a ladder, automatically evens the ladder. Later, the player object ascends by the ladder if the player tilts the control lever 45 (figure 1) upwards, and descends if he inclines it downwards. If the player object reaches the uppermost position of the ladder, the player object automatically stops on the floor near the member. Meanwhile, if the player object comes from the top floor of the ladder, a "step ladder" code is detected on the floor object in the rear part of the ladder so that the player object can descend the ladder according to the code. In this way, according to this modality, it is possible to cause the player object to automatically perform a different action depending on a control code or an action code previously contained in a terrain object when a player object exists. Consequently, it is very easy to adjust the program to control the action of the player object. Incidentally, a flow chart shown in Figure 26 represents a player object processing operation for step S5 of the main routine of Figure 7. In the first step S501, the CPU 11 determines whether the player object is in a course of action or not. If it is in a course of action, a position and the pose of the player object are determined so that the player object continues its action. The pose is determined by animation data as explained above. If the player object is not in a course of action, the CPU 11, in the next step S503 detects a state of operation of the joystick 45 (Figure 1, Figure 4) included in the controller 40. Subsequently, they are determined respectively in steps S503, S504 and S505 a direction of movement, movement speed and position and pose of the player object, according to the operating state of the control lever 45. In step S507, the player object is registered with the display list area 201 (FIG. 6) of RAM 14, similar to the case, after passing through step S502. In response, the player object will be displayed depending on the operation status of the control lever 45. The camera determination process in step S6 of FIG. 9 of the main routine is explained in detail with reference to FIG. 27 as well as the related figures. In the first step S601 of FIG. 27, the CPU 11 references the data in the image data area 205 and detects a control code (camera code) previously established in the object data of a terrain object that they exist below the player object. In each of the steps S602, S604, S606, S608 and S610, the detected control code is determined to be a first chamber code, a second chamber code, a third chamber code, a fourth chamber code or a fifth camera code. The explanation is then presented in a first camera, second camera, third camera, fourth camera and fifth camera which have been placed in the virtual three-dimensional space in the mode, in baee in figure 28. In an example of figure 28 , a longitudinal wall is provided in almost a center of space that is rectangular in plan, where a door is formed in a part of the wall. A third chamber is fixedly fixed on one side of the door (one side of the door opening) which is directed towards the door. On one side opposite the door, a fourth camera is placed. This fourth camera is provided as an approach camera for the player object to open and enter through the door. In addition, a second chamber and a fifth chamber are fixed individually in two respective corners in space. The first camera is provided as a moveable camera which is allowed to move following the player object. The camera control is explained below assuming that this mode has five virtual cameras, in the three-dimensional space as indicated above. However, it is not necessary to mention that it can be modified as required by the number, distribution and function or paper (fixation, in movement, approach, etc.) appropriately. Note that in Figure 28, the term "first chamber", "second chamber", ... "fifth chamber" that are provided by the blocks (rectangular networks) respectively represent control codes, or camera codes, which were previously have placed on the terrain objects of this three-dimensional space. Consequently, when a player object exists in a block, the player object will be taken by a camera corresponding to a camera code that has been established in that block. Referring again to Fig. 27, if a first camera code is detected in step S602, then the next step S603, a first camera control program is electively placed. The camera control program, as explained above, is placed in the area 22f of the camera control program (Figure 5) of the external ROM 21, which is transferred as required to the program area 202 of the RAM 14 internal Accordingly, the CPU 11 in step S603 reads a first camera control program from the program area 202 (FIG. 6). The control program of the first camera is a control program for the first camera, and the first camera is placed to move following the player object as described above. In the control program of the first camera detailed in FIG. 29, in step S612, the data in the image data area 205 (FIG. 6) referenced therein first detect a position of the player object. In the next step S613, the CPU 11 determines a position of the first chamber so that the distance from the player object to the first chamber becomes constant. In step S614, the first camera is directed to an image taken in the direction towards the player object. Consequently, the first camera will take a view of the player-back object with a steady diet. In a second chamber control program to be executed in step S605 (FIG. 27), in the first step S615, a position of the player object is detected, as shown in FIG. 31, similarly to the previous stage S612 (figure 29). Next, in step S616, the second camera is directed to form an image by taking the direction of the player object. That is, the second camera will take the player object from a fixed position shown in Fig. 28. Incidentally, because the camera machine has a fixed camera, in the same way as the second camera, a control program of the fifth camera to be selected in the stage S611 is similar to the control program of the second chamber of FIG. 31. The third chamber is fixedly fixed to the front of the door, as shown in figure 28. Consequently, the third chamber only takes the object player entering or leaving the door at a point of constant distance. Because of this, the control program of the third chamber of step S607 (FIG. 27) includes step S617 of FIG. 32. In this step S617, the third chamber is directed to an address to take an image of the door. Accordingly, the way in which the player object enters or leaves the door will be captured by the third camera, as shown in figure 33. Figure 34 shows a detail of a control program of a fourth camera that is executed in the camera S609 of figure 27. The fourth camera is chosen, as will be chosen well from figure 28, when a fourth camera code is detected which has been placed on a block in which the player object has entered. . In the first step S618 of FIG. 34, the number of frames is detected after detecting a fourth chamber code and step S609 is inputted, ie after the chamber replacement. This is because there are ways in which the fourth camera captures the player object. If the number of frames is less than a predetermined number, that is, when it is immediately after the camera change, "YES" is determined in step S619. In this case, the CPU 11 in step S620 controls the fourth camera so that the fourth camera captures, from a predetermined position, the player object entering through the door. The player object captured by the fourth camera in step S620 is illustrated in figure 35. As will be understood from figure 35, the fourth camera provides in a fixed manner a position shown in figure 28, and in step S620 in which, immediately after the change of camera, acquires a dietary frame from the player object that enters through the door. That is, the fourth camera captures a comparatively wide range that includes the player object. Consequently, when the player object enters through the door as in this modality, from a total view display, the player can easily understand at what moment the player object as a hero comes out at that moment. Before a predetermined number of frames or a time elapses since the camera change but not immediately after the camera change, "NO" is determined in step S621. In this case, in the next step S622, the CPU 11 causes the fourth camera to zoom in order to take a close-range view of the player object, as shown in figure 36. This is the image that is it captures in a comparatively right range, but includes the player object. If a predetermined number of frames has elapsed, "YES" is determined in step S621. In this case, the CPU 11 changes from the fourth chamber to the first chamber, as shown in step S623. In this way, according to this modality, it is possible to automatically change over the camera that captures the player object and its function depends on a control code, or camera code, previously contained in a terrain object where the player object exits . Consequently, even when the problematic camera change is necessary, it is very easy for a program to establish the theme. Meanwhile, when the camera switches depending on the position of the player object (position of the X-Y coordinates), the camera changes if the same X-Y coordinate is performed, without considering a Z coordinate or height. On the contrary, in the method of this mode, the camera change codes are embedded in the terrain objects. Consequently, in the case of the same X-Y plane but of different height (Z), it is possible to place a different terrain object, that is, a camera code, and therefore a different camera. That is, in the modality, the camera change is feasible in a three-dimensional way. Incidentally, after finishing any of the stages S620, S622 and S623, the process returns to the main routine. With reference to FIG. 37, a detailed explanation is made with respect to step S9 of the main routine of FIG. 7, that is, the processing of sound. This sound processing uses the control codes explained above. Accordingly, in the first step S624 of FIG. 37, reference is made to the image data area 205 for detecting a control code, or melody code, set on a ground object when the player object exits therein. . In step S625, it is determined whether the control code, or melody code is over BGM1 or not. The melody code BGMl is a code to select a first BGM (background). Accordingly, after an "YES" is determined, in step S625, the CPU 11 in step S626 reads the melody or sound data of the first BGM out of the sound memory area 206 shown in FIG. 6. , 'and transmits it to the common link control circuit 121 (figure 2). Yes in the. step S625 a "NO" is determined, it is determined in step S627 whether the control code, or the melody code, is or not for BGM2. The melody code BGM2 is a code to select a second BGM. Accordingly, after an "YES" is determined in step S627, the CPU 11 in step S628 reads the melody or the sound data for the second BGM out of the sound memory area 206 and transmits it to the circuit 121. of common link control. After a "NO" is determined in both stages S625 and S627, the CPU 11 in step S629 determines whether the control code, or the melody code, is for a "crying" or not. The "crying" melody code is a code to generate a crying sound effect. Accordingly, after an "YES" is determined in step S629, the CPU 11 in step S630 reads the sound data for a "crying" of the memory area 206 and transmits it to the control circuit 21 of common link. Incidentally, when the control code or melody code or sound code is different from those described above, then other sound data is established in step S631. In this way, according to this mode, it is possible to automatically change the sound to generate it according to a control code, or sound code, contained in a terrain object when a player object exists. Consequently, even when a problematic control is necessary to change the sound, it is easy to establish a program for it to do so. Although the present invention has been described and illustrated in detail, it is clearly understood that the term is only by way of illustration and example and that it should not be considered as limiting, the spirit and scope of the present invention is limited only by the terms of reference. the attached claims. It is noted that in relation to this date, the best method known by the applicant to carry out the aforementioned invention, is the conventional one for the manufacture of the objects or products to which the member refers.

Claims (21)

CLAIMS Having described the invention as above, the content of the following claims is claimed as property:
1. A video game apparatus for generating and supplying to a screen, an image signal for displaying a player object that exists on a terrain object in a virtual three-dimensional space by processing image data for the player object and the terrain object of according to a program, the video game apparatus is characterized in that it comprises: a player object image data generating means for generating player object image data to display a player object; and a terrain object image data generating means for generating terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code; the video game apparatus further comprises a program control code detecting means for detecting a program control code in relation to a position of the player object, and a means for changing the image, to cause the image signal change based on the control code of the detected program.
2. The video game apparatus, according to claim 1, characterized in that the program control code includes an action code for controlling an action of the player object, the means for changing the image includes a means of data transmission of animation to transmit animation data to automatically cause the player object to perform an action in accordance with the action code.
3. The video game apparatus, according to claim 2, characterized in that the terrain object is a hole or hole and the action code is "ealtar", the means of tranemission of animation data transmits animation data for cause the player object to perform a "jump over a hole or hole" action.
The video game apparatus, according to claim 3, characterized in that the video game apparatus has a controller, in association with it, that includes a means of steering inetruction to instruct a direction of movement of the object player so that the player object moves in the direction of movement, the video game apparatus further comprises; a means of detecting movement speed to detect the speed of movement of the player object, and a means of calculating the distance of jump to calculate a jumping diet of the player object based on the speed of movement, the means of data transmission of animation transmits animation data to cause the player object to perform a jumping action, according to the distance of the jump.
5. The video game apparatus, according to claim 2, characterized in that when the terrain object is a wall surface and the action code is "climb", the transmission data transmission means tranemite talee datoe of animation so that the player object performs a climbing action on the wall surface. ~ 6.
The video game apparatus, according to claim 5, characterized in that when the action code is not "climbing", a means of calculating the height of wall surface ee additionally includes to calculate a height of the surface of wall, the means of transmission of animation data transmits such data of animation so that the player object performs an optimal action according to the height of the wall surface.
The video game apparatus, according to claim 1, characterized in that the program control code includes a camera control code, the image change means includes a camera control means for controlling a virtual camera that is provided in the three-dimensional virtual space.
8. The video game apparatus, according to claim 7, characterized in that the virtual camera includes a plurality of virtual cameras, the camera control code includes a camera change code, and the camera control means includes a camera change means for changing between a plurality of virtual cameras depending on the camera change code.
9. The video game apparatus according to claim 1, characterized in that the program control code includes a sound code, and further comprising: an audio data generator means for generating sound data; means of sound control to control the sound that is transmitted from the sound data generating means that depends on the sound code.
The video game apparatus, according to claim 9, characterized in that the sound data generating means can generate sound data for a plurality of sounds, the sound code includes a sound change code and the medium Sound control includes a means of sound change to change the sound data depending on the code of sound change.
11. A video game apparatus for generating and supplying a screen with an image signal to display a player object that exceeds a terrain object in a virtual three-dimensional space by processing image data for the player object and a ground object according to a program, and which also provides a sound signal to an audio streaming medium when processing sound data according to a program, the video game apparatus is characterized in that it comprises: a means for generating image data of player object to generate player object image data to display a player object; and a terrain object image data generating means for generating terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code; the video game apparatus further comprises a means for detecting program control code for detecting the program control code in relation to a position of the player object, and a means for changing the sound to cause the sound signal to change according to the detected program control code.
12. A memory medium applicable to a video game apparatus for generating, and supplying to a screen, an image signal for displaying a player object that exists on a terrain object in a virtual three-dimensional space when processing image data for the object player and the terrain object, according to a program, and memorized with a program that is to be processed by an information processing means included in the video game apparatus, the memory means comprises: a data generating program of player object image to generate player object image data to display a player object; and a terrain object image data generating program for generating terrain object image data to display a terrain object; wherein the terrain object image data includes a program control code; and further comprising: a program control code detecting program for detecting the program control code in relation to a position of the player object, and an image change program for causing the image signal to change based on the program control code detected.
The memory medium, according to claim 12, characterized in that the program control code includes an action code for controlling an action of the player object, the program for changing the image includes an animation data transmission program to transmit animation data to automatically cause the player object to perform an action depending on the action code.
14. The memory medium, according to claim 13, characterized in that the image data generating program of terrain object generates a terrain object of a hole or hole and a "jump" action code, the program of introduction of animation data transmits animation data to cause the player object to perform an action of jumping over the hole or hole.
15. The memory medium, according to claim 14, characterized in that the video game apparatus has a controller, in conjunction with it, that includes a means of steering destruction to create a direction of movement of the player object. In the manner in which the player object moves in the direction of movement, the memory means further comprises: a movement speed detector program for detecting the speed of movement of the player object, and a program for calculating the distance of the jump to calculate a distance of ealto of the object player in baee in the speed of movement, and a program of transmission of datoe of animation that transmits datoe of animation to cause that the object player realizes a action of jumping, according to the distance of ealto.
16. The memory medium, according to claim 13, characterized in that the program generating image data of terrain object generates a terrain object of a wall surface and a "climbing" action code, and the program The animation data tranemission transmits such animation data so that the player object performs a climbing action on the wall surface.
17. The memory medium, according to claim 16, characterized in that when the action code is not "climbing", a wall surface height calculation program is additionally included to calculate the wall surface height, the The animation data transmission program transmits such animation data so that the player object performs an optimal action depending on the height of the wall.
18. The memory medium, according to claim 12, characterized in that the terrain object image data generating means generates terrain object image data including a camera control code, and the change program of Image includes a camera control program to control a virtual camera that is provided in the three-dimensional virtual space.
19. The memory medium, according to claim 18, characterized in that the virtual camera includes a plurality of virtual cameras, the camera control code includes a camera change code, and a camera control program includes a camera change program to change between the plurality of virtual cameras.
The memory medium, according to claim 12, characterized in that the terrain object image data generating means generates a terrain object that includes an eonid code of a program control code, which further comprises: a program for generating sound data to generate sound data, and a sound control program for controlling the sound that is transmitted from the sound generator of sound data that depends on an audio code.
21. The memory medium, according to claim 20, characterized in that the sound data generating program can generate output data of a plurality of sound ones, the sound code includes the sound change code, and the sound The program of sound control includes a sound change program to change the sound data depending on the sound change code.
MXPA/A/1999/010643A 1998-11-19 1999-11-18 Video game device and media storage device for vi game MXPA99010643A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP10-329805 1998-11-19

Publications (1)

Publication Number Publication Date
MXPA99010643A true MXPA99010643A (en) 2000-08-01

Family

ID=

Similar Documents

Publication Publication Date Title
US6712703B2 (en) Video game apparatus and information storage medium for video game
US6612930B2 (en) Video game apparatus and method with enhanced virtual camera control
US6692357B2 (en) Video game apparatus and method with enhanced player object action control
CA2289952C (en) Video game apparatus and information storage medium for video game
KR100618740B1 (en) A video game apparatus and a writable media thereof
EP0830881B1 (en) Simulative golf video game system
CA2254277C (en) Video game apparatus and memory medium therefor
JP3426105B2 (en) Video game system and storage medium for video game
CA2253991C (en) Video game apparatus and memory medium used therefor
US6325717B1 (en) Video game apparatus and method with enhanced virtual camera control
JP2007301039A (en) Game program and game apparatus
CA2289973C (en) Video game apparatus and information storage medium for video game
MXPA99010643A (en) Video game device and media storage device for vi game
CA2289391C (en) Video game apparatus and information storage medium for video game
JP5396006B2 (en) GAME PROGRAM AND GAME DEVICE
US20110039625A1 (en) Game Control Program, Game Device, And Method For Controlling Game
JP2000157731A (en) Game device, storage medium, and character operation setting
JP2007190404A (en) Gaming device and memory medium thereof
JP2006055660A (en) Game device and storage medium thereof
JP4484089B2 (en) Recording medium, image processing apparatus, and image processing method
JP2007216033A (en) Game device and storage medium thereof
JP2006055653A (en) Game device and method of controlling the same
JP2004223286A (en) Game device and memory medium thereof
MXPA98009030A (en) Video and media play memory equipment for im
JP2006051396A (en) Game device and storage medium of the same