US20010016511A1 - Entertainment system, entertainment apparatus, recording medium, and program - Google Patents

Entertainment system, entertainment apparatus, recording medium, and program Download PDF

Info

Publication number
US20010016511A1
US20010016511A1 US09/784,895 US78489501A US2001016511A1 US 20010016511 A1 US20010016511 A1 US 20010016511A1 US 78489501 A US78489501 A US 78489501A US 2001016511 A1 US2001016511 A1 US 2001016511A1
Authority
US
United States
Prior art keywords
action
hypothetical
special action
control
special
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US09/784,895
Inventor
Akihiro Hino
Kentaro Motomura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Assigned to SONY COMPUTER ENTERTAINMENT INC. reassignment SONY COMPUTER ENTERTAINMENT INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HINO, AKIHIRO, MOTOMURA, KENTARO
Publication of US20010016511A1 publication Critical patent/US20010016511A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/422Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5375Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/833Hand-to-hand fighting, e.g. martial arts competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • A63F2300/6054Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands by generating automatically game commands to assist the player, e.g. automatic braking in a driving game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8029Fighting without shooting

Definitions

  • the present invention relates to an entertainment system having at least an entertainment apparatus for executing various programs, a manual controller, and a display unit, an entertainment apparatus for executing various programs, a recording medium storing a program and data that are used by the entertainment apparatus, and a program for realizing the entertainment system.
  • Some entertainment systems including entertainment apparatus such as video game machines display video game images based on video game data stored in a recording medium such as a CD-ROM or the like on the display screen of a television receiver while allowing the user or game player to play the video game with commands entered via a manual controller.
  • entertainment apparatus such as video game machines display video game images based on video game data stored in a recording medium such as a CD-ROM or the like on the display screen of a television receiver while allowing the user or game player to play the video game with commands entered via a manual controller.
  • the entertainment apparatus and the manual controller are usually connected to each other by a serial interface.
  • the manual controller sends key switch information based on the user's control entries in synchronism with the clock signal.
  • vibration generating means for applying vibrations to the user based on a request from an external apparatus such as an entertainment apparatus, for example. While a video game is in progress, the vibration generating means applies various different kinds of vibrations to the user in response to user's different control entries.
  • Games that are played on video game machines include role-playing games, shooting games, driving games, combat games, etc.
  • a combat game is designed for a principal character to attack and knock out an opponent, and offers a variety of attack patterns available for the principal character to employ.
  • the user or game player selects an attack pattern depending on the situation in which the principal character is placed.
  • the user or game player wages an attack on an opponent in the selected attack pattern at an optimum time to cause substantial damage to the opponent for thereby placing the principal character in an advantageous position in the progress of the game. Therefore, the user or game player can enjoy the excitement of the combat game.
  • an entertainment system comprising an entertainment apparatus for executing various programs, at least one manual controller for entering control requests from the user into the entertainment apparatus, a display unit for displaying images outputted from the entertainment apparatus, normal action rendering means for displaying a normal action image in which at least two hypothetical characters are making a normal action, and special action rendering means for displaying a special action image in which at 5 least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
  • an entertainment apparatus for connection to a manual controller for outputting a control request from the user and a display unit for displaying images, comprising normal action rendering means for displaying an image in which at least two hypothetical characters are making a normal action, and special action rendering means for displaying an image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
  • a recording medium storing a program and data for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images
  • the program comprising the steps of displaying a normal action image in which at least two hypothetical characters are making a normal action, and displaying a special action image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
  • a program readable and executable by a computer for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images
  • the program comprising the steps of displaying a normal action image in which at least two hypothetical characters are making a normal action, and displaying a special action image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
  • a normal action i.e., a normal battle action
  • a certain condition is satisfied in the normal battle action, e.g., when one of the hypothetical characters stands with its back facing a wall or puts a foot into a step, at least one of the principal characters changes from the normal battle action to a special battle action.
  • the special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example.
  • the special action rendering means or step may comprise icon rendering means for, or the step of, displaying an icon indicative of the special action when the hypothetical character changes from the normal action to the special action. Therefore, the user is aware of the entry into the special action, i.e., can be mentally prepared to enter the special action, and hence can subsequently make smooth control actions to perform the special action.
  • the icon is thus effective to allow children and old people to enjoy the video game.
  • the special action rendering means or step may comprise guidance rendering means for, or the step of, rendering an indicia representing at least one of control members on the manual controller as the guidance, and special action continuation determining means for, or the step of, comparing a time to prompt the user to press the control member corresponding to the indicia with a time when the user presses the corresponding control member, and determining whether the hypothetical character is to make the special action based on a compared result.
  • the user thus only needs to operate the control member corresponding to the indicia at the prompting time indicated by the guidance.
  • the special action continuation determining means or step may comprise means for, or the step of, transferring control over to the normal action rendering means when the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to the indicia beyond an allowable range. Therefore, if the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to the indicia while the hypothetical character is making the special action, then the hypothetical character which is making the special action returns to the normal action.
  • the special action rendering means or step may comprise means for, or the step of, changing the number of indicia and the time to prompt the user to press the control member corresponding to each of the indicia, depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among the at least two hypothetical characters. For example, if the hypothetical character which is making the special action is a principal character and the other hypothetical character is a monster, then the number of indicia and the time to prompt the user to press the control member corresponding to each of the indicia may be changed depending on characteristics of the monster. Accordingly, elements of fun are added to battle scenes which would tend to be monotonous, keeping the user interested in the video game.
  • the special action rendering means or step may comprise bar rendering means for, or the step of, rendering a bar which progresses randomly toward the indicia, and the time to prompt the user to press the control member corresponding to the indicia may represent a time when a leading end of the bar reaches the indicia. Therefore, the user may press the control member corresponding to the indicia at the time when the leading end of the bar reaches the indicia.
  • the special action rendering means or step may comprise means for, or the step of, changing a progressing pattern of the bar depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among the at least two hypothetical characters.
  • the user may possibly overlook a special action made by the hypothetical character displayed in a central area of the display screen because the user may be distracted to the progress of the bar displayed in the lower region or the corner of the display screen.
  • the bar may be displayed as a semitransparent image of a large size in the center of the display screen, rather than in a lower region or a corner of the display screen.
  • the user can see a special action made by the hypothetical character and press control members on the manual controller while simultaneously confirming the progressing pattern of the bar. Therefore, the user can see and enjoy various special actions made by the hypothetical character.
  • FIG. 1 is a block diagram of a circuit arrangement of an entertainment system according to the present invention
  • FIG. 2 is a perspective view of a manual controller
  • FIG. 3 is a plan view of the manual controller
  • FIG. 4 is a block diagram of a circuit arrangement of the manual controller
  • FIG. 5 is a view showing a displayed image of a normal battle action of a principal character
  • FIG. 6 is a view showing a displayed image similar to FIG. 5, with an icon displayed indicating that the principal character is about to wage a special attack;
  • FIG. 7 is a view showing a displayed image similar to FIG. 5, with the principal character waging a special attack as a bar progresses in a guidance;
  • FIG. 8 is a view showing another guidance
  • FIG. 9 is a functional block diagram of a battle mode processing means according to the present invention.
  • FIG. 10 is a flowchart of a processing sequence of the battle mode processing means, particularly, a process of rendering a normal action of a monster with a normal action rendering means;
  • FIGS. 11 through 14 are flowcharts of a process of rendering a normal action of a principal character with the normal action rendering means
  • FIG. 15 is a flowchart of a processing sequence of an icon rendering means of a special action rendering means
  • FIG. 16 is a flowchart of a processing sequence of a guidance rendering means of the special action rendering means
  • FIG. 17 is a flowchart of a processing sequence of a special action continuation determining means of the special action rendering means.
  • FIG. 18 is a flowchart of a processing sequence of the special action rendering means.
  • FIGS. 1 through 18 An entertainment system, an entertainment apparatus, a recording medium, and a program according to the present invention as applied to a video game apparatus will be described below with reference to FIGS. 1 through 18.
  • an entertainment system 10 basically comprises an entertainment apparatus 12 for executing various programs, a memory card 14 detachably connected to the entertainment apparatus 12 , a manual controller 16 detachably connected to the entertainment apparatus 12 , and a display monitor 18 such as a television receiver which is supplied with video and audio output signals from the entertainment apparatus 12 .
  • the entertainment apparatus 12 reads a game program recorded in a mass storage medium such as an optical disk 20 such as a CD-ROM or the like, and executes a game, for example, based on the program depending on commands supplied from the user, e.g., the game player, via the manual controller 16 .
  • the execution of the game mainly represents controlling the progress of the game by controlling the display of images and the generation of sounds on the display monitor 18 based on manual input actions entered from the manual controller 16 .
  • the entertainment apparatus 12 generally comprises a control system 200 , a graphic generating system 204 connected to the control system 200 via a system bus 202 , a sound generating system 206 connected to the control system 200 via the system bus 202 , and an optical disk control system 208 connected to the control system 200 via the system bus 202 .
  • a communication controller 210 for controlling data to be inputted to and outputted from the manual controller 16 and the memory card 14 is also connected to the control system 200 via the system bus 202 .
  • the manual controller 16 supplies commands (including control data) from the user via a communication controller 150 (see FIG. 4) of the manual controller 16 and the communication controller 210 to the entertainment apparatus 12 .
  • the optical disk control system 208 includes an optical disk drive 212 in which the optical disk 20 , which may comprise a CD-ROM or the like as a specific example of a recording medium according to the present invention, is loaded.
  • the control system 200 controls motions of characters displayed on the display monitor 18 based on a program and data read from the optical disk 20 and commands supplied from the manual controller 16 .
  • the control system 200 includes an MPU 220 for controlling the entertainment apparatus 12 , a main memory 222 used for operating various programs and storing various data, a peripheral device controller 224 for controlling interrupts and direct memory access (DMA) data transfer, a read-only memory (ROM) 226 which stores various programs such as an operating system program for managing the graphic generating system 204 , the sound generating system 206 , etc., and which has an OSD function to execute a control program such as kernel, and a real-time clock (RTC) 228 having a calendar and clock function.
  • the main memory 222 is capable of executing at least the game program thereon.
  • the MPU 220 controls the entertainment apparatus 12 in its entirety by executing the operating system program stored in the ROM 226 .
  • the MPU 220 comprises a 32-bit RISC-CPU, for example.
  • the MPU 220 executes the operating system program stored in the ROM 226 to start controlling the graphic generating system 204 , the sound generating system 206 , etc.
  • the MPU 220 When the operating system program is executed, the MPU 220 initializes the entertainment apparatus 12 in its entirety for confirming its operation, and thereafter controls the optical disc control system 208 to execute an application program such as a game program recorded in the optical disk 20 .
  • the MPU 220 controls the graphic generating system 204 , the sound generating system 206 , etc. depending on commands entered by the user for thereby controlling the display of images and the generation of music sounds and sound effects.
  • the graphic generating system 204 comprises a vector calculating unit 230 for performing floating point vector calculations required for geometry processing, an image processor 232 for generating image data under the control of the MPU 220 and outputting the generated image data to the display monitor 18 , which is a CRT in this embodiment, a graphic interface (GIF) 234 for arbitrating transfer paths between the MPU 220 , the vector calculating unit 230 , and the image processor 232 , and an image decoder 236 for decoding image data compressed and encoded by an orthogonal transform such as a discrete cosine transform.
  • GIF graphic interface
  • the image processor 232 comprises a rendering engine 240 , a memory interface 242 , an image memory 244 , and a display controller 246 such as a programmable CRT controller.
  • the rendering engine 240 renders (stores) image data into the image memory 244 via the memory interface 242 according to a rendering command supplied from the MPU 220 .
  • a first bus 248 is connected between the memory interface 242 and the rendering engine 240
  • a second bus 250 is connected between the memory interface 242 and the image memory 244 .
  • the first and second buses 248 , 250 each have a 128-bit width to allow the rendering engine 240 to render (store) image data at a high speed into the image memory 244 .
  • the rendering engine 240 is capable of rendering image data of 320 ⁇ 240 pixels or 640 ⁇ 480 pixels according to the NTSC or PAL standards on a real-time basis ten and several times to several tens times in a period of time ranging from ⁇ fraction (1/60) ⁇ to ⁇ fraction (1/30) ⁇ second.
  • the image memory 244 is of a unified memory structure capable of specifying a texture rendering area and a display rendering area as one area.
  • the display controller 246 serves to write texture data read from the optical disk 20 via the optical disk drive 212 and texture data generated in the main memory 222 into the texture rendering area of the image memory 244 via the memory interface 242 , and read image data stored in the display rendering area of the image memory 244 via the memory interface 242 and output the read image data to the display monitor 18 , which displays an image thereon based on the image data.
  • the sound generating system 206 comprises a sound processing unit (SPU) 260 for generating music sounds, sound effects, etc. based on instructions from the MPU 220 , and a sound buffer 262 for storing music sounds, sound effects, etc. generated by the SPU 260 .
  • Audio signals representing music sounds, sound effects, etc. generated by the SPU 260 are supplied to audio terminals of the display monitor 18 .
  • the display monitor 18 has a speaker 264 which radiates music sounds, sound effects, etc. generated by the SPU 260 based on the supplied audio signals.
  • the SPU 260 has an ADPCM (adaptive differential PCM) demodulating function for reproducing 16-bit audio data which has been encoded as 4-bit differential signal by ADPCM, a reproducing function for reproducing waveform data stored in the sound buffer 262 to generate sound effects, etc., and a modulating function for modulating and reproducing the waveform data stored in the sound buffer 262 .
  • ADPCM adaptive differential PCM
  • the sound generating system 206 with these functions can be used as a sampling sound source which generates music sounds, sound effects, etc. based on the waveform data stored in the sound buffer 262 according to instructions from the MPU 220 .
  • the optical disk control system 208 comprises an optical disk drive 212 for reproducing application programs and data recorded on the optical disk 20 , a decoder 270 for decoding programs and data that are recorded with an error correcting code added thereto, and a buffer 272 for temporarily storing data read from the optical disk drive 212 so as to allow the data from the optical disk 20 to be read at a high speed.
  • An auxiliary CPU 274 is connected to the decoder 270 .
  • Audio data recorded on the optical disk 20 which is read by the optical disk drive 212 includes PCM data converted from analog sound signals, in addition to the ADPCM data.
  • the ADPCM data which is recorded as 4-bit differential data of 16-bit digital data, is decoded by the decoder 270 , supplied to the SPU 260 , converted thereby into analog data, and applied to drive the speaker 264 .
  • the PCM data which is recorded as 16-bit digital data, is decoded by the decoder 270 and then applied to drive the speaker 264 .
  • the manual controller 16 has first and second control pads 34 , 36 , an L (Left) button 38 L, an R (Right) button 38 R, a start button 40 , and a selection button 42 .
  • the manual controller 16 also has joysticks 44 , 46 for inputting analog control actions, a mode selection switch 48 for selecting control modes of the joysticks 44 , 46 , and a mode indicator 50 for indicating a selected control mode.
  • the mode indicator 50 comprises a light-emitting element such as a light-emitting diode or the like.
  • the manual controller 16 has a housing 104 comprising an upper member 100 and a lower member 102 which are mated and joined to each other by fasteners such as screws.
  • a pair of left and right grips 106 , 108 projects from one side of respective opposite ends of the housing 104 .
  • the left and right grips 106 , 108 are shaped so as to be gripped by the palms of left and right hands of the user or game player when the manual controller 16 is connected to the entertainment apparatus 12 and information retrieval is carried out or the game is played thereby, for example.
  • the left and right grips 106 , 108 are progressively spaced away from each other toward their distal ends.
  • the left and right grips 106 , 108 are tapered from their joint with the housing 104 toward their distal ends, and have arcuate outer peripheral surfaces and arcuate distal end surfaces.
  • the first control pad 34 is disposed on one end of the housing 104 and comprises a first pressable control member (up button) 110 a , a second pressable control member (right button) 110 b , a third pressable control member (down button) 110 c , and a fourth pressable control member (left button) 110 d .
  • the first through fourth pressable control members 110 a , 110 b , 110 c , 110 d project on an upper surface of the housing 104 and are arranged in a crisscross pattern.
  • the first control pad 34 includes switch elements as signal input elements associated respectively with the first through fourth pressable control members 110 a , 110 b , 110 c , 110 d .
  • the first control pad 34 functions as a directional controller for controlling the direction of movement of a displayed game character, for example.
  • the game player selectively presses the first through fourth pressable control members 110 a , 110 b , 110 c , 110 d to turn on or off the switch elements associated respectively with the first through fourth pressable control members 110 a , 110 b , 110 c , 110 d
  • the displayed game character moves in the direction corresponding to the pressed one of the first through fourth pressable control members 110 a , 110 b , 110 c , 110 d.
  • the second control pad 36 is disposed on the other end of the housing 104 and comprises a first pressable control member ( ⁇ button) 112 a , a second pressable control member ( ⁇ button) 112 b , a third pressable control member ( ⁇ button) 112 c , and a fourth pressable control member ( ⁇ button) 112 d .
  • the first through fourth pressable control members 112 a , 112 b , 112 c , 112 d project on the upper surface of the housing 104 and are arranged in a crisscross pattern.
  • the first through fourth pressable control members 112 a , 112 b , 112 c , 112 d are constructed as independent members, and associated with respective switch elements disposed in the second control pad 36 .
  • the second control pad 36 serves as a function setting/performing unit for setting functions for a displayed game character assigned to the pressable control members 112 a - 112 d or performing functions of a displayed game character when the switch elements associated with the pressable control members 112 a - 112 d are turned on.
  • the L button 38 L and the R button 38 R are disposed on a side of the housing 104 remote from the left and right grips 106 , 108 and positioned respectively at the opposite ends of the housing 104 .
  • the L button 38 L has a first left pressable control member (L 1 button) 114 a and a second left pressable control member (L 2 button) 114 b
  • the R button 38 R has a first right pressable control member (R 1 button) 116 a and second right pressable control member (R 2 button) 116 b , respectively.
  • the L button 38 L and the R button 38 R have respective switch elements associated respectively with the pressable control members (the L 1 button 114 a , the L 2 button 114 b , the R 1 button 116 a , and the R 2 button 116 b).
  • the L button 38 L and the R button 38 R serve as respective function setting/performing units for setting functions for a displayed game character assigned to the pressable control members 114 a , 114 b and 116 a , 116 b or performing functions of a displayed game character when the switch elements associated with the pressable control members 114 a , 114 b and 116 a , 116 b are turned on.
  • the manual controller 16 also has first and second analog control pads 118 , 120 disposed respectively at confronting corners defined between the housing 104 and the proximal ends of the left and right grips 106 , 108 which are joined to the housing 104 .
  • the first and second analog control pads 118 , 120 have the respective joysticks 44 , 46 which can be tilted in all directions (360°) about control shafts thereof, and respective signal input elements such as variable resistors or the 25 like which are operable by the respective joysticks 44 , 46 .
  • the control shafts of the left and right joysticks 44 , 46 are normally urged to return to their neutral positions by biasing members.
  • the left and the right joysticks 44 , 46 can be freely tilted in all directions (360°) about the axes of the control shafts.
  • the first and second analog control pads 118 , 120 can move a displayed game character while rotating the same or while changing its speed, and can make an analog-like action such as to change the form of a displayed character, when the game player manipulates the joysticks 44 , 46 . Therefore, the first and second analog control pads 118 , 120 are used as a control unit for entering command signals for a displayed character to perform the above movement or action.
  • Analog input values produced when the user moves the left and right joysticks 44 , 46 are available in a vertical range from an uppermost vertical value of “0” to a lowermost vertical value of “255” and a horizontal range from a leftmost horizontal value of “0” to a rightmost horizontal value of “255”.
  • the first and second analog control pads 118 , 120 output other signals different from those signals in the horizontal and vertical ranges.
  • the mode selection switch 48 When the mode selection switch 48 is pressed, it can select a control mode for allowing a command signal to be inputted from the first and second analog control pads 118 , 120 or a control mode for inhibiting a command signal from being inputted from the first and second analog control pads 118 , 120 .
  • the mode selection switch 48 When the mode selection switch 48 is pressed, the functions of the first through fourth pressable control members 112 a , 112 b , 112 c , 112 d of the second control pad 36 , and the functions of the pressable control members 114 a , 114 b and 116 a , 116 b of the L button 38 L and the R button 38 R are changed depending on the control mode selected by the pressed mode selection switch 48 .
  • the mode indicator 50 flickers and changes its indication light.
  • the manual controller 16 comprises a communication controller 150 , a CPU 152 , a program memory 154 , a working RAM 156 , a digital input block 158 , an analog input block 160 , a left motor driver 170 L, a left motor 130 L, a right motor driver 170 R, and a right motor 130 R.
  • the digital input block 158 functions as a manual input controller for the pressable control members 110 a - 110 d of the first control pad 34 and the pressable control members 112 a - 112 d of the second control pad 36 .
  • the analog input block 160 functions as a manual input controller for the left and right joysticks 44 , 46 .
  • the digital input block 158 and the analog input block 160 allow the user to enter various items of information into the manual controller 16 .
  • the communication controller 150 has a function to effect serial communications with an external device.
  • the communication controller 150 is electrically connectable to the communication controller 210 (see FIG. 1) of the entertainment apparatus 12 , for example, for data communications with the entertainment apparatus 12 .
  • the characteristic function when a certain condition is satisfied while at least two hypothetical characters are making a normal action, at least one of the hypothetical characters changes from the normal action to a special action, and when a control action made by the user satisfies a certain control condition indicated by a guidance displayed on the display monitor, the special action performed by the hypothetical character is completed.
  • one of two hypothetical characters displayed on the display monitor 18 is used as a principal character 300 that moves under control commands from the user and the other hypothetical character as a monster 302 confronting the principal character 300 .
  • the normal battle action is such an action that when the user presses the control members 110 a - 110 d , e.g., when the user presses the ⁇ button 112 d , the principal character 300 attacks the monster 302 with a sword 304 , and when the user presses the ⁇ button 112 c , the principal character 300 defends himself from the monster 302 .
  • the principal character 300 changes from the normal battle action to a special battle action.
  • the special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example. When such a dead blow strikes the monster 302 , the HP of the monster 302 is instantaneously eliminated, and the monster 302 falls down.
  • an icon 306 indicative of the special battle action is displayed for 2 seconds, for example, in the vicinity of the principal character 300 .
  • the user can recognize that the principal character 300 enters the special battle action.
  • a guidance 310 is displayed substantially centrally on the display screen of the display monitor 18 .
  • the guidance 310 comprises an area 312 for displaying a bar 320 (hereinafter referred to as “bar display area 312 ”), and an area 316 for displaying indicia 314 representing the four control members, i.e., the ⁇ button 112 d , the ⁇ button 112 a , the ⁇ button 112 c , and the ⁇ button 112 b , (hereinafter referred to as “indicia display area 316 ”).
  • At least the bar display area 312 is displayed in a semitransparent fashion. In the example shown in FIG.
  • the indicia display area 316 is displayed as a strip. As shown in FIG. 8, however, the indicia display area 316 with the indicia 314 displayed therein may be positioned above the bar display area 312 , and strips 318 may be displayed in the bar display area 312 at respective positions aligned with the respective indicia 314 .
  • a bar 320 progresses or moves from the left end of the bar display area 312 toward the right end thereof with certain timing. While the bar 320 is progressing, the principal character 300 performs the special battle action different from the normal battle action, as shown in FIG. 7. In FIG. 7, the principal character 300 jumps backward, and at the same time cuts the monster 302 with the sword 304 which the principal character 300 is holding.
  • the software comprises a battle mode processing means 400 .
  • the battle mode processing means 400 can be supplied to the entertainment apparatus 12 from a randomly accessible recording medium such as the optical disk 20 or the memory card 14 , or a network. It is assumed in the present embodiment that the battle mode processing means 400 is read from the optical disk 20 such into the entertainment apparatus 12 .
  • the battle mode processing means 400 is downloaded in advance from the optical disk 20 played back by the entertainment apparatus 12 into the main memory 222 in the control system 200 thereof according to a predetermined process, and executed by the MPU 220 of the control system 200 .
  • the battle mode processing means 400 comprises a normal action rendering means 402 for displaying a normal action image in which at least two hypothetical characters (the principal character 300 and the monster 302 ) are doing a normal action, a special action rendering means 404 for displaying a special action image in which at least one of the hypothetical characters (the principal character 300 ) changes from a normal action to a special action when a certain condition is satisfied, and the special action made by the hypothetical character (the principal character 300 ) is completed when a control action made by the user satisfies a certain control condition in the guidance 310 , and an image display means 406 for outputting image data stored in the image memory 244 to the display monitor 18 to enable the display monitor 18 to display an image based on the image data.
  • a normal action rendering means 402 for displaying a normal action image in which at least two hypothetical characters (the principal character 300 and the monster 302 ) are doing a normal action
  • a special action rendering means 404 for displaying a special action image in which at least one of the hypothetical
  • the special action rendering means 404 comprises a condition determining means 410 for determining whether a condition to enter the special action is satisfied or not, an icon rendering means 412 for displaying the icon 306 (see FIG. 6) indicative of the special action when the hypothetical character (the principal character 300 ) changes from the normal action to the special action, a guidance rendering means 414 for rendering the guidance 310 (see FIG.
  • the guidance rendering means 414 has an indicia selecting means 420 for changing the number of indicia 314 displayed in the guidance 310 and an array of those indicia 314 depending on the characteristics of the opponent, i.e., the monster 302 .
  • the bar rendering means 416 has a progressing pattern selecting means 422 for changing a progressing pattern of the bar 320 depending on the characteristics of the opponent, i.e., the monster 302 .
  • a processing sequence of the battle mode processing means 400 will be described below with reference to FIGS. 10 through 18 .
  • the normal action rendering means 402 of the battle mode processing means 400 performs its own processing sequence.
  • the normal action rendering means 402 stores initial values “0” respectively in an index register i used to update the number of times that a special attack is continued, an index register j used to update the display of the icon 306 , and an index register k used to cause the principal character 300 to perform a normal attack action, thus initializing these registers i, j, k.
  • step S 2 the normal action rendering means 402 generates a random number.
  • step S 3 the normal action rendering means 402 selects an attack action pattern corresponding to the generated random number from a plurality of attack action patterns.
  • Each of the attack action patterns comprises an array of plural action data, and object data of the monster 302 is rendered based on the array of plural action data for thereby making one attack action on the display monitor 18 .
  • step S 4 the normal action rendering means 402 stores an initial value “0” in an index register m used to wage an attack action on the monster 302 , thus initializing the index register m. Thereafter, in step S 5 , the normal action rendering means 402 stores a background image based on the coordinates of a viewpoint in the image memory 244 .
  • step S 6 the normal action rendering means 402 reads mth action data from the selection attack action pattern.
  • step S 7 the normal action rendering means 402 determines whether there is action data or not. If there is action data, then control proceeds to step S 8 in which the normal action rendering means 402 rewrites vertex data of the object data of the monster 302 based on the content of the action data.
  • step S 9 the normal action rendering means 402 renders three-dimensional image data of the monster 302 based on the object data of the monster 302 , and stores the three-dimensional image data of the monster 302 in the image memory 244 .
  • step S 10 shown in FIG. 11 the normal action rendering means 402 determines whether a special attack is being continued or not based on whether a special attack flag is set to “1” or not. If the special attack flag is “0” indicating that a special attack is not presently waged, then control goes to step S 11 in which the normal action rendering means 402 determines whether a control input from the manual controller 16 is different from the preceding control input or not.
  • a different control input is entered when the user presses one of the direction buttons 110 a - 110 d or the ⁇ button 112 d , for example, after the user has not pressed any control members or buttons, or when the user presses the right button 110 b , for example, after the user has pressed the left button 100 d , for example.
  • step S 11 If a control input from the manual controller 16 is judged as being different from the preceding control input in step S 11 , then control goes to step S 12 in which the normal action rendering means 402 stores an initial value “0” in the index register k, thus initializing the index register k. Thereafter, in step S 13 , the normal action rendering means 402 calculates an action of the principal character 300 based on the present control input, thereby generating an action pattern comprising a plurality of action data. If a control input from the manual controller 16 is judged as being the same as the preceding control input in step S 11 , then control goes to step S 14 in which the normal action rendering means 402 increments the value of the index register k by “1”.
  • step S 15 the normal action rendering means 402 determines whether the present action pattern is an attack pattern or not based on whether the control input is an input corresponding to the ⁇ button 112 d , for example, which indicates an attack.
  • step S 16 If the present action pattern is an attack pattern, then control goes to step S 16 in which the normal action rendering means 402 reads kth action data from the present action pattern. In step S 17 , the normal action rendering means 402 determines whether there is action data or not, i.e., whether a normal attack action is finished or not.
  • step S 18 If there is action data, i.e., if a normal attack action is not finished, then control goes to step S 18 in which the normal action rendering means 402 rewrites vertex data of the object data of the principal character 300 based on the content of the present action data.
  • step S 19 the normal action rendering means 402 renders three-dimensional image data of the principal character 300 based on the object data of the principal character 300 , and stores the three-dimensional image data of the principal character 300 in the image memory 244 .
  • a hidden surface removal process for the background image and the three-dimensional images of the principal character 300 and the monster 302 is carried out according a Z buffering process.
  • step S 15 If the present action pattern is not an attack pattern in step S 15 , then control proceeds to step S 20 shown in FIG. 13 in which the normal action rendering means 402 reads kth action data from the present action pattern. Thereafter, in step S 21 , the normal action rendering means 402 determines whether there is action data or not, i.e., whether a normal action other than an attack action, e.g., a direction changing action or a defensive action, is finished or not.
  • a normal action other than an attack action e.g., a direction changing action or a defensive action
  • step S 18 and step S 19 shown in FIG. 11 If there is action data, i.e., if a normal action other than an attack action is not finished, then the processing in step S 18 and step S 19 shown in FIG. 11 is performed, and three-dimensional image data of the principal character 300 is stored in the image memory 244 .
  • step S 21 If a normal action other than an attack action is finished in step S 21 , then control returns to step S 12 shown in FIG. 13, and the normal action rendering means 402 generates an action pattern again.
  • step S 19 When the rendering process in step S 19 is finished, control goes to step S 22 shown in FIG. 12 in which the normal action rendering means 402 determines whether the icon 306 is being presently displayed or not based on whether an icon display flag is set to “1” or not. If the icon 306 is not being displayed, then control goes to step S 23 in which the condition determining means 410 determines whether a condition to wage a special attack is satisfied or not.
  • condition determining means 410 determines whether a condition to wage a special attack is satisfied or not based on whether the monster 302 is put in a disadvantageous position, e.g., whether the monster 302 stands with its back facing a wall or puts a foot into a step, or whether the HP (hit point) of the principal character 300 becomes equal to or lower than a certain value.
  • step S 24 If a condition to wage a special attack is not satisfied, then control goes via a decision process in step S 24 to step S 25 in which the image display means 406 outputs image data stored in the image memory 244 to the display monitor 18 , which displays a corresponding image on the display screen.
  • step S 25 When the processing in step S 25 is finished, control goes back to step S 5 in FIG. 10, and the processing from step S 5 is repeated.
  • step S 7 If there is no action data and an attack action by the monster 302 is finished in step S 7 , then control goes to step S 26 in which the normal action rendering means 402 calculates an effect imposed on the principal character 300 by the attack from the monster 302 based on the hitting ratio of the attack from the monster 302 , the level difference between the monster 302 and the principal character 300 , the attacking power of the monster 302 , the physical strength of the principal character 300 , and the protective gear that the principal character 300 wears. If the principal character 300 is not hit by the attack from the monster 302 , then the effect imposed on the principal character 300 by the attack from the monster 302 is “0”.
  • step S 27 the normal action rendering means 402 reduces the HP of the principal character 300 based on the calculated effect.
  • step S 28 the normal action rendering means 402 determines whether the HP of the principal character 300 is equal to “0” or not.
  • step S 2 If the HP of the principal character 300 is not equal to “0”, then control goes back to step S 2 to render image data for a next attack action by the monster 302 . If the HP of the principal character 300 is equal to “0”, then control goes to step S 29 in which the normal action rendering means 402 performs a gameover process, e.g., displays a message indicative of a gameover. The processing sequence of the battle mode processing means 400 is now put to an end.
  • step S 17 shown in FIG. 11 If there is no action data, i.e., if a normal attack action by the principal character 300 is finished, in step S 17 shown in FIG. 11, then control goes to step S 30 shown in FIG. 14 in which the normal action rendering means 402 calculates an effect imposed on the monster 302 by the attack from the principal character 300 based on the hitting ratio of the attack from the principal character 300 , the level difference between the principal character 300 and the monster 302 , the attacking power of the principal character 300 , the physical strength of the principal character 300 , and the defending ratio of the monster 302 .
  • step S 31 the normal action rendering means 402 reduces the HP of the monster 302 based on the calculated effect.
  • step S 32 the normal action rendering means 402 determines whether the HP of the monster 302 is equal to “0” or not.
  • the normal action rendering means 402 initializes the index register k in step S 33 . Thereafter, control goes back to step S 16 in FIG. 11 to render image data for a next attack action by the principal character 300 . If the HP of the monster 302 is equal to “0”, then control goes to step S 34 in which the normal action rendering means 402 displays an image showing a victory action, e.g., a triumphant gesture, made by the principal character 300 , and an acquisition of a prize from the monster 302 . The processing sequence of the battle mode processing means 400 is now put to an end.
  • a victory action e.g., a triumphant gesture
  • step S 23 in FIG. 12 If a condition to wage a special attack is satisfied in step S 23 in FIG. 12, then control goes via the decision process in step S 24 to a processing sequence of the icon rendering means 412 of the special action rendering means 404 .
  • the icon rendering means 412 sets the icon display flag to “1” in step S 35 shown in FIG. 15, and then renders the icon 306 indicative of the beginning of a special attack in the vicinity of the principal character 300 in step S 36 .
  • step S 37 the icon rendering means 412 increments the value of the index register j by “1”. Thereafter, in step S 38 , the icon rendering means 412 determines whether the icon 306 has been displayed for a predetermined time, e.g., 2 seconds, or not, based on whether or not the value of the index register j is equal to or greater than a number A which corresponds to the number of frames for 2 seconds.
  • a predetermined time e.g. 2 seconds
  • step S 25 in FIG. 12 If the icon 306 has not been displayed for 2 seconds, then control goes to step S 25 in FIG. 12 to display image data. At this time, the icon 306 is displayed in the vicinity of the principal character 300 which is doing a normal attack action.
  • step S 39 the icon rendering means 412 initializes the icon display flag, and sets a special attack flag to “1”, after which control goes to step S 25 in FIG. 12 to display image data.
  • step S 10 After the special attack flag is set to “1”, control goes via step S 10 shown in FIG. 11 to step S 40 shown in FIG. 16 in which the guidance rendering means 414 reads image data of the bar display area 312 and the indicia display area 316 of the guidance 310 .
  • step S 41 the indicia selecting means 420 generates image data of the guidance 310 based on the number of indicia 314 and an array of those indicia 314 depending on the characteristics of the monster 302 with which the principal character 300 is presently fighting.
  • step S 42 the indicia selecting means 420 stores the generated image data of the guidance 310 as semitransparent image data in the image memory 244 .
  • step S 43 the guidance rendering means 414 determines whether the present cycle is a first cycle or not based on whether the value of the index register i is “0” or not. If the present cycle is a first cycle, then control goes to step S 44 in which the guidance rendering means 414 stores an initial value “0” in an index register n used to display the bar 320 , thus initializing the index register n.
  • step S 45 the progressing pattern selecting means 422 reads a bar display progressing pattern corresponding to the present monster 302 from a plurality of bar display progressing patterns.
  • Each of the bar display progressing patterns comprises a plurality of registered progress data each representing the interval which the bar progresses per frame, then control goes to step S 46 in which the progressing pattern selecting means 422 increments the value of the index register i by “1”.
  • the bar rendering means 416 performs its processing sequence. First, in step S 47 , the bar rendering means 416 reads nth progress data from the bar display progressing pattern read in step S 45 . Then, in step S 48 , the bar rendering means 416 determines whether there is progress data or not. If there is progress data, then control goes to step S 49 in which the bar rendering means 416 renders the bar 320 as a semitransparent bar whose leading end 320 a is in substantially alignment with the position represented by the progress data.
  • the special action continuation determining means 418 performs its processing sequence. First, in step S 50 shown in FIG. 17, the special action continuation determining means 418 determines whether the leading end 320 a of the bar 320 reaches the indicia display area 316 (see FIG. 7) or one of the strips 318 (see FIG. 8), i.e., whether the time to prompt the user to press the control member is reached, or not, based on the coordinates of the leading end 320 a of the bar 320 and the range in which the indicia display area 316 or the strip 318 is displayed.
  • step S 51 the special action continuation determining means 418 determines whether there is a matching control input from the manual controller 16 or not, i.e., whether a control member represented by the indicia 314 corresponding to the indicia display area 316 or the strip 318 is pressed by the user or not. If there is a matching control input from the manual controller 16 , then control goes to step S 52 in which the special action continuation determining means 418 sets a matching flag to “1”.
  • step S 53 the special action continuation determining means 418 stores an initial value “0” in an index register p used to cause the principal character 300 to perform a special action, thus initializing the index register p.
  • step S 54 the special action continuation determining means 418 selects a special action pattern corresponding to the indicia 314 , for example, from a plurality of special action patterns.
  • Each of the special action patterns comprises an array of plural action data, and object data of the principal character 300 is rendered based on the array of plural action data for thereby making one special action on the display monitor 18 .
  • step S 51 If there is no matching control input in step S 51 , then control goes to step S 55 in which the special action continuation determining means 418 determines whether a matching control input from the manual controller 16 has already been entered or not based on whether the matching flag is set to “1” or not. If a matching control input from the manual controller 16 has already been entered, then control goes to step S 56 in which the special action continuation determining means 418 increments the value of the index register p by “1”.
  • step S 57 the special action continuation determining means 418 sets the matching flag to “0”, thus resetting the matching flag.
  • step S 58 the special action continuation determining means 418 increments the value of the index register p by “1”.
  • step S 54 When the processing in step S 54 , the processing in step S 56 , or the processing in step S 58 is finished, control goes to step S 59 shown in FIG. 18 in which the special action continuation determining means 418 reads pth action data from the selected special action pattern. Thereafter, control goes to step S 60 in which the special action continuation determining means 418 rewrites vertex data of the object data of the principal character 300 based on the content of the present action data. Then, in step S 61 , the special action continuation determining means 418 renders three-dimensional image data of the principal character 300 based on the object data of the principal character 300 , and stores the three-dimensional image data of the principal character 300 in the image memory 244 .
  • step S 61 When the processing in step S 61 is finished, control goes to step S 25 shown in FIG. 12 in which the image display means 406 outputs image data stored in the image memory 244 to the display monitor 18 , which displays an image thereon based on the image data. After the processing in step S 25 , control returns to step S 5 shown in FIG. 10 to repeat the processing from step S 5 .
  • steps S 5 -S 10 , steps S 40 -S 61 , step S 25 is repeated to display an image in which the principal character 300 makes a special action to deliver a succession of attacking techniques to the monster 302 .
  • step S 55 shown in FIG. 17 to step S 62 in which the special action continuation determining means 418 initializes the index register i.
  • step S 63 the special action continuation determining means 418 stores “0” in the special attack flag, thus resetting the special attack flag.
  • step S 5 shown in FIG. 10 in which the normal action rendering means 402 performs its processing sequence. That is, the principal character 300 changes from the special action back to the normal action.
  • step S 48 If a matching input is entered with respect to each of all the indicia 314 displayed in the guidance 310 , then control goes from step S 48 shown in FIG. 16 to step S 64 in which the special action continuation determining means 418 initializes the index register i. Thereafter, in step S 65 , the special action continuation determining means 418 stores “0” in the special attack flag, thus resetting the special attack flag. Control then goes to step S 34 shown in FIG. 14 in which the special action continuation determining means 418 displays an image showing a victory action, e.g., a triumphant gesture, made by the principal character 300 , and an acquisition of a prize from the monster 302 . The processing sequence of the battle mode processing means 400 is now put to an end.
  • a victory action e.g., a triumphant gesture
  • the battle mode processing means 400 as it is applied to a battle scene in a role-playing game or a combat game, for example, first performs a normal action, i.e., a normal battle action, between the principal character 300 and the monster 302 .
  • a normal action i.e., a normal battle action
  • the principal character 300 changes from the normal battle action to a special battle action.
  • the special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example.
  • the principal character 300 changes from the normal action to the special action, the change is indicated by the icon 306 . Therefore, the user is aware of the entry into the special action, i.e., can be mentally prepared to enter the special action, and hence can subsequently make smooth control actions to perform the special action.
  • the icon 360 is thus effective to allow children and old people to enjoy the video game.
  • the guidance 310 includes an indicia 314 rendered to indicate at least one control member on the manual controller 16 .
  • the time to prompt the user to press a control member corresponding to the indicia 314 and the time at which the user presses the control member are compared with each other, and it is determined whether the principal character 300 is to make a special action or not based on the compared result. The user thus only needs to operate the control member corresponding to the indicia 314 at the prompting time indicated by the guidance 310 .
  • the prompting time is the time when the leading end 320 a of the bar 320 as it progresses randomly from the left end to right end of the bar display area 312 reaches the indicia display area 316 or strip 318 corresponding to the indicia 314 . Therefore, the user may press the control member corresponding to the indicia 314 at the time when the leading end 320 a of the bar 320 reaches the indicia display area 316 or strip 318 .
  • the user can deliver a succession of attacking techniques which are highly difficult to achieve, after being trained to a certain extent in a relatively short period of time. Consequently, the user will not easily give up the video game while playing the video game.
  • the guidance 310 is displayed in a lower region or a corner of the display screen of the display monitor 18 , then the user may possibly overlook a special action made by the principal character 300 displayed in a central area of the display screen because the user may be distracted to the progress of the bar 320 displayed in the lower region or the corner of the display screen.
  • the guidance 310 and the bar 320 are displayed as semitransparent images, and hence can be displayed in a large size in the center of the display screen, as shown in FIG. 7, rather than in a lower region or a corner of the display screen.
  • the user can see a special action made by the principal character 300 and presses control members on the manual controller 16 while simultaneously confirming the progressing pattern of the bar 320 . Therefore, the user can see and enjoy various special actions made by the principal character 300 .

Abstract

An entertainment system has a normal action rendering means for displaying a normal action image in which at least two hypothetical characters are making a normal action, and a special action rendering means for displaying a special action image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance, and a image display means for outputting image data stored in an image memory to a display monitor to enable the display monitor to display an image based on the image data.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to an entertainment system having at least an entertainment apparatus for executing various programs, a manual controller, and a display unit, an entertainment apparatus for executing various programs, a recording medium storing a program and data that are used by the entertainment apparatus, and a program for realizing the entertainment system. [0002]
  • 2. Description of the Related Art [0003]
  • Some entertainment systems including entertainment apparatus such as video game machines display video game images based on video game data stored in a recording medium such as a CD-ROM or the like on the display screen of a television receiver while allowing the user or game player to play the video game with commands entered via a manual controller. [0004]
  • In those entertainment systems, the entertainment apparatus and the manual controller are usually connected to each other by a serial interface. When a clock signal is supplied from the entertainment apparatus to the manual controller, the manual controller sends key switch information based on the user's control entries in synchronism with the clock signal. [0005]
  • Recently developed manual controllers incorporate a vibration generating means for applying vibrations to the user based on a request from an external apparatus such as an entertainment apparatus, for example. While a video game is in progress, the vibration generating means applies various different kinds of vibrations to the user in response to user's different control entries. [0006]
  • Games that are played on video game machines include role-playing games, shooting games, driving games, combat games, etc. [0007]
  • When the game player plays a role-playing game, which is constructed around one basic story, the game progresses as a principal character travels to various regions in the story, and experiences events or searches for hidden items in various scenes. Since the role-playing game is primarily focused on experiencing events and searching for hidden items, battle scenes, if any, in the game tend to be boring to the user or game player. [0008]
  • A combat game is designed for a principal character to attack and knock out an opponent, and offers a variety of attack patterns available for the principal character to employ. The user or game player selects an attack pattern depending on the situation in which the principal character is placed. The user or game player wages an attack on an opponent in the selected attack pattern at an optimum time to cause substantial damage to the opponent for thereby placing the principal character in an advantageous position in the progress of the game. Therefore, the user or game player can enjoy the excitement of the combat game. [0009]
  • In order to deliver a succession of attacking techniques in the combat game, however, the user or game player has to be well aware of sequences of pressing a plurality of control members on the manual controller and times to press those control buttons. It takes a considerable period of time until the user or game player gets used to well manipulating the manual controller and enjoys the combat game to a satisfactory extent. [0010]
  • SUMMARY OF THE INVENTION
  • It is therefore an object of the present invention to provide an entertainment system, an entertainment apparatus, a recording medium, and a program which are capable of turning battle scenes, which would tend to be monotonous in role-playing games, for example, into realistic scenes, allows the user or game player to easily deliver a succession of attacking techniques by operating a manual controller according to a guidance, and keeps the user or game player interested in a video game if applied to the video game. [0011]
  • According to an aspect of the present invention, there is provided an entertainment system comprising an entertainment apparatus for executing various programs, at least one manual controller for entering control requests from the user into the entertainment apparatus, a display unit for displaying images outputted from the entertainment apparatus, normal action rendering means for displaying a normal action image in which at least two hypothetical characters are making a normal action, and special action rendering means for displaying a special action image in which at [0012] 5 least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
  • According to another aspect of the present invention, there is provided an entertainment apparatus for connection to a manual controller for outputting a control request from the user and a display unit for displaying images, comprising normal action rendering means for displaying an image in which at least two hypothetical characters are making a normal action, and special action rendering means for displaying an image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance. [0013]
  • According to still another aspect of the present invention, there is provided a recording medium storing a program and data for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images, the program comprising the steps of displaying a normal action image in which at least two hypothetical characters are making a normal action, and displaying a special action image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance. [0014]
  • According to yet another aspect of the present invention, there is provided a program readable and executable by a computer, for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images, the program comprising the steps of displaying a normal action image in which at least two hypothetical characters are making a normal action, and displaying a special action image in which at least one of the hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by the one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance. [0015]
  • If the present invention is applied to a battle scene in a role-playing game or a combat game, for example, a normal action, i.e., a normal battle action, is performed between the two hypothetical characters. When a certain condition is satisfied in the normal battle action, e.g., when one of the hypothetical characters stands with its back facing a wall or puts a foot into a step, at least one of the principal characters changes from the normal battle action to a special battle action. The special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example. [0016]
  • When the control input from the user satisfies a control condition indicated by the guidance displayed on the display monitor, the special battle action is completed, and the opponent, e.g., a monster, falls down. [0017]
  • Therefore, battle scenes, which would tend to be monotonous in role-playing games, for example, are turned into realistic scenes, and the user or game player is allowed to easily deliver a succession of attacking techniques by operating the manual controller according to the guidance. If the entertainment system according to the present invention is applied to play a video game, then the user or game player is continuously interested in the video game. [0018]
  • The special action rendering means or step may comprise icon rendering means for, or the step of, displaying an icon indicative of the special action when the hypothetical character changes from the normal action to the special action. Therefore, the user is aware of the entry into the special action, i.e., can be mentally prepared to enter the special action, and hence can subsequently make smooth control actions to perform the special action. The icon is thus effective to allow children and old people to enjoy the video game. [0019]
  • The special action rendering means or step may comprise guidance rendering means for, or the step of, rendering an indicia representing at least one of control members on the manual controller as the guidance, and special action continuation determining means for, or the step of, comparing a time to prompt the user to press the control member corresponding to the indicia with a time when the user presses the corresponding control member, and determining whether the hypothetical character is to make the special action based on a compared result. The user thus only needs to operate the control member corresponding to the indicia at the prompting time indicated by the guidance. [0020]
  • The special action continuation determining means or step may comprise means for, or the step of, transferring control over to the normal action rendering means when the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to the indicia beyond an allowable range. Therefore, if the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to the indicia while the hypothetical character is making the special action, then the hypothetical character which is making the special action returns to the normal action. [0021]
  • The special action rendering means or step may comprise means for, or the step of, changing the number of indicia and the time to prompt the user to press the control member corresponding to each of the indicia, depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among the at least two hypothetical characters. For example, if the hypothetical character which is making the special action is a principal character and the other hypothetical character is a monster, then the number of indicia and the time to prompt the user to press the control member corresponding to each of the indicia may be changed depending on characteristics of the monster. Accordingly, elements of fun are added to battle scenes which would tend to be monotonous, keeping the user interested in the video game. [0022]
  • The special action rendering means or step may comprise bar rendering means for, or the step of, rendering a bar which progresses randomly toward the indicia, and the time to prompt the user to press the control member corresponding to the indicia may represent a time when a leading end of the bar reaches the indicia. Therefore, the user may press the control member corresponding to the indicia at the time when the leading end of the bar reaches the indicia. [0023]
  • The special action rendering means or step may comprise means for, or the step of, changing a progressing pattern of the bar depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among the at least two hypothetical characters. [0024]
  • If the bar is displayed in a lower region or a corner of the display screen of the display unit, then the user may possibly overlook a special action made by the hypothetical character displayed in a central area of the display screen because the user may be distracted to the progress of the bar displayed in the lower region or the corner of the display screen. [0025]
  • If at least the bar is displayed as a semitransparent bar in front of the hypothetical characters, then the bar may be displayed as a semitransparent image of a large size in the center of the display screen, rather than in a lower region or a corner of the display screen. As a result, the user can see a special action made by the hypothetical character and press control members on the manual controller while simultaneously confirming the progressing pattern of the bar. Therefore, the user can see and enjoy various special actions made by the hypothetical character. [0026]
  • The above and other objects, features, and advantages of the present invention will become more apparent from the following description when taken in conjunction with the accompanying drawings in which a preferred embodiment of the present invention is shown by way of illustrative example. [0027]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a circuit arrangement of an entertainment system according to the present invention; [0028]
  • FIG. 2 is a perspective view of a manual controller; [0029]
  • FIG. 3 is a plan view of the manual controller; [0030]
  • FIG. 4 is a block diagram of a circuit arrangement of the manual controller; [0031]
  • FIG. 5 is a view showing a displayed image of a normal battle action of a principal character; [0032]
  • FIG. 6 is a view showing a displayed image similar to FIG. 5, with an icon displayed indicating that the principal character is about to wage a special attack; [0033]
  • FIG. 7 is a view showing a displayed image similar to FIG. 5, with the principal character waging a special attack as a bar progresses in a guidance; [0034]
  • FIG. 8 is a view showing another guidance; [0035]
  • FIG. 9 is a functional block diagram of a battle mode processing means according to the present invention; [0036]
  • FIG. 10 is a flowchart of a processing sequence of the battle mode processing means, particularly, a process of rendering a normal action of a monster with a normal action rendering means; [0037]
  • FIGS. 11 through 14 are flowcharts of a process of rendering a normal action of a principal character with the normal action rendering means; [0038]
  • FIG. 15 is a flowchart of a processing sequence of an icon rendering means of a special action rendering means; [0039]
  • FIG. 16 is a flowchart of a processing sequence of a guidance rendering means of the special action rendering means; [0040]
  • FIG. 17 is a flowchart of a processing sequence of a special action continuation determining means of the special action rendering means; and [0041]
  • FIG. 18 is a flowchart of a processing sequence of the special action rendering means. [0042]
  • DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An entertainment system, an entertainment apparatus, a recording medium, and a program according to the present invention as applied to a video game apparatus will be described below with reference to FIGS. 1 through 18. [0043]
  • As shown in FIG. 1, an [0044] entertainment system 10 basically comprises an entertainment apparatus 12 for executing various programs, a memory card 14 detachably connected to the entertainment apparatus 12, a manual controller 16 detachably connected to the entertainment apparatus 12, and a display monitor 18 such as a television receiver which is supplied with video and audio output signals from the entertainment apparatus 12.
  • The [0045] entertainment apparatus 12 reads a game program recorded in a mass storage medium such as an optical disk 20 such as a CD-ROM or the like, and executes a game, for example, based on the program depending on commands supplied from the user, e.g., the game player, via the manual controller 16. The execution of the game mainly represents controlling the progress of the game by controlling the display of images and the generation of sounds on the display monitor 18 based on manual input actions entered from the manual controller 16.
  • The [0046] entertainment apparatus 12 generally comprises a control system 200, a graphic generating system 204 connected to the control system 200 via a system bus 202, a sound generating system 206 connected to the control system 200 via the system bus 202, and an optical disk control system 208 connected to the control system 200 via the system bus 202. A communication controller 210 for controlling data to be inputted to and outputted from the manual controller 16 and the memory card 14 is also connected to the control system 200 via the system bus 202.
  • The [0047] manual controller 16 supplies commands (including control data) from the user via a communication controller 150 (see FIG. 4) of the manual controller 16 and the communication controller 210 to the entertainment apparatus 12. The optical disk control system 208 includes an optical disk drive 212 in which the optical disk 20, which may comprise a CD-ROM or the like as a specific example of a recording medium according to the present invention, is loaded.
  • The [0048] control system 200 controls motions of characters displayed on the display monitor 18 based on a program and data read from the optical disk 20 and commands supplied from the manual controller 16.
  • The [0049] control system 200 includes an MPU 220 for controlling the entertainment apparatus 12, a main memory 222 used for operating various programs and storing various data, a peripheral device controller 224 for controlling interrupts and direct memory access (DMA) data transfer, a read-only memory (ROM) 226 which stores various programs such as an operating system program for managing the graphic generating system 204, the sound generating system 206, etc., and which has an OSD function to execute a control program such as kernel, and a real-time clock (RTC) 228 having a calendar and clock function. The main memory 222 is capable of executing at least the game program thereon.
  • The [0050] MPU 220 controls the entertainment apparatus 12 in its entirety by executing the operating system program stored in the ROM 226. The MPU 220 comprises a 32-bit RISC-CPU, for example.
  • When the [0051] entertainment apparatus 12 is turned on, the MPU 220 executes the operating system program stored in the ROM 226 to start controlling the graphic generating system 204, the sound generating system 206, etc.
  • When the operating system program is executed, the [0052] MPU 220 initializes the entertainment apparatus 12 in its entirety for confirming its operation, and thereafter controls the optical disc control system 208 to execute an application program such as a game program recorded in the optical disk 20.
  • As the application program such as a game program is executed, the [0053] MPU 220 controls the graphic generating system 204, the sound generating system 206, etc. depending on commands entered by the user for thereby controlling the display of images and the generation of music sounds and sound effects.
  • The [0054] graphic generating system 204 comprises a vector calculating unit 230 for performing floating point vector calculations required for geometry processing, an image processor 232 for generating image data under the control of the MPU 220 and outputting the generated image data to the display monitor 18, which is a CRT in this embodiment, a graphic interface (GIF) 234 for arbitrating transfer paths between the MPU 220, the vector calculating unit 230, and the image processor 232, and an image decoder 236 for decoding image data compressed and encoded by an orthogonal transform such as a discrete cosine transform.
  • The [0055] image processor 232 comprises a rendering engine 240, a memory interface 242, an image memory 244, and a display controller 246 such as a programmable CRT controller.
  • The rendering engine [0056] 240 renders (stores) image data into the image memory 244 via the memory interface 242 according to a rendering command supplied from the MPU 220.
  • A [0057] first bus 248 is connected between the memory interface 242 and the rendering engine 240, and a second bus 250 is connected between the memory interface 242 and the image memory 244. The first and second buses 248, 250 each have a 128-bit width to allow the rendering engine 240 to render (store) image data at a high speed into the image memory 244.
  • The rendering engine [0058] 240 is capable of rendering image data of 320×240 pixels or 640×480 pixels according to the NTSC or PAL standards on a real-time basis ten and several times to several tens times in a period of time ranging from {fraction (1/60)} to {fraction (1/30)} second.
  • The [0059] image memory 244 is of a unified memory structure capable of specifying a texture rendering area and a display rendering area as one area.
  • The [0060] display controller 246 serves to write texture data read from the optical disk 20 via the optical disk drive 212 and texture data generated in the main memory 222 into the texture rendering area of the image memory 244 via the memory interface 242, and read image data stored in the display rendering area of the image memory 244 via the memory interface 242 and output the read image data to the display monitor 18, which displays an image thereon based on the image data.
  • The [0061] sound generating system 206 comprises a sound processing unit (SPU) 260 for generating music sounds, sound effects, etc. based on instructions from the MPU 220, and a sound buffer 262 for storing music sounds, sound effects, etc. generated by the SPU 260. Audio signals representing music sounds, sound effects, etc. generated by the SPU 260 are supplied to audio terminals of the display monitor 18. The display monitor 18 has a speaker 264 which radiates music sounds, sound effects, etc. generated by the SPU 260 based on the supplied audio signals.
  • The [0062] SPU 260 has an ADPCM (adaptive differential PCM) demodulating function for reproducing 16-bit audio data which has been encoded as 4-bit differential signal by ADPCM, a reproducing function for reproducing waveform data stored in the sound buffer 262 to generate sound effects, etc., and a modulating function for modulating and reproducing the waveform data stored in the sound buffer 262.
  • The [0063] sound generating system 206 with these functions can be used as a sampling sound source which generates music sounds, sound effects, etc. based on the waveform data stored in the sound buffer 262 according to instructions from the MPU 220.
  • The optical [0064] disk control system 208 comprises an optical disk drive 212 for reproducing application programs and data recorded on the optical disk 20, a decoder 270 for decoding programs and data that are recorded with an error correcting code added thereto, and a buffer 272 for temporarily storing data read from the optical disk drive 212 so as to allow the data from the optical disk 20 to be read at a high speed. An auxiliary CPU 274 is connected to the decoder 270.
  • Audio data recorded on the [0065] optical disk 20 which is read by the optical disk drive 212 includes PCM data converted from analog sound signals, in addition to the ADPCM data.
  • The ADPCM data, which is recorded as 4-bit differential data of 16-bit digital data, is decoded by the [0066] decoder 270, supplied to the SPU 260, converted thereby into analog data, and applied to drive the speaker 264.
  • The PCM data, which is recorded as 16-bit digital data, is decoded by the [0067] decoder 270 and then applied to drive the speaker 264.
  • As shown in FIGS. 2 and 3, the [0068] manual controller 16 has first and second control pads 34, 36, an L (Left) button 38L, an R (Right) button 38R, a start button 40, and a selection button 42. The manual controller 16 also has joysticks 44, 46 for inputting analog control actions, a mode selection switch 48 for selecting control modes of the joysticks 44, 46, and a mode indicator 50 for indicating a selected control mode. The mode indicator 50 comprises a light-emitting element such as a light-emitting diode or the like.
  • As shown in FIG. 2, the [0069] manual controller 16 has a housing 104 comprising an upper member 100 and a lower member 102 which are mated and joined to each other by fasteners such as screws.
  • As shown in FIGS. 2 and 3, a pair of left and [0070] right grips 106, 108 projects from one side of respective opposite ends of the housing 104. The left and right grips 106, 108 are shaped so as to be gripped by the palms of left and right hands of the user or game player when the manual controller 16 is connected to the entertainment apparatus 12 and information retrieval is carried out or the game is played thereby, for example.
  • As shown in FIG. 3, the left and [0071] right grips 106, 108 are progressively spaced away from each other toward their distal ends. To allow the game player to grip the left and right grips 106, 108 comfortably for a long period of time, the left and right grips 106, 108 are tapered from their joint with the housing 104 toward their distal ends, and have arcuate outer peripheral surfaces and arcuate distal end surfaces.
  • As shown in FIGS. 2 and 3, the [0072] first control pad 34 is disposed on one end of the housing 104 and comprises a first pressable control member (up button) 110 a, a second pressable control member (right button) 110 b, a third pressable control member (down button) 110 c, and a fourth pressable control member (left button) 110 d. The first through fourth pressable control members 110 a, 110 b, 110 c, 110 d project on an upper surface of the housing 104 and are arranged in a crisscross pattern.
  • The [0073] first control pad 34 includes switch elements as signal input elements associated respectively with the first through fourth pressable control members 110 a, 110 b, 110 c, 110 d. The first control pad 34 functions as a directional controller for controlling the direction of movement of a displayed game character, for example. When the game player selectively presses the first through fourth pressable control members 110 a, 110 b, 110 c, 110 d to turn on or off the switch elements associated respectively with the first through fourth pressable control members 110 a, 110 b, 110 c, 110 d, the displayed game character moves in the direction corresponding to the pressed one of the first through fourth pressable control members 110 a, 110 b, 110 c, 110 d.
  • As shown in FIGS. 2 and 3, the [0074] second control pad 36 is disposed on the other end of the housing 104 and comprises a first pressable control member (Δ button) 112 a, a second pressable control member (□ button) 112 b, a third pressable control member (× button) 112 c, and a fourth pressable control member (∘ button) 112 d. The first through fourth pressable control members 112 a, 112 b, 112 c, 112 d project on the upper surface of the housing 104 and are arranged in a crisscross pattern.
  • The first through fourth [0075] pressable control members 112 a, 112 b, 112 c, 112 d are constructed as independent members, and associated with respective switch elements disposed in the second control pad 36.
  • The [0076] second control pad 36 serves as a function setting/performing unit for setting functions for a displayed game character assigned to the pressable control members 112 a-112 d or performing functions of a displayed game character when the switch elements associated with the pressable control members 112 a-112 d are turned on.
  • The [0077] L button 38L and the R button 38R are disposed on a side of the housing 104 remote from the left and right grips 106, 108 and positioned respectively at the opposite ends of the housing 104. As shown in FIGS. 2 and 4, the L button 38L has a first left pressable control member (L1 button) 114 a and a second left pressable control member (L2 button) 114 b, and the R button 38R has a first right pressable control member (R1 button) 116 a and second right pressable control member (R2 button) 116 b, respectively. The L button 38L and the R button 38R have respective switch elements associated respectively with the pressable control members (the L1 button 114 a, the L2 button 114 b, the R1 button 116 a, and the R2 button 116b).
  • The [0078] L button 38L and the R button 38R serve as respective function setting/performing units for setting functions for a displayed game character assigned to the pressable control members 114 a, 114 b and 116 a, 116 b or performing functions of a displayed game character when the switch elements associated with the pressable control members 114 a, 114 b and 116 a, 116 b are turned on.
  • As shown in FIGS. 2 and 3, the [0079] manual controller 16 also has first and second analog control pads 118, 120 disposed respectively at confronting corners defined between the housing 104 and the proximal ends of the left and right grips 106, 108 which are joined to the housing 104.
  • The first and second [0080] analog control pads 118, 120 have the respective joysticks 44, 46 which can be tilted in all directions (360°) about control shafts thereof, and respective signal input elements such as variable resistors or the 25 like which are operable by the respective joysticks 44, 46. Specifically, the control shafts of the left and right joysticks 44, 46 are normally urged to return to their neutral positions by biasing members. The left and the right joysticks 44, 46 can be freely tilted in all directions (360°) about the axes of the control shafts.
  • The first and second [0081] analog control pads 118, 120 can move a displayed game character while rotating the same or while changing its speed, and can make an analog-like action such as to change the form of a displayed character, when the game player manipulates the joysticks 44, 46. Therefore, the first and second analog control pads 118, 120 are used as a control unit for entering command signals for a displayed character to perform the above movement or action.
  • Analog input values produced when the user moves the left and [0082] right joysticks 44, 46 are available in a vertical range from an uppermost vertical value of “0” to a lowermost vertical value of “255” and a horizontal range from a leftmost horizontal value of “0” to a rightmost horizontal value of “255”.
  • When the [0083] joysticks 44, 46 are pushed in, the first and second analog control pads 118, 120 output other signals different from those signals in the horizontal and vertical ranges.
  • When the [0084] mode selection switch 48 is pressed, it can select a control mode for allowing a command signal to be inputted from the first and second analog control pads 118, 120 or a control mode for inhibiting a command signal from being inputted from the first and second analog control pads 118, 120.
  • When the [0085] mode selection switch 48 is pressed, the functions of the first through fourth pressable control members 112 a, 112 b, 112 c, 112 d of the second control pad 36, and the functions of the pressable control members 114 a, 114 b and 116 a, 116 b of the L button 38L and the R button 38R are changed depending on the control mode selected by the pressed mode selection switch 48. Depending on the control mode selected by the mode selection switch 48, the mode indicator 50 flickers and changes its indication light.
  • As shown in FIG. 4, the [0086] manual controller 16 comprises a communication controller 150, a CPU 152, a program memory 154, a working RAM 156, a digital input block 158, an analog input block 160, a left motor driver 170L, a left motor 130L, a right motor driver 170R, and a right motor 130R.
  • These components of the [0087] manual controller 16 are connected to a bus 162.
  • The digital input block [0088] 158 functions as a manual input controller for the pressable control members 110 a-110 d of the first control pad 34 and the pressable control members 112 a-112 d of the second control pad 36. The analog input block 160 functions as a manual input controller for the left and right joysticks 44, 46. The digital input block 158 and the analog input block 160 allow the user to enter various items of information into the manual controller 16.
  • The [0089] communication controller 150 has a function to effect serial communications with an external device. The communication controller 150 is electrically connectable to the communication controller 210 (see FIG. 1) of the entertainment apparatus 12, for example, for data communications with the entertainment apparatus 12.
  • A characteristic function of the [0090] entertainment system 10 according to the present embodiment will be described below with reference to FIGS. 5 through 18.
  • According to the characteristic function, when a certain condition is satisfied while at least two hypothetical characters are making a normal action, at least one of the hypothetical characters changes from the normal action to a special action, and when a control action made by the user satisfies a certain control condition indicated by a guidance displayed on the display monitor, the special action performed by the hypothetical character is completed. [0091]
  • The above function as applied to a scene of a battle between a principal character and a monster in a role-playing game will be described below with reference to FIGS. 5 through 8. [0092]
  • As shown in FIG. 5, one of two hypothetical characters displayed on the display monitor [0093] 18 is used as a principal character 300 that moves under control commands from the user and the other hypothetical character as a monster 302 confronting the principal character 300.
  • When the [0094] principal character 300 runs into the monster 302 and starts fighting against the monster 302, a normal battle action takes place. The normal battle action is such an action that when the user presses the control members 110 a-110 d, e.g., when the user presses the ∘ button 112 d, the principal character 300 attacks the monster 302 with a sword 304, and when the user presses the × button 112 c, the principal character 300 defends himself from the monster 302.
  • In the normal battle action, when a certain condition is satisfied, e.g., when the [0095] monster 302 is put in a disadvantageous position, e.g., when the monster 302 stands with its back facing a wall or puts a foot into a step, or when 10l the HP (hit point) of the principal character 300 becomes equal to or lower than a certain value, the principal character 300 changes from the normal battle action to a special battle action. The special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example. When such a dead blow strikes the monster 302, the HP of the monster 302 is instantaneously eliminated, and the monster 302 falls down.
  • When the [0096] principal character 300 enters the special battle action, as shown in FIG. 6, an icon 306 indicative of the special battle action is displayed for 2 seconds, for example, in the vicinity of the principal character 300. With the icon 306 displayed, the user can recognize that the principal character 300 enters the special battle action.
  • Thereafter, as shown in FIG. 7, a [0097] guidance 310 is displayed substantially centrally on the display screen of the display monitor 18. The guidance 310 comprises an area 312 for displaying a bar 320 (hereinafter referred to as “bar display area 312”), and an area 316 for displaying indicia 314 representing the four control members, i.e., the ∘ button 112 d, the Δ button 112 a, the × button 112 c, and the □ button 112 b, (hereinafter referred to as “indicia display area 316”). At least the bar display area 312 is displayed in a semitransparent fashion. In the example shown in FIG. 7, the indicia display area 316 is displayed as a strip. As shown in FIG. 8, however, the indicia display area 316 with the indicia 314 displayed therein may be positioned above the bar display area 312, and strips 318 may be displayed in the bar display area 312 at respective positions aligned with the respective indicia 314.
  • When the [0098] guidance 310 is displayed, a bar 320 progresses or moves from the left end of the bar display area 312 toward the right end thereof with certain timing. While the bar 320 is progressing, the principal character 300 performs the special battle action different from the normal battle action, as shown in FIG. 7. In FIG. 7, the principal character 300 jumps backward, and at the same time cuts the monster 302 with the sword 304 which the principal character 300 is holding.
  • When a [0099] leading end 320 a of the bar 320 progressing toward the right end of the bar display area 312 reaches the indicia display area 316 (see FIG. 7) or one of the strips 318 (see FIG. 8), if the user presses the control member, e.g., the ∘ button 112 d, indicated by the indicia 314 corresponding to the indicia display area 316 or the strip 318, then the special battle action made by the principal character 300 is continued.
  • When the user presses the control members corresponding to the [0100] indicia 314 displayed in the guidance 310 at correct times, the principal character 300 delivers a succession of attacking techniques until finally the monster 302 falls down.
  • However, when the [0101] leading end 320 a of the bar 320 displayed in progress reaches the indicia display area 316 or one of the strips 318, if the user fails to press the control member indicated by the indicia 314 or if the user presses a wrong control member or presses the correct control member at a wrong time, then the special battle action made by the principal character 300 is put to an end, and the principal character 300 enters the normal battle action again.
  • One example of software for performing the above characteristic function will be described below with reference to FIGS. 9 through 18. As shown in FIG. 9, the software comprises a battle mode processing means [0102] 400.
  • The battle mode processing means [0103] 400 can be supplied to the entertainment apparatus 12 from a randomly accessible recording medium such as the optical disk 20 or the memory card 14, or a network. It is assumed in the present embodiment that the battle mode processing means 400 is read from the optical disk 20 such into the entertainment apparatus 12.
  • The battle mode processing means [0104] 400 is downloaded in advance from the optical disk 20 played back by the entertainment apparatus 12 into the main memory 222 in the control system 200 thereof according to a predetermined process, and executed by the MPU 220 of the control system 200.
  • As shown in FIG. 9, the battle mode processing means [0105] 400 comprises a normal action rendering means 402 for displaying a normal action image in which at least two hypothetical characters (the principal character 300 and the monster 302) are doing a normal action, a special action rendering means 404 for displaying a special action image in which at least one of the hypothetical characters (the principal character 300) changes from a normal action to a special action when a certain condition is satisfied, and the special action made by the hypothetical character (the principal character 300) is completed when a control action made by the user satisfies a certain control condition in the guidance 310, and an image display means 406 for outputting image data stored in the image memory 244 to the display monitor 18 to enable the display monitor 18 to display an image based on the image data.
  • The special action rendering means [0106] 404 comprises a condition determining means 410 for determining whether a condition to enter the special action is satisfied or not, an icon rendering means 412 for displaying the icon 306 (see FIG. 6) indicative of the special action when the hypothetical character (the principal character 300) changes from the normal action to the special action, a guidance rendering means 414 for rendering the guidance 310 (see FIG. 7) that comprises the semitransparent bar display area 312 and the indicia display area 316 and also rendering the indicia 314 representing at least one control member on the manual controller 16 in the indicia display area 316, a bar rendering means 416 for rendering the semitransparent bar 320 as it progresses randomly from the left end to right end of the bar display area 312, and a special action continuation determining means 418 for comparing the time to prompt the user to press the control member corresponding to the indicia 314, i.e., the time when the leading end 320 a of the bar 320 reaches the indicia display area 316 or one of the strips 318, with the time when the user presses the corresponding control member on the manual controller 16, and determining whether the principal character 300 is to make the special action based on the compared result.
  • The guidance rendering means [0107] 414 has an indicia selecting means 420 for changing the number of indicia 314 displayed in the guidance 310 and an array of those indicia 314 depending on the characteristics of the opponent, i.e., the monster 302. The bar rendering means 416 has a progressing pattern selecting means 422 for changing a progressing pattern of the bar 320 depending on the characteristics of the opponent, i.e., the monster 302.
  • A processing sequence of the battle mode processing means [0108] 400 will be described below with reference to FIGS. 10 through 18.
  • First, the normal action rendering means [0109] 402 of the battle mode processing means 400 performs its own processing sequence. In step S1 shown in FIG. 10, the normal action rendering means 402 stores initial values “0” respectively in an index register i used to update the number of times that a special attack is continued, an index register j used to update the display of the icon 306, and an index register k used to cause the principal character 300 to perform a normal attack action, thus initializing these registers i, j, k.
  • In step S[0110] 2, the normal action rendering means 402 generates a random number. In step S3, the normal action rendering means 402 selects an attack action pattern corresponding to the generated random number from a plurality of attack action patterns. Each of the attack action patterns comprises an array of plural action data, and object data of the monster 302 is rendered based on the array of plural action data for thereby making one attack action on the display monitor 18.
  • In step S[0111] 4, the normal action rendering means 402 stores an initial value “0” in an index register m used to wage an attack action on the monster 302, thus initializing the index register m. Thereafter, in step S5, the normal action rendering means 402 stores a background image based on the coordinates of a viewpoint in the image memory 244.
  • In step S[0112] 6, the normal action rendering means 402 reads mth action data from the selection attack action pattern.
  • In step S[0113] 7, the normal action rendering means 402 determines whether there is action data or not. If there is action data, then control proceeds to step S8 in which the normal action rendering means 402 rewrites vertex data of the object data of the monster 302 based on the content of the action data. In step S9, the normal action rendering means 402 renders three-dimensional image data of the monster 302 based on the object data of the monster 302, and stores the three-dimensional image data of the monster 302 in the image memory 244.
  • In step S[0114] 10 shown in FIG. 11, the normal action rendering means 402 determines whether a special attack is being continued or not based on whether a special attack flag is set to “1” or not. If the special attack flag is “0” indicating that a special attack is not presently waged, then control goes to step S11 in which the normal action rendering means 402 determines whether a control input from the manual controller 16 is different from the preceding control input or not. A different control input is entered when the user presses one of the direction buttons 110 a-110 d or the ∘ button 112 d, for example, after the user has not pressed any control members or buttons, or when the user presses the right button 110 b, for example, after the user has pressed the left button 100 d, for example.
  • If a control input from the [0115] manual controller 16 is judged as being different from the preceding control input in step S11, then control goes to step S12 in which the normal action rendering means 402 stores an initial value “0” in the index register k, thus initializing the index register k. Thereafter, in step S13, the normal action rendering means 402 calculates an action of the principal character 300 based on the present control input, thereby generating an action pattern comprising a plurality of action data. If a control input from the manual controller 16 is judged as being the same as the preceding control input in step S11, then control goes to step S14 in which the normal action rendering means 402 increments the value of the index register k by “1”.
  • When the processing in step S[0116] 13 or the processing in step S14 is finished, control goes to step S15 in which the normal action rendering means 402 determines whether the present action pattern is an attack pattern or not based on whether the control input is an input corresponding to the ∘ button 112 d, for example, which indicates an attack.
  • If the present action pattern is an attack pattern, then control goes to step S[0117] 16 in which the normal action rendering means 402 reads kth action data from the present action pattern. In step S17, the normal action rendering means 402 determines whether there is action data or not, i.e., whether a normal attack action is finished or not.
  • If there is action data, i.e., if a normal attack action is not finished, then control goes to step S[0118] 18 in which the normal action rendering means 402 rewrites vertex data of the object data of the principal character 300 based on the content of the present action data. In step S19, the normal action rendering means 402 renders three-dimensional image data of the principal character 300 based on the object data of the principal character 300, and stores the three-dimensional image data of the principal character 300 in the image memory 244.
  • A hidden surface removal process for the background image and the three-dimensional images of the [0119] principal character 300 and the monster 302 is carried out according a Z buffering process.
  • If the present action pattern is not an attack pattern in step S[0120] 15, then control proceeds to step S20 shown in FIG. 13 in which the normal action rendering means 402 reads kth action data from the present action pattern. Thereafter, in step S21, the normal action rendering means 402 determines whether there is action data or not, i.e., whether a normal action other than an attack action, e.g., a direction changing action or a defensive action, is finished or not.
  • If there is action data, i.e., if a normal action other than an attack action is not finished, then the processing in step S[0121] 18 and step S19 shown in FIG. 11 is performed, and three-dimensional image data of the principal character 300 is stored in the image memory 244.
  • If a normal action other than an attack action is finished in step S[0122] 21, then control returns to step S12 shown in FIG. 13, and the normal action rendering means 402 generates an action pattern again.
  • When the rendering process in step S[0123] 19 is finished, control goes to step S22 shown in FIG. 12 in which the normal action rendering means 402 determines whether the icon 306 is being presently displayed or not based on whether an icon display flag is set to “1” or not. If the icon 306 is not being displayed, then control goes to step S23 in which the condition determining means 410 determines whether a condition to wage a special attack is satisfied or not.
  • Specifically, the condition determining means [0124] 410 determines whether a condition to wage a special attack is satisfied or not based on whether the monster 302 is put in a disadvantageous position, e.g., whether the monster 302 stands with its back facing a wall or puts a foot into a step, or whether the HP (hit point) of the principal character 300 becomes equal to or lower than a certain value.
  • If a condition to wage a special attack is not satisfied, then control goes via a decision process in step S[0125] 24 to step S25 in which the image display means 406 outputs image data stored in the image memory 244 to the display monitor 18, which displays a corresponding image on the display screen. When the processing in step S25 is finished, control goes back to step S5 in FIG. 10, and the processing from step S5 is repeated.
  • The above processing is repeated to display a normal battle action of the [0126] principal character 300 and the monster 302 on the display monitor 18 as shown in FIG. 5.
  • If there is no action data and an attack action by the [0127] monster 302 is finished in step S7, then control goes to step S26 in which the normal action rendering means 402 calculates an effect imposed on the principal character 300 by the attack from the monster 302 based on the hitting ratio of the attack from the monster 302, the level difference between the monster 302 and the principal character 300, the attacking power of the monster 302, the physical strength of the principal character 300, and the protective gear that the principal character 300 wears. If the principal character 300 is not hit by the attack from the monster 302, then the effect imposed on the principal character 300 by the attack from the monster 302 is “0”.
  • In step S[0128] 27, the normal action rendering means 402 reduces the HP of the principal character 300 based on the calculated effect. In step S28, the normal action rendering means 402 determines whether the HP of the principal character 300 is equal to “0” or not.
  • If the HP of the [0129] principal character 300 is not equal to “0”, then control goes back to step S2 to render image data for a next attack action by the monster 302. If the HP of the principal character 300 is equal to “0”, then control goes to step S29 in which the normal action rendering means 402 performs a gameover process, e.g., displays a message indicative of a gameover. The processing sequence of the battle mode processing means 400 is now put to an end.
  • If there is no action data, i.e., if a normal attack action by the [0130] principal character 300 is finished, in step S17 shown in FIG. 11, then control goes to step S30 shown in FIG. 14 in which the normal action rendering means 402 calculates an effect imposed on the monster 302 by the attack from the principal character 300 based on the hitting ratio of the attack from the principal character 300, the level difference between the principal character 300 and the monster 302, the attacking power of the principal character 300, the physical strength of the principal character 300, and the defending ratio of the monster 302.
  • In step S[0131] 31, the normal action rendering means 402 reduces the HP of the monster 302 based on the calculated effect. In step S32, the normal action rendering means 402 determines whether the HP of the monster 302 is equal to “0” or not.
  • If the HP of the [0132] monster 302 is not equal to “0”, then the normal action rendering means 402 initializes the index register k in step S33. Thereafter, control goes back to step S16 in FIG. 11 to render image data for a next attack action by the principal character 300. If the HP of the monster 302 is equal to “0”, then control goes to step S34 in which the normal action rendering means 402 displays an image showing a victory action, e.g., a triumphant gesture, made by the principal character 300, and an acquisition of a prize from the monster 302. The processing sequence of the battle mode processing means 400 is now put to an end.
  • If a condition to wage a special attack is satisfied in step S[0133] 23 in FIG. 12, then control goes via the decision process in step S24 to a processing sequence of the icon rendering means 412 of the special action rendering means 404. According to the processing sequence of the icon rendering means 412, the icon rendering means 412 sets the icon display flag to “1” in step S35 shown in FIG. 15, and then renders the icon 306 indicative of the beginning of a special attack in the vicinity of the principal character 300 in step S36.
  • In step S[0134] 37, the icon rendering means 412 increments the value of the index register j by “1”. Thereafter, in step S38, the icon rendering means 412 determines whether the icon 306 has been displayed for a predetermined time, e.g., 2 seconds, or not, based on whether or not the value of the index register j is equal to or greater than a number A which corresponds to the number of frames for 2 seconds.
  • If the [0135] icon 306 has not been displayed for 2 seconds, then control goes to step S25 in FIG. 12 to display image data. At this time, the icon 306 is displayed in the vicinity of the principal character 300 which is doing a normal attack action.
  • If the [0136] icon 306 has been displayed for 2 seconds, then control goes to step S39 in which the icon rendering means 412 initializes the icon display flag, and sets a special attack flag to “1”, after which control goes to step S25 in FIG. 12 to display image data.
  • After the special attack flag is set to “1”, control goes via step S[0137] 10 shown in FIG. 11 to step S40 shown in FIG. 16 in which the guidance rendering means 414 reads image data of the bar display area 312 and the indicia display area 316 of the guidance 310. In step S41, the indicia selecting means 420 generates image data of the guidance 310 based on the number of indicia 314 and an array of those indicia 314 depending on the characteristics of the monster 302 with which the principal character 300 is presently fighting. Thereafter, in step S42, the indicia selecting means 420 stores the generated image data of the guidance 310 as semitransparent image data in the image memory 244.
  • In step S[0138] 43, the guidance rendering means 414 determines whether the present cycle is a first cycle or not based on whether the value of the index register i is “0” or not. If the present cycle is a first cycle, then control goes to step S44 in which the guidance rendering means 414 stores an initial value “0” in an index register n used to display the bar 320, thus initializing the index register n.
  • In step S[0139] 45, the progressing pattern selecting means 422 reads a bar display progressing pattern corresponding to the present monster 302 from a plurality of bar display progressing patterns. Each of the bar display progressing patterns comprises a plurality of registered progress data each representing the interval which the bar progresses per frame, then control goes to step S46 in which the progressing pattern selecting means 422 increments the value of the index register i by “1”.
  • Then, the bar rendering means [0140] 416 performs its processing sequence. First, in step S47, the bar rendering means 416 reads nth progress data from the bar display progressing pattern read in step S45. Then, in step S48, the bar rendering means 416 determines whether there is progress data or not. If there is progress data, then control goes to step S49 in which the bar rendering means 416 renders the bar 320 as a semitransparent bar whose leading end 320 a is in substantially alignment with the position represented by the progress data.
  • Then, the special action continuation determining means [0141] 418 performs its processing sequence. First, in step S50 shown in FIG. 17, the special action continuation determining means 418 determines whether the leading end 320 a of the bar 320 reaches the indicia display area 316 (see FIG. 7) or one of the strips 318 (see FIG. 8), i.e., whether the time to prompt the user to press the control member is reached, or not, based on the coordinates of the leading end 320 a of the bar 320 and the range in which the indicia display area 316 or the strip 318 is displayed.
  • If the time to prompt the user to press the control member is reached, then control goes to step S[0142] 51 in which the special action continuation determining means 418 determines whether there is a matching control input from the manual controller 16 or not, i.e., whether a control member represented by the indicia 314 corresponding to the indicia display area 316 or the strip 318 is pressed by the user or not. If there is a matching control input from the manual controller 16, then control goes to step S52 in which the special action continuation determining means 418 sets a matching flag to “1”.
  • In step S[0143] 53, the special action continuation determining means 418 stores an initial value “0” in an index register p used to cause the principal character 300 to perform a special action, thus initializing the index register p. In step S54, the special action continuation determining means 418 selects a special action pattern corresponding to the indicia 314, for example, from a plurality of special action patterns. Each of the special action patterns comprises an array of plural action data, and object data of the principal character 300 is rendered based on the array of plural action data for thereby making one special action on the display monitor 18.
  • If there is no matching control input in step S[0144] 51, then control goes to step S55 in which the special action continuation determining means 418 determines whether a matching control input from the manual controller 16 has already been entered or not based on whether the matching flag is set to “1” or not. If a matching control input from the manual controller 16 has already been entered, then control goes to step S56 in which the special action continuation determining means 418 increments the value of the index register p by “1”.
  • If the time to prompt the user to press the control member is not reached, then control goes to step S[0145] 57 in which the special action continuation determining means 418 sets the matching flag to “0”, thus resetting the matching flag. Thereafter, in step S58, the special action continuation determining means 418 increments the value of the index register p by “1”.
  • When the processing in step S[0146] 54, the processing in step S56, or the processing in step S58 is finished, control goes to step S59 shown in FIG. 18 in which the special action continuation determining means 418 reads pth action data from the selected special action pattern. Thereafter, control goes to step S60 in which the special action continuation determining means 418 rewrites vertex data of the object data of the principal character 300 based on the content of the present action data. Then, in step S61, the special action continuation determining means 418 renders three-dimensional image data of the principal character 300 based on the object data of the principal character 300, and stores the three-dimensional image data of the principal character 300 in the image memory 244.
  • When the processing in step S[0147] 61 is finished, control goes to step S25 shown in FIG. 12 in which the image display means 406 outputs image data stored in the image memory 244 to the display monitor 18, which displays an image thereon based on the image data. After the processing in step S25, control returns to step S5 shown in FIG. 10 to repeat the processing from step S5.
  • The processing in steps S[0148] 5-S10, steps S40-S61, step S25 is repeated to display an image in which the principal character 300 makes a special action to deliver a succession of attacking techniques to the monster 302.
  • If no matching input is entered when the [0149] leading end 320 a of the bar 320 reaches the indicia display area 316 or one of the strips 318 while the principal character 300 is making the special action, then control goes via step S55 shown in FIG. 17 to step S62 in which the special action continuation determining means 418 initializes the index register i. Thereafter, in step S63, the special action continuation determining means 418 stores “0” in the special attack flag, thus resetting the special attack flag. Thereafter, control goes to step S5 shown in FIG. 10 in which the normal action rendering means 402 performs its processing sequence. That is, the principal character 300 changes from the special action back to the normal action.
  • If a matching input is entered with respect to each of all the [0150] indicia 314 displayed in the guidance 310, then control goes from step S48 shown in FIG. 16 to step S64 in which the special action continuation determining means 418 initializes the index register i. Thereafter, in step S65, the special action continuation determining means 418 stores “0” in the special attack flag, thus resetting the special attack flag. Control then goes to step S34 shown in FIG. 14 in which the special action continuation determining means 418 displays an image showing a victory action, e.g., a triumphant gesture, made by the principal character 300, and an acquisition of a prize from the monster 302. The processing sequence of the battle mode processing means 400 is now put to an end.
  • As described above, the battle mode processing means [0151] 400 as it is applied to a battle scene in a role-playing game or a combat game, for example, first performs a normal action, i.e., a normal battle action, between the principal character 300 and the monster 302. When a certain condition is satisfied in the normal battle action, e.g., when the monster 302 stands with its back facing a wall or puts a foot into a step, the principal character 300 changes from the normal battle action to a special battle action. The special battle action is an action to deliver a dead blow such as a succession of attacking techniques in a combat game, for example.
  • When the control input from the user satisfies a control condition indicated by the [0152] guidance 310 displayed on the display monitor 18, the special battle action is completed, and the opponent, i.e., the monster 302, falls down.
  • Therefore, battle scenes, which would tend to be monotonous in role-playing games, for example, are turned into realistic scenes, and the user or game player is allowed to easily deliver a succession of attacking techniques by operating the manual controller according to the [0153] guidance 310. If the entertainment system according to the present invention is applied to play a video game, then the user or game player is continuously interested in the video game.
  • When the [0154] principal character 300 changes from the normal action to the special action, the change is indicated by the icon 306. Therefore, the user is aware of the entry into the special action, i.e., can be mentally prepared to enter the special action, and hence can subsequently make smooth control actions to perform the special action. The icon 360 is thus effective to allow children and old people to enjoy the video game.
  • The [0155] guidance 310 includes an indicia 314 rendered to indicate at least one control member on the manual controller 16. The time to prompt the user to press a control member corresponding to the indicia 314 and the time at which the user presses the control member are compared with each other, and it is determined whether the principal character 300 is to make a special action or not based on the compared result. The user thus only needs to operate the control member corresponding to the indicia 314 at the prompting time indicated by the guidance 310.
  • In this embodiment, the prompting time is the time when the [0156] leading end 320 a of the bar 320 as it progresses randomly from the left end to right end of the bar display area 312 reaches the indicia display area 316 or strip 318 corresponding to the indicia 314. Therefore, the user may press the control member corresponding to the indicia 314 at the time when the leading end 320 a of the bar 320 reaches the indicia display area 316 or strip 318. The user can deliver a succession of attacking techniques which are highly difficult to achieve, after being trained to a certain extent in a relatively short period of time. Consequently, the user will not easily give up the video game while playing the video game.
  • The number and array of [0157] indicia 314 displayed in the guidance 310 and the progressing pattern of the bar 320 displayed in the bar display area 312 of the guidance 310 are changed depending on the characteristics of the monster 302. Accordingly, elements of fun are added to battle scenes, which would tend to be monotonous, keeping the user interested in the video game.
  • If the [0158] guidance 310 is displayed in a lower region or a corner of the display screen of the display monitor 18, then the user may possibly overlook a special action made by the principal character 300 displayed in a central area of the display screen because the user may be distracted to the progress of the bar 320 displayed in the lower region or the corner of the display screen.
  • According to the embodiment of the present invention, at least the [0159] guidance 310 and the bar 320 are displayed as semitransparent images, and hence can be displayed in a large size in the center of the display screen, as shown in FIG. 7, rather than in a lower region or a corner of the display screen. As a result, the user can see a special action made by the principal character 300 and presses control members on the manual controller 16 while simultaneously confirming the progressing pattern of the bar 320. Therefore, the user can see and enjoy various special actions made by the principal character 300.
  • Although a certain preferred embodiment of the present invention has been shown and described in detail, it should be understood that various changes and modifications may be made therein without departing from the scope of the appended claims. [0160]

Claims (18)

What is claimed is:
1. An entertainment system comprising:
an entertainment apparatus for executing various programs;
at least one manual controller for entering control requests from the user into said entertainment apparatus;
a display unit for displaying images outputted from said entertainment apparatus;
normal action rendering means for displaying a normal action image in which at least two hypothetical characters are making a normal action; and
special action rendering means for displaying a special action image in which at least one of said hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by said one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
2. An entertainment system according to
claim 1
, wherein said special action rendering means comprises:
icon rendering means for displaying an icon indicative of said special action when said hypothetical character 25 changes from the normal action to the special action.
3. An entertainment system according to
claim 1
, wherein said special action rendering means comprises:
guidance rendering means for rendering an indicia representing at least one of control members on said manual controller as said guidance; and
special action continuation determining means for comparing a time to prompt the user to press the control member corresponding to said indicia with a time when the user presses the corresponding control member, and determining whether said hypothetical character is to make the special action based on a compared result.
4. An entertainment system according to
claim 3
, wherein said special action continuation determining means comprises:
means for transferring control over to said normal action rendering means when the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to said indicia beyond an allowable range.
5. An entertainment system according to
claim 3
, wherein said special action rendering means comprises:
means for changing the number of indicia and the time to prompt the user to press the control member corresponding to each of said indicia, depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among said at least two hypothetical characters.
6. An entertainment system according to
claim 3
, wherein said special action rendering means comprises:
bar rendering means for rendering a bar which progresses randomly toward said indicia;
wherein said time to prompt the user to press the control member corresponding to said indicia represents a time when a leading end of said bar reaches said indicia.
7. An entertainment system according to
claim 6
, wherein said special action rendering means comprises:
means for changing a progressing pattern of said bar depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among said at least two hypothetical characters.
8. An entertainment system according to
claim 6
, further comprising:
means for rendering at least said bar as a semitransparent bar in front of said hypothetical characters.
9. An entertainment apparatus for connection to a manual controller for outputting a control request from the user and a display unit for displaying images, comprising:
normal action rendering means for displaying an image in which at least two hypothetical characters are making a normal action; and
special action rendering means for displaying an image in which at least one of said hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by said one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
10. A recording medium storing a program and data for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images, said program comprising the steps of:
displaying a normal action image in which at least two hypothetical characters are making a normal action; and
displaying a special action image in which at least one of said hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by said one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
11. A recording medium according to
claim 10
, wherein said step of displaying a special action image comprises the step of:
displaying an icon indicative of said special action when said hypothetical character changes from the normal action to the special action.
12. A recording medium according to
claim 10
, wherein said step of displaying a special action image comprises the steps of:
rendering an indicia representing at least one of control members on said manual controller as said guidance; and
comparing a time to prompt the user to press the control member corresponding to said indicia with a time when the user presses the corresponding control member, and determining whether said hypothetical character is to make the special action based on a compared result.
13. A recording medium according to
claim 12
, wherein said step of determining whether said hypothetical character is to make the special action based on a compared result comprises the step of:
transferring control over to said normal action rendering means when the time when the user presses the corresponding control member deviates from the time to prompt the user to press the control member corresponding to said indicia beyond an allowable range.
14. A recording medium according to
claim 12
, wherein said step of displaying a special action image comprises the step of:
changing the number of indicia and the time to prompt the user to press the control member corresponding to each of said indicia, depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among said at least two hypothetical characters.
15. A recording medium according to
claim 12
, wherein said step of displaying a special action image comprises the step of:
rendering a bar which progresses randomly toward said indicia;
wherein said time to prompt the user to press the control member corresponding to said indicia represents a time when a leading end of said bar reaches said indicia.
16. A recording medium according to
claim 15
, wherein said step of displaying a special action image comprises the step of:
changing a progressing pattern of said bar depending on characteristics of the hypothetical character other than the hypothetical character which makes the special action, among said at least two hypothetical characters.
17. A recording medium according to
claim 12
, further comprising the step of:
rendering at least said bar as a semitransparent bar in front of said hypothetical characters.
18. A program readable and executable by a computer, for use in an entertainment apparatus for connection to a manual controller for outputting a control request from the user, and a display unit for displaying images, said program comprising the steps of:
displaying a normal action image in which at least two hypothetical characters are making a normal action; and
displaying a special action image in which at least one of said hypothetical characters changes from a normal action to a special action when a predetermined condition is satisfied, and the special action made by said one of the hypothetical characters is completed when a control action made by the user satisfies a predetermined control condition indicated in a guidance.
US09/784,895 2000-02-17 2001-02-14 Entertainment system, entertainment apparatus, recording medium, and program Abandoned US20010016511A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2000-40033 2000-02-17
JP2000040033 2000-02-17
JP2001015426A JP2001300132A (en) 2000-02-17 2001-01-24 Entertainment system, entertainment device, recording medium and program

Publications (1)

Publication Number Publication Date
US20010016511A1 true US20010016511A1 (en) 2001-08-23

Family

ID=26585597

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/784,895 Abandoned US20010016511A1 (en) 2000-02-17 2001-02-14 Entertainment system, entertainment apparatus, recording medium, and program

Country Status (3)

Country Link
US (1) US20010016511A1 (en)
EP (1) EP1127598A3 (en)
JP (1) JP2001300132A (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020165027A1 (en) * 2001-05-02 2002-11-07 Takehiro Kaminagayoshi Command processing method
US20030068602A1 (en) * 2001-10-05 2003-04-10 Francis Emmerson Mobile gaming
US20040259636A1 (en) * 2003-06-19 2004-12-23 Aruze Corp. Gaming machine and computer-readable program product
US20050021159A1 (en) * 2003-07-18 2005-01-27 Yojiro Ogawa Network game system and network game processing method
US20050032562A1 (en) * 2003-08-06 2005-02-10 Nintendo Co., Ltd. Game machine and storage medium having game program stored therein
US20060183521A1 (en) * 2005-01-04 2006-08-17 Aruze Corp. Game program, gaming apparatus, and recording medium
US20070060234A1 (en) * 2005-08-23 2007-03-15 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20070167203A1 (en) * 2006-01-17 2007-07-19 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20080248870A1 (en) * 2004-09-22 2008-10-09 Konami Digital Entertainment Co., Ltd. Game Device, Information Storage Medium, and Game Device Control Method
US20090318223A1 (en) * 2008-06-23 2009-12-24 Microsoft Corporation Arrangement for audio or video enhancement during video game sequences
US20100020083A1 (en) * 2008-07-28 2010-01-28 Namco Bandai Games Inc. Program, image generation device, and image generation method
US20120302403A1 (en) * 2009-10-20 2012-11-29 Felix Touret Device intended in particular for athletic training
US20130130791A1 (en) * 2011-02-18 2013-05-23 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US20140018168A1 (en) * 2012-07-13 2014-01-16 DeNA Co., Ltd. Non-transitory information processing device-readable storage medium, and information processing device
US8715090B2 (en) * 2012-10-03 2014-05-06 DeNA Co., Ltd. Information processing device, and non-transitory computer-readable storage medium
US20150273324A1 (en) * 2012-12-25 2015-10-01 Konami Digital Entertainment Co., Ltd. Game machine, and control method and non-transitory computer readable storage medium employed thereupon
US9704350B1 (en) 2013-03-14 2017-07-11 Harmonix Music Systems, Inc. Musical combat game
CN109395398A (en) * 2018-09-27 2019-03-01 腾讯科技(深圳)有限公司 Character control method, calculates equipment and storage medium at device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3153761B2 (en) * 1996-03-06 2001-04-09 株式会社ナムコ Game screen display method and game device
US6217444B1 (en) * 1996-09-24 2001-04-17 Konami Co., Ltd. Simulative golf game system and a method for providing a simulative golf game and a storage medium for storing a simulative golf game program
JP3372832B2 (en) * 1997-07-25 2003-02-04 コナミ株式会社 GAME DEVICE, GAME IMAGE PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING GAME IMAGE PROCESSING PROGRAM
JPH11300044A (en) * 1998-04-16 1999-11-02 Sony Computer Entertainment Inc Recording medium and entertainment system

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020165027A1 (en) * 2001-05-02 2002-11-07 Takehiro Kaminagayoshi Command processing method
US20030068602A1 (en) * 2001-10-05 2003-04-10 Francis Emmerson Mobile gaming
US7275992B2 (en) * 2001-10-05 2007-10-02 Nokia Corporation Mobile gaming
US20040259636A1 (en) * 2003-06-19 2004-12-23 Aruze Corp. Gaming machine and computer-readable program product
US20050021159A1 (en) * 2003-07-18 2005-01-27 Yojiro Ogawa Network game system and network game processing method
US8287378B2 (en) * 2003-07-18 2012-10-16 Sega Corporation Network game system and network game processing method
US20050032562A1 (en) * 2003-08-06 2005-02-10 Nintendo Co., Ltd. Game machine and storage medium having game program stored therein
US8292736B2 (en) * 2004-09-22 2012-10-23 Konami Digital Entertainment Co., Ltd. Game device, information storage medium, and game device control method
US20080248870A1 (en) * 2004-09-22 2008-10-09 Konami Digital Entertainment Co., Ltd. Game Device, Information Storage Medium, and Game Device Control Method
US20060183521A1 (en) * 2005-01-04 2006-08-17 Aruze Corp. Game program, gaming apparatus, and recording medium
US20070060234A1 (en) * 2005-08-23 2007-03-15 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US8100765B2 (en) * 2005-08-23 2012-01-24 Nintendo Co., Ltd. Storage medium having game program and game apparatus
US7967680B2 (en) * 2006-01-17 2011-06-28 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20070167203A1 (en) * 2006-01-17 2007-07-19 Nintendo Co., Ltd. Storage medium having game program stored thereon and game apparatus
US20090318223A1 (en) * 2008-06-23 2009-12-24 Microsoft Corporation Arrangement for audio or video enhancement during video game sequences
US20100020083A1 (en) * 2008-07-28 2010-01-28 Namco Bandai Games Inc. Program, image generation device, and image generation method
US20120302403A1 (en) * 2009-10-20 2012-11-29 Felix Touret Device intended in particular for athletic training
US20130130791A1 (en) * 2011-02-18 2013-05-23 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US8808090B2 (en) * 2011-02-18 2014-08-19 Konami Digital Entertainment Co., Ltd. Game device, game control method, program, recording medium and game management device
US8814685B2 (en) * 2012-07-13 2014-08-26 DeNA Co., Ltd. Non-transitory information processing device storage medium, and information processing device for manually inputting consumption and recovery amount on a game screen
US20140018168A1 (en) * 2012-07-13 2014-01-16 DeNA Co., Ltd. Non-transitory information processing device-readable storage medium, and information processing device
US8715090B2 (en) * 2012-10-03 2014-05-06 DeNA Co., Ltd. Information processing device, and non-transitory computer-readable storage medium
US20150273324A1 (en) * 2012-12-25 2015-10-01 Konami Digital Entertainment Co., Ltd. Game machine, and control method and non-transitory computer readable storage medium employed thereupon
US9573055B2 (en) * 2012-12-25 2017-02-21 Konami Digital Entertainment Co., Ltd. Game machine, method and non-transitory computer readable storage medium for a music game in which touch operations are evaluated
US9704350B1 (en) 2013-03-14 2017-07-11 Harmonix Music Systems, Inc. Musical combat game
CN109395398A (en) * 2018-09-27 2019-03-01 腾讯科技(深圳)有限公司 Character control method, calculates equipment and storage medium at device

Also Published As

Publication number Publication date
EP1127598A3 (en) 2004-10-13
JP2001300132A (en) 2001-10-30
EP1127598A2 (en) 2001-08-29

Similar Documents

Publication Publication Date Title
US6375571B1 (en) Entertainment system, entertainment apparatus, recording medium, and program
US6402619B1 (en) Method for displaying a shoot of a character in a video game, storage medium for storing a video game program, and video game device
US6758756B1 (en) Method of controlling video game, video game device, and medium recording video game program
US7052397B2 (en) Entertainment system, recording medium and entertainment apparatus
EP0768105B1 (en) Method of assisting a player in entering commands in a video game, video game system and video game storage medium
US20010016511A1 (en) Entertainment system, entertainment apparatus, recording medium, and program
US6544125B2 (en) Video game device, method and medium with program for character control based on measured moving time and position
EP1019161B1 (en) Entertainment system, supply medium, and manual control input device
US7559835B2 (en) Video game processing apparatus, a method and a computer program product for processing a video game
US9539504B2 (en) Storage medium storing game program, game device, game system, and game process method
JP2001300142A (en) Video game device, new training preparation method and readable recording medium having new training preparation program recorded thereon
US20010003708A1 (en) Entertainment system, entertainment apparatus, recording medium, and program
US6881149B2 (en) Entertainment system, entertainment apparatus, recording medium, and program
JP3495029B2 (en) Command processing program, recording medium recording command processing program, command processing apparatus and method
JP3699415B2 (en) Character training program, character training method, and video game apparatus
JP3686071B2 (en) Program, information storage medium, and image generation system
JP3054947B1 (en) Image display method, image display device, recording medium, and game device
JP3053391B1 (en) Video game apparatus, video game play control method, and readable recording medium on which the method is recorded
US20030171146A1 (en) Quick passing feature for sports video games
JP2000126446A (en) Game device, storing of game item, and data recording medium
JP2000037562A (en) Game apparatus and information memory medium
JP2005319107A (en) Program, information storage medium, and game system
JP2930237B1 (en) Video game device and readable recording medium on which video game program is recorded
JP7009539B2 (en) Game processing equipment, terminals and programs
JP2003047767A (en) Program for operating a plurality of characters in display screen and external storage medium storing the program

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HINO, AKIHIRO;MOTOMURA, KENTARO;REEL/FRAME:011761/0115;SIGNING DATES FROM 20010402 TO 20010406

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION