WO2007069373A1 - Game program, game machine, and game method - Google Patents

Game program, game machine, and game method Download PDF

Info

Publication number
WO2007069373A1
WO2007069373A1 PCT/JP2006/316224 JP2006316224W WO2007069373A1 WO 2007069373 A1 WO2007069373 A1 WO 2007069373A1 JP 2006316224 W JP2006316224 W JP 2006316224W WO 2007069373 A1 WO2007069373 A1 WO 2007069373A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
data
character
gazing point
virtual camera
Prior art date
Application number
PCT/JP2006/316224
Other languages
French (fr)
Japanese (ja)
Inventor
Keiji Matsukita
Hiroaki Kobayashi
Original Assignee
Konami Digital Entertainment Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konami Digital Entertainment Co., Ltd. filed Critical Konami Digital Entertainment Co., Ltd.
Publication of WO2007069373A1 publication Critical patent/WO2007069373A1/en

Links

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • G06T13/403D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Definitions

  • the present invention relates to a game program, and more particularly to a game program for causing a computer to realize a game for displaying on a monitor an operation screen of a character photographed by a virtual camera.
  • the present invention also relates to a game device and a game method realized by the game program.
  • Non-Patent Document 1 a baseball game, in which a player character displayed on a monitor is operated to play a game.
  • a player can select one baseball team to which each player character belongs, and can play against other players or computers that have selected other baseball teams (for example, Non-Patent Document 1).
  • a player character that is a target object such as a pitcher character or a batter character for example, a pitcher character's pitching action taken by a plurality of virtual cameras placed behind the back net or in front of the stand
  • an action screen for performing a batting action of a batter character can be displayed on the monitor.
  • the camera position of the virtual camera is set to the front of the stand and the gaze point of the virtual camera is set to the face position of the batter character
  • the camera position of the virtual camera is a position where a virtual camera that does not actually exist is arranged, and specifically, a position determined by a three-dimensional coordinate position in the stadium.
  • a virtual camera's point of sight is a point where a virtual camera that does not exist in reality focuses on the object, and the camera position of the virtual camera goes straight to the object.
  • the extended straight line is a point on the target object that intersects the target object, specifically, a point determined by a three-dimensional coordinate position on the target object.
  • Non-Patent Document 1 Professional Baseball Spirits 2 Official Guide Complete Edition, Japan, Konami Corporation, April 7, 2005
  • the camera position of the virtual camera and the gazing point of the virtual camera are set in advance for each player character or event. This makes it possible to capture the action of the player character with the optimal camera angle that matches each player character or event.
  • the camera position of the virtual camera is set to the front of the stand, and the gaze point of the virtual camera is set to the batter character's face position. By setting it, the batter character's movement will be shot with the optimal power mela angle!
  • An object of the present invention is to realize a realistic game in a game program.
  • a game program according to claim 1 is a program for causing a computer capable of realizing a game for displaying a motion screen of a character photographed by a virtual camera on a monitor to realize the following functions.
  • a reference character data creation function for creating reference character data relating to a character photographed by a virtual camera.
  • a reference camera gazing point data creation function for creating reference camera gazing point data for a character to be photographed by a virtual camera.
  • a game condition determining function for determining whether or not a predetermined game condition regarding the game is satisfied.
  • Data creation function
  • a character data creation function for creating character data relating to a character based on the reference character data and the camera gazing point data.
  • a character motion display function that reads character data created by the character data creation function and displays a motion screen of the character photographed by the virtual camera.
  • the reference character data relating to the character photographed by the virtual camera is created by the reference character data creation function.
  • reference camera gazing point data creation function reference camera gazing point data for the character to be imaged by the virtual camera is created.
  • game condition determining function it is determined whether or not a predetermined game condition relating to the game is satisfied.
  • camera gaze point data creation function when the game condition judgment function determines that a predetermined game condition is satisfied, virtual camera camera gaze point data is created based on the reference camera gaze point data.
  • character data creation function character data related to the character is created based on the reference character data and the camera gazing point data.
  • the character motion display function the character data created by the character data creation function is read, and the motion screen of the character photographed by the virtual camera is displayed.
  • a player character that is a target object such as a pitcher character or a batter character, for example, a pitcher motion of a pitcher character or a batter character photographed by a plurality of virtual cameras placed behind the back net or in front of a stand.
  • Action to perform the blow action Play a game that displays the screen on the monitor.
  • reference character data relating to a character photographed by the virtual camera is created by the reference character data creation function.
  • the reference character data related to the character is data used as a reference for generating the motion performed by the player character as a moving image, and is basic data that does not depend on the shaking of the virtual camera described later. .
  • reference camera gazing point data for a character to be imaged by the virtual camera is created by the reference camera gazing point data creation function.
  • the reference camera gazing point data related to the character is a point that actually exists! /,!, And the virtual camera focuses on the target character.
  • the straight line extending straight toward is the numerical data of the point on the character that intersects the character, for example, the numerical data of the three-dimensional coordinate position on the character.
  • the reference camera gaze data creation function sets the gaze point of the virtual camera to the batter character's face position. Then, the numerical data of the three-dimensional coordinate position of the batter character's face position is created as the reference camera gazing point data for the character.
  • the predetermined game condition is a specific condition in which the virtual camera is shaken, and is a specific condition in which the virtual camera is shaken due to, for example, an influence of a wind generated in the stadium or an erroneous operation of a cameraman.
  • the game condition determination function when a predetermined game condition is satisfied, specifically, when the wind force generated in the stadium is higher than a predetermined value or when the cameraman makes an erroneous operation according to the probability that the cameraman makes an erroneous operation.
  • the virtual camera swings up and down by a predetermined amount (swing width), for example, in the predetermined swing direction, which is the shake data of the virtual camera, and the coordinate position of the X-axis and Z-axis It is determined that the virtual camera has shaken so that the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data is maintained. If the predetermined game condition is not satisfied, it is determined that the virtual camera is shaken.
  • a predetermined amount for example, in the predetermined swing direction, which is the shake data of the virtual camera
  • the coordinate position of the X-axis and Z-axis the coordinate position of the X-axis and Z-axis
  • Camera gazing point data of the virtual camera is created.
  • the camera gazing point data relating to a character is a point where a virtual camera that does not actually exist focuses on the target character, and the virtual camera power also extends straight toward the character.
  • the numerical data of the point on the character that intersects the character for example, the numerical data of the three-dimensional coordinate position on the character.
  • the camera gazing point data relating to the character is data depending on the shaking of the virtual camera in which the shaking data of the virtual camera is added to the reference camera gazing point data relating to the character.
  • the game condition determination function causes the virtual camera to swing up and down by a predetermined amount (swing width) in a predetermined swing direction that is the shake data of the virtual camera, for example, the X axis and Z axis. If it is determined that the virtual camera shakes so that the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while maintaining the numerical data at the coordinate position of the camera, the camera gaze data creation function causes the virtual camera Numerical data is created for the three-dimensional coordinate position where the viewpoint swings up and down the batter character's face.
  • the helmet position increased by a predetermined amount upward from the face position is created, and numerical data of the three-dimensional coordinate position of the batter character's helmet position is created as the camera gaze data related to the character. Is done.
  • the uniform position is increased by a predetermined amount downward from the face position, and numerical data of the three-dimensional coordinate position of the batter character's uniform position is created as camera gaze data for the character.
  • character data relating to the character is created by the character data creation function based on the reference character data and the camera gazing point data.
  • the character data relating to the character is data for generating the motion performed by the player character as a moving image, and is data depending on the shaking of the virtual camera to which the shaking data of the virtual camera is added.
  • the virtual camera The batter character photographed by the virtual camera with the viewpoint set to the helmet position where the face position increased upward by a predetermined amount or the helmet position where the viewpoint decreased downward from the face position by a predetermined amount was added.
  • a series of operation screens when entering the utter box Data for generation is created.
  • the character motion display function reads out the character data created by the character data creation function, and the motion screen of the character photographed by the virtual camera is displayed.
  • the helmet position created by the character data creation function for example, the helmet position where the gazing point of the virtual camera has increased by a predetermined amount upward from the face position, or the uniform position where the facial position force has been increased by a predetermined amount downward
  • the batter shot by the virtual camera with the shaking is read, and the data for generating a series of action screens when the batter character shot by the virtual camera with the shaking added is entered into the knotter box.
  • a series of operation screens when the character enters the batter box is displayed on the monitor.
  • the reference character data creation function and the reference camera gazing point data creation function allow the reference character data regarding the character photographed by the virtual camera and the reference camera gazing point data for the character to be photographed by the virtual camera. Data is created.
  • the game condition determination function determines that the wind force generated in the stadium is greater than or equal to a predetermined value, or when the cameraman has determined that the cameraman has made an error due to the probability of the cameraman having made an error
  • the virtual camera shake data For example, the virtual camera swings up and down by a specified amount of swing (swing width) in the predetermined swing direction, and the coordinate position of the Y axis while maintaining the X and Z coordinate position data.
  • the virtual camera has swayed so that the numerical data of is increased or decreased by a predetermined amount.
  • the game condition determination function determines that a predetermined game condition relating to the game is satisfied
  • the camera gaze data and character data of the virtual camera are created by the camera gaze data creation function and the character data creation function. Character data for is created.
  • the character motion display function reads out the character data created by the character data creation function and displays the motion screen of the character photographed by the virtual camera.
  • the gaze point of the virtual camera determines the face position of the batter character with the reference camera gaze point data creation function. Is set, and numerical data of the 3D coordinate position of the batter character's face position is created as the reference camera gazing point data for the character .
  • the virtual camera swings up and down by a predetermined amount (swing width) in the predetermined swing direction that is the shake data of the virtual camera, for example, the X-axis and z-axis coordinates
  • a predetermined amount for example, the X-axis and z-axis coordinates
  • the camera gaze data creation function changes the virtual camera's gaze point to the batter.
  • the numerical data of the three-dimensional coordinate position of the position that swung in the vertical direction of the character's face position is created.
  • the helmet position is increased by a predetermined amount upward from the face position, and the numerical data of the three-dimensional coordinate position of the batter character's hermet position is created as the camera gaze data related to the character.
  • the uniform position is increased by a predetermined amount downward from the face position, and numerical data of the three-dimensional coordinate position of the batter character's uniform position is created as camera gazing point data relating to the character.
  • the character data creation function sets the gaze point of the virtual camera to the helmet position where the gaze point of the virtual camera is increased upward by a predetermined amount or the uniform position where the facial position force is also increased by a predetermined amount of downward.
  • data is generated for generating a series of action screens when the batter character photographed by the virtual camera to which the shaking is added enters the knotter box.
  • the character action display function the uniform created by the character data creation function with the virtual camera gaze point increased by a predetermined amount upward from the face position or the uniform increased by a predetermined amount downward from the face position.
  • the data used to generate a series of action screens when a batter character shot by a virtual camera set at a position and shaken enters the notter box is read and shot by a virtual camera with shake added.
  • a series of action screens when the batter character enters the batter box is displayed on the monitor.
  • the camera gaze point related to the character that has the virtual camera shake data created by the camera gaze data creation function and the character data creation function is created, and a series of operation screens when the batter character photographed by the virtual camera to which the shake is added enters the batter box is displayed on the monitor. Therefore, in this game program, the TV camera shakes like a real-world baseball broadcast. Since it can be reproduced, it is possible to realize a baseball game with a reality close to that of a real-world baseball game.
  • a game program according to claim 2 is a program for causing the computer to further realize the following functions in the game program according to claim 1.
  • a camera position data creation function for creating camera position data of a virtual camera.
  • the character data creating function further creates character data relating to the character based on the camera position data of the virtual camera.
  • the camera position data of the virtual camera is further created by the camera position data creation function.
  • the camera position data of the virtual camera is a position where a virtual camera that does not actually exist is arranged, for example, a position determined by a three-dimensional coordinate position in the stadium. Specifically, when the virtual camera is placed in front of the stand, the virtual camera is set in front of the camera position force stand, and numerical data of the three-dimensional coordinate position of the camera position of the virtual camera is created.
  • the character data creation function creates character data related to the character based on the camera position data of the virtual camera, in addition to the reference character data and the camera gaze point data.
  • the camera position data creation function creates the camera position data of the virtual camera, and the character data about the character is created based on the created camera position data of the virtual camera. Realistic baseball games that are close to real-world baseball broadcasts can be realized.
  • the game program according to claim 3 is the game program according to claim 1 or 2, wherein the reference camera gazing point data is data capable of generating a waveform having a different amplitude for each predetermined period, and
  • the viewpoint data creation function creates camera gazing point data by changing the amplitude of the reference camera gazing point data.
  • the virtual camera swings up and down, and the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data at the coordinate position of the X axis and Z axis is maintained. If this is the case, the waveform will have a different amplitude (maximum or minimum amplitude on the Y axis) for each given period. For example, by creating camera gazing point data so that a waveform such as a sine wave has a different amplitude every 2 ⁇ , it is possible to simplify the state in which the virtual camera is swinging up and down at predetermined intervals. Can be realized.
  • the predetermined game condition regarding the game is a condition regarding shaking of the virtual camera
  • the camera gazing point data creation function is In addition, the camera gaze data of the virtual camera is created based on the camera shake data related to the virtual camera shake.
  • the predetermined game condition relating to the game is a condition relating to the shaking of the virtual camera. Specifically, when the wind force generated in the stadium is greater than or equal to a predetermined value, This is a condition for determining that the cameraman has made an erroneous operation based on the probability of an erroneous operation.
  • the camera gazing point data creation function allows the virtual camera to swing up and down by a predetermined amount (swing width) in a predetermined shaking direction that is the shaking data of the virtual camera, for example, the X axis, While maintaining the numerical data of the coordinate position of the heel axis, the camera gazing point data is generated by shaking the virtual camera so that the numerical data of the coordinate position of the heel axis is increased or decreased by a predetermined amount. Realizing a baseball game with a reality close to that.
  • the game program according to claim 5 is the same as the game program according to claim 4, and the camera gazing point data includes a predetermined camera shaking coefficient obtained from the camera shaking data in the reference camera gazing point data. The numerical data multiplied.
  • the virtual camera is shaken randomly. Can be realized.
  • a game device is a game device that realizes a game in which a motion screen of a character photographed by a virtual camera is displayed on a monitor.
  • the game apparatus includes reference character data creation means, reference camera gazing point data creation means, game condition determination means, camera gazing point data creation means, character data creation means, and character action display means.
  • reference character data creation means reference character data relating to the character photographed by the virtual camera is created.
  • Reference camera gazing point In the data creation means, reference camera gazing point data for the character to be photographed by the virtual camera is created.
  • the game condition determining means it is determined whether or not the predetermined game condition relating to the game is satisfied.
  • the camera gazing point data creating means determines that the predetermined game condition is satisfied by the game condition deciding means, the camera gazing point data of the virtual camera is created based on the reference camera gazing point data.
  • the In the character data creation means character data relating to the character is created based on the reference character data and the camera gazing point data.
  • the character motion display means the character data created by the character data creation means is read, and the motion screen of the character photographed by the virtual camera is displayed.
  • a game method is a game method for realizing a game in which an operation screen of a character photographed by a virtual camera is displayed on a monitor.
  • the game method includes a reference character data creation step, a reference camera gazing point data creation step, a game condition determination step, a camera gazing point data creation step, a character data creation step, and a character action display step.
  • Yes In the reference character data creation step, reference character data relating to the character photographed by the virtual camera is created.
  • reference camera gazing point data creation step reference camera gazing point data for the character to be imaged by the virtual camera is created.
  • the game condition determining step it is determined whether or not the power satisfies a predetermined game condition relating to the game.
  • the camera point of the virtual camera is determined based on the reference camera gaze point data.
  • Viewpoint data is created.
  • character data creation step character data related to the character is created based on the reference character data, the camera gaze point, and the data.
  • the character action display step the character data created in the character data creation step is read, and the action screen of the character photographed by the virtual camera is displayed.
  • FIG. 1 is a basic configuration diagram of a video game apparatus according to an embodiment of the present invention.
  • FIG. 2 is a functional block diagram of the video game apparatus.
  • FIG. 3 is a schematic diagram showing data stored in a storage unit of the video game apparatus.
  • FIG. 4 A schematic diagram showing the change in vertical shaking of the virtual camera.
  • FIG. 5 is a schematic diagram showing the relationship between the camera position of the virtual camera and the camera gazing point with respect to the batter character of the virtual camera.
  • FIG. 6 is a television monitor diagram showing a batting operation display screen when the virtual camera is shaken upward.
  • FIG. 7 is a television monitor diagram showing a batting action display screen when the virtual camera is shaken.
  • FIG. 8 is a television monitor diagram showing a batting operation display screen when the virtual camera is shaken downward.
  • FIG. 9 is a flowchart for explaining a character action display process.
  • FIG. 10 is a flowchart for explaining camera gazing point data creation processing.
  • FIG. 1 shows a basic configuration of a game device according to an embodiment of the present invention.
  • a home video game apparatus will be described as an example of the video game apparatus.
  • the home video game apparatus includes a home game machine body and a home television.
  • the home game machine main body can be loaded with the recording medium 10, and the game data is read as appropriate for the recording medium 10 to execute the game.
  • the content of the game executed in this way is displayed on the home television.
  • the game system of the home video game apparatus includes a control unit 1, a storage unit 2, an image display unit 3, an audio output unit 4, and an operation input unit 5, each of which uses a bus 6. Connected through.
  • This bus 6 includes an address bus, a data bus, and a control bus.
  • the control unit 1, the storage unit 2, the audio output unit 4, and the operation input unit 5 are included in the home video game machine main body of the home video game device, and the image display unit 3 is included in the home television. I'm going to talk.
  • the control unit 1 is provided mainly for controlling the progress of the entire game based on the game program.
  • the control unit 1 includes, for example, a CPU 7 (Central Processing Unit), a signal processor 8, and an image processor 9.
  • the CPU 7, the signal processor 8 and the image processor 9 are connected to each other via the bus 6.
  • the CPU 7 interprets instructions from the game program and performs various data processing and control.
  • the CPU 7 instructs the signal processor 8 to supply image data to the image processor.
  • the signal processor 8 mainly performs calculations in a three-dimensional space, position conversion calculation from a three-dimensional space to a pseudo three-dimensional space, light source calculation processing, and image and sound data generation and processing processing. It is carried out.
  • the image processing processor 9 mainly performs a process of writing image data to be drawn into the RAM 12 based on the calculation result and the processing result of the signal processor 8.
  • the storage unit 2 is provided mainly for storing program data, various data used in the program data, and the like.
  • the storage unit 2 includes, for example, a recording medium 10, an interface circuit 11, and a RAM 12 (Random Access Memory).
  • An interface circuit 11 is connected to the recording medium 10.
  • the interface circuit 11 and the RAM 12 are connected via the bus 6.
  • the recording medium 10 is used to record operation system program data, image data, audio data, and various game data.
  • the recording medium 10 is, for example, a ROM (Read Only Memory) cassette, an optical disk, a flexible disk, or the like, and stores operating system program data, game data, and the like.
  • the recording medium 10 also includes a card-type memory, and this force-type memory is mainly used for storing various game parameters at the time of interruption when the game is interrupted.
  • the RAM 12 is used for temporarily storing various data read from the recording medium 10 and for temporarily recording the processing results from the control unit 1. This RAM 12 stores various data and address data indicating the storage location of the various data,
  • the image display unit 3 is mainly configured to store images written in the RAM 12 by the image processor 9. It is provided for outputting image data, image data read from the recording medium 10 and the like as an image.
  • the image display unit 3 includes, for example, a television monitor 20, an interface circuit 21, and a D / A converter 22 (Digital-To-Analog converter).
  • a DZ A converter 22 is connected to the television monitor 20, and an interface circuit 21 is connected to the D / A converter 22.
  • the bus 6 is connected to the interface circuit 21.
  • the image data is supplied to the DZA converter 22 via the interface circuit 21 and is converted into an analog image signal here. Then, the analog image signal is output as an image to the television monitor 20.
  • image data includes, for example, polygon data and texture data.
  • Polygon data is the coordinate data of vertices constituting a polygon.
  • the texture data is used to set a texture on the polygon, and consists of texture instruction data and texture color data.
  • the texture instruction data is data for associating polygons and textures
  • the texture color data is data for designating the texture color.
  • polygon address data and texture address data indicating the storage position of each data are associated with the polygon data and the texture data.
  • the signal processor 8 uses the polygon data (3D polygon data) in the 3D space indicated by the polygon address data, based on the movement amount data and rotation amount data of the screen itself (viewpoint). Then, coordinate conversion and perspective projection conversion are performed and replaced with polygon data in the two-dimensional space (two-dimensional polygon data).
  • a polygon outline is composed of a plurality of two-dimensional polygon data, and texture data indicated by the texture address data is written in an internal area of the polygon. In this way, it is possible to represent an object in which a texture is pasted on each polygon, that is, various characters.
  • the audio output unit 4 is provided mainly for outputting audio data read from the recording medium 10 as audio.
  • the audio output unit 4 includes, for example, a speaker 13, an amplifier circuit 14, a DZA converter 15, and an interface circuit 16.
  • An amplifier circuit 14 is connected to the spinner 13, a DZA converter 15 is connected to the amplifier circuit 14, and an interface circuit 16 is connected to the DZA converter 15.
  • the bus 6 is connected to the interface circuit 16.
  • the signal is supplied to the D / A converter 15 through the audio data interface circuit 16 and converted into an analog audio signal.
  • This analog audio signal is amplified by the amplifier circuit 14 and output from the speaker 13 as audio.
  • Audio data includes, for example, ADPCM (Adaptive Differential Pulse Code Modulation) data and PCM (Pulse Code Modulation) data.
  • ADPCM data sound can be output from the speaker 13 by the same processing method as described above.
  • PCM data by converting the PCM data into ADPCM data in the RAM 12, the sound can be output from the speaker 13 by the same processing method as described above.
  • the operation input unit 5 mainly includes a controller 17, an operation information interface circuit 18, and an interface circuit 19.
  • An operation information interface circuit 18 is connected to the controller 17, and an interface circuit 19 is connected to the operation information interface circuit 18.
  • the bus 6 is connected to the interface circuit 19.
  • the controller 17 is an operation device used by the player to input various operation commands, and sends an operation signal corresponding to the operation of the player to the CPU 7.
  • ⁇ 1st button 17a ⁇ 2nd button 17b ⁇ 3rd button 17c ⁇ 4th button 17d, Up key 17U, Down key 17D, Left key 17L, Right key 17R, L1 button 17L1, L2 Button 17 L2, R1 button 17R1, R2 button 17R2, start button 17e, select button 17f, left stick 17SL and right stick 17SR are provided.
  • Up arrow key 17U, down arrow key 17D, left arrow key 17L, and right arrow key 17R give CPU 7 a command to move a character or cursor up, down, left, or right on the screen of television monitor 20. Used for.
  • the start button 17e is used when instructing the CPU 7 to load a game program from the recording medium 10.
  • the select button 17f is used when the CPU 7 is instructed to select various types of game programs loaded from the recording medium 10.
  • the left stick 17SL and the right stick 17SR have almost the same structure as a so-called joystick.
  • This stick type controller has an upright stick. This stick is designed to tilt up to 360 ° including the front, back, left and right with the fulcrum as the center.
  • the left stick 17SL and right stick 17SR have the operation information interface circuit 18 and interface circuit 19 using the X and y coordinate values with the upright position as the origin, as operation signals, according to the tilt direction and tilt angle of the stick. And send it to CPU7.
  • the first button 17a ?? second button 17b ?? third button 17c ?? fourth button 17d, L1 button 17L1, L2 button 17L2, R1 button 17R1 and R2 button 17R2 are loaded with the game program Various functions are allocated depending on the situation.
  • buttons and keys of the controller 17 except for the left stick 17SL and the right stick 17SR are turned on when the neutral position force is pressed by an external pressing force, and are neutral when the pressing force is released. Become an on / off switch that returns to the position and turns off!
  • the CPU 7 receives image data, audio data, and a program from the recording medium 10 based on the operating system stored in the recording medium 10. Read data. Some or all of the read image data, audio data, and program data are stored in RAMI 2. Then, the CPU 7 issues a command to the image data and audio data stored in the RAM 12 based on the program data stored in the RAM 12.
  • image data on the basis of a command from the CPU 7, first, the position calculation of the character in the three-dimensional space of the signal processor 8 and the light source calculation are performed. Next, the image processor 9 performs a process of writing image data to be drawn into the RAM 12 based on the calculation result of the signal processor 8. Then, the image data is written to the RAM 12 and supplied to the DZA converter 22 via the interface circuit 21. Here, the image data is converted into an analog video signal by the DZA converter 22. The image data is supplied to the television monitor 20 and displayed as an image.
  • the signal processor 8 In the case of audio data, first, the signal processor 8 generates and processes audio data based on commands from the CPU 7. Here, for audio data, For example, pitch conversion, noise addition, envelope setting, level setting, and reverb addition are performed. Next, the audio data is output from the signal processor 8 and supplied to the DZA converter 15 via the interface circuit 16. Here, the audio data is converted into an analog audio signal. Then, the audio data is output as audio from the speaker 13 via the amplifier circuit 14.
  • the game executed in this game machine is, for example, a baseball game.
  • This game machine can realize a game in which a character displayed on the television monitor 20 is operated by operating the controller 17.
  • FIG. 2 is a functional block diagram for explaining functions that play a major role in the present invention.
  • the control unit 1 includes a character display means 50, a character action means 51, a camera position data creation means 52, a reference character data creation means 53, a reference camera gazing point data creation means 54, a game condition determination means 55,
  • the camera mainly includes camera gazing point data creation means 56, character data creation means 57, and character motion display means 58.
  • the character display means 50 has a function of displaying the batter character 70 on the television monitor 20.
  • the character display means 50 displays the batter character 70 shown in FIGS. 6 to 8 on the television monitor 20.
  • the batter image data corresponding to the batter character 70 is supplied from the storage unit 2, for example, the recording medium 10 to the RAM 12 and stored in the RAM 12 when the game program is loaded.
  • the batter image data is recognized by the control unit 1, for example, the CPU 7.
  • the batter coordinate data storage unit 2 for displaying batter image data on the television monitor 20 is supplied from the recording medium 10 to the RAM 12 and stored in the RAM 12.
  • the batter image data is recognized by the control unit 1, for example, the CPU 7.
  • the batter image data force stored in the RAM 12 is supplied to the television monitor 20 via the image processing processor 9 based on an instruction from the CPU 7.
  • the batter image data is displayed at a predetermined position on the television monitor 20 based on the batter coordinate data.
  • the CPU 7 indicates that the batter image data is displayed at a predetermined position on the television monitor 20.
  • the character operation means 51 has a function of operating the batter character 70. In the character operating means 51, the batter character 70 is operated.
  • the control unit 1 when a signal from the controller 17 for operating the batter character 70 is recognized by the control unit 1, for example, the CPU 7, the batter image corresponding to the batter character 70 based on the instruction from the CPU 7.
  • Data force controller 1 Processed by, for example, signal processor 8 and image processor 9. The processed image data is also supplied to the television monitor 20 with the RAM 12 power, and the swing motion of the batter character 70 is displayed on the television monitor 20 as a moving image.
  • the camera position data creating means 52 has a function of creating camera position data P1 to Pn (see FIG. 3) of the virtual camera 80 (see FIG. 5).
  • camera position data Pl to Pn of the virtual camera 80 are created.
  • the camera position data Pl to Pn of the virtual camera 80 actually exist and are positions where the virtual camera 80 is arranged, for example, positions determined by a three-dimensional coordinate position in the stadium. is there.
  • the camera position P1 of the virtual camera 80 is set to the front of the stand, and the numerical data of the three-dimensional coordinate position of the camera position P1 of the virtual camera 80 is stored. Created.
  • various data of the camera position data P 1 to Pn recognized by the camera position data creating means 52 are stored in the RAM 12.
  • the reference character data creation means 53 has a function of creating reference character data CB 1 to CBn (see FIG. 3) relating to characters photographed by the virtual camera 80! /.
  • reference character data CBl to CBn relating to the character photographed by the virtual camera 80 is created.
  • the reference character data CBl to CBn related to the character is data used as a reference for generating the motion performed by the character as a moving image, and is basic data that does not depend on the shaking of the virtual camera 80.
  • the reference character data CB1 related to the batter character 70 entered in the knotter box is data for generating a motion of the batting swing of the specific batter character 70 as a moving image, and the virtual character data shown in FIG.
  • the reference character data CB1 to CB recognized by the reference character data creation means 53 are used.
  • Various data of n is stored in the RAM 12.
  • the reference camera gazing point data creating means 54 has a function of creating reference camera gazing point data LBl to LBn (see FIG. 3) for the character to be photographed by the virtual camera 80.
  • reference camera gazing point data creation means 54 reference camera gazing point data LBl to LBn for the character to be photographed by the virtual camera 80 are created.
  • the reference camera gazing point data LBl to LBn for the character is a point at which the virtual camera 80 that does not actually exist is focused on the target character, as shown in FIG.
  • a straight line extending straight from the virtual camera 80 toward the batter character 70 (the front end of the virtual camera 80 and the first camera gazing point 4 which is camera gazing point data L1 to L3 described later)
  • straight lines SL1 to SL3 connecting the second camera gazing point 42 and the third camera gazing point 43, respectively, are numerical data of points on the batter character 70 that intersect with the batter character 70. Is the numerical data of the coordinate position.
  • a straight line that extends straight from the virtual camera 80 placed in front of the stand toward the batter character 70 (the tip of the virtual camera 80, the first camera gaze point 41, the second camera gaze point 42, the third camera
  • This is numerical data of points on the batter character 70 at which the straight lines SL1 to SL3) respectively connecting the gazing point 43 and the batter character 70 intersect.
  • the reference camera gazing point data LB1 to LB3 and the first camera gazing point 41 and the second camera gazing point 4 which are camera gazing point data L1 to L3 described later, respectively.
  • the third camera gaze point 43 is compatible, and in particular, the second camera gaze point 42, which is the camera gaze point data L2 of the virtual camera 80 that is not shaken, is the same as the reference camera gaze point data LB2. Yes.
  • various data of the reference camera gazing point data LBl to LBn recognized by the reference camera gazing point data creating means 54 is stored in the RAM 12.
  • the game condition determining means 55 has a function of determining whether or not the power satisfies a predetermined game condition relating to the game.
  • the game condition determining means 55 determines whether or not the power satisfies a predetermined game condition relating to the game.
  • the predetermined game conditions are specific conditions in which the virtual camera 80 is shaken, for example, the specific conditions in which the virtual camera 80 is shaken due to, for example, the influence of a wind generated in the stadium or an erroneous operation of a cameraman. .
  • the predetermined game condition when the predetermined game condition is satisfied, specifically, when the wind data W (see FIG.
  • the virtual camera 80 swings up and down by a predetermined amount (swing width), for example, in a predetermined direction that is the vibration data of the virtual camera 80. It is determined that the virtual camera 80 is shaken so that the numerical data of the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data of the coordinate position of the X axis and Z axis is maintained. Further, the game condition determining means 55 determines that the virtual camera 80 is not shaken when the predetermined game condition is not satisfied.
  • various data of the determination result such as whether the virtual camera 80 recognized by the game condition determination unit 55 is shaking, swaying, or shaking is stored in the RAM 12.
  • the camera gazing point data creation means 56 is configured to use the camera of the virtual camera 80 based on the reference camera gazing point data LBl to LBn. It has a function to create gazing point data Ll to Ln (see Fig. 3).
  • the camera gazing point data creation means 56 when the game condition determination means 55 determines that a predetermined game condition is satisfied, that is, when it is determined that the virtual camera 80 is shaking, the reference camera gazing point data Based on LBl to LBn, camera gazing point data Ll to Ln of the virtual camera 80 are created.
  • the camera gazing point data Ll to Ln relating to the character is a point where the virtual camera 80 that does not actually exist focuses on the target character, as shown in FIG.
  • Straight line extending straight from virtual camera 80 to batter character 70 (straight line SL1 connecting the tip of virtual camera 80 and first camera gaze point 41, second camera gaze point 42, and third camera gaze point 43) ⁇ SL3) are numerical data of points on the batter character 70 intersecting with the batter character 70, for example, numerical data of a three-dimensional coordinate position on the character.
  • a straight line extending straight from the virtual camera 80 arranged in front of the stand toward the batter character 70 are the numerical data of the points on batter character 70 that intersect batter character 70.
  • camera gaze point data Ll to L3 and the first camera note The viewpoint 41, the second camera gazing point 42, and the third camera gazing point 43 coincide.
  • the camera gazing point data Ll to Ln related to the character are the camera sway data R (see Fig. 3) and wind power of the virtual camera 80 to the reference camera gazing point data LB 1 to LBn related to the character.
  • This data depends on the shaking of the virtual camera 80 to which data W (see Fig. 3) is added.
  • the game condition determination means 55 causes the virtual camera 80 to swing up and down by a predetermined amount (swing width) in a predetermined swing direction that is the camera shake data R of the virtual camera 80, for example, X If it is determined that the virtual camera 80 is shaking so that the numerical data of the Y-axis coordinate position is increased or decreased by a predetermined amount while maintaining the numerical data of the axis and Z-axis coordinate positions, the camera gaze data is created.
  • numerical data of the three-dimensional coordinate position of the position where the camera gazing point data Ll to Ln of the virtual camera 80 swings up and down the face position of the batter character 70 is created.
  • the second camera gazing point 42 of the virtual camera 80 in the state in which the virtual camera 80 shown in FIGS. 5 and 7 is not shaken is set to the nose position of the batter character 70, and the camera gazing point data L2 relating to the batter character 70 is shown in FIG.
  • numerical data of the three-dimensional coordinate position of the nose position of the batter character 70 is created.
  • the game condition determining means 55 determines that the virtual camera 80 has shaken upward
  • the first camera gazing point 41 is set to the helmet position increased by a predetermined amount upward from the nose position of the second camera gazing point 42 by the camera gazing point data creation means 56, and the camera gazing point 70 is associated with the camera gazing point 70.
  • viewpoint data L1 numerical data of the three-dimensional coordinate position of the hermet position of the batter character 70 is created. Further, when the game condition determining means 55 determines that the virtual camera 80 has shaken downward, the third camera * of the virtual camera 80 in the state where the virtual camera 80 shown in FIGS.
  • the viewpoint 43 is set to a jaw position that is increased by a predetermined amount downward from the nose position of the second camera gaze point 42 by the camera gaze data creation means 56, and the batter character 70 as the camera gaze data L3 regarding the batter character 70 is set.
  • Numerical data of the 3D coordinate position of the jaw position is created.
  • various data of the camera gazing point data Ll to Ln recognized by the camera gazing point data creating means 56 is stored in the RAM 12.
  • the character data creation means 57 includes camera position data P 1 to Pn created by the camera position data creation means 52, reference character data CB1 to CBn created by the reference character data creation means 53, and camera gaze data creation means 56. Based on the camera gazing point data LBl to LBn created by, character data C1 to Cn (see Fig. 3) related to the character are created.
  • Camera position data Pl to Pn created by the camera position data creation means 52, reference character data CBl to CBn created by the reference character data creation means 53, force camera gaze data creation means 56 Based on LBl to LBn, character data Cl to Cn related to the character are created.
  • the character data Cl to Cn related to the characters is data for generating motions performed by the player character as moving images, and the virtual camera 80 to which the camera shake data R and wind power data W of the virtual camera 80 are added. This data is dependent on the shaking.
  • the camera position data Pl to Pn created by the camera position data creation means 52, the reference character data CBl to CBn created by the reference character data creation means 53, and the camera gaze data creation means 56 Based on the generated camera gazing point data LBl to LBn, the camera gazing point data Ll to Ln of the virtual camera 80 are located upward from the nose position (second camera gazing point 42 shown in FIGS. 5 and 7). Helmet position increased by a fixed amount (first camera gaze point 41 shown in FIGS.
  • the virtual camera 80 to which the batter character 70 enters the batter box is generated by the virtual camera 80 to which the camera shake data R and wind data W of the virtual camera 80 are added. Made.
  • the reference character data regarding the batter character 70 corresponding to the second camera gaze point 42 is used.
  • character data C2 relating to the batter character 70 identical to the reference character data CB1 relating to the batter character 70 is created.
  • the reference character data regarding the batter character 70 corresponding to the first camera gazing point 41 is obtained.
  • the character data C1 relating to the batter character 70 which operates in the same manner as the reference character data CB1 relating to the batter character 70 but the virtual camera 80 faces upward, is created.
  • reference character data CB1 relating to the batter character 70 corresponding to the third camera gazing point 43 is stored.
  • the virtual camera 80 faces downward.
  • Character data C3 relating to the batter character 70 in the direction is created.
  • various data of the character data Cl to Cn recognized by the character data creating means 57 is stored in the RAM 12.
  • the character action display means 58 has a function of reading the character data Cl to Cn created by the character data creation means 57 and displaying the action screen of the character photographed by the virtual camera 80.
  • the character data Cl to Cn created by the character data creation means 57 are read, and the action screen of the character photographed by the virtual power camera 80 is displayed.
  • the character data C2 created by the character data creating means 57 is read out. Then, the batting action display screen 40 (see FIG.
  • the character data C3 created by the character data creation means 57 is read out and A batting action display screen 40 (see FIG. 7), which is the batting action screen when entering the batter box 70 batter character taken by the virtual camera 80 swaying in the direction, is displayed on the television monitor 20.
  • the camera gaze data creating means 56 and the character data creating means 57 cause the camera shake data R and wind power of the virtual power 80 to be shaken.
  • Power for batter character 70 with added data W Mera gaze point data L 1 to Ln and character data C 1 to Cn for batter character 70 are created and photographed by virtual camera 80 with camera shake data R and wind power data W added Is a batting action screen when the hit batter character 70 enters the batter box
  • the hitting operation display screen 40 is displayed on the television monitor 20.
  • This game program can reproduce the shaking of a TV camera like a real-world baseball broadcast, so it can realize a realistic baseball game similar to a real-world baseball broadcast.
  • the character motion display system and the camera gazing point data creation system in the baseball game of the present embodiment are a schematic diagram showing the change in vertical shaking of the virtual camera 80 shown in FIG. 4, and the flowcharts shown in FIG. 9 and FIG. It explains using.
  • the virtual camera 80 swings in the vertical direction as shown in FIGS.
  • the + direction shown in FIGS. 4 and 5 means the upward direction
  • the one direction shown in FIGS. 4 and 5 means the downward direction.
  • the virtual camera 80 has a predetermined cycle.
  • the camera gazing point data Ll to Ln are determined so that a waveform having a different amplitude (a maximum or minimum amplitude on the Y axis), such as a waveform of a sine wave, has a different amplitude every 2 ⁇ . It is summer. Specifically, as shown in FIG.
  • the virtual camera 80 swings upward so that the maximum amplitude becomes +1 at the ⁇ position, and reaches the B position.
  • the virtual camera 80 is turned over, the virtual camera 80 is turned into a state, and the virtual camera 80 is swung so that the maximum amplitude is 1 in the downward direction at the C position.
  • the position A shown in FIG. 4 corresponds to the first camera gazing point 41 of the virtual camera 80 in a state where the virtual camera 80 shown in FIGS. 5 and 6 is swung upward
  • B shown in FIG. 5 corresponds to the second camera gazing point 42 of the virtual camera 80 in the state
  • the C position shown in FIG. 4 corresponds to the virtual camera shown in FIG. 5 and FIG. This corresponds to the third camera gazing point 43 of the virtual camera 80 in a state where 80 is swung downward.
  • the virtual camera 80 swings in the upward direction so that the maximum amplitude is +3, and the virtual position is The camera 80 is not shaken and is in the F position!
  • the virtual camera 80 is swung so that the maximum amplitude is 3 in the downward direction.
  • the virtual camera 80 swings upward in the G position so that the maximum amplitude becomes +2, and the virtual camera 80 moves to the ⁇ position.
  • the virtual camera 80 is swung so that the maximum amplitude is 2 in the downward direction at the I position.
  • 3 to ⁇ 1 and +1 to +3 are relative ratios of the amplitude at which the virtual camera 80 swings.
  • the swing of the virtual camera 80 having a period of 2 ⁇ force or 4 ⁇ is 0 force 2
  • the shaking of the virtual camera 80 with a period up to ⁇ is three times that of the shaking.
  • the first camera gaze point 41 with a period of 2 ⁇ force up to 4 ⁇ is placed above the position of the first camera gaze point 41 with a period of 0 to 2 ⁇ , and the 2 ⁇ force 4 ⁇
  • the third camera gazing point 43 with a period up to 2 is arranged at a position below the position of the third camera gazing point 43 with a period from 0 force to 2 ⁇ , and the shaking of the virtual camera 80 is large. It expresses that.
  • the batting action display screen 40 which is the batting action screen when the batter character 70 photographed by the virtual camera 80 swaying upward and downward enters the batter box, is displayed on the television monitor 20.
  • the character action display system that performs this will be described with reference to the flowchart shown in FIG.
  • the camera position data ⁇ 1 to ⁇ of the virtual camera 80 which actually exists but is the position where the virtual camera 80 is arranged, are created (Sl).
  • camera position data Pl to Pn of the virtual camera 80 which is position data determined by the three-dimensional coordinate position in the stadium, are created.
  • the camera position P1 of the virtual camera 80 is set in front of the stand, and the numerical data of the three-dimensional coordinate position of the camera position P1 of the virtual camera 80 is Created.
  • Various data of the camera position data P 1 to Pn created by the camera position data creation process of step S 1 is stored in the RAM 12.
  • the process proceeds to the reference character data creation process (S2).
  • reference character data CBl to CBn related to the character photographed by the virtual camera 80 are created.
  • the reference character data CBl to CBn related to the character is data used as a reference for generating the motion performed by the character as a moving image, and is basic data that does not depend on the shaking of the virtual camera 80.
  • the reference character data CB1 related to the batter character 70 entered in the knotter box is data for generating a motion of the striking swing of the specific batter character 70 as a movie, and the virtual character data shown in FIG. Camera 80 is not shaken This is data for generating a moving image in which the batter character 70 performs a batting swing motion.
  • reference camera gazing point data LBl to LBn for the character to be photographed by the virtual camera 80 are created.
  • the reference camera gazing point data LBl to LBn relating to characters are points where the virtual camera 80 that does not actually exist focuses on the target character, as shown in FIG.
  • a straight line extending straight from the virtual camera 80 toward the batter character 70 are numerical data of points on the batter character 70 where the batter character 70 intersects, for example, numerical data of a three-dimensional coordinate position on the character.
  • a straight line extending straight from the virtual camera 80 placed in front of the stand toward the batter character 70 (the tip of the virtual camera 80, the first camera gaze point 41, the second camera gaze point 42, the third camera
  • This is the numerical data of the points on the batter character 70 where the straight lines SL1 to SL 3) respectively connecting the gazing point 43 and the batter character 70 intersect.
  • the reference camera gazing point data LB1 to LB3 and the first camera gazing point 41, the second camera gazing point 42, and the third camera gazing point 43 which are the camera gazing point data L1 to L3 described later, correspond to each other.
  • the second camera gazing point 42 that is the camera gazing point data L2 of the virtual camera 80 that is not shaken is the same as the reference camera gazing point data LB2.
  • the various data of the reference camera gazing point data LBl to LBn created in the reference camera gazing point data creation process in step S3 is stored in the RAM 12.
  • a predetermined game condition relating to the game that is, whether or not a specific condition that the virtual camera 80 is shaken is satisfied is determined. Specifically, when the wind power data W of the wind generated in the stadium (see Fig. 3) is greater than or equal to a predetermined value, or when it is determined that the cameraman has made a mistake by the probability that the photographer will make a mistake. Is a predetermined amount of shaking (swing width) that is the shaking data of the virtual camera 80.For example, the virtual camera 80 swings up and down as shown in the schematic diagram of FIG.
  • the virtual camera 80 While maintaining the numerical data of the coordinate position of the Z axis, the virtual camera 80 is adjusted so that the numerical data of the coordinate position of the Y axis is increased or decreased by a predetermined amount (-3 to 1-1, +1 to +3 shown in Fig. 4). It is determined that the camera has shaken, and the process proceeds to the camera gazing point data creation process (S5). In the game condition determination process in step S5, if the predetermined game condition is not satisfied, it is determined that the virtual camera 80 is shaken! And the process proceeds to the character action display process (S7).
  • a predetermined amount -3 to 1-1, +1 to +3 shown in Fig. 4
  • the camera gazing point data creation process in step S5 is performed by reading the reference camera gazing point data LB1 to LBn created in the reference camera gazing point data creation process in step S3 from the RAM 12 (S11). .
  • the camera shake data R (see Fig. 3) of the virtual camera 80 is read (S12).
  • the camera shake data R is a numerical data between 0 and 1. The larger the value, the more the virtual camera 80 shakes.
  • camera shake random number data ER (see FIG. 3) of the virtual camera 80 is calculated (S1 3).
  • Camera shake random number data ER is numerical data of 0 or more and R or less, and is numerical data of 0 or more and 1 or less calculated by random number calculation.
  • the wind power data W is read from the RAM 12 (S14).
  • wind influence data EW (see FIG. 3) is calculated (S15).
  • Wind influence data EW is numerical data representing the degree of wind influence calculated by dividing the wind data W by the maximum value of the wind data W, and is numerical data from 0 to 1.
  • the camera shake coefficient E is calculated (S16).
  • the camera shake coefficient E is the sum of the camera shake random number data ER calculated in the camera shake random number calculation process in step S13 and the wind influence data EW calculated in the wind influence data calculation process in step S15. Numerical data obtained in this way.
  • the camera gazing point data calculation process is performed (S17).
  • the reference camera gazing point data L read out in the reference camera gazing point data reading process in step S11 Camera gazing point data Ll to Ln (see FIG. 3) of the virtual camera 80 are created using B1 to LBn and the camera shake coefficient E calculated in the camera shake coefficient calculation process in step S16.
  • FIG. 3 the schematic diagram shown in FIG.
  • the amplitude S of the virtual camera 80 is calculated by the camera shake coefficient calculation process of step S16 to the maximum value of the amplitude S of the virtual camera 80 (D position in FIG. 4). It is obtained by multiplying the camera shake coefficient E. If the amplitude S of the virtual camera 80 exceeds the maximum value of the amplitude S of the virtual camera 80 that does not exceed the maximum value (D position in FIG. 4) of the virtual camera 80, the virtual camera 80 The amplitude S of the virtual camera 80 is the same as the maximum value of the amplitude S of the virtual camera 80.
  • step S6 camera position data Pl to Pn created by the camera position data creation process in step S1, reference character data CBl to CBn created by the reference character data creation process in step S2, step Character data Cl to Cn related to the character are created based on the camera gazing point data LBl to LBn created by the camera gazing point data creation process of S5.
  • the character data Cl to Cn relating to the characters is data for generating motions performed by the player character as moving images, and the virtual camera 80 to which the camera shake data R and wind power data W of the virtual camera 80 are added. This data is dependent on the shaking.
  • the camera position data Pl to Pn created by the camera position data creation process in step S1 the reference character data CB1 to CBn created by the reference character data creation process in step S2, and the camera gazing point in step S5
  • the camera gazing point data L 1 to Ln of the virtual camera 80 increased by a predetermined amount upward from the nose position (second camera gazing point 42 shown in FIGS. 5 and 7).
  • Helmet position (first camera gazing point 41 shown in Figs. 5 and 6) or nasal position force is set to the jaw position (third camera gazing point 43 shown in Figs. 5 and 8) decreased by a predetermined amount downward.
  • the virtual camera 80 to which the camera shake data R and the wind power data W of the virtual camera 80 are added generates data for generating a batting action screen when the batter character 70 enters the knotter box.
  • the reference character data regarding the batter character 70 corresponding to the second camera gazing point 42 is used.
  • the reference character data C2 relating to the batter character 70 is created.
  • the first camera gazing point 41 of the virtual camera 80 with the virtual camera 80 swaying upward shown in FIGS.
  • the reference character data regarding the batter character 70 corresponding to the first camera gazing point 41 Based on CB1, character data C1 related to the batter character 70 is created that performs the same operation as the reference character data CB1 related to the batter character 70 but the virtual camera 80 faces upward. Further, in the third camera gazing point 43 of the virtual camera 80 with the virtual camera 80 swaying downward shown in FIGS. 5 and 8, reference character data CB 1 relating to the batter character 70 corresponding to the third camera gazing point 43 Based on this, the character data C3 relating to the batter character 70, which operates in the same manner as the reference character data CB1 relating to the batter character 70 but the virtual camera 80 is directed downward, is created. Various data of the character data Cl to Cn created by the character data creation processing of step S6 is stored in the RAM 12. When the character data Cl to Cn are created in the character data creation process in step S6, the process proceeds to the character action display process (S7).
  • step S7 the character data Cl to Cn created by the character data creation process of step S6 are read, and the action screen of the character photographed by the virtual camera 80 is displayed.
  • step S7 first, the second camera gazing point 42 of the virtual camera 80 in the state where the virtual camera 80 is not shaken as shown in FIGS. 5 and 7 is created by the character data creation process in step S6.
  • the batting motion display screen 40 is a television monitor 20. Is displayed.
  • the character data C1 created by the character data creation process of step S6 is The batting motion display screen 40 (see FIG. 6), which is the batting motion screen when the batter character 70 taken by the virtual camera 80 that has been read and shaken upwards enters the knotter box, is displayed on the television monitor 20. Is done. Further, in the third camera gazing point 43 of the virtual camera 80 in a state in which the virtual camera 80 shown in FIGS. 5 and 7 is swung downward, the character data C3 created by the character data creation processing in step S6 is read out. The hitting action display screen 40 (see FIG. 7) is displayed on the television monitor 20 when the batter character 70 photographed by the virtual force mela 80 swaying downward enters the knotter box.
  • the virtual camera 80 is processed by the camera gazing point data generation process of step S5 and the character data generation process of step S6.
  • the batting action display screen 40 that is the batting action screen when the batter character 70 photographed by the virtual camera 80 enters the batter box is displayed on the television monitor 20.
  • This game program can reproduce the shaking of a TV camera like a real-world baseball broadcast, so a realistic baseball game similar to a real-world baseball broadcast can be realized.
  • the power game device shown as an example in the case of using a home video game device as an example of a computer to which the game program can be applied is not limited to the above-described embodiment, and a monitor is separately provided.
  • the present invention also includes a program for executing the game as described above, a program method for executing the game as described above, and a computer-readable recording medium on which the program is recorded.
  • a recording medium for example, a computer-readable flexible disk, a semiconductor memory, a CD-ROM, a DVD, an MO, a ROM cassette, and the like can be cited in addition to the cartridge.
  • the batting action of the batter character 70 photographed by the virtual camera 80 is displayed on the batting action display screen 40, but the virtual camera 80 is arranged behind the back net, for example. Then, the pitching motion of the pitcher character may be displayed.
  • the present invention can be applied to all other games such as a soccer game and a shooting game that are not limited to baseball games.
  • the virtual camera 80 is swung only in the vertical direction, but can be swung in any direction such as a horizontal direction, an oblique direction, and a combination thereof.
  • the camera position data Pl to Pn of the virtual camera 80, the reference camera gazing point data LBl to LBn, the camera gazing point data Ll to Ln, the camera shake data R, and the calculation methods thereof are limited to the above embodiment. It is not a thing.
  • the reference character data CBl to CBn, the wind data W, and the calculation methods thereof are not limited to the above embodiment.
  • the virtual camera shakes by the camera gazing point data creation function and the character data creation function.
  • Camera gazing point data related to the character to which data is added and character data related to the character are created, and a series of action screens when the batter character photographed by the virtual camera to which the shake is added enter the batter box are displayed on the monitor. Therefore, it is possible to realize a realistic baseball game that is similar to a real-world baseball game.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • Processing Or Creating Images (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A game program realizes a game with reality. A control section (1) mainly has camera position data creating means (52), reference character data creating means (53), reference camera viewpoint data creating means (54), game condition judging means (55), camera viewpoint data creating means (56), character data creating means (57), and character action display means (58). When the game condition judging means (55) judges that a virtual camera shakes, the camera viewpoint data creating means (56) and the character data creating means (57) create camera viewpoint data on a batter character to which camera shake data on a virtual camera is added and character data on the batter character. A hitting screen showing the batter character imaged by the virtual camera (80) to which the camera shake data is added is displayed on a television monitor.

Description

明 細 書  Specification
ゲームプログラム、ゲーム装置及びゲーム方法  GAME PROGRAM, GAME DEVICE, AND GAME METHOD
技術分野  Technical field
[0001] 本発明は、ゲームプログラム、特に、仮想カメラによって撮影されたキャラクタの動 作画面をモニタに表示するゲームをコンピュータに実現させるためのゲームプロダラ ムに関する。また、このゲームプログラムによって実現されるゲーム装置及びゲーム 方法に関する。  TECHNICAL FIELD [0001] The present invention relates to a game program, and more particularly to a game program for causing a computer to realize a game for displaying on a monitor an operation screen of a character photographed by a virtual camera. The present invention also relates to a game device and a game method realized by the game program.
背景技術  Background art
[0002] 従来力 種々のゲームが提案されている。そのうちの 1つとして、モニタに表示され た選手キャラクタを動作させて競技を行わせる対戦ビデオゲーム、たとえば野球ゲー ムが知られている。この種の野球ゲームでは、プレイヤが各選手キャラクタが属する 1 つの野球チームを選択し、他の野球チームを選択した他のプレイヤやコンピュータと 対戦可能である (たとえば、非特許文献 1)。  [0002] Various games have been proposed. One of them is a battle video game, for example, a baseball game, in which a player character displayed on a monitor is operated to play a game. In this type of baseball game, a player can select one baseball team to which each player character belongs, and can play against other players or computers that have selected other baseball teams (for example, Non-Patent Document 1).
[0003] このような野球ゲームでは、投手キャラクタや打者キャラクタ等の対象物となる選手 キャラクタを、たとえばバックネット裏やスタンド正面に配置された複数の仮想カメラに よって撮影された投手キャラクタの投球動作や打者キャラクタの打撃動作を行う動作 画面をモニタに表示させることができる。たとえば、仮想カメラによって打者キャラクタ 力 Sバッターボックスに入ったときの一連の動作を撮影する場合、仮想カメラのカメラ位 置がスタンド正面に設定され、仮想カメラの注視点が打者キャラクタの顔位置に設定 されることにより、打者キャラクタの動作が最適なカメラアングルで撮影される。ここで 、仮想カメラのカメラ位置とは、現実には存在していない仮想カメラを配置した位置で あって、具体的には球場内の 3次元の座標位置によって定まる位置である。また、仮 想カメラの注視点とは、現実には存在していない仮想カメラが対象物に対してピント を合わせるポイントであって、仮想カメラのカメラ位置カゝら対象物に向けて真つ直ぐに 延びる直線が対象物と交差する対象物上のポイントであって、具体的には対象物上 の 3次元の座標位置によって定まるポイントである。ここでは、現実世界の野球中継 におけるテレビカメラによるテレビ画面と同様の動作画面をモニタに表示させることが できるので、現実世界の野球選手と同様の選手キャラクタの特徴を細力べ再現するこ とがでさる。 [0003] In such a baseball game, a player character that is a target object such as a pitcher character or a batter character, for example, a pitcher character's pitching action taken by a plurality of virtual cameras placed behind the back net or in front of the stand And an action screen for performing a batting action of a batter character can be displayed on the monitor. For example, when shooting a series of actions when entering the batter character force S batter box with a virtual camera, the camera position of the virtual camera is set to the front of the stand and the gaze point of the virtual camera is set to the face position of the batter character By doing so, the motion of the batter character is photographed at the optimum camera angle. Here, the camera position of the virtual camera is a position where a virtual camera that does not actually exist is arranged, and specifically, a position determined by a three-dimensional coordinate position in the stadium. A virtual camera's point of sight is a point where a virtual camera that does not exist in reality focuses on the object, and the camera position of the virtual camera goes straight to the object. The extended straight line is a point on the target object that intersects the target object, specifically, a point determined by a three-dimensional coordinate position on the target object. Here, it is possible to display on the monitor an operation screen similar to the TV screen of a TV camera in a real-world baseball broadcast. As a result, it is possible to reproduce the characteristics of player characters similar to real-world baseball players.
非特許文献 1 :プロ野球スピリッツ 2公式ガイドコンプリートエディション、 日本、コナミ 株式会社、 2005年 4月 7日  Non-Patent Document 1: Professional Baseball Spirits 2 Official Guide Complete Edition, Japan, Konami Corporation, April 7, 2005
発明の開示  Disclosure of the invention
[0004] 前記従来の仮想カメラによって撮影された選手キャラクタの動作画面をモニタに表 示する野球ゲームでは、仮想カメラのカメラ位置及び仮想カメラの注視点を選手キヤ ラクタやイベント毎に予め設定しておくことにより、選手キャラクタやイベント毎に合致 した最適なカメラアングルで選手キャラクタの動作を撮影することができる。具体的に は、仮想カメラによって打者キャラクタがバッターボックスに入ったときの一連の動作 を撮影する場合、仮想カメラのカメラ位置をスタンド正面に設定し、仮想カメラの注視 点を打者キャラクタの顔位置に設定することにより、打者キャラクタの動作が最適な力 メラアングルで撮影されるようになって!/ヽる。  [0004] In a baseball game in which the action screen of a player character photographed by the conventional virtual camera is displayed on a monitor, the camera position of the virtual camera and the gazing point of the virtual camera are set in advance for each player character or event. This makes it possible to capture the action of the player character with the optimal camera angle that matches each player character or event. Specifically, when shooting a series of actions when the batter character enters the batter box with the virtual camera, the camera position of the virtual camera is set to the front of the stand, and the gaze point of the virtual camera is set to the batter character's face position. By setting it, the batter character's movement will be shot with the optimal power mela angle!
[0005] しかし、現実世界の野球中継では、テレビカメラが野球選手を撮影するときには、た とえば球場内に発生した風の影響や、カメラマンの誤操作等によってテレビカメラが 上下に揺れ、テレビ画面に投影された野球選手が上下に揺れることがある。これに対 して、従来の野球ゲームでは、仮想カメラのカメラ位置及び仮想カメラの注視点が、 撮影する対象物である選手キャラクタやイベント毎に予め一定の値となるように設定さ れているので、現実世界の野球中継におけるテレビカメラの揺れを再現することがで きず、このため現実世界の野球中継に近!、リアリティのある野球ゲームを実現するの が非常に困難である。  [0005] However, in a real-world baseball broadcast, when a TV camera shoots a baseball player, the TV camera shakes up and down due to, for example, the influence of the wind generated in the stadium or an erroneous operation of the cameraman, resulting in a TV screen. The projected baseball player may swing up and down. On the other hand, in the conventional baseball game, the camera position of the virtual camera and the gazing point of the virtual camera are set in advance to be constant values for each player character or event that is the object to be photographed. Therefore, it is impossible to reproduce the shaking of a TV camera in a real-world baseball broadcast, so it is close to a real-world baseball broadcast! It is very difficult to realize a realistic baseball game.
[0006] 本発明の課題は、ゲームプログラムにおいて、リアリティのあるゲームを実現させる ことにある。  [0006] An object of the present invention is to realize a realistic game in a game program.
[0007] 請求項 1に係るゲームプログラムは、仮想カメラによって撮影されたキャラクタの動 作画面をモニタに表示するゲームを実現可能なコンピュータに、以下の機能を実現 させるためのプログラムである。  [0007] A game program according to claim 1 is a program for causing a computer capable of realizing a game for displaying a motion screen of a character photographed by a virtual camera on a monitor to realize the following functions.
[0008] (1)仮想カメラよつて撮影されたキャラクタに関する基準キャラクタデータを作成する 基準キャラクタデータ作成機能。 [0009] (2)仮想カメラの撮影対象となるキャラクタに対する基準カメラ注視点データを作成 する基準カメラ注視点データ作成機能。 (1) A reference character data creation function for creating reference character data relating to a character photographed by a virtual camera. [0009] (2) A reference camera gazing point data creation function for creating reference camera gazing point data for a character to be photographed by a virtual camera.
[0010] (3)ゲームに関する所定のゲーム条件を満たしているか否かを判断するゲーム条 件判断機能。 [0010] (3) A game condition determining function for determining whether or not a predetermined game condition regarding the game is satisfied.
[0011] (4)ゲーム条件判断機能によって所定のゲーム条件が満たされていると判断された とき、基準カメラ注視点データを基にして、仮想カメラのカメラ注視点データを作成す るカメラ注視点データ作成機能。  [0011] (4) A camera gazing point that creates camera gazing point data of a virtual camera based on the reference camera gazing point data when the game condition deciding function determines that a predetermined game condition is satisfied. Data creation function.
[0012] (5)基準キャラクタデータ及びカメラ注視点データを基にして、キャラクタに関するキ ャラクタデータを作成するキャラクタデータ作成機能。 [0012] (5) A character data creation function for creating character data relating to a character based on the reference character data and the camera gazing point data.
[0013] (6)キャラクタデータ作成機能によって作成されたキャラクタデータを読み出し、仮 想カメラによって撮影されたキャラクタの動作画面を表示するキャラクタ動作表示機 能。 [0013] (6) A character motion display function that reads character data created by the character data creation function and displays a motion screen of the character photographed by the virtual camera.
[0014] このプログラムによって実現されるゲームでは、基準キャラクタデータ作成機能にお いて、仮想カメラよつて撮影されたキャラクタに関する基準キャラクタデータが作成さ れる。基準カメラ注視点データ作成機能において、仮想カメラの撮影対象となるキヤ ラクタに対する基準カメラ注視点データが作成される。ゲーム条件判断機能において 、ゲームに関する所定のゲーム条件を満たしているか否かが判断される。カメラ注視 点データ作成機能において、ゲーム条件判断機能によって所定のゲーム条件が満 たされていると判断されたとき、基準カメラ注視点データを基にして、仮想カメラのカメ ラ注視点データが作成される。キャラクタデータ作成機能において、基準キャラクタデ ータ及びカメラ注視点データを基にして、キャラクタに関するキャラクタデータが作成 される。キャラクタ動作表示機能において、キャラクタデータ作成機能によって作成さ れたキャラクタデータを読み出し、仮想カメラによって撮影されたキャラクタの動作画 面が表示される。  [0014] In the game realized by this program, the reference character data relating to the character photographed by the virtual camera is created by the reference character data creation function. In the reference camera gazing point data creation function, reference camera gazing point data for the character to be imaged by the virtual camera is created. In the game condition determining function, it is determined whether or not a predetermined game condition relating to the game is satisfied. In the camera gaze point data creation function, when the game condition judgment function determines that a predetermined game condition is satisfied, virtual camera camera gaze point data is created based on the reference camera gaze point data. The In the character data creation function, character data related to the character is created based on the reference character data and the camera gazing point data. In the character motion display function, the character data created by the character data creation function is read, and the motion screen of the character photographed by the virtual camera is displayed.
[0015] たとえば、選手キャラクタを動作させる野球ゲームを実現させる場合を考える。ここ では、野球ゲームにおいて、投手キャラクタや打者キャラクタ等の対象物となる選手 キャラクタを、たとえばバックネット裏やスタンド正面に配置された複数の仮想カメラに よって撮影された投手キャラクタの投球動作や打者キャラクタの打撃動作を行う動作 画面をモニタに表示させるゲームを行う。 For example, consider a case where a baseball game in which a player character is operated is realized. Here, in a baseball game, a player character that is a target object such as a pitcher character or a batter character, for example, a pitcher motion of a pitcher character or a batter character photographed by a plurality of virtual cameras placed behind the back net or in front of a stand. Action to perform the blow action Play a game that displays the screen on the monitor.
[0016] まず、基準キャラクタデータ作成機能によって、仮想カメラよつて撮影されたキャラク タに関する基準キャラクタデータが作成される。ここで、キャラクタに関する基準キャラ クタデータとは、選手キャラクタが行う動作を動画として生成するための基準となるデ ータであって、後述する仮想カメラの揺れに依存しな 、基本となるデータである。  First, reference character data relating to a character photographed by the virtual camera is created by the reference character data creation function. Here, the reference character data related to the character is data used as a reference for generating the motion performed by the player character as a moving image, and is basic data that does not depend on the shaking of the virtual camera described later. .
[0017] 次に、基準カメラ注視点データ作成機能によって、仮想カメラの撮影対象となるキヤ ラクタに対する基準カメラ注視点データが作成される。ここで、キャラクタに関する基 準カメラ注視点データとは、現実には存在して!/、な!、仮想カメラが対象物であるキヤ ラクタに対してピントを合わせるポイントであって、仮想カメラ力もキャラクタに向けて 真っ直ぐに延びる直線がキャラクタと交差するキャラクタ上のポイントの数値データで あり、たとえばキャラクタ上の 3次元の座標位置の数値データである。具体的には、仮 想カメラによって打者キャラクタがバッターボックスに入ったときの一連の動作を撮影 する場合、基準カメラ注視点データ作成機能によって、仮想カメラの注視点が打者キ ャラクタの顔位置に設定され、キャラクタに関する基準カメラ注視点データとして打者 キャラクタの顔位置の 3次元の座標位置の数値データが作成される。  Next, reference camera gazing point data for a character to be imaged by the virtual camera is created by the reference camera gazing point data creation function. Here, the reference camera gazing point data related to the character is a point that actually exists! /,!, And the virtual camera focuses on the target character. The straight line extending straight toward is the numerical data of the point on the character that intersects the character, for example, the numerical data of the three-dimensional coordinate position on the character. Specifically, when shooting a series of actions when the batter character enters the batter box with the virtual camera, the reference camera gaze data creation function sets the gaze point of the virtual camera to the batter character's face position. Then, the numerical data of the three-dimensional coordinate position of the batter character's face position is created as the reference camera gazing point data for the character.
[0018] 次に、ゲーム条件判断機能によって、ゲームに関する所定のゲーム条件を満たして いる力否かが判断される。ここで、所定のゲーム条件とは、仮想カメラが揺れる具体 的な条件であって、たとえば球場内に発生した風の影響や、カメラマンの誤操作等の 仮想カメラが揺れる具体的な条件である。ゲーム条件判断機能では、所定のゲーム 条件を満たしているときは、具体的には、球場内に発生した風の風力が所定の値以 上であるときや、カメラマンが誤操作する確率によってカメラマンが誤操作したと判断 されたときは、仮想カメラの揺れデータである所定の揺れ方向に所定の揺れ量 (揺れ 幅)だけ、たとえば仮想カメラが上下方向に揺動し、 X軸、 Z軸の座標位置の数値デ ータを維持したまま Y軸の座標位置の数値データを所定量だけ増減させるように、仮 想カメラが揺れたと判断される。また、所定のゲーム条件を満たしていないときは、仮 想カメラが揺れて 、な 、と判断される。  Next, it is determined by the game condition determination function whether or not the power satisfies a predetermined game condition related to the game. Here, the predetermined game condition is a specific condition in which the virtual camera is shaken, and is a specific condition in which the virtual camera is shaken due to, for example, an influence of a wind generated in the stadium or an erroneous operation of a cameraman. In the game condition determination function, when a predetermined game condition is satisfied, specifically, when the wind force generated in the stadium is higher than a predetermined value or when the cameraman makes an erroneous operation according to the probability that the cameraman makes an erroneous operation. When it is determined that the virtual camera has moved, the virtual camera swings up and down by a predetermined amount (swing width), for example, in the predetermined swing direction, which is the shake data of the virtual camera, and the coordinate position of the X-axis and Z-axis It is determined that the virtual camera has shaken so that the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data is maintained. If the predetermined game condition is not satisfied, it is determined that the virtual camera is shaken.
[0019] 次に、カメラ注視点データ作成機能によって、ゲーム条件判断機能によって所定の ゲーム条件が満たされて 、ると判断されたとき、基準カメラ注視点データを基にして、 仮想カメラのカメラ注視点データが作成される。ここで、キャラクタに関するカメラ注視 点データとは、現実には存在していない仮想カメラが対象物であるキャラクタに対し てピントを合わせるポイントであって、仮想カメラ力もキャラクタに向けて真っ直ぐに延 びる直線がキャラクタと交差するキャラクタ上のポイントの数値データであり、たとえば キャラクタ上の 3次元の座標位置の数値データである。キャラクタに関するカメラ注視 点データは、キャラクタに関する基準カメラ注視点データに、仮想カメラの揺れデータ を付加した仮想カメラの揺れに依存したデータである。具体的には、ゲーム条件判断 機能によって、仮想カメラの揺れデータである所定の揺れ方向に所定の揺れ量 (揺 れ幅)だけ、たとえば仮想カメラが上下方向に揺動し、 X軸、 Z軸の座標位置の数値 データを維持したまま Y軸の座標位置の数値データを所定量だけ増減させるように、 仮想カメラが揺れると判断されると、カメラ注視点データ作成機能によって、仮想カメ ラの注視点が打者キャラクタの顔位置の上下方向に揺動した位置の 3次元の座標位 置の数値データが作成される。ここでは、たとえば顔位置カゝら上方向に所定量だけ 増加したヘルメット位置に設定され、キャラクタ〖こ関するカメラ注視点データとして打 者キャラクタのへルメット位置の 3次元の座標位置の数値データが作成される。また、 顔位置から下方向に所定量だけ増加したユニフォーム位置に設定され、キャラクタに 関するカメラ注視点データとして打者キャラクタのユニフォーム位置の 3次元の座標 位置の数値データが作成される。 Next, when it is determined by the camera gazing point data creation function that the predetermined game condition is satisfied by the game condition determination function, based on the reference camera gazing point data, Camera gazing point data of the virtual camera is created. Here, the camera gazing point data relating to a character is a point where a virtual camera that does not actually exist focuses on the target character, and the virtual camera power also extends straight toward the character. Is the numerical data of the point on the character that intersects the character, for example, the numerical data of the three-dimensional coordinate position on the character. The camera gazing point data relating to the character is data depending on the shaking of the virtual camera in which the shaking data of the virtual camera is added to the reference camera gazing point data relating to the character. Specifically, the game condition determination function causes the virtual camera to swing up and down by a predetermined amount (swing width) in a predetermined swing direction that is the shake data of the virtual camera, for example, the X axis and Z axis. If it is determined that the virtual camera shakes so that the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while maintaining the numerical data at the coordinate position of the camera, the camera gaze data creation function causes the virtual camera Numerical data is created for the three-dimensional coordinate position where the viewpoint swings up and down the batter character's face. Here, for example, the helmet position increased by a predetermined amount upward from the face position is created, and numerical data of the three-dimensional coordinate position of the batter character's helmet position is created as the camera gaze data related to the character. Is done. The uniform position is increased by a predetermined amount downward from the face position, and numerical data of the three-dimensional coordinate position of the batter character's uniform position is created as camera gaze data for the character.
次に、キャラクタデータ作成機能によって、基準キャラクタデータ及びカメラ注視点 データを基にして、キャラクタに関するキャラクタデータが作成される。ここで、キャラク タに関するキャラクタデータとは、選手キャラクタが行う動作を動画として生成するた めのデータであって、仮想カメラの揺れデータを付加した仮想カメラの揺れに依存し たデータである。具体的には、基準キャラクタデータ作成機能によって作成されたキ ャラクタに関する基準キャラクタデータと、カメラ注視点データ作成機能によって作成 された仮想カメラのカメラ注視点データとを基にして、たとえば仮想カメラの注視点が 顔位置から上方向に所定量だけ増加したヘルメット位置または顔位置から下方向に 所定量だけ減少したユニフォーム位置に設定され、揺れが付加された仮想カメラによ つて撮影された打者キャラクタがノ ッターボックスに入ったときの一連の動作画面を 生成するためのデータが作成される。 Next, character data relating to the character is created by the character data creation function based on the reference character data and the camera gazing point data. Here, the character data relating to the character is data for generating the motion performed by the player character as a moving image, and is data depending on the shaking of the virtual camera to which the shaking data of the virtual camera is added. Specifically, based on the reference character data related to the character created by the reference character data creation function and the camera gaze data of the virtual camera created by the camera gaze data creation function, for example, the virtual camera The batter character photographed by the virtual camera with the viewpoint set to the helmet position where the face position increased upward by a predetermined amount or the helmet position where the viewpoint decreased downward from the face position by a predetermined amount was added. A series of operation screens when entering the utter box Data for generation is created.
[0021] そして、キャラクタ動作表示機能によって、キャラクタデータ作成機能によって作成 されたキャラクタデータを読み出し、仮想カメラによって撮影されたキャラクタの動作 画面が表示される。ここでは、キャラクタデータ作成機能によって作成された、たとえ ば仮想カメラの注視点が顔位置から上方向に所定量だけ増加したヘルメット位置ま たは顔位置力 下方向に所定量だけ増カロしたユニフォーム位置に設定され、揺れが 付加された仮想カメラによって撮影された打者キャラクタがノ ッターボックスに入った ときの一連の動作画面を生成するためのデータを読み出し、揺れが付加された仮想 カメラによって撮影された打者キャラクタがバッターボックスに入ったときの一連の動 作画面がモニタに表示される。  [0021] Then, the character motion display function reads out the character data created by the character data creation function, and the motion screen of the character photographed by the virtual camera is displayed. Here, the helmet position created by the character data creation function, for example, the helmet position where the gazing point of the virtual camera has increased by a predetermined amount upward from the face position, or the uniform position where the facial position force has been increased by a predetermined amount downward The batter shot by the virtual camera with the shaking is read, and the data for generating a series of action screens when the batter character shot by the virtual camera with the shaking added is entered into the knotter box. A series of operation screens when the character enters the batter box is displayed on the monitor.
[0022] このゲームプログラムでは、基準キャラクタデータ作成機能及び基準カメラ注視点 データ作成機能によって、仮想カメラよつて撮影されたキャラクタに関する基準キャラ クタデータ及び仮想カメラの撮影対象となるキャラクタに対する基準カメラ注視点デ ータが作成される。次に、ゲーム条件判断機能によって、球場内に発生した風の風 力が所定の値以上であるときや、カメラマンが誤操作する確率によってカメラマンが 誤操作したと判断されたときは、仮想カメラの揺れデータである所定の揺れ方向に所 定の揺れ量 (揺れ幅)だけ、たとえば仮想カメラが上下方向に揺動し、 X軸、 Z軸の座 標位置の数値データを維持したまま Y軸の座標位置の数値データを所定量だけ増 減させるように、仮想カメラが揺れたと判断される。次に、ゲーム条件判断機能によつ て、ゲームに関する所定のゲーム条件を満たしていると判断されたとき、カメラ注視点 データ作成機能及びキャラクタデータ作成機能によって、仮想カメラのカメラ注視点 データ及びキャラクタに関するキャラクタデータが作成される。そして、キャラクタ動作 表示機能によって、キャラクタデータ作成機能によって作成されたキャラクタデータを 読み出し、仮想カメラによって撮影されたキャラクタの動作画面が表示される。  [0022] In this game program, the reference character data creation function and the reference camera gazing point data creation function allow the reference character data regarding the character photographed by the virtual camera and the reference camera gazing point data for the character to be photographed by the virtual camera. Data is created. Next, when the game condition determination function determines that the wind force generated in the stadium is greater than or equal to a predetermined value, or when the cameraman has determined that the cameraman has made an error due to the probability of the cameraman having made an error, the virtual camera shake data For example, the virtual camera swings up and down by a specified amount of swing (swing width) in the predetermined swing direction, and the coordinate position of the Y axis while maintaining the X and Z coordinate position data. It is determined that the virtual camera has swayed so that the numerical data of is increased or decreased by a predetermined amount. Next, when the game condition determination function determines that a predetermined game condition relating to the game is satisfied, the camera gaze data and character data of the virtual camera are created by the camera gaze data creation function and the character data creation function. Character data for is created. Then, the character motion display function reads out the character data created by the character data creation function and displays the motion screen of the character photographed by the virtual camera.
[0023] 具体的には、仮想カメラによって打者キャラクタがバッターボックスに入ったときの一 連の動作を撮影する場合、基準カメラ注視点データ作成機能によって、仮想カメラの 注視点が打者キャラクタの顔位置に設定され、キャラクタに関する基準カメラ注視点 データとして打者キャラクタの顔位置の 3次元の座標位置の数値データが作成される 。次に、ゲーム条件判断機能によって、仮想カメラの揺れデータである所定の揺れ方 向に所定の揺れ量 (揺れ幅)だけ、たとえば仮想カメラが上下方向に揺動し、 X軸、 z 軸の座標位置の数値データを維持したまま Y軸の座標位置の数値データを所定量 だけ増減させるように、仮想カメラが揺れたと判断されると、カメラ注視点データ作成 機能によって、仮想カメラの注視点が打者キャラクタの顔位置の上下方向に揺動した 位置の 3次元の座標位置の数値データが作成される。ここでは、たとえば顔位置から 上方向に所定量だけ増加したヘルメット位置に設定され、キャラクタに関するカメラ注 視点データとして打者キャラクタのへルメット位置の 3次元の座標位置の数値データ が作成される。また、顔位置から下方向に所定量だけ増加したユニフォーム位置に 設定され、キャラクタに関するカメラ注視点データとして打者キャラクタのユニフォーム 位置の 3次元の座標位置の数値データが作成される。次に、キャラクタデータ作成機 能によって、仮想カメラの注視点が顔位置から上方向に所定量だけ増加したヘルメ ット位置または顔位置力も下方向に所定量だけ増カロしたユニフォーム位置に設定さ れ、揺れが付加された仮想カメラによって撮影された打者キャラクタがノ ッターボック スに入ったときの一連の動作画面を生成するためのデータが作成される。そして、キ ャラクタ動作表示機能によって、キャラクタデータ作成機能によって作成された、仮想 カメラの注視点が顔位置から上方向に所定量だけ増加したヘルメット位置または顔 位置から下方向に所定量だけ増加したユニフォーム位置に設定され、揺れが付加さ れた仮想カメラによって撮影された打者キャラクタがノ ッターボックスに入ったときの 一連の動作画面を生成するためのデータを読み出し、揺れが付加された仮想カメラ によって撮影された打者キャラクタがバッターボックスに入ったときの一連の動作画面 がモニタに表示される。 [0023] Specifically, when shooting a series of actions when a batter character enters a batter box with a virtual camera, the gaze point of the virtual camera determines the face position of the batter character with the reference camera gaze point data creation function. Is set, and numerical data of the 3D coordinate position of the batter character's face position is created as the reference camera gazing point data for the character . Next, with the game condition judgment function, the virtual camera swings up and down by a predetermined amount (swing width) in the predetermined swing direction that is the shake data of the virtual camera, for example, the X-axis and z-axis coordinates When it is determined that the virtual camera has shaken so that the numerical data of the Y-axis coordinate position is increased or decreased by a predetermined amount while the numerical data of the position is maintained, the camera gaze data creation function changes the virtual camera's gaze point to the batter. The numerical data of the three-dimensional coordinate position of the position that swung in the vertical direction of the character's face position is created. Here, for example, the helmet position is increased by a predetermined amount upward from the face position, and the numerical data of the three-dimensional coordinate position of the batter character's hermet position is created as the camera gaze data related to the character. Also, the uniform position is increased by a predetermined amount downward from the face position, and numerical data of the three-dimensional coordinate position of the batter character's uniform position is created as camera gazing point data relating to the character. Next, the character data creation function sets the gaze point of the virtual camera to the helmet position where the gaze point of the virtual camera is increased upward by a predetermined amount or the uniform position where the facial position force is also increased by a predetermined amount of downward. Then, data is generated for generating a series of action screens when the batter character photographed by the virtual camera to which the shaking is added enters the knotter box. Then, with the character action display function, the uniform created by the character data creation function with the virtual camera gaze point increased by a predetermined amount upward from the face position or the uniform increased by a predetermined amount downward from the face position. The data used to generate a series of action screens when a batter character shot by a virtual camera set at a position and shaken enters the notter box is read and shot by a virtual camera with shake added. A series of action screens when the batter character enters the batter box is displayed on the monitor.
ここでは、ゲーム条件判断機能によって、仮想カメラが揺れたと判断されると、カメラ 注視点データ作成機能及びキャラクタデータ作成機能によって、仮想カメラの揺れデ 一タを付カロしたキャラクタ〖こ関するカメラ注視点、データ及びキャラクタ〖こ関するキャラ クタデータが作成され、揺れが付加された仮想カメラによって撮影された打者キャラク タがバッターボックスに入ったときの一連の動作画面がモニタに表示される。したがつ て、このゲームプログラムでは、現実世界の野球中継のように、テレビカメラの揺れを 再現することができるので、現実世界の野球中継に近いリアリティのある野球ゲーム を実現することができる。 Here, if it is determined that the virtual camera is shaken by the game condition determination function, the camera gaze point related to the character that has the virtual camera shake data created by the camera gaze data creation function and the character data creation function. The character data related to the data and the character is created, and a series of operation screens when the batter character photographed by the virtual camera to which the shake is added enters the batter box is displayed on the monitor. Therefore, in this game program, the TV camera shakes like a real-world baseball broadcast. Since it can be reproduced, it is possible to realize a baseball game with a reality close to that of a real-world baseball game.
[0025] 請求項 2に係るゲームプログラムは、請求項 1のゲームプログラムにおいて、コンビ ユータに、以下の機能をさらに実現させるためのプログラムである。  [0025] A game program according to claim 2 is a program for causing the computer to further realize the following functions in the game program according to claim 1.
[0026] (7)仮想カメラのカメラ位置データを作成するカメラ位置データ作成機能。  (7) A camera position data creation function for creating camera position data of a virtual camera.
[0027] さらに、このゲームプログラムは、請求項 1のゲームプログラムにおいて、キャラクタ データ作成機能は、さらに仮想カメラのカメラ位置データを基にして、キャラクタに関 するキャラクタデータを作成する。  [0027] Further, in this game program according to claim 1, the character data creating function further creates character data relating to the character based on the camera position data of the virtual camera.
[0028] このプログラムによって実現されるゲームでは、さらに、カメラ位置データ作成機能 によって、仮想カメラのカメラ位置データが作成される。ここで、仮想カメラのカメラ位 置データとは、現実には存在していない仮想カメラを配置した位置であって、たとえ ば球場内の 3次元の座標位置によって定まる位置である。具体的には、仮想カメラが スタンド正面に配置されて 、るとき、仮想カメラのカメラ位置力スタンド正面に設定さ れ、仮想カメラのカメラ位置の 3次元の座標位置の数値データが作成される。さら〖こ、 このゲームプログラムでは、キャラクタデータ作成機能によって、基準キャラクタデー タ及びカメラ注視点データにカ卩えて、さらに仮想カメラのカメラ位置データを基にして 、キャラクタに関するキャラクタデータが作成される。  In the game realized by this program, the camera position data of the virtual camera is further created by the camera position data creation function. Here, the camera position data of the virtual camera is a position where a virtual camera that does not actually exist is arranged, for example, a position determined by a three-dimensional coordinate position in the stadium. Specifically, when the virtual camera is placed in front of the stand, the virtual camera is set in front of the camera position force stand, and numerical data of the three-dimensional coordinate position of the camera position of the virtual camera is created. In this game program, the character data creation function creates character data related to the character based on the camera position data of the virtual camera, in addition to the reference character data and the camera gaze point data.
[0029] ここでは、カメラ位置データ作成機能によって、仮想カメラのカメラ位置データが作 成され、さらに作成された仮想カメラのカメラ位置データを基にして、キャラクタに関す るキャラクタデータが作成されるので、現実世界の野球中継に近いリアリティのある野 球ゲームを実現することができる。  [0029] Here, the camera position data creation function creates the camera position data of the virtual camera, and the character data about the character is created based on the created camera position data of the virtual camera. Realistic baseball games that are close to real-world baseball broadcasts can be realized.
[0030] 請求項 3に係るゲームプログラムは、請求項 1又は 2のゲームプログラムにおいて、 基準カメラ注視点データは、所定の周期毎に異なる振幅を有する波形を生成可能な データであって、カメラ注視点データ作成機能は、基準カメラ注視点データの振幅を 変化させることによってカメラ注視点データを作成する。  [0030] The game program according to claim 3 is the game program according to claim 1 or 2, wherein the reference camera gazing point data is data capable of generating a waveform having a different amplitude for each predetermined period, and The viewpoint data creation function creates camera gazing point data by changing the amplitude of the reference camera gazing point data.
[0031] ここでは、たとえば仮想カメラが上下方向に揺動し、 X軸、 Z軸の座標位置の数値デ ータを維持したまま Y軸の座標位置の数値データを所定量だけ増減させるようにした 場合、所定の周期毎に異なる振幅 (Y軸の最大または最小となる振幅)を有する波形 、たとえば正弦波等の波形が 2 π毎に異なる振幅となるようにカメラ注視点データを 作成することによって、仮想カメラが所定の周期毎に上下方向に揺動している状態を 簡素な方法で実現することができる。 [0031] Here, for example, the virtual camera swings up and down, and the numerical data at the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data at the coordinate position of the X axis and Z axis is maintained. If this is the case, the waveform will have a different amplitude (maximum or minimum amplitude on the Y axis) for each given period. For example, by creating camera gazing point data so that a waveform such as a sine wave has a different amplitude every 2π, it is possible to simplify the state in which the virtual camera is swinging up and down at predetermined intervals. Can be realized.
[0032] 請求項 4に係るゲームプログラムは、請求項 1から 3のいずれかのゲームプログラム において、ゲームに関する所定のゲーム条件は、仮想カメラの揺れに関する条件で あって、カメラ注視点データ作成機能は、さらに仮想カメラの揺れに関するカメラ揺れ データを基にして、仮想カメラのカメラ注視点データを作成する。  [0032] In the game program according to claim 4, in the game program according to any one of claims 1 to 3, the predetermined game condition regarding the game is a condition regarding shaking of the virtual camera, and the camera gazing point data creation function is In addition, the camera gaze data of the virtual camera is created based on the camera shake data related to the virtual camera shake.
[0033] ここでは、ゲームに関する所定のゲーム条件は、仮想カメラの揺れに関する条件で あって、具体的には、球場内に発生した風の風力が所定の値以上であるときや、カメ ラマンが誤操作する確率によってカメラマンが誤操作したと判断される条件である。ま た、ここでは、カメラ注視点データ作成機能によって、仮想カメラの揺れデータである 所定の揺れ方向に所定の揺れ量 (揺れ幅)だけ、たとえば仮想カメラが上下方向に 揺動し、 X軸、 Ζ軸の座標位置の数値データを維持したまま Υ軸の座標位置の数値 データを所定量だけ増減させるように、仮想カメラが揺れたカメラ注視点データが作 成されるので、現実世界の野球中継に近いリアリティのある野球ゲームを実現するこ とがでさる。  [0033] Here, the predetermined game condition relating to the game is a condition relating to the shaking of the virtual camera. Specifically, when the wind force generated in the stadium is greater than or equal to a predetermined value, This is a condition for determining that the cameraman has made an erroneous operation based on the probability of an erroneous operation. Also, here, the camera gazing point data creation function allows the virtual camera to swing up and down by a predetermined amount (swing width) in a predetermined shaking direction that is the shaking data of the virtual camera, for example, the X axis, While maintaining the numerical data of the coordinate position of the heel axis, the camera gazing point data is generated by shaking the virtual camera so that the numerical data of the coordinate position of the heel axis is increased or decreased by a predetermined amount. Realizing a baseball game with a reality close to that.
[0034] 請求項 5に係るゲームプログラムは、請求項 4のゲームプログラムにお!/、て、カメラ 注視点データは、基準カメラ注視点データに、カメラ揺れデータから得られる所定の カメラ揺れ係数を乗じた数値データである。  [0034] The game program according to claim 5 is the same as the game program according to claim 4, and the camera gazing point data includes a predetermined camera shaking coefficient obtained from the camera shaking data in the reference camera gazing point data. The numerical data multiplied.
[0035] ここでは、たとえばカメラ揺れデータに所定の乱数を乗じて得られた所定のカメラ揺 れ係数を用いることにより、仮想カメラがランダムに揺れることになるので、現実世界 の野球中継に近いリアリティのある野球ゲームを実現することができる。  [0035] Here, for example, by using a predetermined camera shake coefficient obtained by multiplying the camera shake data by a predetermined random number, the virtual camera is shaken randomly. Can be realized.
[0036] 請求項 5に係るゲーム装置は、仮想カメラによって撮影されたキャラクタの動作画面 をモニタに表示するゲームを実現させるゲーム装置である。このゲーム装置は、基準 キャラクタデータ作成手段と、基準カメラ注視点データ作成手段と、ゲーム条件判断 手段と、カメラ注視点データ作成手段と、キャラクタデータ作成手段と、キャラクタ動作 表示手段とを備えている。基準キャラクタデータ作成手段において、仮想カメラよつて 撮影されたキャラクタに関する基準キャラクタデータが作成される。基準カメラ注視点 データ作成手段において、仮想カメラの撮影対象となるキャラクタに対する基準カメラ 注視点データが作成される。ゲーム条件判断手段において、ゲームに関する所定の ゲーム条件を満たして 、る力否かが判断される。カメラ注視点データ作成手段にお いて、ゲーム条件判断手段によって所定のゲーム条件が満たされていると判断され たとき、基準カメラ注視点データを基にして、仮想カメラのカメラ注視点データが作成 される。キャラクタデータ作成手段において、基準キャラクタデータ及びカメラ注視点 データを基にして、キャラクタに関するキャラクタデータが作成される。キャラクタ動作 表示手段において、キャラクタデータ作成手段によって作成されたキャラクタデータ を読み出し、仮想カメラによって撮影されたキャラクタの動作画面が表示される。 [0036] A game device according to claim 5 is a game device that realizes a game in which a motion screen of a character photographed by a virtual camera is displayed on a monitor. The game apparatus includes reference character data creation means, reference camera gazing point data creation means, game condition determination means, camera gazing point data creation means, character data creation means, and character action display means. . In the reference character data creation means, reference character data relating to the character photographed by the virtual camera is created. Reference camera gazing point In the data creation means, reference camera gazing point data for the character to be photographed by the virtual camera is created. In the game condition determining means, it is determined whether or not the predetermined game condition relating to the game is satisfied. When the camera gazing point data creating means determines that the predetermined game condition is satisfied by the game condition deciding means, the camera gazing point data of the virtual camera is created based on the reference camera gazing point data. The In the character data creation means, character data relating to the character is created based on the reference character data and the camera gazing point data. In the character motion display means, the character data created by the character data creation means is read, and the motion screen of the character photographed by the virtual camera is displayed.
[0037] 請求項 6に係るゲーム方法は、仮想カメラによって撮影されたキャラクタの動作画面 をモニタに表示するゲームを実現させるゲーム方法である。このゲーム方法は、基準 キャラクタデータ作成ステップと、基準カメラ注視点データ作成ステップと、ゲーム条 件判断ステップと、カメラ注視点データ作成ステップと、キャラクタデータ作成ステップ と、キャラクタ動作表示ステップとを備えている。基準キャラクタデータ作成ステップに ぉ 、て、仮想カメラよつて撮影されたキャラクタに関する基準キャラクタデータが作成 される。基準カメラ注視点データ作成ステップにおいて、仮想カメラの撮影対象となる キャラクタに対する基準カメラ注視点データが作成される。ゲーム条件判断ステップ において、ゲームに関する所定のゲーム条件を満たしている力否かが判断される。力 メラ注視点データ作成ステップにお 、て、ゲーム条件判断ステップによって所定のゲ ーム条件が満たされていると判断されたとき、基準カメラ注視点データを基にして、仮 想カメラのカメラ注視点データが作成される。キャラクタデータ作成ステップにお!/ヽて 、基準キャラクタデータ及びカメラ注視点、データを基にして、キャラクタに関するキャラ クタデータが作成される。キャラクタ動作表示ステップにおいて、キャラクタデータ作 成ステップによって作成されたキャラクタデータを読み出し、仮想カメラによって撮影 されたキャラクタの動作画面が表示される。  [0037] A game method according to claim 6 is a game method for realizing a game in which an operation screen of a character photographed by a virtual camera is displayed on a monitor. The game method includes a reference character data creation step, a reference camera gazing point data creation step, a game condition determination step, a camera gazing point data creation step, a character data creation step, and a character action display step. Yes. In the reference character data creation step, reference character data relating to the character photographed by the virtual camera is created. In the reference camera gazing point data creation step, reference camera gazing point data for the character to be imaged by the virtual camera is created. In the game condition determining step, it is determined whether or not the power satisfies a predetermined game condition relating to the game. When the game condition determination step determines that the predetermined game condition is satisfied in the camera gaze point data creation step, the camera point of the virtual camera is determined based on the reference camera gaze point data. Viewpoint data is created. In the character data creation step, character data related to the character is created based on the reference character data, the camera gaze point, and the data. In the character action display step, the character data created in the character data creation step is read, and the action screen of the character photographed by the virtual camera is displayed.
図面の簡単な説明  Brief Description of Drawings
[0038] [図 1]本発明の一実施形態によるビデオゲーム装置の基本構成図。 FIG. 1 is a basic configuration diagram of a video game apparatus according to an embodiment of the present invention.
[図 2]前記ビデオゲーム装置の機能ブロック図。 [図 3]前記ビデオゲーム装置の記憶部に格納されるデータを表す模式図。 FIG. 2 is a functional block diagram of the video game apparatus. FIG. 3 is a schematic diagram showing data stored in a storage unit of the video game apparatus.
[図 4]仮想カメラの上下方向の揺れの変化を表す模式図。 [Fig. 4] A schematic diagram showing the change in vertical shaking of the virtual camera.
[図 5]前記仮想カメラのカメラ位置と前記仮想カメラの打者キャラクタに対するカメラ注 視点の関係を示す模式図。  FIG. 5 is a schematic diagram showing the relationship between the camera position of the virtual camera and the camera gazing point with respect to the batter character of the virtual camera.
[図 6]前記仮想カメラが上方向に揺れたときの打撃動作表示画面を示すテレビジョン モニタ図。  FIG. 6 is a television monitor diagram showing a batting operation display screen when the virtual camera is shaken upward.
[図 7]前記仮想カメラが揺れて 、な 、ときの打撃動作表示画面を示すテレビジョンモ ユタ図。  FIG. 7 is a television monitor diagram showing a batting action display screen when the virtual camera is shaken.
[図 8]前記仮想カメラが下方向に揺れたときの打撃動作表示画面を示すテレビジョン モニタ図。  FIG. 8 is a television monitor diagram showing a batting operation display screen when the virtual camera is shaken downward.
[図 9]キャラクタ動作表示処理を説明するためのフローチャート。  FIG. 9 is a flowchart for explaining a character action display process.
[図 10]カメラ注視点データ作成処理を説明するためのフローチャート。  FIG. 10 is a flowchart for explaining camera gazing point data creation processing.
符号の説明 Explanation of symbols
1 制御部  1 Control unit
7 CPU  7 CPU
17 コントローラ  17 Controller
20 テレビジョンモニタ  20 Television monitor
40 打撃動作表示画面  40 Hitting action display screen
41 第 1カメラ注視点  41 First camera gaze point
42 第 2カメラ注視点  42 Second camera gaze point
43 第 3カメラ注視点  43 Third camera gaze point
50 キャラクタ表示手段  50 Character display means
51 キャラクタ動作手段  51 Character movement means
52 カメラ位置データ作成手段  52 Camera position data creation means
53 基準キャラクタデータ作成手段  53 Standard character data creation means
54 基準カメラ注視点データ作成手段  54 Reference camera gaze point data creation means
55 ゲーム条件判断手段  55 Game condition judgment means
56 カメラ注視点データ作成手段 57 キャラクタデータ作成手段 56 Camera gaze data creation method 57 Character data creation method
58 キャラクタ動作表示手段  58 Character motion display means
70 打者キャラクタ  70 batter character
80 仮想カメラ  80 virtual cameras
Cl〜Cn キャラクタデータ  Cl to Cn character data
CBl〜CBn 基準キャラクタデータ  CBl to CBn reference character data
E カメラ揺れ係数  E Camera shake coefficient
ER カメラ揺れ乱数データ  ER camera shake random number data
EW 風力影響データ  EW wind impact data
Ll〜Ln カメラ注視点データ  Ll ~ Ln Camera gaze data
LBl〜LBn 基準カメラ注視点データ  LBl to LBn reference camera gazing point data
Pl〜Pn カメラ位置データ  Pl ~ Pn Camera position data
R カメラ揺れデータ  R Camera shake data
W 風力データ  W Wind power data
発明を実施するための最良の形態  BEST MODE FOR CARRYING OUT THE INVENTION
[0040] 〔ゲーム装置の構成と動作〕 [Configuration and Operation of Game Device]
図 1は、本発明の一実施形態によるゲーム装置の基本構成を示している。ここでは 、ビデオゲーム装置の一例として、家庭用ビデオゲーム装置をとりあげて説明を行う こととする。家庭用ビデオゲーム装置は、家庭用ゲーム機本体および家庭用テレビジ ヨンを備える。家庭用ゲーム機本体には、記録媒体 10が装填可能となっており、記録 媒体 10力もゲームデータが適宜読み出されてゲームが実行される。このようにして実 行されるゲーム内容が家庭用テレビジョンに表示される。  FIG. 1 shows a basic configuration of a game device according to an embodiment of the present invention. Here, a home video game apparatus will be described as an example of the video game apparatus. The home video game apparatus includes a home game machine body and a home television. The home game machine main body can be loaded with the recording medium 10, and the game data is read as appropriate for the recording medium 10 to execute the game. The content of the game executed in this way is displayed on the home television.
[0041] 家庭用ビデオゲーム装置のゲームシステムは、制御部 1と、記憶部 2と、画像表示 部 3と、音声出力部 4と、操作入力部 5とからなっており、それぞれがバス 6を介して接 続される。このバス 6は、アドレスバス、データバス、およびコントロールバスなどを含 んでいる。ここで、制御部 1、記憶部 2、音声出力部 4および操作入力部 5は、家庭用 ビデオゲーム装置の家庭用ゲーム機本体に含まれており、画像表示部 3は家庭用テ レビジョンに含まれて ヽる。 [0042] 制御部 1は、主に、ゲームプログラムに基づいてゲーム全体の進行を制御するため に設けられている。制御部 1は、たとえば、 CPU7 (Central Processing Unit)と、 信号処理プロセッサ 8と、画像処理プロセッサ 9とから構成されている。 CPU7と信号 処理プロセッサ 8と画像処理プロセッサ 9とは、それぞれがバス 6を介して互いに接続 されている。 CPU7は、ゲームプログラムからの命令を解釈し、各種のデータ処理や 制御を行う。たとえば、 CPU7は、信号処理プロセッサ 8に対して、画像データを画像 処理プロセッサに供給するように命令する。信号処理プロセッサ 8は、主に、 3次元空 間上における計算と、 3次元空間上から擬似 3次元空間上への位置変換計算と、光 源計算処理と、画像および音声データの生成加工処理とを行っている。画像処理プ 口セッサ 9は、主に、信号処理プロセッサ 8の計算結果および処理結果に基づいて、 描画すべき画像データを RAM 12に書き込む処理を行って!/ヽる。 [0041] The game system of the home video game apparatus includes a control unit 1, a storage unit 2, an image display unit 3, an audio output unit 4, and an operation input unit 5, each of which uses a bus 6. Connected through. This bus 6 includes an address bus, a data bus, and a control bus. Here, the control unit 1, the storage unit 2, the audio output unit 4, and the operation input unit 5 are included in the home video game machine main body of the home video game device, and the image display unit 3 is included in the home television. I'm going to talk. [0042] The control unit 1 is provided mainly for controlling the progress of the entire game based on the game program. The control unit 1 includes, for example, a CPU 7 (Central Processing Unit), a signal processor 8, and an image processor 9. The CPU 7, the signal processor 8 and the image processor 9 are connected to each other via the bus 6. The CPU 7 interprets instructions from the game program and performs various data processing and control. For example, the CPU 7 instructs the signal processor 8 to supply image data to the image processor. The signal processor 8 mainly performs calculations in a three-dimensional space, position conversion calculation from a three-dimensional space to a pseudo three-dimensional space, light source calculation processing, and image and sound data generation and processing processing. It is carried out. The image processing processor 9 mainly performs a process of writing image data to be drawn into the RAM 12 based on the calculation result and the processing result of the signal processor 8.
[0043] 記憶部 2は、主に、プログラムデータや、プログラムデータで使用される各種データ などを格納しておくために設けられている。記憶部 2は、たとえば、記録媒体 10と、ィ ンターフェース回路 11と、 RAM 12 (Random Access Memory)とから構成され ている。記録媒体 10には、インターフェース回路 11が接続されている。そして、インタ 一フェース回路 11と RAM12とはバス 6を介して接続されている。記録媒体 10は、ォ ペレーシヨンシステムのプログラムデータや、画像データ、音声データ並びに各種プ ログラムデータ力 なるゲームデータなどを記録するためのものである。この記録媒体 10は、たとえば、 ROM (Read Only Memory)カセット、光ディスク、およびフレキ シブルディスクなどであり、オペレーティングシステムのプログラムデータやゲームデ ータなどが記憶される。なお、記録媒体 10にはカード型メモリも含まれており、この力 ード型メモリは、主に、ゲームを中断するときに中断時点での各種ゲームパラメータを 保存するために用いられる。 RAM12は、記録媒体 10から読み出された各種データ を一時的に格納したり、制御部 1からの処理結果を一時的に記録したりするために用 いられる。この RAM12には、各種データとともに、各種データの記憶位置を示すアド レスデータが格納されており、任意のアドレスを指定して読み書きすることが可能にな つている。  The storage unit 2 is provided mainly for storing program data, various data used in the program data, and the like. The storage unit 2 includes, for example, a recording medium 10, an interface circuit 11, and a RAM 12 (Random Access Memory). An interface circuit 11 is connected to the recording medium 10. The interface circuit 11 and the RAM 12 are connected via the bus 6. The recording medium 10 is used to record operation system program data, image data, audio data, and various game data. The recording medium 10 is, for example, a ROM (Read Only Memory) cassette, an optical disk, a flexible disk, or the like, and stores operating system program data, game data, and the like. The recording medium 10 also includes a card-type memory, and this force-type memory is mainly used for storing various game parameters at the time of interruption when the game is interrupted. The RAM 12 is used for temporarily storing various data read from the recording medium 10 and for temporarily recording the processing results from the control unit 1. This RAM 12 stores various data and address data indicating the storage location of the various data, and can be read and written by designating an arbitrary address.
[0044] 画像表示部 3は、主に、画像処理プロセッサ 9によって RAM12に書き込まれた画 像データや、記録媒体 10から読み出される画像データなどを画像として出力するた めに設けられている。この画像表示部 3は、たとえば、テレビジョンモニタ 20と、インタ 一フェース回路 21と、 D/Aコンバータ 22 (Digital— To— Analogコンバータ)とか ら構成されている。テレビジョンモニタ 20には DZ Aコンバータ 22が接続されており、 D/Aコンバータ 22にはインターフェース回路 21が接続されている。そして、インタ 一フェース回路 21にバス 6が接続されている。ここでは、画像データが、インターフエ ース回路 21を介して DZAコンバータ 22に供給され、ここでアナログ画像信号に変 換される。そして、アナログ画像信号がテレビジョンモニタ 20に画像として出力される [0044] The image display unit 3 is mainly configured to store images written in the RAM 12 by the image processor 9. It is provided for outputting image data, image data read from the recording medium 10 and the like as an image. The image display unit 3 includes, for example, a television monitor 20, an interface circuit 21, and a D / A converter 22 (Digital-To-Analog converter). A DZ A converter 22 is connected to the television monitor 20, and an interface circuit 21 is connected to the D / A converter 22. The bus 6 is connected to the interface circuit 21. Here, the image data is supplied to the DZA converter 22 via the interface circuit 21 and is converted into an analog image signal here. Then, the analog image signal is output as an image to the television monitor 20.
[0045] ここで、画像データには、たとえば、ポリゴンデータやテクスチャデータなどがある。 Here, image data includes, for example, polygon data and texture data.
ポリゴンデータはポリゴンを構成する頂点の座標データのことである。テクスチャデー タは、ポリゴンにテクスチャを設定するためのものであり、テクスチャ指示データとテク スチヤカラーデータとからなつて 、る。テクスチャ指示データはポリゴンとテクスチャと を対応づけるためのデータであり、テクスチャカラーデータはテクスチャの色を指定す るためのデータである。ここで、ポリゴンデータとテクスチャデータとには、各データの 記憶位置を示すポリゴンアドレスデータとテクスチャアドレスデータとが対応づけられ ている。このような画像データでは、信号処理プロセッサ 8により、ポリゴンアドレスデ ータの示す 3次元空間上のポリゴンデータ(3次元ポリゴンデータ)力 画面自体 (視 点)の移動量データおよび回転量データに基づいて座標変換および透視投影変換 されて、 2次元空間上のポリゴンデータ(2次元ポリゴンデータ)に置換される。そして 、複数の 2次元ポリゴンデータでポリゴン外形を構成して、ポリゴンの内部領域にテク スチヤアドレスデータが示すテクスチャデータを書き込む。このようにして、各ポリゴン にテクスチャが貼り付けられた物体つまり各種キャラクタを表現することができる。  Polygon data is the coordinate data of vertices constituting a polygon. The texture data is used to set a texture on the polygon, and consists of texture instruction data and texture color data. The texture instruction data is data for associating polygons and textures, and the texture color data is data for designating the texture color. Here, polygon address data and texture address data indicating the storage position of each data are associated with the polygon data and the texture data. In such image data, the signal processor 8 uses the polygon data (3D polygon data) in the 3D space indicated by the polygon address data, based on the movement amount data and rotation amount data of the screen itself (viewpoint). Then, coordinate conversion and perspective projection conversion are performed and replaced with polygon data in the two-dimensional space (two-dimensional polygon data). Then, a polygon outline is composed of a plurality of two-dimensional polygon data, and texture data indicated by the texture address data is written in an internal area of the polygon. In this way, it is possible to represent an object in which a texture is pasted on each polygon, that is, various characters.
[0046] 音声出力部 4は、主に、記録媒体 10から読み出される音声データを音声として出 力するために設けられている。音声出力部 4は、たとえば、スピーカー 13と、増幅回 路 14と、 DZAコンバータ 15と、インターフェース回路 16とから構成されている。スピ 一力一 13には増幅回路 14が接続されており、増幅回路 14には DZAコンバータ 15 が接続されており、 DZAコンバータ 15にはインターフェース回路 16が接続されてい る。そして、インターフェース回路 16にバス 6が接続されている。ここでは、音声デー タカ インターフェース回路 16を介して D/Aコンバータ 15に供給され、ここでアナ口 グ音声信号に変換される。このアナログ音声信号が増幅回路 14によって増幅され、 スピーカー 13から音声として出力される。音声データには、たとえば、 ADPCM (Ad aptive Differential Pulse Code Modulation)データや PCM (Pulse Code Modulation)データなどがある。 ADPCMデータの場合、上述と同様の処理方法 で音声をスピーカー 13から出力することができる。 PCMデータの場合、 RAM12に おいて PCMデータを ADPCMデータに変換しておくことで、上述と同様の処理方法 で音声をスピーカー 13から出力することができる。 The audio output unit 4 is provided mainly for outputting audio data read from the recording medium 10 as audio. The audio output unit 4 includes, for example, a speaker 13, an amplifier circuit 14, a DZA converter 15, and an interface circuit 16. An amplifier circuit 14 is connected to the spinner 13, a DZA converter 15 is connected to the amplifier circuit 14, and an interface circuit 16 is connected to the DZA converter 15. The The bus 6 is connected to the interface circuit 16. Here, the signal is supplied to the D / A converter 15 through the audio data interface circuit 16 and converted into an analog audio signal. This analog audio signal is amplified by the amplifier circuit 14 and output from the speaker 13 as audio. Audio data includes, for example, ADPCM (Adaptive Differential Pulse Code Modulation) data and PCM (Pulse Code Modulation) data. In the case of ADPCM data, sound can be output from the speaker 13 by the same processing method as described above. In the case of PCM data, by converting the PCM data into ADPCM data in the RAM 12, the sound can be output from the speaker 13 by the same processing method as described above.
[0047] 操作入力部 5は、主に、コントローラ 17と、操作情報インターフェース回路 18と、ィ ンターフェース回路 19とから構成されている。コントローラ 17には、操作情報インター フェース回路 18が接続されており、操作情報インターフェース回路 18にはインターフ エース回路 19が接続されている。そして、インターフェース回路 19にバス 6が接続さ れている。 The operation input unit 5 mainly includes a controller 17, an operation information interface circuit 18, and an interface circuit 19. An operation information interface circuit 18 is connected to the controller 17, and an interface circuit 19 is connected to the operation information interface circuit 18. The bus 6 is connected to the interface circuit 19.
[0048] コントローラ 17は、プレイヤが種々の操作命令を入力するために使用する操作装置 であり、プレイヤの操作に応じた操作信号を CPU7に送出する。コントローラ 17には ゝ第 1ボタン 17aゝ第 2ボタン 17bゝ第 3ボタン 17cゝ第 4ボタン 17d、上方向キー 17U、 下方向キー 17D、左方向キー 17L、右方向キー 17R、 L1ボタン 17L1、 L2ボタン 17 L2、 R1ボタン 17R1、 R2ボタン 17R2、スタートボタン 17e、セレクトボタン 17f、左ス ティック 17SL及び右スティック 17SRが設けられている。  [0048] The controller 17 is an operation device used by the player to input various operation commands, and sends an operation signal corresponding to the operation of the player to the CPU 7. For controller 17, ゝ 1st button 17a ゝ 2nd button 17b ゝ 3rd button 17c ゝ 4th button 17d, Up key 17U, Down key 17D, Left key 17L, Right key 17R, L1 button 17L1, L2 Button 17 L2, R1 button 17R1, R2 button 17R2, start button 17e, select button 17f, left stick 17SL and right stick 17SR are provided.
[0049] 上方向キー 17U、下方向キー 17D、左方向キー 17L及び右方向キー 17Rは、たと えば、キャラクタやカーソルをテレビジョンモニタ 20の画面上で上下左右に移動させ るコマンドを CPU7に与えるために使用される。  [0049] Up arrow key 17U, down arrow key 17D, left arrow key 17L, and right arrow key 17R, for example, give CPU 7 a command to move a character or cursor up, down, left, or right on the screen of television monitor 20. Used for.
[0050] スタートボタン 17eは、記録媒体 10からゲームプログラムをロードするように CPU7 に指示するときなどに使用される。  [0050] The start button 17e is used when instructing the CPU 7 to load a game program from the recording medium 10.
[0051] セレクトボタン 17fは、記録媒体 10からロードされたゲームプログラムに対して、各 種選択を CPU7に指示するときなどに使用される。  [0051] The select button 17f is used when the CPU 7 is instructed to select various types of game programs loaded from the recording medium 10.
[0052] 左スティック 17SL及び右スティック 17SRは、いわゆるジョイスティックとほぼ同一構 成のスティック型コントローラである。このスティック型コントローラは、直立したステイツ クを有している。このスティックは、支点を中心として直立位置力も前後左右を含む 36 0° 方向にわたって、傾倒可能な構成になっている。左スティック 17SL及び右スティ ック 17SRは、スティックの傾倒方向及び傾倒角度に応じて、直立位置を原点とする X 座標及び y座標の値を、操作信号として操作情報インターフェース回路 18とインター フェース回路 19とを介して CPU7に送出する。 [0052] The left stick 17SL and the right stick 17SR have almost the same structure as a so-called joystick. This is a stick type controller. This stick type controller has an upright stick. This stick is designed to tilt up to 360 ° including the front, back, left and right with the fulcrum as the center. The left stick 17SL and right stick 17SR have the operation information interface circuit 18 and interface circuit 19 using the X and y coordinate values with the upright position as the origin, as operation signals, according to the tilt direction and tilt angle of the stick. And send it to CPU7.
[0053] 第 1ボタン 17aゝ第 2ボタン 17bゝ第 3ボタン 17cゝ第 4ボタン 17d、 L1ボタン 17L1、 L2ボタン 17L2、 R1ボタン 17R1及び R2ボタン 17R2には、記録媒体 10からロードさ れるゲームプログラムに応じて種々の機能が割り振られている。  [0053] The first button 17a ゝ second button 17b ゝ third button 17c ゝ fourth button 17d, L1 button 17L1, L2 button 17L2, R1 button 17R1 and R2 button 17R2 are loaded with the game program Various functions are allocated depending on the situation.
[0054] なお、左スティック 17SL及び右スティック 17SRを除くコントローラ 17の各ボタン及 び各キーは、外部からの押圧力によって中立位置力 押圧されるとオンになり、押圧 力が解除されると中立位置に復帰してオフになるオンオフスィッチになって!/、る。  [0054] The buttons and keys of the controller 17 except for the left stick 17SL and the right stick 17SR are turned on when the neutral position force is pressed by an external pressing force, and are neutral when the pressing force is released. Become an on / off switch that returns to the position and turns off!
[0055] 以上のような構成力もなる家庭用ビデオゲーム装置の概略動作を、以下に説明す る。図示しない電源スィッチがオンにされゲームシステム 1に電源が投入されると、 CP U7が、記録媒体 10に記憶されているオペレーティングシステムに基づいて、記録媒 体 10から画像データ、音声データ、およびプログラムデータを読み出す。読み出され た画像データ、音声データ、およびプログラムデータの一部若しくは全部は、 RAMI 2に格納される。そして、 CPU7が、 RAM 12に格納されたプログラムデータに基づい て、 RAM12に格納された画像データや音声データにコマンドを発行する。  The general operation of the home video game apparatus having the above-described configuration power will be described below. When a power switch (not shown) is turned on and the game system 1 is turned on, the CPU 7 receives image data, audio data, and a program from the recording medium 10 based on the operating system stored in the recording medium 10. Read data. Some or all of the read image data, audio data, and program data are stored in RAMI 2. Then, the CPU 7 issues a command to the image data and audio data stored in the RAM 12 based on the program data stored in the RAM 12.
[0056] 画像データの場合、 CPU7からのコマンドに基づいて、まず、信号処理プロセッサ 8 力 3次元空間上におけるキャラクタの位置計算および光源計算などを行う。次に、 画像処理プロセッサ 9が、信号処理プロセッサ 8の計算結果に基づいて、描画すべき 画像データの RAM 12への書き込み処理などを行う。そして、 RAM 12に書き込まれ た画像データ力 インターフェース回路 21を介して DZAコンバータ 22に供給される oここで、画像データが DZAコンバータ 22でアナログ映像信号に変換される。そし て、画像データはテレビジョンモニタ 20に供給され画像として表示される。  In the case of image data, on the basis of a command from the CPU 7, first, the position calculation of the character in the three-dimensional space of the signal processor 8 and the light source calculation are performed. Next, the image processor 9 performs a process of writing image data to be drawn into the RAM 12 based on the calculation result of the signal processor 8. Then, the image data is written to the RAM 12 and supplied to the DZA converter 22 via the interface circuit 21. Here, the image data is converted into an analog video signal by the DZA converter 22. The image data is supplied to the television monitor 20 and displayed as an image.
[0057] 音声データの場合、まず、信号処理プロセッサ 8が、 CPU7からのコマンドに基づ!/ヽ て音声データの生成および加工処理を行う。ここでは、音声データに対して、たとえ ば、ピッチの変換、ノイズの付加、エンベロープの設定、レベルの設定及びリバーブ の付加などの処理が施される。次に、音声データは、信号処理プロセッサ 8から出力 されて、インターフェース回路 16を介して DZAコンバータ 15に供給される。ここで、 音声データがアナログ音声信号に変換される。そして、音声データは増幅回路 14を 介してスピーカー 13から音声として出力される。 In the case of audio data, first, the signal processor 8 generates and processes audio data based on commands from the CPU 7. Here, for audio data, For example, pitch conversion, noise addition, envelope setting, level setting, and reverb addition are performed. Next, the audio data is output from the signal processor 8 and supplied to the DZA converter 15 via the interface circuit 16. Here, the audio data is converted into an analog audio signal. Then, the audio data is output as audio from the speaker 13 via the amplifier circuit 14.
[0058] 〔ゲーム装置における各種処理概要〕  [Outline of Various Processes in Game Device]
本ゲーム機において実行されるゲームは、たとえば野球ゲームである。本ゲーム機 は、コントローラ 17を操作することによってテレビジョンモニタ 20に表示されたキャラク タを動作させるゲームを実現可能になっている。図 2は、本発明で主要な役割を果た す機能を説明するための機能ブロック図である。制御部 1は、キャラクタ表示手段 50 と、キャラクタ動作手段 51と、カメラ位置データ作成手段 52と、基準キャラクタデータ 作成手段 53と、基準カメラ注視点データ作成手段 54と、ゲーム条件判断手段 55と、 カメラ注視点データ作成手段 56と、キャラクタデータ作成手段 57と、キャラクタ動作 表示手段 58とを主に備えて 、る。  The game executed in this game machine is, for example, a baseball game. This game machine can realize a game in which a character displayed on the television monitor 20 is operated by operating the controller 17. FIG. 2 is a functional block diagram for explaining functions that play a major role in the present invention. The control unit 1 includes a character display means 50, a character action means 51, a camera position data creation means 52, a reference character data creation means 53, a reference camera gazing point data creation means 54, a game condition determination means 55, The camera mainly includes camera gazing point data creation means 56, character data creation means 57, and character motion display means 58.
[0059] キャラクタ表示手段 50は、テレビジョンモニタ 20に打者キャラクタ 70を表示する機 能を備えている。キャラクタ表示手段 50では、図 6〜図 8に示す打者キャラクタ 70が テレビジョンモニタ 20に表示される。  The character display means 50 has a function of displaying the batter character 70 on the television monitor 20. The character display means 50 displays the batter character 70 shown in FIGS. 6 to 8 on the television monitor 20.
[0060] この手段では、打者キャラクタ 70に対応する打者用画像データが、ゲームプロダラ ムのロード時に、記憶部 2たとえば記録媒体 10から RAM12に供給され、 RAM12に 格納される。このときに、打者用画像データが制御部 1たとえば CPU7に認識される。 また、打者用画像データをテレビジョンモニタ 20に表示するための打者用座標デー タカ 記憶部 2たとえば記録媒体 10から RAM12に供給され、 RAM12に格納される 。このときに、打者用画像データが制御部 1たとえば CPU7に認識される。すると、 R AM12に格納された打者用画像データ力 CPU7からの指示に基づいて、画像処 理プロセッサ 9を介してテレビジョンモニタ 20に供給される。そして、打者用画像デー タカ 打者用座標データに基づいて、テレビジョンモニタ 20の所定の位置に表示さ れる。なお、打者用画像データをテレビジョンモニタ 20の所定の位置に表示するた めの旨示は、 CPU7によって行われる。 [0061] キャラクタ動作手段 51は、打者キャラクタ 70を動作させる機能を備えている。キャラ クタ動作手段 51では、打者キャラクタ 70が動作させられる。 With this means, the batter image data corresponding to the batter character 70 is supplied from the storage unit 2, for example, the recording medium 10 to the RAM 12 and stored in the RAM 12 when the game program is loaded. At this time, the batter image data is recognized by the control unit 1, for example, the CPU 7. The batter coordinate data storage unit 2 for displaying batter image data on the television monitor 20 is supplied from the recording medium 10 to the RAM 12 and stored in the RAM 12. At this time, the batter image data is recognized by the control unit 1, for example, the CPU 7. Then, the batter image data force stored in the RAM 12 is supplied to the television monitor 20 via the image processing processor 9 based on an instruction from the CPU 7. The batter image data is displayed at a predetermined position on the television monitor 20 based on the batter coordinate data. The CPU 7 indicates that the batter image data is displayed at a predetermined position on the television monitor 20. The character operation means 51 has a function of operating the batter character 70. In the character operating means 51, the batter character 70 is operated.
[0062] この手段では、打者キャラクタ 70を動作させるためのコントローラ 17からの信号が 制御部 1たとえば CPU7に認識されると、 CPU7からの指示に基づいて、打者キャラ クタ 70に対応する打者用画像データ力 制御部 1たとえば信号処理プロセッサ 8と画 像処理プロセッサ 9とによって処理される。そして、処理された画像データが RAM 12 力もテレビジョンモニタ 20に供給されて、打者キャラクタ 70のスイング動作が動画とし てテレビジョンモニタ 20に表示される。  In this means, when a signal from the controller 17 for operating the batter character 70 is recognized by the control unit 1, for example, the CPU 7, the batter image corresponding to the batter character 70 based on the instruction from the CPU 7. Data force controller 1 Processed by, for example, signal processor 8 and image processor 9. The processed image data is also supplied to the television monitor 20 with the RAM 12 power, and the swing motion of the batter character 70 is displayed on the television monitor 20 as a moving image.
[0063] カメラ位置データ作成手段 52は、仮想カメラ 80 (図 5参照)のカメラ位置データ P1 〜Pn (図 3参照)を作成する機能を備えている。カメラ位置データ作成手段 52では、 仮想カメラ 80のカメラ位置データ Pl〜Pnが作成される。ここで、仮想カメラ 80のカメ ラ位置データ Pl〜Pnとは、現実には存在して 、な 、仮想カメラ 80を配置した位置で あって、たとえば球場内の 3次元の座標位置によって定まる位置である。具体的には 、仮想カメラ 80がスタンド正面に配置されているとき、仮想カメラ 80のカメラ位置 P1が スタンド正面に設定され、仮想カメラ 80のカメラ位置 P 1の 3次元の座標位置の数値 データが作成される。ここでは、カメラ位置データ作成手段 52によって認識された力 メラ位置データ P 1〜Pnの各種データは、 RAM 12に格納される。  The camera position data creating means 52 has a function of creating camera position data P1 to Pn (see FIG. 3) of the virtual camera 80 (see FIG. 5). In the camera position data creation means 52, camera position data Pl to Pn of the virtual camera 80 are created. Here, the camera position data Pl to Pn of the virtual camera 80 actually exist and are positions where the virtual camera 80 is arranged, for example, positions determined by a three-dimensional coordinate position in the stadium. is there. Specifically, when the virtual camera 80 is placed in front of the stand, the camera position P1 of the virtual camera 80 is set to the front of the stand, and the numerical data of the three-dimensional coordinate position of the camera position P1 of the virtual camera 80 is stored. Created. Here, various data of the camera position data P 1 to Pn recognized by the camera position data creating means 52 are stored in the RAM 12.
[0064] 基準キャラクタデータ作成手段 53は、仮想カメラ 80よって撮影されたキャラクタに関 する基準キャラクタデータ CB 1〜CBn (図 3参照)を作成する機能を有して!/、る。基準 キャラクタデータ作成手段 53では、仮想カメラ 80よって撮影されたキャラクタに関す る基準キャラクタデータ CBl〜CBnが作成される。ここで、キャラクタに関する基準キ ャラクタデータ CBl〜CBnとは、キャラクタが行う動作を動画として生成するための基 準となるデータであって、仮想カメラ 80の揺れに依存しな 、基本となるデータである 。具体的には、ノ ッターボックスに入った打者キャラクタ 70に関する基準キャラクタデ ータ CB1は、特定の打者キャラクタ 70の打撃スイングの動作を動画として生成するた めのデータであって、図 7に示す仮想カメラ 80が揺れていない状態での打者キャラク タ 70が打撃スイングの動作を行う動画を生成するためのデータである。ここでは、基 準キャラクタデータ作成手段 53によって認識された基準キャラクタデータ CB1〜CB nの各種データは、 RAM12に格納される。 The reference character data creation means 53 has a function of creating reference character data CB 1 to CBn (see FIG. 3) relating to characters photographed by the virtual camera 80! /. In the reference character data creation means 53, reference character data CBl to CBn relating to the character photographed by the virtual camera 80 is created. Here, the reference character data CBl to CBn related to the character is data used as a reference for generating the motion performed by the character as a moving image, and is basic data that does not depend on the shaking of the virtual camera 80. . Specifically, the reference character data CB1 related to the batter character 70 entered in the knotter box is data for generating a motion of the batting swing of the specific batter character 70 as a moving image, and the virtual character data shown in FIG. This is data for generating a video in which the batter character 70 performs the batting swing motion when the camera 80 is not shaken. Here, the reference character data CB1 to CB recognized by the reference character data creation means 53 are used. Various data of n is stored in the RAM 12.
[0065] 基準カメラ注視点データ作成手段 54は、仮想カメラ 80の撮影対象となるキャラクタ に対する基準カメラ注視点データ LBl〜LBn (図 3参照)を作成する機能を有してい る。基準カメラ注視点データ作成手段 54では、仮想カメラ 80の撮影対象となるキャラ クタに対する基準カメラ注視点データ LBl〜LBnが作成される。ここで、キャラクタに 関する基準カメラ注視点データ LBl〜LBnとは、現実には存在していない仮想カメ ラ 80が対象物であるキャラクタに対してピントを合わせるポイントであって、図 5に示 すように、仮想カメラ 80から打者キャラクタ 70に向けて真っ直ぐに延びる直線 (仮想 カメラ 80の先端部と、後述するカメラ注視点データ L1〜L3である第 1カメラ注視点 4The reference camera gazing point data creating means 54 has a function of creating reference camera gazing point data LBl to LBn (see FIG. 3) for the character to be photographed by the virtual camera 80. In the reference camera gazing point data creation means 54, reference camera gazing point data LBl to LBn for the character to be photographed by the virtual camera 80 are created. Here, the reference camera gazing point data LBl to LBn for the character is a point at which the virtual camera 80 that does not actually exist is focused on the target character, as shown in FIG. Thus, a straight line extending straight from the virtual camera 80 toward the batter character 70 (the front end of the virtual camera 80 and the first camera gazing point 4 which is camera gazing point data L1 to L3 described later)
1、第 2カメラ注視点 42、第 3カメラ注視点 43とをそれぞれ結ぶ直線 SL1〜SL3)が 打者キャラクタ 70と交差する打者キャラクタ 70上のポイントの数値データであり、たと えばキャラクタ上の 3次元の座標位置の数値データである。図 5では、スタンド正面に 配置された仮想カメラ 80から打者キャラクタ 70に向けて真っ直ぐに延びる直線 (仮想 カメラ 80の先端部と、第 1カメラ注視点 41、第 2カメラ注視点 42、第 3カメラ注視点 43 とをそれぞれ結ぶ直線 SL1〜SL3)が打者キャラクタ 70と交差する打者キャラクタ 70 上のポイントの数値データである。ここでは、基準カメラ注視点データ LB1〜LB3と、 後述するカメラ注視点データ L1〜L3である第 1カメラ注視点 41、第 2カメラ注視点 41, straight lines SL1 to SL3 connecting the second camera gazing point 42 and the third camera gazing point 43, respectively, are numerical data of points on the batter character 70 that intersect with the batter character 70. Is the numerical data of the coordinate position. In Fig. 5, a straight line that extends straight from the virtual camera 80 placed in front of the stand toward the batter character 70 (the tip of the virtual camera 80, the first camera gaze point 41, the second camera gaze point 42, the third camera This is numerical data of points on the batter character 70 at which the straight lines SL1 to SL3) respectively connecting the gazing point 43 and the batter character 70 intersect. Here, the reference camera gazing point data LB1 to LB3 and the first camera gazing point 41 and the second camera gazing point 4 which are camera gazing point data L1 to L3 described later, respectively.
2、第 3カメラ注視点 43とが対応しており、特に、揺れていない仮想カメラ 80のカメラ 注視点データ L2である第 2カメラ注視点 42は、基準カメラ注視点データ LB2と同一 になっている。ここでは、基準カメラ注視点データ作成手段 54によって認識された基 準カメラ注視点データ LBl〜LBnの各種データは、 RAM12に格納される。 2.The third camera gaze point 43 is compatible, and in particular, the second camera gaze point 42, which is the camera gaze point data L2 of the virtual camera 80 that is not shaken, is the same as the reference camera gaze point data LB2. Yes. Here, various data of the reference camera gazing point data LBl to LBn recognized by the reference camera gazing point data creating means 54 is stored in the RAM 12.
[0066] ゲーム条件判断手段 55は、ゲームに関する所定のゲーム条件を満たしている力否 かを判断する機能を有している。ゲーム条件判断手段 55では、ゲームに関する所定 のゲーム条件を満たしている力否かが判断される。ここで、所定のゲーム条件とは、 仮想カメラ 80が揺れる具体的な条件であって、たとえば球場内に発生した風の影響 や、カメラマンの誤操作等の仮想カメラ 80が揺れる具体的な条件である。ゲーム条件 判断手段 55では、所定のゲーム条件を満たしているときは、具体的には、球場内に 発生した風の風力データ W (図 3参照)が所定の値以上であるときや、カメラマンが誤 操作する確率によってカメラマンが誤操作したと判断されたときは、仮想カメラ 80の 揺れデータである所定の揺れ方向に所定の揺れ量 (揺れ幅)だけ、たとえば仮想カメ ラ 80が上下方向に揺動し、 X軸、 Z軸の座標位置の数値データを維持したまま Y軸 の座標位置の数値データを所定量だけ増減させるように、仮想カメラ 80が揺れたと 判断される。また、ゲーム条件判断手段 55では、所定のゲーム条件を満たしていな いときは、仮想カメラ 80が揺れていないと判断される。ここでは、ゲーム条件判断手 段 55によって認識された仮想カメラ 80が揺れて 、る力、揺れて 、な 、かの判断結果 の各種データは、 RAM12に格納される。 [0066] The game condition determining means 55 has a function of determining whether or not the power satisfies a predetermined game condition relating to the game. The game condition determining means 55 determines whether or not the power satisfies a predetermined game condition relating to the game. Here, the predetermined game conditions are specific conditions in which the virtual camera 80 is shaken, for example, the specific conditions in which the virtual camera 80 is shaken due to, for example, the influence of a wind generated in the stadium or an erroneous operation of a cameraman. . In the game condition judging means 55, when the predetermined game condition is satisfied, specifically, when the wind data W (see FIG. 3) of the wind generated in the stadium is a predetermined value or more, Mistake When it is determined that the cameraman has made an erroneous operation based on the probability of operation, the virtual camera 80 swings up and down by a predetermined amount (swing width), for example, in a predetermined direction that is the vibration data of the virtual camera 80. It is determined that the virtual camera 80 is shaken so that the numerical data of the coordinate position of the Y axis is increased or decreased by a predetermined amount while the numerical data of the coordinate position of the X axis and Z axis is maintained. Further, the game condition determining means 55 determines that the virtual camera 80 is not shaken when the predetermined game condition is not satisfied. Here, various data of the determination result such as whether the virtual camera 80 recognized by the game condition determination unit 55 is shaking, swaying, or shaking is stored in the RAM 12.
カメラ注視点データ作成手段 56は、ゲーム条件判断手段 55によって所定のゲーム 条件が満たされて 、ると判断されたとき、基準カメラ注視点データ LBl〜LBnを基に して、仮想カメラ 80のカメラ注視点データ Ll〜Ln (図 3参照)を作成する機能を有し ている。カメラ注視点データ作成手段 56では、ゲーム条件判断手段 55によって所定 のゲーム条件が満たされていると判断されたとき、すなわち、仮想カメラ 80が揺れて いると判断されたとき、基準カメラ注視点データ LBl〜LBnを基にして、仮想カメラ 8 0のカメラ注視点データ Ll〜Lnが作成される。ここで、キャラクタに関するカメラ注視 点データ Ll〜Lnとは、現実には存在していない仮想カメラ 80が対象物であるキャラ クタに対してピントを合わせるポイントであって、図 5に示すように、仮想カメラ 80から 打者キャラクタ 70に向けて真っ直ぐに延びる直線 (仮想カメラ 80の先端部と、第 1カメ ラ注視点 41、第 2カメラ注視点 42、第 3カメラ注視点 43とをそれぞれ結ぶ直線 SL1 〜SL3)が打者キャラクタ 70と交差する打者キャラクタ 70上のポイントの数値データ であり、たとえばキャラクタ上の 3次元の座標位置の数値データである。図 5では、スタ ンド正面に配置された仮想カメラ 80から打者キャラクタ 70に向けて真っ直ぐに延びる 直線 (仮想カメラ 80の先端部と、第 1カメラ注視点 41、第 2カメラ注視点 42、第 3カメ ラ注視点 43とをそれぞれ結ぶ直線 SL1〜SL3)が打者キャラクタ 70と交差する打者 キャラクタ 70上のポイントの数値データであって、ここでは、カメラ注視点データ Ll〜 L3と、第 1カメラ注視点 41、第 2カメラ注視点 42、第 3カメラ注視点 43とが一致してい る。キャラクタに関するカメラ注視点データ Ll〜Lnは、キャラクタに関する基準カメラ 注視点データ LB 1〜LBnに、仮想カメラ 80のカメラ揺れデータ R (図 3参照)や風力 データ W (図 3参照)を付加した仮想カメラ 80の揺れに依存したデータである。具体 的には、ゲーム条件判断手段 55によって、仮想カメラ 80のカメラ揺れデータ Rである 所定の揺れ方向に所定の揺れ量 (揺れ幅)だけ、たとえば仮想カメラ 80が上下方向 に揺動し、 X軸、 Z軸の座標位置の数値データを維持したまま Y軸の座標位置の数 値データを所定量だけ増減させるように、仮想カメラ 80が揺れていると判断されると、 カメラ注視点データ作成手段 56によって、仮想カメラ 80のカメラ注視点データ Ll〜 Lnが打者キャラクタ 70の顔位置の上下方向に揺動した位置の 3次元の座標位置の 数値データが作成される。まず、図 5及び図 7に示す仮想カメラ 80が揺れていない状 態の仮想カメラ 80の第 2カメラ注視点 42は、打者キャラクタ 70の鼻位置に設定され、 打者キャラクタ 70に関するカメラ注視点データ L2として打者キャラクタ 70の鼻位置の 3次元の座標位置の数値データが作成される。次に、ゲーム条件判断手段 55によつ て、仮想カメラ 80が上方向に揺れたと判断されると、図 5及び図 6に示す仮想カメラ 8 0が上方向に揺れた状態の仮想カメラ 80の第 1カメラ注視点 41は、カメラ注視点デ ータ作成手段 56によって、第 2カメラ注視点 42の鼻位置から上方向に所定量だけ増 カロしたヘルメット位置に設定され、打者キャラクタ 70に関するカメラ注視点データ L1 として打者キャラクタ 70のへルメット位置の 3次元の座標位置の数値データが作成さ れる。また、ゲーム条件判断手段 55によって、仮想カメラ 80が下方向に揺れたと判 断されると、図 5及び図 8に示す仮想カメラ 80が下方向に揺れた状態の仮想カメラ 80 の第 3カメラ注視点 43は、カメラ注視点データ作成手段 56によって、第 2カメラ注視 点 42の鼻位置から下方向に所定量だけ増加した顎位置に設定され、打者キャラクタ 70に関するカメラ注視点データ L3として打者キャラクタ 70の顎位置の 3次元の座標 位置の数値データが作成される。ここでは、カメラ注視点データ作成手段 56によって 認識されたカメラ注視点データ Ll〜Lnの各種データは、 RAM12に格納される。 キャラクタデータ作成手段 57は、カメラ位置データ作成手段 52によって作成された カメラ位置データ P 1〜Pn、基準キャラクタデータ作成手段 53によって作成された基 準キャラクタデータ CBl〜CBn、カメラ注視点データ作成手段 56によって作成され たカメラ注視点データ LBl〜LBnを基にして、キャラクタに関するキャラクタデータ C l〜Cn (図 3参照)を作成する機能を有している。キャラクタデータ作成手段 57では、 カメラ位置データ作成手段 52によって作成されたカメラ位置データ Pl〜Pn、基準キ ャラクタデータ作成手段 53によって作成された基準キャラクタデータ CBl〜CBn、力 メラ注視点データ作成手段 56によって作成されたカメラ注視点データ LBl〜LBnを 基にして、キャラクタに関するキャラクタデータ Cl〜Cnが作成される。ここで、キャラク タに関するキャラクタデータ Cl〜Cnとは、選手キャラクタが行う動作を動画として生 成するためのデータであって、仮想カメラ 80のカメラ揺れデータ Rや風力データ Wを 付加した仮想カメラ 80の揺れに依存したデータである。具体的には、カメラ位置デー タ作成手段 52によって作成されたカメラ位置データ Pl〜Pn、基準キャラクタデータ 作成手段 53によって作成された基準キャラクタデータ CBl〜CBn、カメラ注視点デ ータ作成手段 56によって作成されたカメラ注視点データ LBl〜LBnを基にして、仮 想カメラ 80のカメラ注視点データ Ll〜Lnが鼻位置(図 5及び図 7に示す第 2カメラ注 視点 42)から上方向に所定量だけ増加したヘルメット位置(図 5及び図 6に示す第 1 カメラ注視点 41)または鼻位置から下方向に所定量だけ減少した顎位置(図 5及び 図 8に示す第 3カメラ注視点 43)に設定され、仮想カメラ 80のカメラ揺れデータ Rや風 力データ Wが付加された仮想カメラ 80によって打者キャラクタ 70がバッターボックス に入ったときの打撃動作画面を生成するためのデータが作成される。ここでは、まず 、図 5及び図 7に示す仮想カメラ 80が揺れていない状態の仮想カメラ 80の第 2カメラ 注視点 42では、第 2カメラ注視点 42に対応する打者キャラクタ 70に関する基準キヤ ラクタデータ CB1を基にして、打者キャラクタ 70に関する基準キャラクタデータ CB1と 同一の打者キャラクタ 70に関するキャラクタデータ C2が作成される。次に、図 5及び 図 6に示す仮想カメラ 80が上方向に揺れた状態の仮想カメラ 80の第 1カメラ注視点 4 1では、第 1カメラ注視点 41に対応する打者キャラクタ 70に関する基準キャラクタデ ータ CB1を基にして、打者キャラクタ 70に関する基準キャラクタデータ CB1と同一の 動作をするが仮想カメラ 80の向きが上向きとなった打者キャラクタ 70に関するキャラ クタデータ C1が作成される。また、図 5及び図 8に示す仮想カメラ 80が下方向に揺れ た状態の仮想カメラ 80の第 3カメラ注視点 43では、第 3カメラ注視点 43に対応する 打者キャラクタ 70に関する基準キャラクタデータ CB1を基にして、打者キャラクタ 70 に関する基準キャラクタデータ CB1と同一の動作をするが仮想カメラ 80の向きが下 向きとなった打者キャラクタ 70に関するキャラクタデータ C3が作成される。ここでは、 キャラクタデータ作成手段 57によって認識されたキャラクタデータ Cl〜Cnの各種デ ータは、 RAM 12に格納される。 When the game condition determining means 55 determines that the predetermined game condition is satisfied, the camera gazing point data creation means 56 is configured to use the camera of the virtual camera 80 based on the reference camera gazing point data LBl to LBn. It has a function to create gazing point data Ll to Ln (see Fig. 3). In the camera gazing point data creation means 56, when the game condition determination means 55 determines that a predetermined game condition is satisfied, that is, when it is determined that the virtual camera 80 is shaking, the reference camera gazing point data Based on LBl to LBn, camera gazing point data Ll to Ln of the virtual camera 80 are created. Here, the camera gazing point data Ll to Ln relating to the character is a point where the virtual camera 80 that does not actually exist focuses on the target character, as shown in FIG. Straight line extending straight from virtual camera 80 to batter character 70 (straight line SL1 connecting the tip of virtual camera 80 and first camera gaze point 41, second camera gaze point 42, and third camera gaze point 43) ~ SL3) are numerical data of points on the batter character 70 intersecting with the batter character 70, for example, numerical data of a three-dimensional coordinate position on the character. In Fig. 5, a straight line extending straight from the virtual camera 80 arranged in front of the stand toward the batter character 70 (the tip of the virtual camera 80, the first camera gaze point 41, the second camera gaze point 42, the third Lines SL1 to SL3 connecting camera gaze point 43) are the numerical data of the points on batter character 70 that intersect batter character 70. Here, camera gaze point data Ll to L3 and the first camera note The viewpoint 41, the second camera gazing point 42, and the third camera gazing point 43 coincide. The camera gazing point data Ll to Ln related to the character are the camera sway data R (see Fig. 3) and wind power of the virtual camera 80 to the reference camera gazing point data LB 1 to LBn related to the character. This data depends on the shaking of the virtual camera 80 to which data W (see Fig. 3) is added. Specifically, the game condition determination means 55 causes the virtual camera 80 to swing up and down by a predetermined amount (swing width) in a predetermined swing direction that is the camera shake data R of the virtual camera 80, for example, X If it is determined that the virtual camera 80 is shaking so that the numerical data of the Y-axis coordinate position is increased or decreased by a predetermined amount while maintaining the numerical data of the axis and Z-axis coordinate positions, the camera gaze data is created. By means 56, numerical data of the three-dimensional coordinate position of the position where the camera gazing point data Ll to Ln of the virtual camera 80 swings up and down the face position of the batter character 70 is created. First, the second camera gazing point 42 of the virtual camera 80 in the state in which the virtual camera 80 shown in FIGS. 5 and 7 is not shaken is set to the nose position of the batter character 70, and the camera gazing point data L2 relating to the batter character 70 is shown in FIG. As a result, numerical data of the three-dimensional coordinate position of the nose position of the batter character 70 is created. Next, when the game condition determining means 55 determines that the virtual camera 80 has shaken upward, the virtual camera 80 shown in FIG. 5 and FIG. The first camera gazing point 41 is set to the helmet position increased by a predetermined amount upward from the nose position of the second camera gazing point 42 by the camera gazing point data creation means 56, and the camera gazing point 70 is associated with the camera gazing point 70. As the viewpoint data L1, numerical data of the three-dimensional coordinate position of the hermet position of the batter character 70 is created. Further, when the game condition determining means 55 determines that the virtual camera 80 has shaken downward, the third camera * of the virtual camera 80 in the state where the virtual camera 80 shown in FIGS. The viewpoint 43 is set to a jaw position that is increased by a predetermined amount downward from the nose position of the second camera gaze point 42 by the camera gaze data creation means 56, and the batter character 70 as the camera gaze data L3 regarding the batter character 70 is set. Numerical data of the 3D coordinate position of the jaw position is created. Here, various data of the camera gazing point data Ll to Ln recognized by the camera gazing point data creating means 56 is stored in the RAM 12. The character data creation means 57 includes camera position data P 1 to Pn created by the camera position data creation means 52, reference character data CB1 to CBn created by the reference character data creation means 53, and camera gaze data creation means 56. Based on the camera gazing point data LBl to LBn created by, character data C1 to Cn (see Fig. 3) related to the character are created. In the character data creation means 57, Camera position data Pl to Pn created by the camera position data creation means 52, reference character data CBl to CBn created by the reference character data creation means 53, force camera gaze data creation means 56 Based on LBl to LBn, character data Cl to Cn related to the character are created. Here, the character data Cl to Cn related to the characters is data for generating motions performed by the player character as moving images, and the virtual camera 80 to which the camera shake data R and wind power data W of the virtual camera 80 are added. This data is dependent on the shaking. Specifically, the camera position data Pl to Pn created by the camera position data creation means 52, the reference character data CBl to CBn created by the reference character data creation means 53, and the camera gaze data creation means 56 Based on the generated camera gazing point data LBl to LBn, the camera gazing point data Ll to Ln of the virtual camera 80 are located upward from the nose position (second camera gazing point 42 shown in FIGS. 5 and 7). Helmet position increased by a fixed amount (first camera gaze point 41 shown in FIGS. 5 and 6) or jaw position decreased by a predetermined amount downward from the nose position (third camera gaze point 43 shown in FIGS. 5 and 8) The virtual camera 80 to which the batter character 70 enters the batter box is generated by the virtual camera 80 to which the camera shake data R and wind data W of the virtual camera 80 are added. Made. Here, first, in the second camera gaze point 42 of the virtual camera 80 in a state where the virtual camera 80 shown in FIGS. 5 and 7 is not shaken, the reference character data regarding the batter character 70 corresponding to the second camera gaze point 42 is used. Based on CB1, character data C2 relating to the batter character 70 identical to the reference character data CB1 relating to the batter character 70 is created. Next, in the first camera gazing point 41 of the virtual camera 80 with the virtual camera 80 swaying upward shown in FIGS. 5 and 6, the reference character data regarding the batter character 70 corresponding to the first camera gazing point 41 is obtained. Based on the data CB1, the character data C1 relating to the batter character 70, which operates in the same manner as the reference character data CB1 relating to the batter character 70 but the virtual camera 80 faces upward, is created. Further, in the third camera gazing point 43 of the virtual camera 80 with the virtual camera 80 swaying downward shown in FIGS. 5 and 8, reference character data CB1 relating to the batter character 70 corresponding to the third camera gazing point 43 is stored. Based on the basic character data CB1 related to the batter character 70, the virtual camera 80 faces downward. Character data C3 relating to the batter character 70 in the direction is created. Here, various data of the character data Cl to Cn recognized by the character data creating means 57 is stored in the RAM 12.
[0069] キャラクタ動作表示手段 58は、キャラクタデータ作成手段 57によって作成されたキ ャラクタデータ Cl〜Cnを読み出し、仮想カメラ 80によって撮影されたキャラクタの動 作画面を表示する機能を有している。キャラクタ動作表示手段 58では、キャラクタデ ータ作成手段 57によって作成されたキャラクタデータ Cl〜Cnが読み出され、仮想力 メラ 80によって撮影されたキャラクタの動作画面が表示される。ここでは、まず、図 5 及び図 7に示す仮想カメラ 80が揺れて ヽな 、状態の仮想カメラ 80の第 2カメラ注視 点 42では、キャラクタデータ作成手段 57によって作成されたキャラクタデータ C2が 読み出され、仮想カメラ 80によって撮影された打者キャラクタ 70がバッターボックスに 入ったときの打撃動作画面である打撃動作表示画面 40 (図 7参照)がテレビジョンモ ユタ 20に表示される。次に、図 5及び図 6に示す仮想カメラ 80が上方向に揺れた状 態の仮想カメラ 80の第 1カメラ注視点 41では、キャラクタデータ作成手段 57によって 作成されたキャラクタデータ C 1が読み出され、上方向に揺れた仮想カメラ 80によつ て撮影された打者キャラクタ 70がバッターボックスに入ったときの打撃動作画面であ る打撃動作表示画面 40 (図 6参照)がテレビジョンモニタ 20に表示される。また、図 5 及び図 7に示す仮想カメラ 80が下方向に揺れた状態の仮想カメラ 80の第 3カメラ注 視点 43では、キャラクタデータ作成手段 57によって作成されたキャラクタデータ C3 が読み出され、下方向に揺れた仮想カメラ 80によって撮影された打者キャラクタ 70 力 Sバッターボックスに入ったときの打撃動作画面である打撃動作表示画面 40 (図 7参 照)がテレビジョンモニタ 20に表示される。  The character action display means 58 has a function of reading the character data Cl to Cn created by the character data creation means 57 and displaying the action screen of the character photographed by the virtual camera 80. In the character action display means 58, the character data Cl to Cn created by the character data creation means 57 are read, and the action screen of the character photographed by the virtual power camera 80 is displayed. Here, first, at the second camera gaze point 42 of the virtual camera 80 in a state where the virtual camera 80 shown in FIGS. 5 and 7 is shaken, the character data C2 created by the character data creating means 57 is read out. Then, the batting action display screen 40 (see FIG. 7), which is the batting action screen when the batter character 70 photographed by the virtual camera 80 enters the batter box, is displayed on the television monitor 20. Next, at the first camera gazing point 41 of the virtual camera 80 in the state in which the virtual camera 80 shown in FIGS. 5 and 6 is swung upward, the character data C 1 created by the character data creating means 57 is read out. The batting action display screen 40 (see FIG. 6), which is the batting action screen when the batter character 70 photographed by the virtual camera 80 swaying upward and enters the batter box, is displayed on the television monitor 20. Is displayed. In addition, at the third camera gazing point 43 of the virtual camera 80 in a state where the virtual camera 80 swings downward as shown in FIGS. 5 and 7, the character data C3 created by the character data creation means 57 is read out and A batting action display screen 40 (see FIG. 7), which is the batting action screen when entering the batter box 70 batter character taken by the virtual camera 80 swaying in the direction, is displayed on the television monitor 20.
[0070] ここでは、ゲーム条件判断手段 55によって、仮想カメラ 80が揺れたと判断されると、 カメラ注視点データ作成手段 56及びキャラクタデータ作成手段 57によって、仮想力 メラ 80のカメラ揺れデータ Rや風力データ Wを付加した打者キャラクタ 70に関する力 メラ注視点データ L 1〜Ln及び打者キャラクタ 70に関するキャラクタデータ C 1〜Cn が作成され、カメラ揺れデータ Rや風力データ Wが付加された仮想カメラ 80によって 撮影された打者キャラクタ 70がバッターボックスに入ったときの打撃動作画面である 打撃動作表示画面 40がテレビジョンモニタ 20に表示される。このゲームプログラムで は、現実世界の野球中継のように、テレビカメラの揺れを再現することができるので、 現実世界の野球中継に近いリアリティのある野球ゲームを実現することができる。 Here, when it is determined by the game condition determining means 55 that the virtual camera 80 has shaken, the camera gaze data creating means 56 and the character data creating means 57 cause the camera shake data R and wind power of the virtual power 80 to be shaken. Power for batter character 70 with added data W Mera gaze point data L 1 to Ln and character data C 1 to Cn for batter character 70 are created and photographed by virtual camera 80 with camera shake data R and wind power data W added Is a batting action screen when the hit batter character 70 enters the batter box The hitting operation display screen 40 is displayed on the television monitor 20. This game program can reproduce the shaking of a TV camera like a real-world baseball broadcast, so it can realize a realistic baseball game similar to a real-world baseball broadcast.
[0071] 〔野球ゲームにおけるキャラクタ動作表示システム及び打撃結果判定システム実行 時の処理フロー〕 [Processing flow when executing character action display system and batting result determination system in baseball game]
本実施形態の野球ゲームにおけるキャラクタ動作表示システム及びカメラ注視点デ ータ作成システムを、図 4に示す仮想カメラ 80の上下方向の揺れの変化を表す模式 図、図 9及び図 10に示すフローチャートを用いて説明する。  The character motion display system and the camera gazing point data creation system in the baseball game of the present embodiment are a schematic diagram showing the change in vertical shaking of the virtual camera 80 shown in FIG. 4, and the flowcharts shown in FIG. 9 and FIG. It explains using.
[0072] まず、仮想カメラ 80は、図 4及び図 5に示すように、上下方向に揺動している。図 4 及び図 5に示す +方向が上方向、図 4及び図 5に示す一方向が下方向を意味してい る。仮想カメラ 80は、図 4に示すように、 X軸、 Z軸の座標位置の数値データを維持し たまま Y軸の座標位置の数値データを所定量だけ増減させるようにした場合、所定の 周期毎に異なる振幅 (Y軸の最大または最小となる振幅)を有する波形、たとえば正 弦波等の波形が 2 π毎に異なる振幅となるようにカメラ注視点データ Ll〜Lnが決定 されるようになつている。具体的には、図 4に示すように、 0から 2 πまでの周期では、 Α位置にぉ 、て仮想カメラ 80が上方向に最大振幅が + 1となるように揺動し、 B位置 にお 、て仮想カメラ 80が摇れて 、な 、状態となり、 C位置にぉ 、て仮想カメラ 80が 下方向に最大振幅が 1となるように揺動して 、る。  First, the virtual camera 80 swings in the vertical direction as shown in FIGS. The + direction shown in FIGS. 4 and 5 means the upward direction, and the one direction shown in FIGS. 4 and 5 means the downward direction. As shown in FIG. 4, when the virtual camera 80 increases or decreases the numerical data of the Y-axis coordinate position by a predetermined amount while maintaining the numerical data of the X-axis and Z-axis coordinate positions, the virtual camera 80 has a predetermined cycle. The camera gazing point data Ll to Ln are determined so that a waveform having a different amplitude (a maximum or minimum amplitude on the Y axis), such as a waveform of a sine wave, has a different amplitude every 2π. It is summer. Specifically, as shown in FIG. 4, in the period from 0 to 2π, the virtual camera 80 swings upward so that the maximum amplitude becomes +1 at the Α position, and reaches the B position. When the virtual camera 80 is turned over, the virtual camera 80 is turned into a state, and the virtual camera 80 is swung so that the maximum amplitude is 1 in the downward direction at the C position.
[0073] ここでは、図 4に示す A位置が図 5及び図 6に示す仮想カメラ 80が上方向に揺れた 状態の仮想カメラ 80の第 1カメラ注視点 41に対応し、図 4に示す B位置が図 5及び図 7に示す仮想カメラ 80が揺れて 、な 、状態の仮想カメラ 80の第 2カメラ注視点 42に 対応し、図 4に示す C位置が図 5及び図 8に示す仮想カメラ 80が下方向に揺れた状 態の仮想カメラ 80の第 3カメラ注視点 43に対応している。また、図 4に示すように、 2 π力 4 πまでの周期では、 D位置にお!、て仮想カメラ 80が上方向に最大振幅が + 3となるように揺動し、 Ε位置において仮想カメラ 80が揺れていない状態となり、 F位 置にお!、て仮想カメラ 80が下方向に最大振幅が 3となるように揺動して 、る。また 、図 4に示すように、 4 π力 6 πまでの周期では、 G位置において仮想カメラ 80が上 方向に最大振幅が + 2となるように揺動し、 Η位置にぉ 、て仮想カメラ 80が揺れて ヽ な 、状態となり、 I位置にぉ 、て仮想カメラ 80が下方向に最大振幅が 2となるように 揺動している。ここで、 3〜― 1、 + 1〜+ 3というのは、仮想カメラ 80が揺れる振幅 の相対比率であって、たとえば 2 π力も 4 πまでの周期の仮想カメラ 80の揺れは、 0 力も 2 πまでの周期の仮想カメラ 80の揺れに対して 3倍の揺れになっている。このた め、 2 π力も 4 πまでの周期の第 1カメラ注視点 41は、 0から 2 πまでの周期の第 1カメ ラ注視点 41の位置より上方位置に配置され、 2 π力 4 πまでの周期の第 3カメラ注 視点 43は、 0力ら 2 πまでの周期の第 3カメラ注視点 43の位置より下方位置に配置さ れるようになっており、仮想カメラ 80の揺れが大き 、ことを表して 、る。 Here, the position A shown in FIG. 4 corresponds to the first camera gazing point 41 of the virtual camera 80 in a state where the virtual camera 80 shown in FIGS. 5 and 6 is swung upward, and B shown in FIG. 5 corresponds to the second camera gazing point 42 of the virtual camera 80 in the state, and the C position shown in FIG. 4 corresponds to the virtual camera shown in FIG. 5 and FIG. This corresponds to the third camera gazing point 43 of the virtual camera 80 in a state where 80 is swung downward. In addition, as shown in Fig. 4, in the period up to 2π force 4π, the virtual camera 80 swings in the upward direction so that the maximum amplitude is +3, and the virtual position is The camera 80 is not shaken and is in the F position! The virtual camera 80 is swung so that the maximum amplitude is 3 in the downward direction. In addition, as shown in FIG. 4, in the period up to 4π force 6π, the virtual camera 80 swings upward in the G position so that the maximum amplitude becomes +2, and the virtual camera 80 moves to the Η position. 80 shakes ヽ In this state, the virtual camera 80 is swung so that the maximum amplitude is 2 in the downward direction at the I position. Here, 3 to −1 and +1 to +3 are relative ratios of the amplitude at which the virtual camera 80 swings. For example, the swing of the virtual camera 80 having a period of 2π force or 4π is 0 force 2 The shaking of the virtual camera 80 with a period up to π is three times that of the shaking. Therefore, the first camera gaze point 41 with a period of 2 π force up to 4 π is placed above the position of the first camera gaze point 41 with a period of 0 to 2 π, and the 2 π force 4 π The third camera gazing point 43 with a period up to 2 is arranged at a position below the position of the third camera gazing point 43 with a period from 0 force to 2π, and the shaking of the virtual camera 80 is large. It expresses that.
[0074] このように上方向及び下方向に揺れた仮想カメラ 80によって撮影された打者キャラ クタ 70がバッターボックスに入ったときの打撃動作画面である打撃動作表示画面 40 をテレビジョンモニタ 20に表示するキャラクタ動作表示システムについて、図 9に示す フローチャートを用いて説明する。  [0074] In this way, the batting action display screen 40, which is the batting action screen when the batter character 70 photographed by the virtual camera 80 swaying upward and downward enters the batter box, is displayed on the television monitor 20. The character action display system that performs this will be described with reference to the flowchart shown in FIG.
[0075] まず、現実には存在して 、な 、仮想カメラ 80を配置した位置である仮想カメラ 80の カメラ位置データ Ρ1〜Ρηが作成される(Sl)。ステップ SIのカメラ位置データ作成処 理では、球場内の 3次元の座標位置によって定まる位置データである仮想カメラ 80 のカメラ位置データ Pl〜Pnが作成される。具体的には、仮想カメラ 80がスタンド正 面に配置されているとき、仮想カメラ 80のカメラ位置 P1がスタンド正面に設定され、 仮想カメラ 80のカメラ位置 P1の 3次元の座標位置の数値データが作成される。ステ ップ S 1のカメラ位置データ作成処理で作成されたカメラ位置データ P 1〜Pnの各種 データは、 RAM 12に格納される。ステップ S1のカメラ位置データ作成処理が行わ れると、基準キャラクタデータ作成処理へ移行する(S 2)。 First, the camera position data Ρ1 to Ρη of the virtual camera 80, which actually exists but is the position where the virtual camera 80 is arranged, are created (Sl). In the camera position data creation process of step SI, camera position data Pl to Pn of the virtual camera 80, which is position data determined by the three-dimensional coordinate position in the stadium, are created. Specifically, when the virtual camera 80 is placed in front of the stand, the camera position P1 of the virtual camera 80 is set in front of the stand, and the numerical data of the three-dimensional coordinate position of the camera position P1 of the virtual camera 80 is Created. Various data of the camera position data P 1 to Pn created by the camera position data creation process of step S 1 is stored in the RAM 12. When the camera position data creation process in step S1 is performed, the process proceeds to the reference character data creation process (S2).
[0076] ステップ S2の基準キャラクタデータ作成処理では、仮想カメラ 80よって撮影された キャラクタに関する基準キャラクタデータ CBl〜CBnが作成される。ここで、キャラクタ に関する基準キャラクタデータ CBl〜CBnとは、キャラクタが行う動作を動画として生 成するための基準となるデータであって、仮想カメラ 80の揺れに依存しない基本とな るデータである。具体的には、ノ ッターボックスに入った打者キャラクタ 70に関する基 準キャラクタデータ CB1は、特定の打者キャラクタ 70の打撃スイングの動作を動画と して生成するためのデータであって、図 7に示す仮想カメラ 80が揺れていない状態 での打者キャラクタ 70が打撃スイングの動作を行う動画を生成するためのデータであ る。ステップ S 2の基準キャラクタデータ作成処理で作成された基準キャラクタデータ CBl〜CBnの各種データは、 RAM12に格納される。ステップ S2の基準キャラクタ データ作成処理が行われると、基準カメラ注視点データ作成処理へ移行する(S3)。 In the reference character data creation process in step S2, reference character data CBl to CBn related to the character photographed by the virtual camera 80 are created. Here, the reference character data CBl to CBn related to the character is data used as a reference for generating the motion performed by the character as a moving image, and is basic data that does not depend on the shaking of the virtual camera 80. Specifically, the reference character data CB1 related to the batter character 70 entered in the knotter box is data for generating a motion of the striking swing of the specific batter character 70 as a movie, and the virtual character data shown in FIG. Camera 80 is not shaken This is data for generating a moving image in which the batter character 70 performs a batting swing motion. Various data of the reference character data CBl to CBn created in the reference character data creation process of step S2 is stored in the RAM 12. When the reference character data creation process in step S2 is performed, the process proceeds to the reference camera gazing point data creation process (S3).
[0077] ステップ S3の基準カメラ注視点データ作成処理では、仮想カメラ 80の撮影対象と なるキャラクタに対する基準カメラ注視点データ LBl〜LBnが作成される。ここで、キ ャラクタに関する基準カメラ注視点データ LBl〜LBnとは、現実には存在していない 仮想カメラ 80が対象物であるキャラクタに対してピントを合わせるポイントであって、 図 5に示すように、仮想カメラ 80から打者キャラクタ 70に向けて真っ直ぐに延びる直 線 (仮想カメラ 80の先端部と、第 1カメラ注視点 41、第 2カメラ注視点 42、第 3カメラ 注視点 43とをそれぞれ結ぶ直線 SL1〜SL3)が打者キャラクタ 70と交差する打者キ ャラクタ 70上のポイントの数値データであり、たとえばキャラクタ上の 3次元の座標位 置の数値データである。図 5では、スタンド正面に配置された仮想カメラ 80から打者 キャラクタ 70に向けて真っ直ぐに延びる直線 (仮想カメラ 80の先端部と、第 1カメラ注 視点 41、第 2カメラ注視点 42、第 3カメラ注視点 43とをそれぞれ結ぶ直線 SL1〜SL 3)が打者キャラクタ 70と交差する打者キャラクタ 70上のポイントの数値データである 。ここでは、基準カメラ注視点データ LB1〜LB3と、後述するカメラ注視点データ L1 〜L3である第 1カメラ注視点 41、第 2カメラ注視点 42、第 3カメラ注視点 43とが対応 しており、特に、揺れていない仮想カメラ 80のカメラ注視点データ L2である第 2カメラ 注視点 42は、基準カメラ注視点データ LB2と同一になっている。ステップ S3の基準 カメラ注視点データ作成処理で作成された基準カメラ注視点データ LBl〜LBnの各 種データは、 RAM 12に格納される。ステップ S3の基準カメラ注視点データ作成処 理が行われると、ゲームに関する所定のゲーム条件を満たしているか否かが判断さ れる(S5)。 In the reference camera gazing point data creation process in step S3, reference camera gazing point data LBl to LBn for the character to be photographed by the virtual camera 80 are created. Here, the reference camera gazing point data LBl to LBn relating to characters are points where the virtual camera 80 that does not actually exist focuses on the target character, as shown in FIG. , A straight line extending straight from the virtual camera 80 toward the batter character 70 (straight lines connecting the tip of the virtual camera 80 and the first camera gaze point 41, the second camera gaze point 42, and the third camera gaze point 43, respectively) SL1 to SL3) are numerical data of points on the batter character 70 where the batter character 70 intersects, for example, numerical data of a three-dimensional coordinate position on the character. In Fig. 5, a straight line extending straight from the virtual camera 80 placed in front of the stand toward the batter character 70 (the tip of the virtual camera 80, the first camera gaze point 41, the second camera gaze point 42, the third camera This is the numerical data of the points on the batter character 70 where the straight lines SL1 to SL 3) respectively connecting the gazing point 43 and the batter character 70 intersect. Here, the reference camera gazing point data LB1 to LB3 and the first camera gazing point 41, the second camera gazing point 42, and the third camera gazing point 43, which are the camera gazing point data L1 to L3 described later, correspond to each other. In particular, the second camera gazing point 42 that is the camera gazing point data L2 of the virtual camera 80 that is not shaken is the same as the reference camera gazing point data LB2. The various data of the reference camera gazing point data LBl to LBn created in the reference camera gazing point data creation process in step S3 is stored in the RAM 12. When the reference camera gazing point data creation process in step S3 is performed, it is determined whether or not a predetermined game condition regarding the game is satisfied (S5).
[0078] ステップ S5のゲーム条件判断処理では、ゲームに関する所定のゲーム条件、すな わち仮想カメラ 80が揺れる具体的な条件が満たされて ヽるカゝ否カゝが判断される。具 体的には、球場内に発生した風の風力データ W (図 3参照)が所定の値以上であると きや、カメラマンが誤操作する確率によってカメラマンが誤操作したと判断されたとき は、仮想カメラ 80の揺れデータである所定の揺れ方向に所定の揺れ量 (揺れ幅)だ け、たとえば仮想カメラ 80が図 4に示す模式図のように上下方向に揺動し、 X軸、 Z軸 の座標位置の数値データを維持したまま Y軸の座標位置の数値データを所定量(図 4に示すー3〜一 1、 + 1〜+ 3)だけ増減させるように、仮想カメラ 80が揺れたと判断 され、カメラ注視点データ作成処理へ移行する(S 5)。また、ステップ S5のゲーム条 件判断処理では、所定のゲーム条件を満たしていないときは、仮想カメラ 80が揺れ て!、な 、と判断され、キャラクタ動作表示処理へ移行する(S7)。 In the game condition determining process in step S5, a predetermined game condition relating to the game, that is, whether or not a specific condition that the virtual camera 80 is shaken is satisfied is determined. Specifically, when the wind power data W of the wind generated in the stadium (see Fig. 3) is greater than or equal to a predetermined value, or when it is determined that the cameraman has made a mistake by the probability that the photographer will make a mistake. Is a predetermined amount of shaking (swing width) that is the shaking data of the virtual camera 80.For example, the virtual camera 80 swings up and down as shown in the schematic diagram of FIG. While maintaining the numerical data of the coordinate position of the Z axis, the virtual camera 80 is adjusted so that the numerical data of the coordinate position of the Y axis is increased or decreased by a predetermined amount (-3 to 1-1, +1 to +3 shown in Fig. 4). It is determined that the camera has shaken, and the process proceeds to the camera gazing point data creation process (S5). In the game condition determination process in step S5, if the predetermined game condition is not satisfied, it is determined that the virtual camera 80 is shaken! And the process proceeds to the character action display process (S7).
ステップ S5のカメラ注視点データ作成処理は、図 10に示すように、ステップ S3の基 準カメラ注視点データ作成処理で作成された基準カメラ注視点データ LB 1〜LBnが RAM12から読み出される(S11)。ステップ S11の基準カメラ注視点データ読出処 理が行われると、仮想カメラ 80のカメラ揺れデータ R (図 3参照)が読み出される(S12 ) oカメラ揺れデータ Rは、 0以上 1以下の数値データであって、数値が大きいほど仮 想カメラ 80の揺れが大きいことを表している。ステップ S 12のカメラ揺れデータ Rが読 み出されると、仮想カメラ 80のカメラ揺れ乱数データ ER (図 3参照)が算出される (S1 3)。カメラ揺れ乱数データ ERは、 0以上 R以下の数値データであって、乱数演算に よって算出される 0以上 1以下の数値データである。ステップ S 13のカメラ揺れ乱数算 出処理でカメラ揺れ乱数データ ERが算出されると、風力データ Wが RAM12から読 み出される(S14)。ステップ S14の風力データ読出処理が行われると、風力影響デ ータ EW (図 3参照)が算出される(S15)。風力影響データ EWは、風力データ Wを風 力データ Wの最大値で除算することによって算出される風力の影響度を表す数値デ ータであって、 0以上 1以下の数値データである。ステップ S 15の風力影響データ算 出処理で風力影響データ EWが算出されると、カメラ揺れ係数 E (図 3参照)が算出さ れる(S16)。カメラ揺れ係数 Eは、ステップ S 13のカメラ揺れ乱数算出処理で算出さ れたカメラ揺れ乱数データ ERと、ステップ S 15の風力影響データ算出処理で算出さ れた風力影響データ EWとを和算して得られる数値データである。ステップ S 16の力 メラ揺れ係数算出処理でカメラ揺れ係数 Eが算出されると、カメラ注視点データ算出 処理が行われる(S17)。ステップ SI 7のカメラ注視点データ算出処理では、ステップ S 11の基準カメラ注視点データ読出処理で読み出された基準カメラ注視点データ L B 1〜LBnと、ステップ S 16のカメラ揺れ係数算出処理で算出されたカメラ揺れ係数 E とを用いて、仮想カメラ 80のカメラ注視点データ Ll〜Ln (図 3参照)が作成される。こ こでは、図 4に示す模式図において、仮想カメラ 80の振幅 Sは、仮想カメラ 80の振幅 Sの最大値(図 4では D位置)に、ステップ S16のカメラ揺れ係数算出処理で算出され たカメラ揺れ係数 Eを乗算することによって得られる。なお、仮想カメラ 80の振幅 Sは 、仮想カメラ 80の振幅 Sの最大値(図 4では D位置)を超えることがなぐ仮想カメラ 80 の振幅 Sの最大値を超えた場合には、仮想カメラ 80の振幅 Sは、仮想カメラ 80の振 幅 Sの最大値と同一になるようになつている。ここでは、 X軸、 Z軸の座標位置の数値 データを維持したまま Y軸の座標位置の数値データに仮想カメラ 80の振幅 S及び sin (T) (Tは時間)を乗算することで得られる図 4に示す sinカーブとなる仮想カメラ 80の 揺れが決定される。なお、ステップ S 17のカメラ注視点データ算出処理は、周期 2 π 毎に繰り返し行われ、周期 2 π毎に異なる振幅 Sが得られることで、図 4に示すように 周期 2 π毎に異なる sinカーブとなる仮想カメラ 80の揺れが決定される。ステップ S17 のカメラ注視点データ算出処理で作成されたカメラ注視点データ Ll〜Lnの各種デ ータは、 RAM 12に格納される。ステップ S 17のカメラ注視点データ算出処理で、仮 想カメラ 80のカメラ注視点データ Ll〜Lnが作成されると、図 9のキャラクタデータ作 成処理へ移行する(S6)。 As shown in FIG. 10, the camera gazing point data creation process in step S5 is performed by reading the reference camera gazing point data LB1 to LBn created in the reference camera gazing point data creation process in step S3 from the RAM 12 (S11). . When the reference camera gazing point data reading process in step S11 is performed, the camera shake data R (see Fig. 3) of the virtual camera 80 is read (S12). O The camera shake data R is a numerical data between 0 and 1. The larger the value, the more the virtual camera 80 shakes. When the camera shake data R in step S12 is read, camera shake random number data ER (see FIG. 3) of the virtual camera 80 is calculated (S1 3). Camera shake random number data ER is numerical data of 0 or more and R or less, and is numerical data of 0 or more and 1 or less calculated by random number calculation. When the camera shake random number data ER is calculated in the camera shake random number calculation process in step S13, the wind power data W is read from the RAM 12 (S14). When the wind data reading process in step S14 is performed, wind influence data EW (see FIG. 3) is calculated (S15). Wind influence data EW is numerical data representing the degree of wind influence calculated by dividing the wind data W by the maximum value of the wind data W, and is numerical data from 0 to 1. When the wind influence data EW is calculated in the wind influence data calculation process in step S15, the camera shake coefficient E (see Fig. 3) is calculated (S16). The camera shake coefficient E is the sum of the camera shake random number data ER calculated in the camera shake random number calculation process in step S13 and the wind influence data EW calculated in the wind influence data calculation process in step S15. Numerical data obtained in this way. When the camera shake coefficient E is calculated in the force camera shake coefficient calculation process in step S16, the camera gazing point data calculation process is performed (S17). In the camera gazing point data calculation process in step SI 7, the reference camera gazing point data L read out in the reference camera gazing point data reading process in step S11 Camera gazing point data Ll to Ln (see FIG. 3) of the virtual camera 80 are created using B1 to LBn and the camera shake coefficient E calculated in the camera shake coefficient calculation process in step S16. Here, in the schematic diagram shown in FIG. 4, the amplitude S of the virtual camera 80 is calculated by the camera shake coefficient calculation process of step S16 to the maximum value of the amplitude S of the virtual camera 80 (D position in FIG. 4). It is obtained by multiplying the camera shake coefficient E. If the amplitude S of the virtual camera 80 exceeds the maximum value of the amplitude S of the virtual camera 80 that does not exceed the maximum value (D position in FIG. 4) of the virtual camera 80, the virtual camera 80 The amplitude S of the virtual camera 80 is the same as the maximum value of the amplitude S of the virtual camera 80. Here, it is obtained by multiplying the numerical data of the Y-axis coordinate position by the amplitude S and sin (T) (T is time) of the virtual camera 80 while maintaining the numerical data of the X-axis and Z-axis coordinate positions. The shaking of the virtual camera 80 having the sin curve shown in Fig. 4 is determined. Note that the camera gazing point data calculation process in step S 17 is repeated every cycle 2 π, and different amplitudes S are obtained every cycle 2 π, so that different sins are obtained every cycle 2 π as shown in FIG. The shaking of the virtual camera 80 that becomes a curve is determined. Various data of the camera gazing point data Ll to Ln created by the camera gazing point data calculation process in step S17 is stored in the RAM 12. When the camera gazing point data Ll to Ln of the virtual camera 80 are created in the camera gazing point data calculation process of step S17, the process proceeds to the character data creation process of FIG. 9 (S6).
ステップ S6のキャラクタデータ作成処理では、ステップ S1のカメラ位置データ作成 処理によって作成されたカメラ位置データ Pl〜Pn、ステップ S2の基準キャラクタデ ータ作成処理によって作成された基準キャラクタデータ CBl〜CBn、ステップ S5の カメラ注視点データ作成処理によって作成されたカメラ注視点データ LBl〜LBnを 基にして、キャラクタに関するキャラクタデータ Cl〜Cnが作成される。ここで、キャラク タに関するキャラクタデータ Cl〜Cnとは、選手キャラクタが行う動作を動画として生 成するためのデータであって、仮想カメラ 80のカメラ揺れデータ Rや風力データ Wを 付加した仮想カメラ 80の揺れに依存したデータである。具体的には、ステップ S1の カメラ位置データ作成処理によって作成されたカメラ位置データ Pl〜Pn、ステップ S 2の基準キャラクタデータ作成処理によって作成された基準キャラクタデータ CB1〜 CBn、ステップ S5のカメラ注視点データ作成処理によって作成されたカメラ注視点 データ LB 1〜LBnを基にして、仮想カメラ 80のカメラ注視点データ L 1〜Lnが鼻位 置(図 5及び図 7に示す第 2カメラ注視点 42)から上方向に所定量だけ増加したヘル メット位置(図 5及び図 6に示す第 1カメラ注視点 41)または鼻位置力 下方向に所定 量だけ減少した顎位置(図 5及び図 8に示す第 3カメラ注視点 43)に設定され、仮想 カメラ 80のカメラ揺れデータ Rや風力データ Wが付加された仮想カメラ 80によって打 者キャラクタ 70がノ ッターボックスに入ったときの打撃動作画面を生成するためのデ ータが作成される。ここでは、まず、図 5及び図 7に示す仮想カメラ 80が揺れていない 状態の仮想カメラ 80の第 2カメラ注視点 42では、第 2カメラ注視点 42に対応する打 者キャラクタ 70に関する基準キャラクタデータ CB1を基にして、打者キャラクタ 70に 関する基準キャラクタデータ CB1と同一の打者キャラクタ 70に関するキャラクタデー タ C2が作成される。次に、図 5及び図 6に示す仮想カメラ 80が上方向に揺れた状態 の仮想カメラ 80の第 1カメラ注視点 41では、第 1カメラ注視点 41に対応する打者キヤ ラクタ 70に関する基準キャラクタデータ CB1を基にして、打者キャラクタ 70に関する 基準キャラクタデータ CB1と同一の動作をするが仮想カメラ 80の向きが上向きとなつ た打者キャラクタ 70に関するキャラクタデータ C1が作成される。また、図 5及び図 8に 示す仮想カメラ 80が下方向に揺れた状態の仮想カメラ 80の第 3カメラ注視点 43では 、第 3カメラ注視点 43に対応する打者キャラクタ 70に関する基準キャラクタデータ CB 1を基にして、打者キャラクタ 70に関する基準キャラクタデータ CB1と同一の動作を するが仮想カメラ 80の向きが下向きとなった打者キャラクタ 70に関するキャラクタデ ータ C3が作成される。ステップ S6のキャラクタデータ作成処理で作成されたキャラク タデータ Cl〜Cnの各種データは、 RAM12に格納される。ステップ S6のキャラクタ データ作成処理で、キャラクタデータ Cl〜Cnが作成されると、キャラクタ動作表示処 理へ移行する(S7)。 In the character data creation process in step S6, camera position data Pl to Pn created by the camera position data creation process in step S1, reference character data CBl to CBn created by the reference character data creation process in step S2, step Character data Cl to Cn related to the character are created based on the camera gazing point data LBl to LBn created by the camera gazing point data creation process of S5. Here, the character data Cl to Cn relating to the characters is data for generating motions performed by the player character as moving images, and the virtual camera 80 to which the camera shake data R and wind power data W of the virtual camera 80 are added. This data is dependent on the shaking. Specifically, the camera position data Pl to Pn created by the camera position data creation process in step S1, the reference character data CB1 to CBn created by the reference character data creation process in step S2, and the camera gazing point in step S5 Camera gazing point created by data creation process Based on the data LB 1 to LBn, the camera gazing point data L 1 to Ln of the virtual camera 80 increased by a predetermined amount upward from the nose position (second camera gazing point 42 shown in FIGS. 5 and 7). Helmet position (first camera gazing point 41 shown in Figs. 5 and 6) or nasal position force is set to the jaw position (third camera gazing point 43 shown in Figs. 5 and 8) decreased by a predetermined amount downward. The virtual camera 80 to which the camera shake data R and the wind power data W of the virtual camera 80 are added generates data for generating a batting action screen when the batter character 70 enters the knotter box. Here, first, in the second camera gazing point 42 of the virtual camera 80 in a state where the virtual camera 80 is not shaken as shown in FIGS. 5 and 7, the reference character data regarding the batter character 70 corresponding to the second camera gazing point 42 is used. Based on CB1, the reference character data C2 relating to the batter character 70 is created. Next, in the first camera gazing point 41 of the virtual camera 80 with the virtual camera 80 swaying upward shown in FIGS. 5 and 6, the reference character data regarding the batter character 70 corresponding to the first camera gazing point 41 Based on CB1, character data C1 related to the batter character 70 is created that performs the same operation as the reference character data CB1 related to the batter character 70 but the virtual camera 80 faces upward. Further, in the third camera gazing point 43 of the virtual camera 80 with the virtual camera 80 swaying downward shown in FIGS. 5 and 8, reference character data CB 1 relating to the batter character 70 corresponding to the third camera gazing point 43 Based on this, the character data C3 relating to the batter character 70, which operates in the same manner as the reference character data CB1 relating to the batter character 70 but the virtual camera 80 is directed downward, is created. Various data of the character data Cl to Cn created by the character data creation processing of step S6 is stored in the RAM 12. When the character data Cl to Cn are created in the character data creation process in step S6, the process proceeds to the character action display process (S7).
ステップ S7のキャラクタ動作表示処理では、ステップ S6のキャラクタデータ作成処 理によって作成されたキャラクタデータ Cl〜Cnが読み出され、仮想カメラ 80によつ て撮影されたキャラクタの動作画面が表示される。ステップ S 7のキャラクタ動作表示 処理では、まず、図 5及び図 7に示す仮想カメラ 80が揺れていない状態の仮想カメラ 80の第 2カメラ注視点 42では、ステップ S6のキャラクタデータ作成処理によって作成 されたキャラクタデータ C2が読み出され、仮想カメラ 80によって撮影された打者キヤ ラクタ 70がバッターボックスに入ったときの打撃動作画面である打撃動作表示画面 4 0 (図 7参照)がテレビジョンモニタ 20に表示される。次に、図 5及び図 6に示す仮想力 メラ 80が上方向に揺れた状態の仮想カメラ 80の第 1カメラ注視点 41では、ステップ S 6のキャラクタデータ作成処理によって作成されたキャラクタデータ C1が読み出され、 上方向に揺れた仮想カメラ 80によって撮影された打者キャラクタ 70がノ ッターボック スに入ったときの打撃動作画面である打撃動作表示画面 40 (図 6参照)がテレビジョ ンモニタ 20に表示される。また、図 5及び図 7に示す仮想カメラ 80が下方向に揺れた 状態の仮想カメラ 80の第 3カメラ注視点 43では、ステップ S6のキャラクタデータ作成 処理によって作成されたキャラクタデータ C3が読み出され、下方向に揺れた仮想力 メラ 80によって撮影された打者キャラクタ 70がノ ッターボックスに入ったときの打撃動 作画面である打撃動作表示画面 40 (図 7参照)がテレビジョンモニタ 20に表示される In the character action display process of step S7, the character data Cl to Cn created by the character data creation process of step S6 are read, and the action screen of the character photographed by the virtual camera 80 is displayed. In the character action display process in step S7, first, the second camera gazing point 42 of the virtual camera 80 in the state where the virtual camera 80 is not shaken as shown in FIGS. 5 and 7 is created by the character data creation process in step S6. When the recorded character data C2 is read and the batter character 70 photographed by the virtual camera 80 enters the batter box, the batting motion display screen 40 (see FIG. 7) is a television monitor 20. Is displayed. Next, in the first camera gazing point 41 of the virtual camera 80 with the virtual force camera 80 swinging upward as shown in FIGS. 5 and 6, the character data C1 created by the character data creation process of step S6 is The batting motion display screen 40 (see FIG. 6), which is the batting motion screen when the batter character 70 taken by the virtual camera 80 that has been read and shaken upwards enters the knotter box, is displayed on the television monitor 20. Is done. Further, in the third camera gazing point 43 of the virtual camera 80 in a state in which the virtual camera 80 shown in FIGS. 5 and 7 is swung downward, the character data C3 created by the character data creation processing in step S6 is read out. The hitting action display screen 40 (see FIG. 7) is displayed on the television monitor 20 when the batter character 70 photographed by the virtual force mela 80 swaying downward enters the knotter box.
[0082] ここでは、ステップ S5のゲーム条件判断処理によって、仮想カメラ 80が揺れたと判 断されると、ステップ S 5のカメラ注視点データ作成処理及びステップ S6のキャラクタ データ作成処理によって、仮想カメラ 80のカメラ揺れデータ Rや風力データ Wを付加 した打者キャラクタ 70に関するカメラ注視点データ L 1〜Ln及び打者キャラクタ 70に 関するキャラクタデータ Cl〜Cnが作成され、カメラ揺れデータ Rや風力データ Wが 付加された仮想カメラ 80によって撮影された打者キャラクタ 70がバッターボックスに 入ったときの打撃動作画面である打撃動作表示画面 40がテレビジョンモニタ 20に表 示される。このゲームプログラムでは、現実世界の野球中継のように、テレビカメラの 揺れを再現することができるので、現実世界の野球中継に近いリアリティのある野球 ゲームを実現することができる。 Here, if it is determined that the virtual camera 80 is shaken by the game condition determination process of step S5, the virtual camera 80 is processed by the camera gazing point data generation process of step S5 and the character data generation process of step S6. Camera gazing point data L1 to Ln for batter character 70 to which camera shake data R and wind power data W are added, and character data Cl to Cn for batter character 70 are created, and camera shake data R and wind data W are added. The batting action display screen 40 that is the batting action screen when the batter character 70 photographed by the virtual camera 80 enters the batter box is displayed on the television monitor 20. This game program can reproduce the shaking of a TV camera like a real-world baseball broadcast, so a realistic baseball game similar to a real-world baseball broadcast can be realized.
[0083] 〔他の実施形態〕  [Other Embodiments]
(a) 前記実施形態では、ゲームプログラムを適用しうるコンピュータの一例としての 家庭用ビデオゲーム装置を用いた場合の例を示した力 ゲーム装置は、前記実施形 態に限定されず、モニタが別体に構成されたゲーム装置、モニタが一体に構成され たゲーム装置、ゲームプログラムを実行することによってゲーム装置として機能するパ 一ソナルコンピュータやワークステーションなどにも同様に適用することができる。 (a) In the above-described embodiment, the power game device shown as an example in the case of using a home video game device as an example of a computer to which the game program can be applied is not limited to the above-described embodiment, and a monitor is separately provided. A game device configured as a body, a game device configured integrally with a monitor, and a game device that functions as a game device by executing a game program. The same can be applied to a single computer or a workstation.
[0084] (b) 本発明には、前述したようなゲームを実行するプログラム、前述したようなゲー ムを実行するプログラム方法、およびこのプログラムを記録したコンピュータ読み取り 可能な記録媒体も含まれる。この記録媒体としては、カートリッジ以外に、たとえば、コ ンピュータ読み取り可能なフレキシブルディスク、半導体メモリ、 CD-ROM, DVD, MO、 ROMカセット、その他のものが挙げられる。  (B) The present invention also includes a program for executing the game as described above, a program method for executing the game as described above, and a computer-readable recording medium on which the program is recorded. As the recording medium, for example, a computer-readable flexible disk, a semiconductor memory, a CD-ROM, a DVD, an MO, a ROM cassette, and the like can be cited in addition to the cartridge.
[0085] (c) 前記実施形態では、仮想カメラ 80によって撮影された打者キャラクタ 70の打 撃動作を打撃動作表示画面 40に表示して 、たが、たとえば仮想カメラ 80をバックネ ット裏に配置し、投手キャラクタの投球動作を表示させるようにしてもよい。また、野球 ゲームに限定されるものではなぐサッカーゲームゃシユーティングゲーム等の他の あらゆるゲームにも本発明を適用できる。  (C) In the embodiment described above, the batting action of the batter character 70 photographed by the virtual camera 80 is displayed on the batting action display screen 40, but the virtual camera 80 is arranged behind the back net, for example. Then, the pitching motion of the pitcher character may be displayed. In addition, the present invention can be applied to all other games such as a soccer game and a shooting game that are not limited to baseball games.
[0086] (d) 前記実施形態では、仮想カメラ 80は、上下方向のみ揺らしていたが、左右方 向や、斜め方向及びこれらの組み合わせ等、あらゆる方向に揺らすことができる。ま た、仮想カメラ 80のカメラ位置データ Pl〜Pn、基準カメラ注視点データ LBl〜LBn 、カメラ注視点データ Ll〜Ln、カメラ揺れデータ R及びこれらの算出方法は、前記実 施形態に限定されるものではない。また、基準キャラクタデータ CBl〜CBn、風力デ ータ W及びこれらの算出方法は、前記実施形態に限定されるものではない。  (D) In the above embodiment, the virtual camera 80 is swung only in the vertical direction, but can be swung in any direction such as a horizontal direction, an oblique direction, and a combination thereof. Further, the camera position data Pl to Pn of the virtual camera 80, the reference camera gazing point data LBl to LBn, the camera gazing point data Ll to Ln, the camera shake data R, and the calculation methods thereof are limited to the above embodiment. It is not a thing. Further, the reference character data CBl to CBn, the wind data W, and the calculation methods thereof are not limited to the above embodiment.
産業上の利用可能性  Industrial applicability
[0087] 本発明によれば、ゲームプログラムにお 、て、ゲーム条件判断機能によって、仮想 カメラが揺れたと判断されると、カメラ注視点データ作成機能及びキャラクタデータ作 成機能によって、仮想カメラの揺れデータを付加したキャラクタに関するカメラ注視点 データ及びキャラクタに関するキャラクタデータが作成され、揺れが付加された仮想 カメラによって撮影された打者キャラクタがバッターボックスに入ったときの一連の動 作画面がモニタに表示されるので、現実世界の野球中継に近いリアリティのある野球 ゲームを実現することができる。 [0087] According to the present invention, when the game program determines that the virtual camera has shaken by the game condition determination function, the virtual camera shakes by the camera gazing point data creation function and the character data creation function. Camera gazing point data related to the character to which data is added and character data related to the character are created, and a series of action screens when the batter character photographed by the virtual camera to which the shake is added enter the batter box are displayed on the monitor. Therefore, it is possible to realize a realistic baseball game that is similar to a real-world baseball game.

Claims

請求の範囲 The scope of the claims
[1] 仮想カメラによって撮影されたキャラクタの動作画面をモニタに表示するゲームを実 現可能なコンピュータに、  [1] To a computer that can execute a game that displays on the monitor the action screen of the character captured by the virtual camera.
前記仮想カメラよつて撮影された前記キャラクタに関する基準キャラクタデータを作 成する基準キャラクタデータ作成機能と、  A reference character data creation function for creating reference character data relating to the character photographed by the virtual camera;
前記仮想カメラの撮影対象となる前記キャラクタに対する基準カメラ注視点データ を作成する基準カメラ注視点データ作成機能と、  A reference camera gazing point data creation function for creating reference camera gazing point data for the character to be imaged by the virtual camera;
前記ゲームに関する所定のゲーム条件を満たしているか否かを判断するゲーム条 件判断機能と、  A game condition determination function for determining whether or not a predetermined game condition for the game is satisfied;
前記ゲーム条件判断機能によって前記所定のゲーム条件が満たされていると判断 されたとき、前記基準カメラ注視点データを基にして、前記仮想カメラのカメラ注視点 データを作成するカメラ注視点データ作成機能と、  Camera gazing point data creation function for creating camera gazing point data of the virtual camera based on the reference camera gazing point data when the game condition determining function determines that the predetermined game condition is satisfied When,
前記基準キャラクタデータ及び前記カメラ注視点データを基にして、前記キャラクタ に関するキャラクタデータを作成するキャラクタデータ作成機能と、  Character data creation function for creating character data relating to the character based on the reference character data and the camera gazing point data;
前記キャラクタデータ作成機能によって作成された前記キャラクタデータを読み出し 、前記仮想カメラによって撮影された前記キャラクタの動作画面を表示するキャラクタ 動作表示機能と、  A character action display function for reading the character data created by the character data creation function and displaying an action screen of the character photographed by the virtual camera;
を実現させるためのゲームプログラム。  A game program to make it happen.
[2] 前記コンピュータに、  [2] In the computer,
前記仮想カメラのカメラ位置データを作成するカメラ位置データ作成機能をさらに 実現させ、  Further realizing a camera position data creation function for creating the camera position data of the virtual camera,
前記キャラクタデータ作成機能は、さらに前記仮想カメラのカメラ位置データを基に して、前記キャラクタに関するキャラクタデータを作成する、請求項 1に記載のゲーム プログラム。  The game program according to claim 1, wherein the character data creation function further creates character data related to the character based on camera position data of the virtual camera.
[3] 前記基準カメラ注視点データは、所定の周期毎に異なる振幅を有する波形を生成 可能なデータであって、  [3] The reference camera gazing point data is data capable of generating a waveform having a different amplitude for each predetermined period,
前記カメラ注視点データ作成機能は、前記基準カメラ注視点データの振幅を変化 させることによって前記カメラ注視点データを作成する、請求項 1又は 2に記載のゲー ムプログラム。 The game camera according to claim 1 or 2, wherein the camera gazing point data creation function creates the camera gazing point data by changing an amplitude of the reference camera gazing point data. Program.
[4] 前記ゲームに関する所定のゲーム条件は、前記仮想カメラの揺れに関する条件で あって、  [4] The predetermined game condition relating to the game is a condition relating to shaking of the virtual camera,
前記カメラ注視点データ作成機能は、さらに前記仮想カメラの揺れに関するカメラ 揺れデータを基にして、前記仮想カメラのカメラ注視点データを作成する、請求項 1 力 3のいずれか 1項に記載のゲームプログラム。  The game according to any one of claims 1 to 3, wherein the camera gazing point data creation function further creates camera gazing point data of the virtual camera based on camera shaking data relating to shaking of the virtual camera. program.
[5] 前記カメラ注視点データは、前記基準カメラ注視点データに、前記カメラ揺れデー タカゝら得られる所定のカメラ揺れ係数を乗じた数値データである、請求項 4に記載の ゲームプログラム。 [5] The game program according to claim 4, wherein the camera gazing point data is numerical data obtained by multiplying the reference camera gazing point data by a predetermined camera shaking coefficient obtained from the camera shaking data.
[6] 仮想カメラによって撮影されたキャラクタの動作画面をモニタに表示するするゲーム を実現させるゲーム装置であって、  [6] A game device for realizing a game for displaying a motion screen of a character photographed by a virtual camera on a monitor,
前記仮想カメラよつて撮影された前記キャラクタに関する基準キャラクタデータを作 成する基準キャラクタデータ作成手段と、  Reference character data creating means for creating reference character data relating to the character photographed by the virtual camera;
前記仮想カメラの撮影対象となる前記キャラクタに対する基準カメラ注視点データ を作成する基準カメラ注視点データ作成手段と、  Reference camera gazing point data creating means for creating reference camera gazing point data for the character to be imaged by the virtual camera;
前記ゲームに関する所定のゲーム条件を満たしているか否かを判断するゲーム条 件判断手段と、  Game condition determining means for determining whether or not a predetermined game condition relating to the game is satisfied;
前記ゲーム条件判断手段によって前記所定のゲーム条件が満たされていると判断 されたとき、前記基準カメラ注視点データを基にして、前記仮想カメラのカメラ注視点 データを作成するカメラ注視点データ作成手段と、  Camera gazing point data creating means for creating camera gazing point data of the virtual camera based on the reference camera gazing point data when the game condition determining unit determines that the predetermined game condition is satisfied. When,
前記基準キャラクタデータ及び前記カメラ注視点データを基にして、前記キャラクタ に関するキャラクタデータを作成するキャラクタデータ作成手段と、  Character data creating means for creating character data relating to the character based on the reference character data and the camera gazing point data;
前記キャラクタデータ作成手段によって作成された前記キャラクタデータを読み出し 、前記仮想カメラによって撮影された前記キャラクタの動作画面を表示するキャラクタ 動作表示手段と、  Character action display means for reading out the character data created by the character data creation means and displaying an action screen of the character photographed by the virtual camera;
を備えるゲーム装置。  A game device comprising:
[7] 仮想カメラによって撮影されたキャラクタの動作画面をモニタに表示するするゲーム をコンピュータに実現させるゲーム方法であって、 前記仮想カメラよつて撮影された前記キャラクタに関する基準キャラクタデータを作 成する基準キャラクタデータ作成ステップと、 [7] A game method for causing a computer to realize a game for displaying a motion screen of a character photographed by a virtual camera on a monitor, A reference character data creating step for creating reference character data relating to the character photographed by the virtual camera;
前記仮想カメラの撮影対象となる前記キャラクタに対する基準カメラ注視点データ を作成する基準カメラ注視点データ作成ステップと、  A reference camera gazing point data creation step for creating reference camera gazing point data for the character to be imaged by the virtual camera;
前記ゲームに関する所定のゲーム条件を満たしているか否かを判断するゲーム条 件判断ステップと、  A game condition determining step for determining whether or not a predetermined game condition for the game is satisfied;
前記ゲーム条件判断ステップによって前記所定のゲーム条件が満たされていると 判断されたとき、前記基準カメラ注視点データを基にして、前記仮想カメラのカメラ注 視点データを作成するカメラ注視点データ作成ステップと、  Camera gazing point data creation step of creating camera gazing point data of the virtual camera based on the reference camera gazing point data when it is determined that the predetermined game condition is satisfied by the game condition deciding step When,
前記基準キャラクタデータ及び前記カメラ注視点データを基にして、前記キャラクタ に関するキャラクタデータを作成するキャラクタデータ作成ステップと、  Character data creation step for creating character data related to the character based on the reference character data and the camera gazing point data;
前記キャラクタデータ作成ステップによって作成された前記キャラクタデータを読み 出し、前記仮想カメラによって撮影された前記キャラクタの動作画面を表示するキャラ クタ動作表示ステップと、  A character operation display step of reading the character data generated by the character data generation step and displaying an operation screen of the character photographed by the virtual camera;
を備えるゲーム方法。 A game method comprising:
PCT/JP2006/316224 2005-12-14 2006-08-18 Game program, game machine, and game method WO2007069373A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2005-360168 2005-12-14
JP2005360168A JP2007159817A (en) 2005-12-14 2005-12-14 Game program, game device and game method

Publications (1)

Publication Number Publication Date
WO2007069373A1 true WO2007069373A1 (en) 2007-06-21

Family

ID=38162689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2006/316224 WO2007069373A1 (en) 2005-12-14 2006-08-18 Game program, game machine, and game method

Country Status (3)

Country Link
JP (1) JP2007159817A (en)
TW (1) TW200722154A (en)
WO (1) WO2007069373A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4881981B2 (en) * 2009-08-14 2012-02-22 株式会社コナミデジタルエンタテインメント Virtual space display device, viewpoint setting method, and program
JP6499728B2 (en) * 2017-07-18 2019-04-10 株式会社カプコン Game program and game system
CN110193194B (en) * 2019-07-01 2023-02-28 网易(杭州)网络有限公司 Method and device for simulating camera shake in game and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11175766A (en) * 1997-12-05 1999-07-02 Namco Ltd Image generating device and information storage medium
JP2000306115A (en) * 1999-04-26 2000-11-02 Namco Ltd Image forming device and information storage medium
JP2000331184A (en) * 1999-05-20 2000-11-30 Namco Ltd Image forming device and information storing medium
JP2002035409A (en) * 2000-07-28 2002-02-05 Namco Ltd Game system and information recording medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11175766A (en) * 1997-12-05 1999-07-02 Namco Ltd Image generating device and information storage medium
JP2000306115A (en) * 1999-04-26 2000-11-02 Namco Ltd Image forming device and information storage medium
JP2000331184A (en) * 1999-05-20 2000-11-30 Namco Ltd Image forming device and information storing medium
JP2002035409A (en) * 2000-07-28 2002-02-05 Namco Ltd Game system and information recording medium

Also Published As

Publication number Publication date
JP2007159817A (en) 2007-06-28
TW200722154A (en) 2007-06-16

Similar Documents

Publication Publication Date Title
JP4029102B2 (en) Video game program, video game apparatus, and video game control method
JP4589971B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4536098B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4463289B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
US20120108303A1 (en) Game device, game control method and recording medium
JP4110186B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4521020B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP3892889B1 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP2009273865A (en) Game program, game machine, and game control method
JP5237312B2 (en) GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD
WO2007069373A1 (en) Game program, game machine, and game method
US20130137513A1 (en) Game machine, game system, game machine control method, and information storage medium
JP4122038B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4572245B2 (en) Image processing program, image processing apparatus, and image processing control method
JP2008237387A (en) Game program, game device and game control method
JP4543054B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP2007167117A (en) Game program, game device, and method for controlling game
JP2007222398A (en) Game program, game device and game control method
JP5072937B2 (en) GAME PROGRAM, GAME DEVICE, GAME CONTROL METHOD
JP3965198B1 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4430686B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4659071B2 (en) Image processing program, image processing apparatus, and image control method
JP4705145B2 (en) Drawing processing program, drawing processing apparatus, and drawing processing method
JP4772079B2 (en) GAME PROGRAM, GAME DEVICE, AND GAME CONTROL METHOD
JP4932815B2 (en) Drawing processing program, drawing processing apparatus, and drawing processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 06796535

Country of ref document: EP

Kind code of ref document: A1