WO1995035555A1 - Procede et dispositif de traitement d'image - Google Patents
Procede et dispositif de traitement d'image Download PDFInfo
- Publication number
- WO1995035555A1 WO1995035555A1 PCT/JP1995/001219 JP9501219W WO9535555A1 WO 1995035555 A1 WO1995035555 A1 WO 1995035555A1 JP 9501219 W JP9501219 W JP 9501219W WO 9535555 A1 WO9535555 A1 WO 9535555A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- matrix
- information
- camera
- data
- image processing
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1043—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being characterized by constructional details
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/203—Image generating hardware
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
Definitions
- the present invention relates to an image processing method and an image processing method applicable to a game machine that displays an image on a display device and progresses the game in real time, and particularly to the image processing method.
- the present invention relates to an image processing method and an apparatus for effectively utilizing hardware resources of a game machine.
- a computer game machine that can advance a game to a real time is a game machine that executes game software stored in advance. And an operation unit for providing an operation signal for processing such as moving a character on the game, and a display for displaying an image of the game development executed on the game machine body. It has a device and an audio device that generates the required sound as the game is deployed.
- game machines that capable of providing a more realistic image with a clear screen are gaining in popularity.
- game machines that use polygon processing which can display three-dimensional image data as a set of fixed units, provide a visual realism. Is popular because it is high.
- this polygon processing game machine the number of polygons that constitute one object (a moving object on the screen) is increased or the polygon surface is protected. By applying the scan, the reality can be further enhanced.
- seeking realism means that for hardware, not only can the processing time be shortened, but also the data capacity that increases dramatically can be reduced. This means demanding the ability to handle images, which significantly increases the burden on hardware. However, this demand is expected to increase further in the future, which will only increase the computational load on the duer.
- game machines originally perform real-time image processing based on player input. This was required, and the basic load of the central processing unit was originally higher than that of simple computer graphics image processing.
- An object of the present invention has been made in view of the above-mentioned problems of the related art, and an image processing method and an image processing method capable of reducing the load of image processing and displaying a more realistic image. It is intended to provide the equipment of the above.
- Another object of the present invention is to provide an image processing method and an image processing method capable of reducing the load of image processing on an object to be displayed and displaying a more realistic image. It is to provide the device. Disclosure of the invention
- an image processing method provides an object obtained by performing coordinate transformation between a matrix of information of virtual camera information and a transformation matrix.
- Object display data can be obtained by setting the rotation component in the matrix of the unit so that it is the unit matrix.
- the matrix of the camera information includes, for example, the position information of the force camera and the rotation information of the camera.
- a virtual camera is one in which the viewpoint and the angle of view when drawing computer graphics are compared to a camera.
- the settings of this virtual camera include position, optical axis direction (lens direction), angle of view (zoom to wide), and twist (rotation of the optical axis). Angle).
- a virtual camera refers to a viewpoint set virtually.
- the virtual camera is understood as virtual view direction determining means for determining the view direction of an image displayed on the television monitor.
- Shape (object) Defines the arrangement of objects (objects) in three-dimensional space from the body coordinate system, which is a unique coordinate system.
- the object force s which is model-transformed to the world coordinate system, is converted to the field-of-view coordinate system determined by this virtual camera (position and angle, etc.). explaining followed c is found to be displayed on the motor two motor 3 0 Te, or, preferably, moves the Conclusions click scan information of the virtual Camera in M, 3-dimensional space Let X be the matrix that has the information of, and let T be the transformation matrix.
- the image processing apparatus converts the coordinates of an object based on the matrix of the information of the virtual camera, and performs the coordinate conversion of the object. It is provided with a processing means for setting the rotation component of the matrix of the cut to the component forming the unit matrix.
- the processing means includes, for example, a camera control matrix processing means for obtaining a matrix of information of the virtual camera, and a matrix of information of the virtual camera.
- the matrix is multiplied by the conversion matrix to obtain a matrix with information on the rotated point, and the rotation component of the matrix is converted to a unit matrix.
- an image processing apparatus comprises: a storage unit for storing positional coordinate information and angle information of a camera and positional coordinate information of an object; and a force camera obtained from the storage unit.
- An object mat that calculates each angle of the three-dimensional coordinate system that directs the object in a desired direction from the position coordinate information and angle information of the object and the position coordinate information of the object. It is equipped with a risk processing means.
- the virtual component is always virtual.
- the display data of the object facing the gaze direction of the camera is created. Regardless of the direction in which the virtual camera is pointed, the relationship that the object always points in the direction of the camera is maintained. Only one piece of frontal data (two-dimensional data) is required, and there is no need to handle a large amount of data, so the calculation processing becomes lighter.
- the object is configured as an object that is displayed in the shape of a cell, and this object is always set at a predetermined position. The flat object such as a signboard can always be directed to the line of sight.
- the matrix of camera information can be selected as needed, based on the operation information of the game machine, such as camera position information and camera rotation information. You. BRIEF DESCRIPTION OF THE FIGURES
- FIG. 1 is a perspective view schematically illustrating a game machine according to one embodiment to which the present invention is applied.
- - Figure 2 is a diagram explaining the objects of the game machine.
- Figure 3 is an electrical block diagram showing the outline of the game machine.
- FIG. 4 is a functional block diagram of the central processing unit of the game machine and its peripherals.
- FIG. 5 is a flowchart showing an outline of the processing of the central processing unit.
- 6A to 6C are explanatory diagrams showing an example of the image processing.
- 7A to 7C are explanatory diagrams illustrating an example of the image processing.
- FIG. 8 is a flowchart showing an outline of processing of a central processing unit of a game machine according to another embodiment to which the present invention is applied.
- FIG. 1 shows a game machine to which an image processing method and an image processing apparatus according to the present invention are applied.
- This game machine handles, for example, a tank game, and as an object according to the present invention, a pattern of an explosion when an artillery launched from a tank hits a target ( Explosion pattern) will be described as an example.
- the game machine shown in FIG. 1 includes a casing 1 forming a cockpit.
- the housing 1 includes a base 1A and a front wall IB, and a front wall 1B is provided at one end of the base 1A.
- a cockpit 2 is provided at the base 1A, and a player sits on the cockpit 2 to control the game machine.
- the game machine main body 10 is placed inside, and the handlebar 2 OA, access 20B, and the view change switch are placed in the cockpit 2 below it.
- the unit 20 is provided with a TV monitor 30 and a speaker 40 at an upper front part thereof.
- This game machine handles a tank game, and the handle 20 ⁇ is the only operating device that gives the game machine a direction change.
- This tank game targets a tank as a running display object (vehicle).
- the tank 31 can be schematically represented as shown in FIG. 2, and has a hull 32 and a gun sight 33.
- the electrical block diagram of the game machine is shown in Figure 3.
- the game machine main body 10 of this game machine consists of a central processing unit (CPU) 101, an auxiliary processing unit 10 2.
- Polygon '' 'Lambda memory 110 a coordinate transformation device called a geomizer riser 111, Polygon de lambda memory 111, Ren It is equipped with a polygon paint device 113, also called a Darling device, and a frame memory 114.
- the central processing unit (CPU) 101 is connected to the auxiliary arithmetic processing unit 102, the program data ROM 103, the data RAMI 04, and the -Connected to the backup RAMI 05, input interface 106, sound device 108, and polygon '' '' RAM memory 110 ing.
- An operation unit 20 and a dip switch 107 are connected to the input interface 106.
- the CPU 101 cooperates with the auxiliary processing unit 102 to read out the game program data stored in advance in the program data ROM 103 and execute the program. Execute the program.
- the position and angle of a virtual camera that controls the position, direction, and angle of the tank as an object displayed on the TV monitor 30 and determines the field of view of the display screen Control etc. are included.
- Figure 5 shows an overview of these controls.
- the sound device 108 is connected to the speaker 40 via the power amplifier 109.
- the acoustic signal formed by the sound device 108 is power-amplified by the power amplifier 109 and sent to the speaker 40.
- the read-out end of the parameter memory 110 is connected to the coordinate converter 111, and the polygon parameter in this memory 110 is connected. Is given to the coordinate transformation device. You.
- the coordinate converter 111 is connected to the polygon memory 112 and receives the polygon data from the memory 112. In the coordinate conversion device 111, the coordinate values of the polygon to be displayed are converted from three-dimensional to two-dimensional perspective coordinates based on the given polygon parameters and polygon data. Is done.
- the output end of the coordinate conversion device 111 is connected to a polygon paint device 113, and the polygon data converted to perspective coordinates is converted to a polygon paint device. Sent to 1 1 3
- the polygon paint device 1 13 is used to transfer the texture stored in the frame memory 114 to the received polygon. To form the image data.
- a TV monitor 30 is connected to the output terminal of the polygon paint device 113, and the image formed by the paint device 113 is displayed.
- the accelerator 20B of the operation unit 20 is an accelerator opening that reflects the running speed V of the object on the TV monitor 30 in response to the operation of the player. Outputs the electric signal of A. Similarly, handle 2OA outputs an electrical signal in direction 0 that reflects the behavior of the object.
- the view switch 20C is a switch in which the player specifies the position of a virtual camera that determines the field of view of the image displayed on the TV monitor 30. .
- the central processing unit 101 cooperates with the auxiliary processing unit 102 and, as shown in FIG. 4, stores the image according to the present invention stored in the program data ROM 103 as shown in FIG.
- the camera control matrix processing means 12 1 and the object matrix processing means are executed. 1 and 2 and the object pattern display processing means 1 and 2 are functionally realized.
- the program data ROM 103 stores explosion model data M D composed of polygon peaks that determine the shape of the explosion pattern (object).
- the data R AMI04 contains the explosion pattern display matrix X, the camera matrix M, and the explosion coordinate data CD.
- the explosion coordinate data CD is the position data at which the explosion occurs, for example, the center position coordinates of the explosion model data MD.
- the central processing unit 101 controls the position and rotation of the virtual camera that determines the line of sight in response to the operation input from the operation unit 20 and controls the position and rotation of the virtual camera.
- the camera matrix M is stored in the data RAMI04, and the camera matrix M and the explosion coordinate data CD are taken in and the explosion pattern is read.
- (Object) coordinate transformation The explosion pattern display matrix X is created by executing the operation to set the rotation component of the matrix of the explosion pattern matrix and the unit matrix to the explosion pattern display matrix X.
- X is temporarily stored in RAMI 04.
- the central processing unit 101 is provided with an explosion model data MD stored in the program data R0M103 and an explosion pattern stored in the data RAMI04. The display processing of the explosion pattern is executed based on the display matrix X.
- the data thus obtained is supplied to the polygon parameter memory 110, and finally to the display device 30. You.
- the central processing unit 101 calculates the center position of the explosion pattern according to the force s (not shown), operation information, and the like as necessary, and the data RAMI 0 is used for each calculation.
- the value of the explosion coordinate data CD of 4 is updated.
- the operation of this embodiment will be described with reference to FIGS. 5 to 7, focusing on a display process for an explosion pattern as an object.
- the explosion pattern shall use the image data of the front part.
- the central processing unit 101 executes predetermined initialization (step 201).
- the central processing unit 101 reads operation information such as the handle angle and the opening degree of the accelerator operated by the player from the operation unit 20 (step 202). ), And based on the read information, a matrix of the camera information of the virtual camera (camera matrix).
- the control process of M is executed. (Step 203).
- the basic matrix of the position of the virtual camera is set to E
- the translation matrix is set to T
- the rotation matrix is set to rotation.
- the camera matrix M calculated in this manner is stored in the data RAMI04 (step 204).
- the central processing unit 101 reads out the explosion coordinate data CD from the program data ROM 103 and reads the data from RAMI 04 to the camera matrix. Reads Mx (Step 205). After that, the coordinate transformation of the explosion pattern is executed as the explosion pattern processing (step 206). That is, based on the current camera matrix M and the explosion coordinate data CD,
- the explosion pattern display matrix X for the movement in the three-dimensional space is obtained.
- T is the mobile transformation matrix.
- the explosion pattern display matrix X is expressed by [Equation 5]. .
- the first line is [1 0 0 0]
- the second line is [0 1 0 0]
- the third line is
- the central processing unit 101 calculates the rotational component of the matrix X.
- An operation to set the components forming the unit matrix is performed (Fig. 5, step 207). This is because the value of each component ai of the equation has rotation information, and if this rotation information is set to a value that forms a unit matrix as shown in [Equation 7] below, the rotation information is set. Will be lost, and inevitably, the explosion pattern display matrix X will indicate a non-rotating state. In other words, if the object is not rotating and can be turned to this object (here, the explosion pattern), the above-described operation should be performed. This allows the object to always be directed to this direction (line of sight) at a desired position.
- the explosion pattern display matrix X of the smart object can be formed, always pointing in a desired direction along the camera's line of sight. Even if the shooting direction of the camera is oriented in any direction, the object will always be in the direction of the camera.
- the explosion pattern display matrix X is temporarily stored in the data RAM 104 (step 208).
- the explosion pattern display matrix X is read from the data RAM 104, and the explosion model data MD is read from R0M103 (step 209). ).
- explosion pattern display processing is performed (step 210). In this display processing, the display data is created by multiplying the explosion 'turn display matrix X' by the polygon vertex data read from the explosion model data MD.
- the central processing unit 101 creates a polygon parameter of the polygon and stores the polygon data in the polygon's parameter memory 1. Output to 10 (Step 2 1 K 2 1 2) o
- step 202 the process returns to step 202.
- a matrix M of the desired force camera information is obtained and converted into a camera information matrix M.
- the matrix X to be multiplied is calculated.
- the explosion Display data in which the turn is always directed in the line of sight can be obtained.
- the data of the object including the explosion pattern obtained in this way is transferred to the coordinate transformation processing device via the polygon.parameter memory 110.
- the coordinate conversion processing device (geomet- lyzer) 1 1 1 creates display data in accordance with the required display coordinates, and uses a polygon painting device (rendering device). Give to 1 1 3).
- the texture data obtained from the frame 'memory 114 is added to the display data.
- the polygon data is converted into display data in which decoration is applied to the surface of the polygon data. This display data is given to the display device 30 and displayed.
- steps 202 to 204 form the camera contact port matrix processing means 122 of FIG.
- the elements 205 and 206 form the object matrix processing means 122 of the figure, and the steps 207 to 210 correspond to those of the figure.
- Object pattern display processing means 123 is equivalently formed.
- another matrix operation of the camera matrix X in step 206 described above will be described for comparison. In this embodiment, this method is not employed to reduce the number of operations.
- the first line is [100 0 0]
- the second line is [0 cos 0 sine 0]
- the rotation transformation matrix R X is [Equation 8].
- the first line has [abc 0]
- the second line has [ABC 0]
- the upper-case element of the calculation result X is the part affected by the rotation matrix Rx. Therefore, the matrix X obtained by rotating the virtual camera in the X-axis direction is obtained by the coordinate conversion process.
- the first line is [c 0 s 0 0-sine 0], the second line ⁇ [0 1 0 0]
- the third line is [sin 0 0 c 0 s 0 0]
- the first line is [GHI0]
- the second is [(1 e f0]
- the first line is [cos 0 si e 0 0],
- the second line is [one sin 0 cos 6 0 0]
- the third line is [0 0 1 0]
- the fourth line is [0 0 0 1]
- X R z 'M
- X is [MN 0 0] on the first line and [? QR 0], [ghi 0] on the third line, and [jk 1 1] on the fourth line.
- the upper-case element of the calculation result X is the part affected by the rotation matrix Rz ( therefore, the virtual camera was rotated in the z-axis direction by the coordinate transformation process). Matrix X is obtained.
- a matrix X having a rotation component can be obtained. Can be.
- the second line is [0 1 0 0]
- the third line is ⁇ [0 0 1 0]
- an object 501 is arranged in the upper center of the area 500, and a viewpoint 5200 of the virtual camera is shown in the figure.
- the direction is from the lower left of the drawing to the upper right of the drawing.
- the camera matrix M is defined as follows.
- the third row is [GHI 0] and the fourth row is a matrix with power ⁇ [abc 1] c. Therefore, when observing the object 5 10 from the camera viewpoint 5 20, X By calculating RTM, it is possible to obtain a trapezoidal object 5 10 ′ having a small left side and a large right side on the screen as shown in FIG. 7C. In this way, the object 510 can be acquired based on the information on the position and rotation of the camera, and a predetermined image can be obtained.
- FIGS. 6A and 7A the area 510 in FIGS. 6A and 7A is a rectangular polygon 510.
- FIG. 7A shows the state in which the camera moves and rotates. 4
- the converted parameters (vertices) are ⁇ 1 ', ⁇ 2', '3', P 4 '
- the rotation information represented by A, B, C, D, E, F, G, H, and I in the camera matrix M shown in Fig. 7B is formed into a unit matrix.
- the explosion pattern is obtained by a pseudo operation based on the unit matrix. Will always face the virtual camera at 90 degrees. For this reason, the entire explosion pattern can be expressed using only the frontal data of the explosion pattern. In other words, since it is possible to perform the movement according to the viewpoint using only the data in a predetermined area of the explosion pattern, it is possible to move the data only in the predetermined area of the explosion pattern. Juke can be represented three-dimensionally.
- a tank game is assumed as a game that handles the object of the image processing of the present invention, but this game simply uses a target that moves automatically. It may be a game that hits with.
- the explosion pattern has been described as an example of an object, the object may be arbitrary, such as an enemy tank in a tank game, or a specific background. It can be an object, or it can be just like a standing signboard, so that the object can always be directed to the line of sight, reducing the above-mentioned effect of reducing the computational load. Obtainable.
- This embodiment has the same configuration as that of the above-described tank game embodiment, and the central processing unit 101 performs the processing shown in FIG.
- the coordinates and angle information of the camera position are obtained in the camera control matrix processing, and the object matrix is processed.
- the rotation angle of the object on the three-dimensional coordinate axis is determined from the position of the object, the position of the camera and the rotation angle. Is calculated.
- the central processing unit 101 performs a camera control matrix mixing process in which the position coordinates of the camera are determined from the origin of the three-dimensional coordinates.
- the position data (Cx, Cy, Cz) indicating the position is obtained, and the angle information (Ax, Ay, Az) of the camera is obtained (Steps 301 to 30). 3).
- AX indicates how much the camera is rotating with respect to the X axis of the three-dimensional coordinates.
- a y and A z also indicate how much rotation is with respect to the y-axis and z-axis.
- the explosion coordinate data CD is taken from the RAM 104 and the object position coordinates are obtained.
- the explosion pattern and the flat object are displayed according to the display data accurately oriented in the desired direction (the direction of the line of sight), and the objects are processed.
- the objects are processed.
- a signboard as a flat object can always be directed in the line of sight.
- the object is always oriented accurately and in the desired direction (the direction of the line of sight) from the position data of the object and the position data and angle data of the camera. You can do it.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE69530824T DE69530824D1 (de) | 1994-06-20 | 1995-06-20 | Verfahren und gerät zur bildverarbeitung |
US08/596,324 US5971852A (en) | 1994-06-20 | 1995-06-20 | Image processing method and apparatus |
EP95921991A EP0715280B1 (en) | 1994-06-20 | 1995-06-20 | Method and apparatus for processing image |
KR1019960700837A KR100274444B1 (ko) | 1994-06-20 | 1995-06-20 | 화상 처리 방법 및 장치 |
JP8501948A JP2883737B2 (ja) | 1994-06-20 | 1995-06-20 | 画像処理方法及びその装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP16050794 | 1994-06-20 | ||
JP6/160507 | 1994-06-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1995035555A1 true WO1995035555A1 (fr) | 1995-12-28 |
Family
ID=15716448
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1995/001219 WO1995035555A1 (fr) | 1994-06-20 | 1995-06-20 | Procede et dispositif de traitement d'image |
Country Status (6)
Country | Link |
---|---|
US (1) | US5971852A (ja) |
EP (1) | EP0715280B1 (ja) |
KR (1) | KR100274444B1 (ja) |
CN (2) | CN1087854C (ja) |
DE (1) | DE69530824D1 (ja) |
WO (1) | WO1995035555A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1186031A (ja) * | 1997-09-11 | 1999-03-30 | Sega Enterp Ltd | 画像処理装置及び画像処理方法並びに媒体 |
US6664965B1 (en) | 1998-08-07 | 2003-12-16 | Kabushiki Kaisha Sega Enterprises | Image processing device and information recording medium |
US7101284B2 (en) | 2001-12-18 | 2006-09-05 | Sony Computer Entertainment Inc. | Object display system in a virtual world |
US7277571B2 (en) | 2002-05-21 | 2007-10-02 | Sega Corporation | Effective image processing, apparatus and method in virtual three-dimensional space |
Families Citing this family (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6582308B1 (en) * | 1997-03-23 | 2003-06-24 | Kabushiki Kaisha Sega Enterprises | Image processing device and character aspect design device |
JP3103322B2 (ja) * | 1997-05-23 | 2000-10-30 | コナミ株式会社 | シューティングゲーム装置、シューティングゲームの画像表示方法及び可読記録媒体 |
JP3342393B2 (ja) * | 1998-03-19 | 2002-11-05 | 株式会社コナミコンピュータエンタテインメントジャパン | ビデオゲーム装置、コンピュータ読み取り可能な記録媒体 |
TW401699B (en) * | 1998-04-01 | 2000-08-11 | Koninkl Philips Electronics Nv | A method and device for generating display frames from a sequence of source frames through synthesizing one or more intermediate frames exclusively from an immediately preceding source frame |
EP0999523B1 (en) | 1998-05-20 | 2006-12-06 | Kabushiki Kaisha Sega doing business as Sega Corporation | Image processor, game machine, image processing method, and recording medium |
US6350199B1 (en) * | 1999-03-16 | 2002-02-26 | International Game Technology | Interactive gaming machine and method with customized game screen presentation |
EP1079332A1 (en) * | 1999-08-25 | 2001-02-28 | M.M. Multimedia A/S | Method of rendering states of a system |
JP3321570B2 (ja) * | 1999-09-14 | 2002-09-03 | 株式会社ソニー・コンピュータエンタテインメント | 動画作成方法、記憶媒体およびプログラム実行装置 |
JP3822776B2 (ja) * | 2000-03-29 | 2006-09-20 | 株式会社バンダイナムコゲームス | ゲーム装置、及び情報記憶媒体 |
JP3686920B2 (ja) * | 2002-05-21 | 2005-08-24 | コナミ株式会社 | 3次元画像処理プログラム、3次元画像処理方法及びビデオゲーム装置 |
JP4563266B2 (ja) * | 2005-06-29 | 2010-10-13 | 株式会社コナミデジタルエンタテインメント | ネットワークゲームシステム、ゲーム装置、ゲーム装置の制御方法及びプログラム |
JP4732925B2 (ja) * | 2006-03-09 | 2011-07-27 | イマグノーシス株式会社 | 医用画像の表示方法およびそのプログラム |
US8319771B2 (en) * | 2008-09-30 | 2012-11-27 | Disney Enterprises, Inc. | Computer modelled environment |
TWI493500B (zh) * | 2009-06-18 | 2015-07-21 | Mstar Semiconductor Inc | 使二維影像呈現出三維效果之影像處理方法及相關影像處理裝置 |
CN102238313A (zh) * | 2010-04-22 | 2011-11-09 | 扬智科技股份有限公司 | 产生影像转换矩阵的方法、影像转换方法及其装置 |
JP5905421B2 (ja) * | 2013-09-03 | 2016-04-20 | 株式会社ア−キテック | 三次元作図システム及びそのプログラム |
US11638872B1 (en) * | 2021-10-18 | 2023-05-02 | Electronic Arts Inc. | Videographer mode in online games |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6165368A (ja) * | 1984-09-07 | 1986-04-03 | Hitachi Ltd | 設計支援方法 |
JPS6182278A (ja) * | 1984-09-29 | 1986-04-25 | Toshiba Corp | 3次元座標変換装置 |
JPS6266382A (ja) * | 1985-09-19 | 1987-03-25 | Fujitsu Ltd | 回転図形の歪発生防止方式 |
JPS6295666A (ja) * | 1985-10-21 | 1987-05-02 | Sony Corp | 画像発生装置 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4027403A (en) * | 1975-03-12 | 1977-06-07 | The Singer Company | Real-time simulation of point system having multidirectional points as viewed by a moving observer |
US4811245A (en) * | 1985-12-19 | 1989-03-07 | General Electric Company | Method of edge smoothing for a computer image generation system |
JPH0668758B2 (ja) * | 1986-01-07 | 1994-08-31 | 株式会社日立製作所 | カーソル制御方法及び3次元図形表示装置 |
US5003498A (en) * | 1986-01-13 | 1991-03-26 | Hitachi, Ltd. | Graphic display method |
US5001663A (en) * | 1989-05-03 | 1991-03-19 | Eastman Kodak Company | Programmable digital circuit for performing a matrix multiplication |
US5261820A (en) * | 1990-12-21 | 1993-11-16 | Dynamix, Inc. | Computer simulation playback method and simulation |
FR2675977B1 (fr) * | 1991-04-26 | 1997-09-12 | Inst Nat Audiovisuel | Procede de modelisation d'un systeme de prise de vues et procede et systeme de realisation de combinaisons d'images reelles et d'images de synthese. |
JP2760253B2 (ja) * | 1992-07-14 | 1998-05-28 | 住友電気工業株式会社 | 道路の動画像作成方法及びこの方法を適用した車載ナビゲーション装置 |
US5379370A (en) * | 1992-07-17 | 1995-01-03 | International Business Machines Corporation | Method and apparatus for drawing lines, curves, and points coincident with a surface |
JPH0812687B2 (ja) * | 1992-11-27 | 1996-02-07 | 日本電気株式会社 | 3次元地図上のシンボル表示方式 |
US5583977A (en) * | 1993-10-21 | 1996-12-10 | Taligent, Inc. | Object-oriented curve manipulation system |
-
1995
- 1995-06-20 DE DE69530824T patent/DE69530824D1/de not_active Expired - Lifetime
- 1995-06-20 WO PCT/JP1995/001219 patent/WO1995035555A1/ja active IP Right Grant
- 1995-06-20 KR KR1019960700837A patent/KR100274444B1/ko not_active IP Right Cessation
- 1995-06-20 US US08/596,324 patent/US5971852A/en not_active Expired - Fee Related
- 1995-06-20 EP EP95921991A patent/EP0715280B1/en not_active Expired - Lifetime
- 1995-06-20 CN CN95190555A patent/CN1087854C/zh not_active Expired - Fee Related
-
2002
- 2002-05-16 CN CNB021202141A patent/CN1205594C/zh not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6165368A (ja) * | 1984-09-07 | 1986-04-03 | Hitachi Ltd | 設計支援方法 |
JPS6182278A (ja) * | 1984-09-29 | 1986-04-25 | Toshiba Corp | 3次元座標変換装置 |
JPS6266382A (ja) * | 1985-09-19 | 1987-03-25 | Fujitsu Ltd | 回転図形の歪発生防止方式 |
JPS6295666A (ja) * | 1985-10-21 | 1987-05-02 | Sony Corp | 画像発生装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP0715280A4 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1186031A (ja) * | 1997-09-11 | 1999-03-30 | Sega Enterp Ltd | 画像処理装置及び画像処理方法並びに媒体 |
US6664965B1 (en) | 1998-08-07 | 2003-12-16 | Kabushiki Kaisha Sega Enterprises | Image processing device and information recording medium |
US6980207B2 (en) | 1998-08-07 | 2005-12-27 | Kabushiki Kaisha Sega Enterprises | Image processing device and information recording medium |
US7471297B2 (en) | 1998-08-07 | 2008-12-30 | Kabushiki Kaisha Sega Enterprises | Image processing device and information recording medium |
US7557808B2 (en) | 1998-08-07 | 2009-07-07 | Kabushiki Kaisha Sega | Image processing device and information recording medium |
US7101284B2 (en) | 2001-12-18 | 2006-09-05 | Sony Computer Entertainment Inc. | Object display system in a virtual world |
KR100730455B1 (ko) | 2001-12-18 | 2007-06-19 | 가부시키가이샤 소니 컴퓨터 엔터테인먼트 | 가상 세계에서의 객체 표시 시스템 |
US7277571B2 (en) | 2002-05-21 | 2007-10-02 | Sega Corporation | Effective image processing, apparatus and method in virtual three-dimensional space |
Also Published As
Publication number | Publication date |
---|---|
CN1087854C (zh) | 2002-07-17 |
CN1129482A (zh) | 1996-08-21 |
CN1399230A (zh) | 2003-02-26 |
EP0715280A1 (en) | 1996-06-05 |
EP0715280B1 (en) | 2003-05-21 |
US5971852A (en) | 1999-10-26 |
DE69530824D1 (de) | 2003-06-26 |
CN1205594C (zh) | 2005-06-08 |
KR960704284A (ko) | 1996-08-31 |
EP0715280A4 (en) | 1996-08-21 |
KR100274444B1 (ko) | 2000-12-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1995035555A1 (fr) | Procede et dispositif de traitement d'image | |
US20220091725A1 (en) | Method, apparatus and device for view switching of virtual environment, and storage medium | |
KR100393504B1 (ko) | 오브젝트방향제어방법및장치 | |
JP5300777B2 (ja) | プログラム及び画像生成システム | |
EP2105905A2 (en) | Image generation apparatus | |
US6196919B1 (en) | Shooting game apparatus, method of performing shooting game, and computer-readable recording medium storing shooting game program | |
JPH0836651A (ja) | 画像処理装置 | |
JP2012161604A (ja) | 空間相関したマルチディスプレイヒューマンマシンインターフェース | |
JP3245142B2 (ja) | ゲームシステム及び情報記憶媒体 | |
JP3215306B2 (ja) | 画像合成方法及び装置 | |
JP2000288248A (ja) | ゲーム装置及び情報記憶媒体 | |
JP2883737B2 (ja) | 画像処理方法及びその装置 | |
JP3431522B2 (ja) | ゲーム装置及び情報記憶媒体 | |
US6749509B1 (en) | Image processing method and apparatus | |
JPH11283049A (ja) | 画像処理方法及びその装置 | |
JP2002216167A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP2001250128A (ja) | ゲームシステム及び情報記憶媒体 | |
JP4229316B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP4698701B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP5063022B2 (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP4159060B2 (ja) | 画像生成装置及び情報記憶媒体 | |
JP2002092640A (ja) | ゲームシステム及び情報記憶媒体 | |
JP2888828B1 (ja) | 画像生成装置及び情報記憶媒体 | |
JP3990050B2 (ja) | ゲーム装置及び情報記憶媒体 | |
JP2002042166A (ja) | ゲームシステム及び情報記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 95190555.4 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR RU US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE ES FR GB IT |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1995921991 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1995921991 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08596324 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 1996 596324 Country of ref document: US Date of ref document: 19961022 Kind code of ref document: A |
|
WWG | Wipo information: grant in national office |
Ref document number: 1995921991 Country of ref document: EP |