WO2005101325A1 - 大表面オブジェクト表示機能を有するゲームプログラム及びゲーム装置 - Google Patents
大表面オブジェクト表示機能を有するゲームプログラム及びゲーム装置 Download PDFInfo
- Publication number
- WO2005101325A1 WO2005101325A1 PCT/JP2004/017152 JP2004017152W WO2005101325A1 WO 2005101325 A1 WO2005101325 A1 WO 2005101325A1 JP 2004017152 W JP2004017152 W JP 2004017152W WO 2005101325 A1 WO2005101325 A1 WO 2005101325A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mesh
- projection
- procedure
- large surface
- virtual camera
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/40—Hidden part removal
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
- G06T17/205—Re-meshing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/663—Methods for processing data by generating or executing the game program for rendering three dimensional images for simulating liquid objects, e.g. water, gas, fog, snow, clouds
Definitions
- Game software and game device having large surface object display function
- the present invention relates to a game software and a game device capable of modeling and rendering a three-dimensional object in real time, and particularly represents a three-dimensional object having a large surface such as a sea, a lake, a river, a desert, and a jiyadar.
- the present invention relates to game software and a game device suitable for application. Background art
- game software is a concept including the program itself and various data associated with the program as necessary. However, “game software” is not necessarily associated with data! /, But there is always a program.
- the “various associated data” may be stored together with the program in a memory means such as a ROM disk, or may be stored in an external memory means so as to be readable via a communication medium such as the Internet. It may be done.
- this type of game software uses a 3D image to represent a 3D object with a large surface (hereinafter simply referred to as a “large surface object”) such as a sea, lake, river, desert, or jungle.
- a large surface object such as a sea, lake, river, desert, or jungle.
- the entire model of the 3D object to be represented is placed at a predetermined position in the virtual space, the placed overall model is equally divided into many small meshes, polygons are placed on each mesh, and each Animation processing that changes the shape of the mesh (polygon) over time, and rendering processing such as texture mapping, shadowing, and shading are performed on each polygon to generate an image to be displayed. It was.
- the present invention can eliminate the waste of CPU computation while maintaining the quality of video displayed on a display when displaying a large surface object by real-time CG animation processing.
- An object of the present invention is to provide game software and a game apparatus that can realize high-speed drawing.
- the present invention acquires a video of the surface of the large surface object (21, 22) having a large surface arranged in the three-dimensional virtual space (31) with the virtual camera (23) in the computer (16),
- the game software (GPR) having a program that executes the procedure displayed on the motor (9)
- the game software is further stored in the computer.
- OBD object data
- the lower surface (25e) of the visual field boundary (25) of the camera coordinates (26) of the virtual camera is set in the three-dimensional virtual space so as to intersect the plane (22a) on which the surface of the large surface object is arranged.
- the mesh generated on the projection plane by the mesh generation procedure is projected onto the surface position (22a) of the large surface object indicated by the object data in the three-dimensional virtual space, and the projection mesh (29A) is projected.
- Projection mesh setting procedure (BOD) to be set
- the large surface object partially modeled by the partial modeling procedure Rendering procedure (CCP), which renders the surface of the image and computes the image (22b) of the surface on the projection plane.
- CCP partial modeling procedure
- the mesh (29) is set on the projection plane (25c) of the virtual camera (23), and the set mesh (29) is projected onto the surface position (22a) of the large surface object. Since the projection mesh (29A) is set, the eye (29b) of the projected projection mesh (29A) can be made larger as the mesh (29b) farther away from the virtual force lens (23). As a result, the distant mesh (29b) projected on the large surface object (21, 22) occupies a large surface area on the surface (22a), and modeling and subsequent rendering processes for each eye (29b) are performed. It is possible to carry out processing etc. simply with the area ratio to the object as the mesh (29b) farther away.
- the mesh generating hand 1 jet is:
- the mesh (29a) may be configured to be calculated and generated for a portion closer to the virtual camera than the coordinate position on the projection plane calculated by the surface position calculation procedure.
- the mesh generating hand 1 jet is:
- the projection plane (25c) of the virtual camera may be configured to have an equal division procedure for generating the mesh in such a manner that the projection plane (25c) is equally divided in the horizontal and vertical directions.
- the mesh can be easily generated by generating the mesh in such a manner that the projection plane (25c) of the virtual camera is equally divided in the horizontal and vertical directions. ) Calculation load can be reduced.
- the projection plane of the virtual camera (29b) of the projection mesh (29A) set by the projection mesh setting procedure becomes larger as the virtual camera force increases.
- the mesh (29) can be generated by dividing the 25c) into a mesh dividing procedure.
- the projection plane (25c) of the virtual camera is divided so that the eye (29b) force of the projection mesh (29A) set by the projection mesh setting procedure increases as the distance from the virtual camera increases. Since the mesh (29) is generated, the large surface object (21, 22) and the distant mesh (29b) do not significantly affect the image quality displayed on the monitor (9). A large surface area can be occupied on the top. As a result, the modeling for each eye (29b) and the subsequent rendering process can be easily performed with the area ratio to the object for the distant mesh (29b).
- the partial modeling procedure includes:
- a polygon arrangement procedure for arranging the plate polygons (30) may be provided.
- each plate polygon (30) is arranged for each eye (29b) constituting the projection mesh (29A), the eye (29b) and the plate polygon (30 ) Is one-to-one and modeling can be done easily.
- the present invention may be configured such that the large surface object (21, 22) is an object (22) representing a sea, a lake, or a river.
- the large surface object (21) is an object representing the sea, lake or river (
- the surface of the large surface object 21 becomes a water surface having a relatively simple surface shape. Therefore, the rendering process of the telescopic plate polygon (30) from the virtual camera (31) described above is performed on the water surface of a large area. Even if it is simply performed with respect to (22a), the present invention can be effectively utilized since the simple processing does not become conspicuous as deterioration in image quality.
- the present invention can also be configured as a game device controlled by a computer that executes game software having the above-described program.
- the game software is stored in a hard disk or RO in the game device.
- FIG. 1 is a control block diagram of a game machine to which the present invention is applied.
- FIG. 2 is a schematic diagram showing an example of a lake (large surface object) arranged in a three-dimensional virtual space and a virtual camera arranged to render a scene of the lake.
- a lake large surface object
- FIG. 3 is a schematic diagram showing the relationship between the visual field boundary of the virtual camera of FIG. 2 and the water surface of the lake.
- FIG. 4 is a plan view of FIG.
- FIG. 5 is a schematic diagram showing an example of a mesh set on a projection plane of camera coordinates.
- FIG. 6 is a diagram showing an example of a water surface image displayed on the display.
- FIG. 7 is described in c code is a flowchart showing the outline of a large surface object processing
- the game device 20 executes a predetermined game in accordance with game software such as an action game recorded on a ROM disk 15 as a recording medium.
- the game device 20 includes a CPU 1 mainly composed of a microprocessor, ROM (read-only memory) 2 and RAM (random access memory) 3 as a main storage device for the CPU 1, an image processing device 4 and sound processing. It has a device 6, buffers 5 and 7 for these devices, and a ROM disk reader 8.
- ROM2 An operating system as a program necessary for overall operation control of the game machine is written in ROM2.
- RAM 3 game programs and data read from the ROM disk 15 as a storage medium are written as necessary.
- Image processing equipment The device 4 receives the image data from the CPU 1 and draws the game screen on the frame buffer 5, converts the drawn image data into a predetermined video reproduction signal, and outputs it to the monitor 9 at a predetermined timing.
- the sound processing device 6 reproduces data such as voice, musical sound and sound source data read from the ROM disk 15 and recorded in the sound buffer 7 and outputs them from the speaker 10.
- the ROM disk reader 8 reads a program or data recorded on the ROM disk 15 in accordance with an instruction from the CPU 1, and outputs a signal corresponding to the read content.
- the ROM disk 15 stores programs and data necessary for game execution.
- the monitor 9 generally uses a television receiver for home use, and the speaker 10 generally uses a built-in speaker of the television receiver.
- a communication control device 11 is connected to the CPU 1 via a bus 14, and a controller 12 and an auxiliary storage device 13 as input devices are detachably attached to the device 11 via appropriate connection ports.
- the controller 12 functions as an input device, and is provided with operation members such as operation keys for receiving operations by the player.
- the communication control device 11 scans the operation state of the controller 12 at a constant cycle (for example, 1Z60 seconds), and outputs a signal corresponding to the scanning result to the CPU1.
- the CPU 1 determines the operation state of the controller 12 based on the signal.
- a plurality of controllers 12 and auxiliary storage devices 13 can be connected to the communication control device 11 in parallel.
- the other components except for the monitor 9, the speaker 10, the controller 12, the ROM disk 15, and the auxiliary storage device 13 are integrally accommodated in a predetermined housing to constitute the game machine body 16. To do.
- This game machine body 16 functions as a computer.
- the ROM disk 15 stores game software GPR in which a game progresses according to a predetermined scenario such as an action game, a role playing game, or an adventurous game.
- the CPU 1 when a predetermined initialization operation (for example, a power-on operation) is performed, the CPU 1 first executes a predetermined initialization process according to the program stored in the ROM 2. When initialization is completed, CPU 1 starts reading game software GPR stored in ROM disk 15, and starts game processing according to the program. When the player performs a predetermined game start operation on the controller 12, the CPU 1 Game software Starts various processes necessary for game execution according to GPR procedures.
- a predetermined initialization operation for example, a power-on operation
- the game apparatus 20 performs a predetermined process according to the read game software GPR, controls display of an image displayed on the monitor 9, and controls a progress of a predetermined scenario.
- the game device 20 as a home game machine has been described as an example.
- the game device 20 may be a so-called portable game machine. Furthermore, it may be a device capable of playing back general music and video recording media using a game-dedicated device.
- the present invention is not limited to this, and any computer may be used, for example, a personal computer, a mobile phone, or the like, that is, any computer that can function game software.
- the storage mode thereof is arbitrary.
- the game software GPR program in the ROM disk 15 it is stored in an external memory means such as a server independent of the game machine 1, and the reading provided in the game software GPR is performed. It may be configured to be downloaded to a memory such as RAM3 via a communication mediator such as the Internet by a sending program.
- the game software GPR game is created on the field FLD set in the three-dimensional virtual space 31 generated by the CPU 1 in the RAM 3 by the field generation program FPP of the game software GPR. It is set as a so-called action game in which a player advances a scenario while fighting with an enemy character while a character (not shown) that can be operated via the controller 12 moves.
- the game software GPR shown in FIG. Only the software elements related to the present invention are described, and the game software GPR includes various programs necessary for executing a game using the game software GPR in addition to those described in FIG. And data are stored.
- the field generation program FPP is a scenario progress control program that controls the progress of the game scenario.
- the CPU 1 via the CPU 1, in the three-dimensional virtual space 31, as shown in FIG.
- a large surface object 21 which is an object representing a large surface such as a lake surface, a sea surface, a sand surface, an upper surface of a jiyandal or a river surface, such as a desert, a jungle or a river, may be arranged.
- the lake 22 is arranged in the three-dimensional virtual space 31 as the large surface object 21.
- the scenario progress control program SAP sends the camera control program CCP of the game software GPR to the lake 22 via the CPU 1 and the image processing device 4 in accordance with the movement of a character (not shown) during the game. Command the monitor 9 to display the rendered image.
- the camera control program CCP reads out the large surface object processing program BOP from the game software GPR force via the CPU 1, and based on the large surface object processing program BOP, the water surface that is the surface of the lake 22 Display processing on display 9 of 22a.
- the lake 22 is not arranged as an object until the display processing of the lake 22 by the large surface object processing program BOP is performed. Therefore, the surface object processing program BOP sends object data related to the lake 22 for which display processing is instructed by the camera control program CCP from the object data file ODF of the game software GPR to the field generation program F PP via the CPU 1. O Command to read BD.
- the CPU 1 reads out the object data OBD related to the lake 22 to be placed in the three-dimensional virtual space 31 from the object data file ODF, and stores it in a buffer memory (not shown) (step Sl in FIG. 7).
- the object data of the lake 22 includes OPD, the location data DPD regarding the location of the lake 22 in the 3D virtual space 31, the 3D shape data TSD data such as the shape of the lake 22 and the shape and depth of the water surface 22a, etc.
- Lake 22 3 Data necessary for placement in the three-dimensional virtual space 31 is stored, and the field generation program FPP reads the object data OBD of the lake 22 read from the object data file ODF via the CPU1, and the three-dimensional virtual space from the object data OBD.
- the object of the lake 22 can be easily generated and arranged at a predetermined position in the area 31.
- the large surface object processing program BOP uses the virtual camera (viewpoint) 23 for projecting the object of the lake 22 via the CPU1.
- the current position is acquired from the camera control program CCP, and the positional relationship between the camera 23 and the lake 22 arranged in the three-dimensional virtual space 31 is calculated (step S2 in FIG. 7).
- the virtual camera 23 is connected to the water surface 22a of the lake 22 with the Z-axis of the camera coordinate system 26, as shown in FIGS. 2 to 4, via the camera control program CCP.
- the 3D virtual space 31 is arranged so as to intersect the plane on which the water surface 22a is arranged, and the force control program CCP shows the physical range of the 3D virtual space 31 that can be captured from the virtual camera 23.
- 8 is set for the object of Lake 22, more precisely the object showing the water surface 22a.
- the visual field boundary 25 sets a horizontal and vertical visual field range at the camera coordinates 26, and further, within this visual field boundary 25, a front clipping plane showing a range for projecting an object in the three-dimensional virtual space 31. 25a and a rear clipping plane 25b are set, and a projection plane (view screen) 25c is set between the front clipping plane 25a and the rear clipping plane 25b. In addition, between the front clipping plane 25a and the rear clipping plane 25b of the visual field boundary 25, the object volume 21 in the three-dimensional virtual space 31 becomes a view volume 25d as a range to be projected onto the projection plane 25c.
- the large surface object processing program BOP detects the farthest water surface 22a (horizontal line in the lake 22 object projected onto the projection surface 25c from the position of the virtual camera 23 relative to the lake 22 object.
- HL is calculated as the coordinate position on the projection plane (step S3 in FIG. 7). This can easily calculate the object data OBD of the lake 22, the position of the virtual camera 23 and the projection plane 25c, and the shape data force of the view volume 25d.
- the object data OBD of the lake 22 as shown in FIG. 6, the farthest water surface of the water surface 22a of the lake 22 projected onto the projection surface 25c, in the case of FIG. 3, the rear clipping surface 25b and the water surface Calculate the V coordinate position on the uv coordinate on the projection plane of the position where the intersection line CP (hereinafter referred to as “horizontal line HL”) shown in FIG. Step S3 in Figure 7.
- the water surface 22a of the lake 22 is usually set.
- the intersecting line CP is set horizontally in the u-axis direction with a predetermined V coordinate on the projection plane 25c as shown in FIG.
- the large surface object processing program BOP is the part where the water surface 22a below the horizontal line HL of the projection plane 25c is arranged, that is,
- the mesh 29 is set and calculated equally divided in the u-axis (horizontal direction) and V-axis direction (vertical direction) respectively (step S4 in FIG. 7). Note that this division does not necessarily need to be equal.
- the large surface object processing program BOP uses the CPU 1 to convert the mesh 29 thus obtained into coordinates in which the water surface 22a of the large surface object 21 is located within the visual field boundary 25 as shown in FIG.
- the projection mesh 29A is calculated by projecting to the position.
- the mesh 29 generated evenly on the projection surface 25c is generated by projecting the mesh 29 because the quadrangular pyramid field boundary 25 of the virtual camera 23 is set so as to intersect the water surface 22a.
- the projected mesh 29A thus generated is generated such that the eyes 29b of the projected mesh 29A become larger as the distance from the virtual camera 23 increases.
- the Z axis 26 does not necessarily intersect the water surface 22a, but in order for the water surface 22a to be projected onto the projection surface 25c, the quadrangular pyramid-shaped field boundary of the virtual camera 23 is used.
- the lower surface 25e of 24 needs to intersect the water surface 22a (more precisely, the plane in the three-dimensional virtual space 31 on which the water surface 22a is set).
- the projection mesh 29A is generated on the projection surface 25c onto which the water surface 22a of the virtual camera 23 is projected.
- the projection mesh 29A is horizontal as shown in FIG. It is appropriately set within the range of the view volume 25d within the range of the viewing angle 13, that is, within the range of the three-dimensional virtual space 31 displayed on the display 9.
- the large surface object processing program BOP corresponds to each eye 29b of the projection mesh 29A projected to the position of the water surface 22a of the lake 22 of the three-dimensional virtual space 31 and the size of the eyes 29b.
- the plate polygons 30 are arranged, and the water surface 22a portion located in the view volume 25d of the virtual camera 23 is modeled with the plate polygons. At this time, no plate polygon 30 is arranged on the water surface 22a other than the view volume 25d. In the past, regardless of the range of the view volume 25d, it is only necessary to model the water surface 22a partially compared to placing many polygons on all the water surfaces 22a of the large surface object 21. The calculation time required for the layout processing of the plate polygon 30 can be greatly reduced.
- the large surface object processing program BOP performs an animation process and a rendering process on the arranged plate polygons 30. Then, the image 22b of the water surface 22a of the lake 22 is acquired on the projection surface 25c of the virtual camera 23.
- animation processing suitable for real-time CG for example, processing for deforming each plate polygon 30 over time
- texture mapping for example, texture mapping
- shadowing for example, texture mapping
- shading for example, texture mapping
- reflection for example, projection processing for perspective conversion of each plate polygon 30 of the projection mesh 29A to the projection surface 25c using the virtual camera 23 via the camera control program CCP and CPU 1 is performed.
- the rendering processing includes the above-described polygon image processing and perspective transformation processing to the projection surface 25c.
- the large surface object processing program BOP only needs to perform the animation processing and polygon image processing only for the plate polygon 30 arranged only within the range of the view volume 25d of the virtual camera 23.
- the calculation load can be greatly reduced. You can.
- each plate polygon 30 that performs polygon image processing increases as the distance from the virtual camera 23 increases, and occupies a large surface area on the water surface 22a of the lake 22.
- the mesh 29 eyes 29a corresponding to the plate polygons 30 are set to the same size. The farther away, that is, the larger the Z coordinate value of the camera coordinate is, the larger the plate polygon 30, the smaller the image is projected onto the projection surface 25 c.
- the reduction rate of each plate polygon 30 with respect to the projection plane 25c increases as the plate polygon 30 having a larger Z coordinate value. Therefore, as already mentioned, plate polygon 30 with large Z coordinate value (projected mesh 29A eye 29b) is replaced with plate polygon 30 with small Z coordinate value (projected projection) as shown in Fig. 4. Even if the mesh 29A is set to be larger than the eye 29b), the effect of the polygon image processing performed on such a large plate polygon 30 is reduced by the large reduction ratio to the projection surface 25c.
- the polygon image processing is performed on the large plate polygon 30 far from the virtual camera 23, and thus on the surface 22a of the lake 22 so as to cover the large surface area, and the small plate polygon 30 near the virtual camera 23 is processed.
- the effect can be reduced to a negligible level on the projection plane 25c even if it is performed in a greatly simplified manner per unit area of the lake surface 22a. .
- the water surface 22a near the virtual camera 23 is modeled by a large number of small plate polygons 30, so that it is possible to perform detailed animation processing and polygon image processing as compared to the remote water surface 22a. That is, only for the plate polygon 30 near the virtual camera 23, a real water surface 22a is expressed by performing advanced animation processing and polygon image processing with high processing density per unit surface area of the object, and distant plate polygons. 30 can be done with simple animation processing and polygon image processing with low processing density per unit area of the object.
- the polygons constituting the distant water surface 22a are more than the polygons 30 nearby. Since it is set to be large, the number to be processed can be significantly less than that of the nearby plate polygon 30, and animation processing for the plate polygon 30 on the far surface 22a And polygon image processing can be performed without imposing a heavy load on CPU1.
- the plate polygon 30 is not arranged at all outside the range of the view volume 25d of the virtual camera 23 and the water surface 22a is not modeled as described above. Neither processing nor polygon image processing is performed.
- the large surface object processing program BOP performs animation processing and polygon image processing for each plate polygon 30 placed and set on the eyes 29b of the projection mesh 29A via the CPU 1, and the camera control program CCP.
- the image 22b from the virtual camera 23 on the water surface 22a of the lake 22 is obtained as shown in FIG.
- the CCP displays the image 22b projected on the projection surface 25c on the monitor 9 via the CPU 1 and the image processing device 4 (step S8 in FIG. 7).
- the image 22b displayed on the monitor 9 has the water surface 22a of the lake 22 drawn in detail with a fine plate polygon 30 on the nearby water surface 22a, and the distant water surface 22a It is displayed in a form that has been rendered in a simplified form so as not to become unnatural due to the large plate polygon 30.
- the force described for the case where the image 22b of the water surface 22a of the lake 22 is calculated and generated as the large surface object 21 is the large surface object 21, as already described.
- any object that has a relatively monotonous and large surface such as the surface of the sea, jungle, river, desert, sea surface, many plants, river surface, sand, etc., can be applied.
- the X axis of the camera coordinate 26 of the virtual camera 23 is set parallel to the XZ plane (horizontal plane) of the world coordinate 27 . May not necessarily be parallel to the XZ plane of world coordinates 27. That is, when the mesh 29 shown in the figure is set as the projection plane 25c and the projection mesh 29A is projected to the coordinate position where the surface of the large surface object 21 is arranged, the X axis force of the virtual camera 23 S world coordinate 27 XZ If the image 22b is acquired by the virtual camera 23, which is preferably held parallel to the plane, the X axis of the virtual camera 23 is tilted with respect to the XZ plane of the world coordinates 27, and the projection plane 25c is tilted. Polygon placed on the eyes 29b of the projection mesh 29A The image 22b can also be obtained and generated by perspective-transforming each polygon 30 that has undergone image processing.
- the CPU 1 constitutes a game control device, and various means of the game control device are constituted by a combination of the CPU 1 and specific software. At least one of these means is used.
- the unit may be replaced with a logic circuit.
- the present invention is not limited to a home game system, and may be configured as a game system of various scales. Industrial applicability
- the present invention can be used as an electronic game device using a computer and entertainment software to be executed by the computer.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP04821888A EP1734481A4 (en) | 2004-03-31 | 2004-11-18 | GAME PROGRAM AND GAME DEVICE HAVING LARGE SURFACE OBJECT DISPLAY FUNCTION |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004106044A JP3853329B2 (ja) | 2004-03-31 | 2004-03-31 | ゲームプログラム及びゲーム装置 |
JP2004-106044 | 2004-03-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005101325A1 true WO2005101325A1 (ja) | 2005-10-27 |
Family
ID=35096935
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/017152 WO2005101325A1 (ja) | 2004-03-31 | 2004-11-18 | 大表面オブジェクト表示機能を有するゲームプログラム及びゲーム装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20050233805A1 (ja) |
EP (1) | EP1734481A4 (ja) |
JP (1) | JP3853329B2 (ja) |
WO (1) | WO2005101325A1 (ja) |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070150138A1 (en) | 2005-12-08 | 2007-06-28 | James Plante | Memory management in event recording systems |
US10878646B2 (en) | 2005-12-08 | 2020-12-29 | Smartdrive Systems, Inc. | Vehicle event recorder systems |
US8996240B2 (en) | 2006-03-16 | 2015-03-31 | Smartdrive Systems, Inc. | Vehicle event recorders with integrated web server |
US9201842B2 (en) | 2006-03-16 | 2015-12-01 | Smartdrive Systems, Inc. | Vehicle event recorder systems and networks having integrated cellular wireless communications systems |
US8649933B2 (en) * | 2006-11-07 | 2014-02-11 | Smartdrive Systems Inc. | Power management systems for automotive video event recorders |
US8989959B2 (en) | 2006-11-07 | 2015-03-24 | Smartdrive Systems, Inc. | Vehicle operator performance history recording, scoring and reporting systems |
US8868288B2 (en) | 2006-11-09 | 2014-10-21 | Smartdrive Systems, Inc. | Vehicle exception event management systems |
US8239092B2 (en) | 2007-05-08 | 2012-08-07 | Smartdrive Systems Inc. | Distributed vehicle event recorder systems having a portable memory data transfer system |
JP5087453B2 (ja) * | 2008-03-31 | 2012-12-05 | 株式会社カプコン | プログラム、記憶媒体およびコンピュータ装置 |
JP5078712B2 (ja) * | 2008-04-01 | 2012-11-21 | 任天堂株式会社 | 画像処理プログラム、画像処理装置、画像処理システム及び画像処理方法 |
KR20100132605A (ko) * | 2009-06-10 | 2010-12-20 | 삼성전자주식회사 | 하이브리드 렌더링 장치 및 방법 |
JP5627526B2 (ja) * | 2011-03-31 | 2014-11-19 | 株式会社カプコン | ゲームプログラム、及びゲームシステム |
US9728228B2 (en) | 2012-08-10 | 2017-08-08 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
US9501878B2 (en) | 2013-10-16 | 2016-11-22 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
US9610955B2 (en) | 2013-11-11 | 2017-04-04 | Smartdrive Systems, Inc. | Vehicle fuel consumption monitor and feedback systems |
US8892310B1 (en) | 2014-02-21 | 2014-11-18 | Smartdrive Systems, Inc. | System and method to detect execution of driving maneuvers |
US9663127B2 (en) | 2014-10-28 | 2017-05-30 | Smartdrive Systems, Inc. | Rail vehicle event detection and recording system |
US11069257B2 (en) | 2014-11-13 | 2021-07-20 | Smartdrive Systems, Inc. | System and method for detecting a vehicle event and generating review criteria |
JP6480215B2 (ja) | 2015-03-06 | 2019-03-06 | 株式会社タチエス | 車両用シート |
US9679420B2 (en) | 2015-04-01 | 2017-06-13 | Smartdrive Systems, Inc. | Vehicle event recording system and method |
CN107679015B (zh) * | 2017-09-08 | 2021-02-09 | 山东神戎电子股份有限公司 | 一种基于三维地图的云台摄像机实时监控范围仿真方法 |
CN109509243B (zh) * | 2017-09-13 | 2022-11-11 | 腾讯科技(深圳)有限公司 | 一种液体仿真方法、液体交互方法及装置 |
CN111862324B (zh) * | 2020-07-10 | 2023-07-28 | 完美世界(北京)软件科技发展有限公司 | 水系的烘焙方法及装置、存储介质、电子装置 |
CN111862291B (zh) * | 2020-07-10 | 2024-01-09 | 完美世界(北京)软件科技发展有限公司 | 水系的烘焙方法及装置、存储介质、电子装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636013A (ja) * | 1992-07-14 | 1994-02-10 | Hitachi Ltd | 地形データの作成方法および装置 |
JPH08234657A (ja) * | 1995-02-24 | 1996-09-13 | Nissan Motor Co Ltd | 車両用経路誘導装置 |
JPH0944698A (ja) * | 1995-07-25 | 1997-02-14 | Hitachi Ltd | 模擬海洋波画像の生成方法および生成装置 |
JPH11250279A (ja) * | 1997-11-20 | 1999-09-17 | Real 3 D | コンピュ―タ画像形成ジェネレ―ションシステムにおけるシルエット/フットプリント解析を用いた異方性テクスチャマッピング |
JP2000222596A (ja) * | 1999-01-29 | 2000-08-11 | Toshiba Electronic Systems Co Ltd | コンピュータグラフィックによる3次元海面画像のフラクタルレベル制御方法及びコンピュータグラフィックによる3次元海面画像のフラクタルレベル制御プログラムを記録したコンピュータ読み取り可能な記録媒体 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010040586A1 (en) * | 1996-07-25 | 2001-11-15 | Kabushiki Kaisha Sega Enterprises | Image processing device, image processing method, game device, and craft simulator |
US6879324B1 (en) * | 1998-07-14 | 2005-04-12 | Microsoft Corporation | Regional progressive meshes |
WO2003044720A1 (en) * | 2001-11-15 | 2003-05-30 | Nintendo Software Technology Corporation | System and method of simulating and imaging realistic water surface |
-
2004
- 2004-03-31 JP JP2004106044A patent/JP3853329B2/ja not_active Expired - Lifetime
- 2004-11-18 WO PCT/JP2004/017152 patent/WO2005101325A1/ja not_active Application Discontinuation
- 2004-11-18 EP EP04821888A patent/EP1734481A4/en not_active Ceased
- 2004-12-30 US US11/027,231 patent/US20050233805A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0636013A (ja) * | 1992-07-14 | 1994-02-10 | Hitachi Ltd | 地形データの作成方法および装置 |
JPH08234657A (ja) * | 1995-02-24 | 1996-09-13 | Nissan Motor Co Ltd | 車両用経路誘導装置 |
JPH0944698A (ja) * | 1995-07-25 | 1997-02-14 | Hitachi Ltd | 模擬海洋波画像の生成方法および生成装置 |
JPH11250279A (ja) * | 1997-11-20 | 1999-09-17 | Real 3 D | コンピュ―タ画像形成ジェネレ―ションシステムにおけるシルエット/フットプリント解析を用いた異方性テクスチャマッピング |
JP2000222596A (ja) * | 1999-01-29 | 2000-08-11 | Toshiba Electronic Systems Co Ltd | コンピュータグラフィックによる3次元海面画像のフラクタルレベル制御方法及びコンピュータグラフィックによる3次元海面画像のフラクタルレベル制御プログラムを記録したコンピュータ読み取り可能な記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP1734481A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20050233805A1 (en) | 2005-10-20 |
JP3853329B2 (ja) | 2006-12-06 |
EP1734481A1 (en) | 2006-12-20 |
EP1734481A4 (en) | 2008-02-20 |
JP2005293122A (ja) | 2005-10-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP3853329B2 (ja) | ゲームプログラム及びゲーム装置 | |
JP4177381B2 (ja) | 画像生成方法、画像生成装置、および画像生成プログラム | |
US6781598B1 (en) | Entertainment apparatus, image generation method, and storage medium | |
EP1808207B1 (en) | Storage medium having game program stored thereon and game apparatus | |
JP3625184B2 (ja) | ゲーム用3次元画像処理方法、装置、ゲーム用3次元画像処理プログラムを記録した可読記録媒体及びビデオゲーム装置 | |
JP2008250813A (ja) | 画像生成装置、画像処理方法、および、プログラム | |
JP4305903B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP4193979B2 (ja) | シャドウボリューム生成プログラム及びゲーム装置 | |
JP4502678B2 (ja) | プログラム、情報記憶媒体、及び画像生成システム | |
JP3639286B2 (ja) | ゲームプログラム、及びゲーム装置 | |
US6483520B1 (en) | Image creating method and apparatus, recording medium for recording image creating program, and video game machine | |
JP3564440B2 (ja) | 動画像生成プログラム、動画像生成方法及び装置 | |
JP4447000B2 (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
EP1235187A1 (en) | Image processor, image processing method, and storage medium | |
JP4469709B2 (ja) | 画像処理プログラムおよび画像処理装置 | |
JP3737784B2 (ja) | 3次元画像処理プログラム、3次元画像処理方法及びビデオゲーム装置 | |
KR20020013891A (ko) | 영상들을 생성하는 방법 및 장치 | |
JP3822882B2 (ja) | ゲームプログラム及びゲーム装置 | |
JP2005275795A (ja) | プログラム、情報記憶媒体及び画像生成システム | |
JP2005122479A (ja) | プログラム、情報記憶媒体及び画像生成装置 | |
JP2007159817A (ja) | ゲームプログラム、ゲーム装置及びゲーム方法 | |
JP2011031050A (ja) | 画像生成システム、プログラム及び情報記憶媒体 | |
JP3183636B2 (ja) | 3次元ゲーム装置及び情報記憶媒体 | |
JP4391633B2 (ja) | 画像生成システム及び情報記憶媒体 | |
JP2006068079A (ja) | ゲームプログラム及びゲーム装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004821888 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWW | Wipo information: withdrawn in national office |
Country of ref document: DE |
|
WWP | Wipo information: published in national office |
Ref document number: 2004821888 Country of ref document: EP |