WO2009087936A1 - 画像処理装置、画像処理方法、情報記録媒体、ならびに、プログラム - Google Patents
画像処理装置、画像処理方法、情報記録媒体、ならびに、プログラム Download PDFInfo
- Publication number
- WO2009087936A1 WO2009087936A1 PCT/JP2008/073835 JP2008073835W WO2009087936A1 WO 2009087936 A1 WO2009087936 A1 WO 2009087936A1 JP 2008073835 W JP2008073835 W JP 2008073835W WO 2009087936 A1 WO2009087936 A1 WO 2009087936A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- polygon
- planar body
- image processing
- projection
- orientation
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/60—Shadow generation
Definitions
- Image processing apparatus image processing method, information recording medium, and computer suitable for easily drawing an image such as a shadow generated when an object in a virtual space approaches a planar object such as a floor or a wall in a game or the like It relates to a program for realizing the above.
- Patent Document 1 discloses a technique for implementing more realistic shadows by changing the transparency of shadows according to changes in the position of a light source, the position and orientation of an object, and the like. There is also a demand for application to a technique in which an image of a light-emitting object is projected on the floor as well as a shadow. JP 2007-195747 A
- the present invention is for solving the above-described problems, and is suitable for easily drawing an image such as a shadow generated when an object in a virtual space comes close to a floor or a wall in a game device or the like.
- An object is to provide an image processing apparatus, an image processing method, an information recording medium, and a program to be realized on a computer.
- an image processing apparatus includes a storage unit, a selection unit, and a generation unit, an object disposed in a virtual space, a planar body, and Then, an image representing the image of the object projected on the planar body is generated.
- the storage unit covers the position of the viewpoint and the direction of the line of sight arranged in the virtual space, the shape, position and orientation of the object and the planar object arranged in the virtual space, and the object.
- the shape, position and orientation of a plurality of covering polygons to be arranged are stored.
- the viewpoint is a virtual camera looking at the three-dimensional virtual space
- the line of sight is the direction in which the virtual camera is facing.
- the two-dimensional image of the virtual space is generated by projecting the virtual space viewed from the viewpoint in the line-of-sight direction onto a two-dimensional plane (also referred to as a projection plane or a projection plane).
- the planar body is another object different from the object, and assumes a wide surface such as a ground (floor) or a wall.
- the shape of each object and the planar body is defined (modeled) by a surface such as a combination of minute polygons called polygons.
- the covering polygon is a polygon that expresses an image such as a shadow or a reflection projected on a planar object that the object is close to (hereinafter, mainly described as “shadow” as an example).
- a plurality of covered polygons form a polyhedron and are arranged so as to cover the object.
- shadows are clearly projected on objects that are nearby. Therefore, in the image processing apparatus according to the present invention, there is a planar object close to the object when the planar object comes into contact with the covered polygon or the planar polygon is included in the covered polygon. I think. Then, by selecting and drawing a covering polygon that satisfies a predetermined condition such as being in contact with the planar body as a projection polygon, a shadow of the object is generated on the planar body.
- a predetermined distance is maintained between an object and a covering polygon covering the object, and the appearance of a shadow differs depending on this distance.
- the distance between the object and the covering polygon may be made larger.
- the distance between the object and the covering polygon may be made smaller.
- the selection unit selects a projection polygon that satisfies a predetermined drawing condition from among the plurality of covered polygons. That is, as described above, the selection unit selects a covering polygon that satisfies a predetermined condition such as being in contact with a planar body as a projection polygon.
- the generation unit generates the line of sight from the viewpoint based on the shape, position, and orientation of the object and the planar body, the position of the viewpoint, the direction of the line of sight, and the shape, position, and direction of the projection polygon.
- An image of the virtual space viewed in the direction is generated. That is, the generation unit draws the projection polygon selected by the selection unit, and generates a shadow of the object on the planar body.
- the generation unit typically draws an object using a Z buffer method or the like. That is, for example, for each pixel constituting the image data to be drawn, the object (and planar object or covering) is painted with the color of texture information (or transparency information) corresponding to the polygon closest to the viewpoint. Polygon) is drawn to perform hidden surface processing.
- the generation unit generates the object from the viewpoint based on the shape, position, and orientation of the object and the planar body, the position of the viewpoint, the direction of the line of sight, and the shape, position, and orientation of the projection polygon.
- the collision determination process is performed on the object, the arrangement of the object and the planar body does not overlap. That is, a state in which the object “sinks” into the planar body does not occur.
- the position of the covered polygon and the planar body may be arranged to overlap each other. That is, when viewed from the viewpoint direction, the projection polygon may be arranged in a state of being “sunk in” on the surface of the planar body.
- the covering polygon is embedded in the planar body, the projection polygon is hidden behind the planar body and is not drawn.
- the generation unit draws the planar object, the projection polygon, and the object in this order as viewed from the viewpoint direction, and generates a two-dimensional image of the virtual space. Also good.
- the certain covering polygon is assumed to satisfy the predetermined drawing condition. That is, it is assumed that the condition of the covering polygon drawn as the projection polygon is the covering polygon that is in contact with the polygon constituting the shape of the planar body. Note that whether or not a certain covering polygon and the planar body are in contact with each other is determined by, for example, whether or not the two intersect.
- the certain covering polygon may satisfy the predetermined drawing condition. That is, a covering polygon that is included in a planar body is also drawn as a projection polygon.
- the inclination with respect to the surface of the planar body Only the smallest covering polygon may satisfy the predetermined drawing condition. That is, the selection unit is in contact with a polygon constituting the surface of the planar body, or, among the covered polygons included in the planar body, the coated polygon closest to the surface of the planar body is The covering polygon is adopted as the projection polygon, assuming that it is the covering polygon that has the most influence on the surface of the planar body. As a result, a shadow-like expression is realized without performing a complicated calculation considering the position of the light source.
- the storage unit further stores the position of the light source, and the generation unit moves the projection polygon by a predetermined distance so as to move away from the light source, and then the object and the planar object are illuminated by the light source. You may make it draw as what is. That is, by moving the projection polygon in accordance with the position of the light source, the influence of the position of the light source is reflected on the projection polygon.
- the generation unit rotates the projection polygon around the intersection line and performs the intersection.
- the projection polygon may be moved along the normals of the two planes, and the projection polygon may be placed on the surface of the planar body before being drawn.
- the generation unit does not draw in the order of the planar body, the projection polygon, and the object as described above.
- the projection polygon is drawn without being hidden by the planar body.
- the predetermined drawing condition may not be satisfied regardless of the arrangement with the planar body. That is, in the image processing apparatus according to the present invention, even if the covering polygon is in contact with or included in the planar body, the covering polygon existing between the viewpoint and the object is used for projection. Do not select as polygon. Therefore, the drawing is performed so that the shadow is always displayed behind (in the back) the object as viewed from the viewpoint.
- An image processing method is executed by an image processing device including a storage unit, a selection unit, and a generation unit, and is arranged in a virtual space, a planar body, and the surface An image representing the image of the object projected onto the object is generated.
- the position of the viewpoint and the direction of the line of sight arranged in the virtual space, the shape, position and orientation of the object and the planar object arranged in the virtual space, and the object are arranged to cover the object
- the shape, position, and orientation of the plurality of covered polygons are stored.
- the selection unit selects a plurality of covered polygons satisfying a predetermined drawing condition as projection polygons.
- the generation unit starts from the viewpoint based on the shape, position, and orientation of the object and the planar body, the position of the viewpoint, the direction of the line of sight, and the shape, position, and orientation of the projection polygon. An image of the virtual space viewed in the direction of the line of sight is generated.
- the certain covering polygon satisfies the predetermined drawing condition.
- a program according to another aspect of the present invention is configured to cause a computer to function as the above-described image processing apparatus.
- the program of the present invention can be recorded on a computer-readable information recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
- a computer-readable information recording medium such as a compact disk, flexible disk, hard disk, magneto-optical disk, digital video disk, magnetic tape, and semiconductor memory.
- the above program can be distributed and sold via a computer communication network independently of the computer on which the program is executed.
- the information recording medium can be distributed and sold independently of the computer.
- the present invention relates to an image processing apparatus, an image processing method, an information recording medium, and a computer suitable for easily drawing an image such as a shadow generated when an object in a virtual space approaches a floor or a wall in a game or the like
- a program for realizing the above can be provided.
- FIG. 1 is a schematic diagram showing a schematic configuration of a typical game device in which an image processing device according to one embodiment of the present invention is realized.
- an image processing device according to one embodiment of the present invention is realized.
- the game apparatus 100 includes a CPU (Central Processing Unit) 101, a ROM (Read Only Memory) 102, a RAM (Random Access Memory) 103, an interface 104, a controller 105, an external memory 106, and an image processing unit 107. , A DVD (Digital Versatile Disk) -ROM drive 108, a NIC (Network Interface Card) 109, and an audio processing unit 110.
- CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- a DVD-ROM storing a game program and data is loaded into the DVD-ROM drive 108 and the game apparatus 100 is turned on to execute the program, thereby realizing the image processing apparatus of the present embodiment.
- the CPU 101 controls the overall operation of the game apparatus 100 and is connected to each component to exchange control signals and data.
- the ROM 102 stores an IPL (Initial Program Loader) that is executed immediately after the power is turned on. When the CPU 101 executes this IPL, the program recorded on the DVD-ROM is read to the RAM 103, and execution by the CPU 101 is started.
- the ROM 102 stores an operating system program and various data necessary for operation control of the entire game apparatus 100.
- the RAM 103 is for temporarily storing data and programs, and holds programs and data read from the DVD-ROM and other data necessary for game progress and chat communication.
- the controller 105 connected via the interface 104 receives an operation input performed when the user executes the game.
- chat communication log data is rewritably stored in the external memory 106 detachably connected via the interface 104.
- the user can store these data in the external memory 106 as appropriate by inputting an instruction via the controller 105.
- the DVD-ROM loaded in the DVD-ROM drive 108 stores a program for realizing a game and image data and sound data associated with the game as described above. Under the control of the CPU 101, the DVD-ROM drive 108 performs a reading process on the DVD-ROM mounted on the DVD-ROM drive 108 to read out necessary programs and data. The read data is temporarily stored in the RAM 103 or the like.
- the image processing unit 107 processes the data read from the DVD-ROM by the CPU 101 or an image arithmetic processor (not shown) included in the image processing unit 107, and then processes the processed data in a frame memory ( (Not shown).
- the image information recorded in the frame memory is converted into a video signal at a predetermined synchronization timing and output to a monitor (not shown) connected to the image processing unit 107. Thereby, various image displays are possible.
- the image calculation processor can execute a two-dimensional image overlay calculation, a transparency calculation such as ⁇ blending, and various saturation calculations at high speed.
- the polygon information arranged in the three-dimensional virtual space and added with various texture information is rendered by the Z buffer method, and a rendering image obtained by overlooking the polygon arranged in the three-dimensional virtual space from a predetermined viewpoint position is obtained. High speed execution of the obtained operation is also possible.
- the CPU 101 and the image arithmetic processor operate in a coordinated manner, so that a character string can be drawn as a two-dimensional image in a frame memory or drawn on the surface of each polygon according to font information that defines the character shape. is there.
- the font information is recorded in the ROM 102, but dedicated font information recorded in the DVD-ROM can also be used.
- the NIC 109 is for connecting the game apparatus 100 to a computer communication network (not shown) such as the Internet.
- the NIC 109 is, for example, compliant with the 10BASE-T / 100BASE-T standard used when configuring a LAN (Local Area Network), an analog modem for connecting to the Internet using a telephone line, or ISDN (Integrated Services).
- a network Digital network
- ADSL Asymmetric digital subscriber line
- cable modem for connecting to the Internet using a cable television line, etc.
- an interface not shown
- the current date and time information can be obtained by connecting to the SNTP server in the Internet via the NIC 109 and acquiring the information from here.
- various network game server devices may be configured and configured to perform the same functions as the SNTP server.
- the audio processing unit 110 converts audio data read from the DVD-ROM into an analog audio signal and outputs it from a speaker (not shown) connected thereto. Further, under the control of the CPU 101, sound effects and music data to be generated during the progress of the game are generated, and sound corresponding to this is output from the speaker.
- the game apparatus 100 may be configured to perform the same function as a DVD-ROM or the like attached to the ROM 102, the RAM 103, and the DVD-ROM drive 108 using a large-capacity external storage device such as a hard disk. Good.
- the image processing apparatus 200 is realized on the game apparatus 100 or a portable game apparatus, but can also be realized on a general computer.
- a general computer like the game apparatus 100, includes a CPU, RAM, ROM, DVD-ROM drive, and NIC, and includes an image processing unit having simpler functions than the game apparatus 100, and external storage.
- a hard disk as a device
- a flexible disk a magneto-optical disk, a magnetic tape, and the like can be used.
- a keyboard and a mouse are used as an input device instead of a controller. Then, after the program is installed, when the program is executed, it can function as an image processing apparatus. *
- the image processing apparatus will be described using the game apparatus 100 shown in FIG.
- the image processing apparatus can be appropriately replaced with general computer elements as necessary, and these embodiments are also included in the scope of the present invention.
- FIG. 2 is a schematic diagram showing a schematic configuration of the image processing apparatus 200 according to the present embodiment.
- the image processing apparatus 200 is an apparatus that projects the shadow of an object on a planar body, and includes a storage unit 201, a selection unit 202, a generation unit 203, and the like as illustrated in FIG.
- a storage unit 201 for storing images
- a selection unit 202 for generating images
- a generation unit 203 for a planar body
- FIG. 2 is a schematic diagram showing a schematic configuration of the image processing apparatus 200 according to the present embodiment.
- the image processing apparatus 200 is an apparatus that projects the shadow of an object on a planar body, and includes a storage unit 201, a selection unit 202, a generation unit 203, and the like as illustrated in FIG.
- each component of the image processing apparatus 200 will be described with reference to this drawing.
- the storage unit 201 stores various information for expressing the inside of the three-dimensional virtual space.
- the storage unit 201 stores shape information of each element (also referred to as an object or a model) in the three-dimensional virtual space.
- Each object is expressed as a surface (surface) defined by a combination of minute polygons (for example, triangles and quadrangles) whose shape is called a polygon.
- the storage unit 201 stores the shape, position, and orientation of each object in the virtual space.
- the storage unit 201 stores a global coordinate system (world coordinate system) that represents the entire virtual space and a local coordinate system that is fixed for each object.
- the representative point of the object for example, the center of gravity
- the surface shape of the object that is, the shape of the polygon constituting the object and the position where the polygon is arranged
- the position of each object is determined, for example, by defining a representative point of the object based on the global coordinate system.
- the direction of the object is defined by, for example, the amount of rotation of a direction vector extending from the representative point of the object in the front direction of the object.
- the position information may be defined using an orthogonal coordinate system, or may be represented by (r, ⁇ , ⁇ ) using a polar coordinate system using one radius and two declinations.
- the storage unit 201 stores information on the shape, position, orientation, and the like of a planar body such as a floor on which the shadow of the object is projected.
- the storage unit 201 stores the shape, orientation, and position of the covering polygon arranged so as to cover the object.
- the covering polygon is fixed with respect to the local coordinate system of the object covered by the covering polygon, and its shape, position, and orientation are defined. Therefore, if the object moves, the covering polygon also follows.
- FIG. 3 shows an example in which a plurality of covered polygons 320 are arranged in a rectangular parallelepiped shape so as to cover the cylindrical object 310.
- FIG. 3A is a perspective view of the object 310 and a plurality of covered polygons 320 corresponding to the object viewed obliquely from above
- FIG. 3B is a view seen from above
- FIG. 3C is a view seen from the side.
- six square covered polygons 320 are arranged at right angles to each other.
- the shape and number of the covered polygons are not limited to this, and other polygon covered polygons are necessary for covering the object. It may be used.
- the polyhedron by the covering polygon is a convex polyhedron.
- the covering polygons are preferably arranged at a predetermined distance from the corresponding object, but they may be arranged adjacent to each other.
- the position and orientation of the viewpoint and the projection plane are also defined using the global coordinate system or the local coordinate system as described above.
- the position and orientation of the projection surface are updated when the position and orientation of the virtual camera are changed by a user operation or the like.
- the storage unit 201 also stores image data called texture to be pasted on the surface of each object, the position of the light source that illuminates the virtual space, and the like. By pasting the texture information on the surface shape of the object, it is possible to express the texture of the object.
- the storage unit 201 also stores transparency information for representing a shadow in association with each covering polygon.
- the transparency information is defined so as to be the most opaque at the center of the covering polygon (that is, not transmit) and gradually become transparent (that is, transmit) as the distance from the center increases. It is typical to specify a dark color such as black in order to represent the shadow in the opaque portion.
- FIG. 3D shows an example of the transparency information included in the covering polygon.
- the black part represents the opaque part
- the white part represents the transparent part.
- a square is specified around the transparency information, but this square is not actually drawn.
- the opaque portion at the center is circular, but a shape other than circular may be used according to the shape of the object.
- the information stored in the storage unit 201 is stored in advance in, for example, a DVD-ROM, and the CPU 101 reads out the DVD-ROM loaded in the DVD-ROM drive 108 and temporarily stores it in the RAM 103.
- the information stored in the external memory 106 may be read by the CPU 101 and temporarily stored in the RAM 103.
- the CPU 101 can update the temporarily stored information as needed, for example, according to the progress of the game.
- the CPU 101, the RAM 103, the DVD-ROM drive 108, and the like operate in cooperation to function as the storage unit 201.
- the selection unit 202 selects a plurality of covered polygons that satisfy a predetermined drawing condition as projection polygons.
- the predetermined drawing conditions will be described in detail in the description of the operation processing of the image processing apparatus.
- the selection unit 202 selects a projection polygon based on the shape, position, and orientation of the planar body and the covering polygon.
- the CPU 101 and the RAM 103 work together to function as the selection unit 202. Note that a series of processing performed for selecting a projection polygon can be realized by using an arithmetic function for image processing provided by a graphics library or dedicated hardware. The processing speed for drawing shadows can be improved.
- the generation unit 203 calculates each object in the three-dimensional virtual space from the viewpoint based on the position and orientation of the object stored in the storage unit 201, the position of the viewpoint, and the position and orientation of the projection plane.
- Image data to be projected (projected) on a projection surface and displayed on a display device such as a monitor is generated. Further, by drawing a projection polygon for the object selected by the selection unit 202, a shadow of the object is generated on the planar body.
- perspective projection is performed by a one-point perspective method.
- the CPU 101, the RAM 103, and the image processing unit 107 work together to function as the generation unit 203.
- the game character moves along a wide surface such as a floor or a wall, and the shadow of the game character that moves at high speed is drawn on a planar body such as the floor or wall.
- a planar body such as the floor or wall.
- the present invention can also be applied when the character can move to the end of the planar body. At this time, the shadow is projected in a form different from the reality, but since it is rare that the character stays at the end for a long time, it is considered that the user does not feel uncomfortable.
- Step S11 When the image processing apparatus is turned on and starts processing, necessary information (for example, the position and orientation of the virtual camera and the shape, position, and orientation of the object) is read into the RAM 103 and the storage unit 201 is initialized. (Step S11).
- the user uses the controller 105 to determine the position of the virtual camera (viewpoint), the direction of the virtual camera (gaze direction), the shooting magnification (zoom rate) of the virtual camera, the position and orientation of the object, and the object. Can be instructed to change parameters such as movement of
- the CPU 101 determines whether or not the user has input an instruction to change the parameters related to the virtual camera (step S12). When there is an instruction input (step S12; Y), the CPU 101 updates the position and orientation of the virtual camera in the storage unit 201 according to the instruction (step S13). Furthermore, the CPU 101 determines the position and orientation in which the projection plane for projecting the virtual space viewed from the viewpoint in the line-of-sight direction is arranged in the virtual space according to the updated virtual camera position, orientation, and zoom rate. Is calculated (step S14).
- the CPU 101 calculates a direction perpendicular to the line-of-sight vector (vector representing the direction of the line of sight) starting from the viewpoint, and sets this as the direction of the projection plane. That is, the line-of-sight vector passes through the center of the projection surface and coincides with the normal vector of the projection surface. Then, when zooming in, the projection plane is moved in parallel so as to approach the subject to be photographed in the three-dimensional space (away from the viewpoint), and when zoomed out, away from the subject to be photographed (so as to approach the viewpoint). ) Move in parallel.
- step S12 When changing the direction of the line-of-sight vector (that is, panning the virtual camera), the direction of the projection surface is also changed according to the direction of the line-of-sight vector. If the user does not input a parameter change instruction regarding the virtual camera (step S12; N), the process proceeds to step S15.
- the CPU 101 determines whether or not there is an instruction input regarding the position and orientation of the object from the user (step S15).
- step S15; Y the CPU 101 translates and rotates the object in the three-dimensional space based on the instruction input, and updates the position and orientation of the object stored in the storage unit 201 (Ste S16). If there is a covering polygon associated with the object, the covering polygon is fixed to the object, so the position and orientation of the covering polygon are also calculated in response to an instruction input for the movement of the object (step S17).
- step S15 no instruction input
- parameters such as the position and orientation of the virtual camera, the shooting magnification, the position and orientation of the object may be given from a control program or the like.
- the parameter may be changed to a predetermined value in association with the passage of time or may be changed randomly.
- the selection unit 202 selects, for each object for which a shadow is to be drawn, a projection polygon to be drawn as a shadow from among a plurality of covered polygons covering the object (step S18). Processing for selecting the projection polygon by the selection unit 202 will be described with reference to FIG. 5A.
- the selection unit 202 extracts the vertices embedded in the planar body from the vertices of the covering polygon covering the object (step S181). Whether or not the vertex of the covering polygon is buried in the planar body is determined by a vector perpendicular to the surface of the planar body extending from the vertex toward the surface constituting the planar body, Judge by comparing the direction with the normal vector. If the directions of both vectors are the same, the vertex is buried in the planar body. If the directions of both vectors are opposite, the vertex is not buried in the planar body.
- FIG. 6A is a diagram showing a state in which the object 310 and the covering polygon 320 shown in FIG. 3 are viewed from the side (that is, the same diagram as FIG. 3C).
- the covering polygon is buried in the surface A of the planar body 430.
- the direction of the vector v1 extending perpendicularly to the surface A from the vertex X1 and the direction of the normal vector v2 with respect to the surface A are compared. To do.
- FIG. 6A since the directions of both vectors are the same, it is determined that the vertex X1 is buried in the planar body 430. This process is performed for all the vertices, and the buried vertices are extracted.
- the selection unit 202 selects all covered polygons that are in contact with the vertices extracted in step S181 as projection polygon candidates (step S182).
- FIG. 6B is a view of the object 310 shown in FIG. 6A as viewed from obliquely above.
- the vertices embedded in the planar body are determined to be X1, X2, X3, and X4.
- FIG. 6C when the object 130 shown in FIG. 6A is viewed from above, the covering polygons 320A, 320B, 320C, and 320D that contact the vertices X1, X2, X3, and X4, and the polygon 320E that forms the bottom surface are Selected as a covering polygon in contact with the planar body.
- the selection unit 202 selects a covering polygon located farther than the object when viewed from the viewpoint in the line-of-sight direction among the covering polygons selected in step S182 (step S183).
- the covering polygon farther than the object may be selected by performing the following processing on each covering polygon.
- a normal vector an outward normal vector on the surface of the covered polygon
- a direction vector from the viewpoint to the representative point (for example, the center of gravity) of the covered polygon
- the angle formed with the line-of-sight vector is calculated by obtaining an inner product or the like, and if the angle is within 90 degrees, the covered polygon is assumed to be farther from the object.
- the covering polygon is arranged so as to cover the corresponding object. Therefore, the covering polygon facing the direction close to the direction vector connecting from the viewpoint to the representative point of the covering polygon is It turns out that it is the covering polygon which exists behind an object.
- the covering polygon is behind the object as viewed from the viewpoint. It is good.
- the case where the object 310 is viewed from the direction of the viewpoint 440 (that is, the case where the object 310 is viewed from the obliquely upward direction in FIG. 6C) will be described as an example.
- the covering polygon 320A is behind the object 310.
- the selection unit 202 determines.
- the covering polygon 320E disposed on the bottom surface, excluding the covering polygon 320C, the covering polygon 320D (the covering polygon facing the reader direction in FIG. 6D), and the covering polygon 320B disposed facing the 320D are also provided. It is determined that the object 310 is arranged behind the object 310.
- the selection unit 202 selects one covering polygon having the inclination closest to the surface of the planar body for each of the covering polygons selected in this way as a projection polygon (step S184). This is selected using, for example, the normal vector of the covering polygon calculated in step S183.
- the reverse vector of the normal vector of the covered polygon is compared with the normal vector of the planar body. Then, one having the closest direction of both vectors (that is, the inner product being closest to 1) is selected.
- FIG. 6D it is determined that among the covered polygons 320 ⁇ / b> A, 320 ⁇ / b> B, 320 ⁇ / b> D, and 320 ⁇ / b> E, 320 ⁇ / b> E has the inclination closest to the surface of the planar body 430.
- steps S181 to S184 are performed on all objects for which a shadow is to be drawn, and a projection polygon is selected. Then, the selected projection polygon is stored in the RAM 103 of the image processing apparatus 200 in association with the corresponding object.
- the generation unit 203 performs the following steps S19 and S20 on all objects in the virtual space and all selected projection polygons, and draws a two-dimensional image of the virtual space.
- the generation unit 203 obtains an object and a projection destination area of the selected projection polygon (step S19).
- each object is perspectively projected on the projection plane by the one-point perspective method, so that an object far from the viewpoint is small and an object near is projected large.
- parallel projection can be adopted instead of the one-point perspective method.
- step S20 When the projection destination of the object is obtained, the corresponding area of the corresponding texture (in the projection polygon, the transparency information at the projection destination) is pasted (mapped) and drawn in each area of the projection destination (step S20). ). The detailed process of step S20 is shown in the flowchart of FIG. 5B.
- the generation unit 203 refers to the RAM 103 of the image processing apparatus 200 and the like, and determines whether or not a projection polygon has been selected for the currently focused object (step S201). For an object for which no projection polygon has been selected (step S201; N), the generation unit 203 uses, for example, the Z buffer method to perform hidden surface processing (step S202). In other words, the generation unit 203 paints the pixel with the color of texture information (or transparency information) corresponding to the polygon closest to the viewpoint (projection plane) for each pixel constituting the image data to be drawn.
- the projection polygon 320E is disposed behind the planar body 430 when viewed from the viewpoint. Therefore, when the hidden surface processing as described above is performed, the projection polygon 320B is completely hidden by the object 310. Alternatively, when a part of the projection polygon is arranged in a state of being “indented” into the planar body, the “indented” portion is hidden by the planar body.
- the generation unit 203 first draws the planar body viewed from the viewpoint in the direction of the line of sight. Then, the projection polygon viewed from the viewpoint in the direction of the line of sight is drawn. Finally, the object viewed from the viewpoint in the direction of the line of sight is drawn (step S203). Thereby, it is always drawn in the order of the planar body, the projection polygon, and the object, and the projection polygon can be prevented from being hidden by the planar body.
- the generation unit 203 waits until a vertical synchronization interrupt occurs (step S21).
- other processing for example, processing for updating the position and orientation of each object stored in the RAM 103 and the virtual camera based on the passage of time or processing from the user
- coroutine May be executed automatically.
- the generation unit 203 transfers the contents of the drawn image data (usually stored in the frame buffer) to a monitor (not shown), displays the image (step S22), and returns to step S12. .
- the drawing image data usually stored in the frame buffer
- a monitor not shown
- displays the image step S22
- returns to step S12. .
- the covering polygon on the bottom surface is drawn as a projection polygon, and when viewed from the viewpoint direction, a shadow as shown in FIG. 7 is obtained. That is, since the central portion of the transparency information is opaque, a shadow-like expression is realized.
- the planar body drawn before the covering polygon is displayed. Note that the outer periphery of the projection polygon is not actually drawn. In this figure, a square which is the outer periphery of the projection polygon is clearly shown for easy understanding.
- the generation unit 203 looks at the virtual space from the viewpoint in the direction of the line of sight in consideration of the fact that there are covered polygons that are arranged in a state of being “indented” in the planar body. By drawing in the order of the object, covering polygon, and object, the covering polygon is always drawn without being hidden by the planar object. In the present embodiment, the generation unit 203 changes the position of the covering polygon so that the covering polygon is displayed along the planar body.
- the configuration of the image processing apparatus according to the second embodiment is the same as that of the image processing apparatus 200 shown in FIG.
- FIG. 8 shows the flow of processing of the present embodiment.
- the image processing apparatus 200 performs the same processing (steps S71 to 78, S82, and S83) as steps S11 to S18, S21, and S22 shown in FIG.
- the generation unit 203 of the image processing apparatus 200 further performs step S79 to move the covered polygon selected as the projection polygon along the planar body.
- FIG. 9A is a diagram illustrating a state in which the object 310 and a plurality of covered polygons 320 corresponding to the object 310 are viewed from the side, as in FIG. 6A.
- the bottom covering polygon 320E is selected as the projection polygon.
- the planar body 900 intersects with the covering polygon 320E at a line 910.
- FIG. 9B shows a state where a portion where the covering polygon 320E and the planar body 900 intersect is viewed from an oblique direction.
- FIG. 9B the plane including the projection polygon 320E and the surface of the planar body 900 intersect at a line 910.
- the generation unit 203 first rotates the projection polygon 320E with respect to the line 910 intersecting the planar body 900 so as to be parallel to the surface of the planar body 900.
- FIG. 9C shows the projection polygon 320 ⁇ / b> E ′ obtained by rotating the projection polygon 320 ⁇ / b> E in the direction of the arrow with respect to the line 910 so as to be parallel to the planar body 900.
- the projection polygon is translated along the direction of the normal vector v of the surface of the planar body 900 by a predetermined distance (a distance that does not collide with the corresponding object 310) ( The projected polygon after the movement is indicated by 320E ′′).
- the projection polygon and the object can be projected without drawing in the order of the covering polygon and the object as seen in the line-of-sight direction from the viewpoint as in the first embodiment.
- the projection destination area of the object or the covering polygon is obtained (step S80).
- the corresponding texture and transparency information are pasted, and the hidden polygon processing such as the Z buffer method is performed for drawing (step S81), so that the covering polygon is drawn without being hidden by the planar body. It is.
- the projection polygon may be rotated with respect to the line.
- the generation unit 203 does not rotate the projection polygon and only a predetermined distance (a distance that does not hit the corresponding object) A translation is performed along the direction of the normal vector of the surface of the planar body.
- processing for selecting a projection polygon is realized by using an arithmetic function for image processing provided by a graphics library or dedicated hardware. Therefore, it is possible to shorten the game development period and improve the processing speed for drawing shadows.
- a planar body is present at the bottom of the object.
- the present invention is also effective when a planar body is present laterally or above.
- the present embodiment may be applied to each planar object.
- the present invention is not limited to the above-described embodiment, and various modifications and applications are possible. Moreover, it is also possible to freely combine the components of the above-described embodiments.
- the projection polygon may be moved so as to move away from the light source in consideration of the position of the light source.
- the projection polygon 1010 corresponding to the object 1000 in FIG. 10A may be moved away from the light source as shown in FIG. 10B.
- the position and orientation of the light source are stored in the storage unit 201, and after step S18 in FIG. 4 or after the position of the projection polygon is determined in step S79 in FIG. Based on the orientation, the projection polygon is further moved as follows. That is, as shown in FIG. 10C, a point P1 where a plane 1020 including the projection polygon 1010 intersects a line L extending from the light source 1030 to the plane 1020 perpendicularly, and a representative point P2 of the projection polygon (for example, the center of gravity) The projection polygon may be translated by a predetermined distance along the direction vector V connecting the
- the projection polygon is used to draw a shadow, but it can also be used for other purposes.
- it can be expressed as if there is reflection or reflection on the surface of the planar body.
- a mirror image of the object viewed from the position of the covering polygon is pasted on the shadow portion of the transparency information corresponding to the covering polygon.
- one covering polygon that satisfies a predetermined condition is selected as a projection polygon.
- step S184 may be omitted, and a plurality of covered polygons satisfying the conditions up to step S183 may be selected as projection polygons.
- step S182 only the covered polygon that intersects the surface of the planar body may be selected, and the covered polygon that is completely included in the planar body may not be selected.
- the projection polygon selection process is applied to the object, the covering polygon, and the planar body shown in FIG. are selected as projection polygons.
- the selected projection polygon may be drawn as it is.
- the second embodiment may be applied and drawn after moving so as to follow the planar body. Thereby, for example, a plurality of shadows are drawn as if the radiosity method was used.
- the shape of the planar body not only a flat surface but also a curved surface such as a spherical surface can be used.
- a plane in contact with the curved surface in the vicinity of the object is considered, and the plane is applied as the shape of the “planar body” in the above invention.
- the curved surface is a spherical surface
- a “tangential plane” that is perpendicular to the straight line connecting the center of the curved surface and the center of gravity of the object and that touches the spherical surface can be obtained.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
101 CPU
102 ROM
103 RAM
104 インターフェイス
105 コントローラ
106 外部メモリ
107 画像処理部
108 DVD-ROMドライブ
109 NIC
110 音声処理部
200 画像処理装置
201 記憶部
202 選択部
203 生成部
図1は、本発明の実施形態の1つに係る画像処理装置が実現される典型的なゲーム装置の概要構成を示す模式図である。以下、本図を参照して説明する。
また、ROM 102にはゲーム装置100全体の動作制御に必要なオペレーティングシステムのプログラムや各種のデータが記録される。
図2は、本実施の形態に係る画像処理装置200の概要構成を示す模式図である。画像処理装置200は、オブジェクトの影を、面状体上に投影する装置であり、図2に示すように、記憶部201、選択部202、生成部203、等を備える。以下に画像処理装置200の各構成要素について本図を参照して説明する。
まず、記憶部201は、3次元仮想空間内の各要素(オブジェクト、あるいはモデルとも呼ばれる)の形状情報を記憶する。各オブジェクトは、その形状をポリゴンと呼ばれる、微小な多角形(例えば三角形や四角形等)の組み合わせによって定義されるサーフェス(表面)として表現される。
以上のような構成を有する画像処理装置200の動作を、図4を参照して説明する。なお、本実施の形態においては、ゲームキャラクタは床や壁などの広い表面に沿って移動し、これら床や壁などの面状体上に、高速に移動するゲームキャラクタの影が描画される。典型的にはゲームキャラクタの移動範囲には制限が設けられるため、ゲームキャラクタが面状体の端に差し掛かり、面状体から部分的にせり出すような場合は考慮しない。
なお、面状体の端部にキャラクタが移動可能な場合にも本発明を適用することは可能である。この際、現実と異なる形態で影が投影されることとなるが、端部にキャラクタが長時間留まることは稀であると考えるため、ユーザが違和感を感じることはきわめて少ないと考えられる。
なお、ユーザから仮想カメラに関するパラメータの変更の指示入力がなかった場合(ステップS12;N)、処理はステップS15に進む。
この他、視点とオブジェクトの代表点(例えば重心)の距離が視点と被覆ポリゴンの代表点(例えば重心)の距離よりも小さい場合に、視点から見て、当該被覆ポリゴンが当該オブジェクトの後方にあるとしてもよい。
まず、生成部203はオブジェクトおよび、選択された投影用ポリゴンの投射先領域を求める(ステップS19)。本実施の形態では、上述のように、各オブジェクトを投射面に1点透視法で透視投影するため、視点から遠くにあるオブジェクトは小さく、近くにあるオブジェクトは大きく投影されることになる。ただし、1点透視法の代わりに、平行投影を採用することもできる。
第1の実施形態では、生成部203は、面状体の中に「めり込んだ」状態で配置される被覆ポリゴンがあることを考慮して、視点から視線の方向に仮想空間を見て、面状体、被覆ポリゴン、オブジェクト、の順番で描画することで、被覆ポリゴンが面状体に隠れずに、常に描画されるようにした。本実施の形態では、生成部203は、被覆ポリゴンが面状体に沿って表示されるように被覆ポリゴンの位置を変化させる。なお、第2の実施形態の画像処理装置の構成は図2に示す画像処理装置200と同一であるため説明を省略する。
本実施の形態の処理の流れを図8に示す。画像処理装置200は、図4に示すステップS11~S18、S21、およびS22と同様の処理(ステップS71~78、S82、およびS83)を行う。本実施の形態では、画像処理装置200の生成部203はさらに、ステップS79を実施し、投影用ポリゴンとして選択された被覆ポリゴンを面状体に沿うように移動する。
Claims (10)
- 仮想空間内に配置されるオブジェクト、面状体、および、当該面状体に投影される当該オブジェクトの像を表す画像を生成する画像処理装置(200)であって、
当該仮想空間内に配置される視点の位置および視線の方向と、当該仮想空間内に配置されるオブジェクトおよび面状体の形状、位置および向きと、当該オブジェクトを覆うように配置される複数の被覆ポリゴンの形状、位置および向きと、を記憶する記憶部(201)、
当該複数の被覆ポリゴンのうち、所定の描画条件を満たすものを投影用ポリゴンとして選択する選択部(202)、
当該オブジェクトおよび状体の形状、位置および向きと、当該視点の位置及び視線の方向と、当該投影用ポリゴンの形状、位置および向きと、に基づいて、当該視点から当該視線の方向に見た当該仮想空間の画像を生成する生成部(203)、
を備え、
ある被覆ポリゴンと当該面状体との配置が、互いに接触する配置である場合、当該ある被覆ポリゴンは当該所定の描画条件を満たす、
ことを特徴とする画像処理装置(200)。 - 請求項1に記載の画像処理装置(200)であって、
前記生成部(203)は、
当該オブジェクトおよび面状体の形状、位置および向きと、当該視点の位置及び視線の方向と、当該投影用ポリゴンの形状、位置および向きと、に基づいて、当該視点から当該視線の方向に見た当該面状体を描画し、当該視点から当該視線の方向に見た当該投影用ポリゴンを描画し、当該視点から当該視線の方向に見た当該オブジェクトを描画することで、当該視点から当該視線の方向に当該仮想空間を見た画像を生成する、
ことを特徴とする画像処理装置(200)。 - 請求項1に記載の画像処理装置(200)であって、
ある被覆ポリゴンと当該面状体との配置が、当該ある被覆ポリゴンが当該面状体に包含される配置である場合、当該ある被覆ポリゴンは当該所定の描画条件を満たす、
ことを特徴とする画像処理装置(200)。 - 請求項3に記載の画像処理装置(200)であって、
当該複数の被覆ポリゴンのそれぞれと当該面状体との配置が、互いに接触し、もしくは当該被覆ポリゴンが当該面状体に包含される配置であるもののうち、当該面状体の表面に対する傾きが最小の被覆ポリゴンのみが、当該所定の描画条件を満たすものとする、
ことを特徴とする画像処理装置(200)。 - 請求項1に記載の画像処理装置(200)であって、
前記記憶部(201)は、光源の位置をさらに記憶し、
前記生成部(203)は、当該光源から遠ざかるように所定の距離だけ当該投影用ポリゴンを移動してから、当該光源に当該オブジェクトと当該面状体が照らされるものとして描画する、
ことを特徴とする画像処理装置(200)。 - 請求項5に記載の画像処理装置(200)であって、
前記生成部(203)は、当該投影用ポリゴンを含む平面と、当該面状体の表面を含む平面と、の交線が存在する場合、当該交線を中心に当該投影用ポリゴンを回転させ、当該交線が存在しない場合、当該両平面の法線に沿って当該投影用ポリゴンを移動して、当該投影用ポリゴンを当該面状体の表面上に配置してから描画する、
ことを特徴とする画像処理装置(200)。 - 請求項1に記載の画像処理装置(200)であって、
ある被覆ポリゴンが当該オブジェクトよりも当該視点に近い場合、当該面状体との配置に関わらず、当該所定の描画条件は満たさないものとする、
ことを特徴とする画像処理装置(200)。 - 仮想空間内に配置されるオブジェクト、面状体、および、当該面状体に投影される当該オブジェクトの像を表す画像を生成する画像処理方法であって、当該画像処理方法は、記憶部(201)と、選択部(202)と、生成部(203)とを備える画像処理装置(200)が実行し、
前記記憶部(201)には、当該仮想空間内に配置される視点の位置および視線の方向と、当該仮想空間内に配置されるオブジェクトおよび面状体の形状、位置および向きと、当該オブジェクトを覆うように配置される複数の被覆ポリゴンの形状、位置および向きと、が記憶され、
当該画像処理方法は、
前記選択部(202)が、当該複数の被覆ポリゴンのうち、所定の描画条件を満たすものを投影用ポリゴンとして選択する選択工程、
前記生成部(203)が当該オブジェクトおよび面状体の形状、位置および向きと、当該視点の位置及び視線の方向と、当該投影用ポリゴンの形状、位置および向きと、に基づいて、当該視点から当該視線の方向に見た当該仮想空間の画像を生成する生成工程、
を備え、
ある被覆ポリゴンと当該面状体との配置が、互いに接触する配置である場合、当該ある被覆ポリゴンは当該所定の描画条件を満たす、
ことを特徴とする画像処理方法。 - コンピュータを、仮想空間内に配置されるオブジェクト、面状体、および、当該面状体に投影される当該オブジェクトの像を表す画像を生成する画像処理装置(200)として機能させるプログラムを記録した情報記録媒体であって、
前記プログラムは前記コンピュータを、
当該仮想空間内に配置される視点の位置および視線の方向と、当該仮想空間内に配置されるオブジェクトおよび面状体の形状、位置および向きと、当該オブジェクトを覆うように配置される複数の被覆ポリゴンの形状、位置および向きと、を記憶する記憶部(201)、
当該複数の被覆ポリゴンのうち、所定の描画条件を満たすものを投影用ポリゴンとして選択する選択部(202)、
当該オブジェクトおよび面状体の形状、位置および向きと、当該視点の位置及び視線の方向と、当該投影用ポリゴンの形状、位置および向きと、に基づいて、当該視点から当該視線の方向に見た当該仮想空間の画像を生成する生成部(203)、
として機能させ、
ある被覆ポリゴンと当該面状体との配置が、互いに接触する配置である場合、当該ある被覆ポリゴンは当該所定の描画条件を満たす、
ことを特徴とするプログラムを記録した情報記録媒体。 - コンピュータを、仮想空間内に配置されるオブジェクト、面状体、および、当該面状体に投影される当該オブジェクトの像を表す画像を生成する画像処理装置(200)として機能させるプログラムであって、
前記プログラムは前記コンピュータを、
当該仮想空間内に配置される視点の位置および視線の方向と、当該仮想空間内に配置されるオブジェクトおよび面状体の形状、位置および向きと、当該オブジェクトを覆うように配置される複数の被覆ポリゴンの形状、位置および向きと、を記憶する記憶部(201)、
当該複数の被覆ポリゴンのうち、所定の描画条件を満たすものを投影用ポリゴンとして選択する選択部(202)、
当該オブジェクトおよび面状体の形状、位置および向きと、当該視点の位置及び視線の方向と、当該投影用ポリゴンの形状、位置および向きと、に基づいて、当該視点から当該視線の方向に見た当該仮想空間の画像を生成する生成部(203)、
として機能させ、
ある被覆ポリゴンと当該面状体との配置が、互いに接触する配置である場合、当該ある被覆ポリゴンは当該所定の描画条件を満たす、
ことを特徴とするプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/812,428 US20100277474A1 (en) | 2008-01-11 | 2008-12-26 | Image processing device, image processing method, information recording medium, and program |
EP08869278A EP2234068A1 (en) | 2008-01-11 | 2008-12-26 | Image processing device, image processing method, information recording medium, and program |
CN2008801243592A CN101911127B (zh) | 2008-01-11 | 2008-12-26 | 图像处理装置、图像处理方法 |
KR1020107014655A KR101146660B1 (ko) | 2008-01-11 | 2008-12-26 | 화상처리장치, 화상처리방법 및 정보기록매체 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-004260 | 2008-01-11 | ||
JP2008004260A JP4852555B2 (ja) | 2008-01-11 | 2008-01-11 | 画像処理装置、画像処理方法、ならびに、プログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2009087936A1 true WO2009087936A1 (ja) | 2009-07-16 |
Family
ID=40853058
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2008/073835 WO2009087936A1 (ja) | 2008-01-11 | 2008-12-26 | 画像処理装置、画像処理方法、情報記録媒体、ならびに、プログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US20100277474A1 (ja) |
EP (1) | EP2234068A1 (ja) |
JP (1) | JP4852555B2 (ja) |
KR (1) | KR101146660B1 (ja) |
CN (1) | CN101911127B (ja) |
TW (1) | TWI393069B (ja) |
WO (1) | WO2009087936A1 (ja) |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120249544A1 (en) | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Cloud storage of geotagged maps |
US9396581B2 (en) | 2013-10-18 | 2016-07-19 | Apple Inc. | Contact shadows in visual representations |
JP6708213B2 (ja) * | 2015-08-12 | 2020-06-10 | ソニー株式会社 | 画像処理装置と画像処理方法とプログラムおよび画像処理システム |
KR102325297B1 (ko) * | 2015-11-09 | 2021-11-11 | 에스케이텔레콤 주식회사 | Ar 컨텐츠 자동 배치 방법 |
CN106161956A (zh) * | 2016-08-16 | 2016-11-23 | 深圳市金立通信设备有限公司 | 一种拍摄时预览画面的处理方法和终端 |
JP6897442B2 (ja) * | 2017-09-12 | 2021-06-30 | 株式会社Jvcケンウッド | 車両用装置、キャリブレーション結果判定システム、および、キャリブレーション結果判定方法、ならびに、プログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004280596A (ja) * | 2003-03-17 | 2004-10-07 | Nintendo Co Ltd | シャドウボリューム生成プログラム及びゲーム装置 |
JP2007007064A (ja) * | 2005-06-29 | 2007-01-18 | Konami Digital Entertainment:Kk | ネットワークゲームシステム、ゲーム装置、ゲーム装置の制御方法及びプログラム |
JP2007195747A (ja) | 2006-01-26 | 2007-08-09 | Konami Digital Entertainment:Kk | ゲーム装置、ゲーム装置の制御方法及びプログラム |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2976963B2 (ja) * | 1998-03-19 | 1999-11-10 | コナミ株式会社 | 画像作成装置、画像作成方法、画像作成プログラムが記録された可読記録媒体およびビデオゲーム装置 |
US6437782B1 (en) * | 1999-01-06 | 2002-08-20 | Microsoft Corporation | Method for rendering shadows with blended transparency without producing visual artifacts in real time applications |
JP3599268B2 (ja) * | 1999-03-08 | 2004-12-08 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理方法、画像処理装置及び記録媒体 |
JP4079410B2 (ja) * | 2002-02-15 | 2008-04-23 | 株式会社バンダイナムコゲームス | 画像生成システム、プログラム及び情報記憶媒体 |
JP4001227B2 (ja) * | 2002-05-16 | 2007-10-31 | 任天堂株式会社 | ゲーム装置及びゲームプログラム |
JP4082937B2 (ja) * | 2002-06-07 | 2008-04-30 | 任天堂株式会社 | ゲームシステム及びゲームプログラム |
CN100478995C (zh) * | 2003-07-31 | 2009-04-15 | 新加坡国立大学 | 梯形阴影映像 |
US7164430B2 (en) * | 2003-09-25 | 2007-01-16 | Via Technologies, Inc. | Anti-aliasing line pixel coverage calculation using programmable shader |
CN1278282C (zh) * | 2003-12-04 | 2006-10-04 | 北京金山数字娱乐科技有限公司 | 场景动态光影实现方法 |
US7030878B2 (en) * | 2004-03-19 | 2006-04-18 | Via Technologies, Inc. | Method and apparatus for generating a shadow effect using shadow volumes |
US7567248B1 (en) * | 2004-04-28 | 2009-07-28 | Mark William R | System and method for computing intersections between rays and surfaces |
US20060017729A1 (en) * | 2004-07-22 | 2006-01-26 | International Business Machines Corporation | Method to improve photorealistic 3D rendering of dynamic viewing angle by embedding shading results into the model surface representation |
WO2006049870A1 (en) * | 2004-10-27 | 2006-05-11 | Pacific Data Images Llc | Volumetric shadows for computer animation |
JP4948218B2 (ja) * | 2007-03-22 | 2012-06-06 | キヤノン株式会社 | 画像処理装置及びその制御方法 |
-
2008
- 2008-01-11 JP JP2008004260A patent/JP4852555B2/ja active Active
- 2008-12-26 KR KR1020107014655A patent/KR101146660B1/ko not_active IP Right Cessation
- 2008-12-26 WO PCT/JP2008/073835 patent/WO2009087936A1/ja active Application Filing
- 2008-12-26 US US12/812,428 patent/US20100277474A1/en not_active Abandoned
- 2008-12-26 CN CN2008801243592A patent/CN101911127B/zh not_active Expired - Fee Related
- 2008-12-26 EP EP08869278A patent/EP2234068A1/en not_active Withdrawn
-
2009
- 2009-01-09 TW TW098100748A patent/TWI393069B/zh not_active IP Right Cessation
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004280596A (ja) * | 2003-03-17 | 2004-10-07 | Nintendo Co Ltd | シャドウボリューム生成プログラム及びゲーム装置 |
JP2007007064A (ja) * | 2005-06-29 | 2007-01-18 | Konami Digital Entertainment:Kk | ネットワークゲームシステム、ゲーム装置、ゲーム装置の制御方法及びプログラム |
JP2007195747A (ja) | 2006-01-26 | 2007-08-09 | Konami Digital Entertainment:Kk | ゲーム装置、ゲーム装置の制御方法及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
KR20100090719A (ko) | 2010-08-16 |
US20100277474A1 (en) | 2010-11-04 |
EP2234068A1 (en) | 2010-09-29 |
JP2009169508A (ja) | 2009-07-30 |
TW200947345A (en) | 2009-11-16 |
CN101911127A (zh) | 2010-12-08 |
TWI393069B (zh) | 2013-04-11 |
JP4852555B2 (ja) | 2012-01-11 |
CN101911127B (zh) | 2012-07-04 |
KR101146660B1 (ko) | 2012-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4226639B1 (ja) | 画像処理装置、画像処理方法、ならびに、プログラム | |
US9342918B2 (en) | System and method for using indirect texturing to efficiently simulate and image surface coatings and other effects | |
US6580430B1 (en) | Method and apparatus for providing improved fog effects in a graphics system | |
JP4852555B2 (ja) | 画像処理装置、画像処理方法、ならびに、プログラム | |
CN110193193B (zh) | 游戏场景的渲染方法和装置 | |
JP2008234473A (ja) | 画像処理装置及びその制御方法 | |
WO2007063805A1 (ja) | オブジェクト選択装置、オブジェクト選択方法、情報記録媒体、ならびに、プログラム | |
JP3961525B2 (ja) | 画像処理装置、画像処理方法、ならびに、プログラム | |
US6483520B1 (en) | Image creating method and apparatus, recording medium for recording image creating program, and video game machine | |
JP4575937B2 (ja) | 画像生成装置、画像生成方法、ならびに、プログラム | |
WO2017188119A1 (ja) | プログラム、情報処理装置、影響度導出方法、画像生成方法及び記録媒体 | |
US20130194263A1 (en) | Three-dimensional image display device and three-dimensional image display program | |
JP4006243B2 (ja) | 画像生成情報、ゲーム情報、情報記憶媒体、画像生成装置、およびゲーム装置 | |
JP2000339499A (ja) | テクスチャマッピング・テクスチャモザイク処理装置 | |
JP4572220B2 (ja) | 画像表示装置、画像表示方法、ならびに、プログラム | |
JP4637205B2 (ja) | 表示装置、表示方法、ならびに、プログラム | |
JP4750085B2 (ja) | 画像表示装置、画像表示方法、ならびに、プログラム | |
WO2024093610A1 (zh) | 阴影渲染方法、装置、电子设备及可读存储介质 | |
Zheng et al. | Interactive Design and Optics-Based Visualization of Arbitrary Non-Euclidean Kaleidoscopic Orbifolds | |
WO2023119715A1 (ja) | 映像生成方法及び画像生成プログラム | |
JP2009140237A (ja) | 画像処理装置、画像処理方法、及び、プログラム | |
JP4589517B2 (ja) | 情報記憶媒体及び画像生成装置 | |
JP4530312B2 (ja) | ゲームシステム及び情報記憶媒体 | |
Gustafsson | Automatic Technical Illustration Based on Cartoon Shading Principles | |
JP2002133443A (ja) | ゲーム装置および情報記憶媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200880124359.2 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08869278 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20107014655 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 12812428 Country of ref document: US Ref document number: 2008869278 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |