MXPA01008800A - Method and apparatus for processing images - Google Patents

Method and apparatus for processing images

Info

Publication number
MXPA01008800A
MXPA01008800A MXPA/A/2001/008800A MXPA01008800A MXPA01008800A MX PA01008800 A MXPA01008800 A MX PA01008800A MX PA01008800 A MXPA01008800 A MX PA01008800A MX PA01008800 A MXPA01008800 A MX PA01008800A
Authority
MX
Mexico
Prior art keywords
shadow
plane
light source
obi
coordinates
Prior art date
Application number
MXPA/A/2001/008800A
Other languages
Spanish (es)
Inventor
Akio Ohba
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Publication of MXPA01008800A publication Critical patent/MXPA01008800A/en

Links

Abstract

After a normal texture expressing process (texture mapping and shading) has been carried out on a polygon to be processed, a polygon shadow rendered on a shadow plane (reference shadow plane (150)) is subjected to texture mapping and rendered on a screen (176) (procedure 1), and thereafter the polygon shadow formed on the object is rendered on the shadow plane (150) (procedure 2). The above process is carried out by Z-sorting with the light source (52) as the viewpoint, as indicated by the arrow A.

Description

METHOD AND APPARATUS FOR PROCESSING IMAGES TECHNICAL FIELD The present invention relates to a method and apparatus for processing an image to express the shadow of an object that is created by light radiation from a light source, applied to another object placed behind, based on the distribution of several objects generated by three-dimensional modeling, a recording medium that stores a program to perform said image processing, and a program to perform said image processing. BACKGROUND OF THE ART Recently, several computer graphics (CG) processing techniques that include a hidden line processing, hidden surface removal, smooth shading, texture application, etc. they have made rapid progress in combination with advances in equipment. In accordance with a general CG processing procedure, several three-dimensional figures (objects) are generated in accordance with three-dimensional CAD modeling and colors and shadows are applied to the generated objects. Then, optical characteristics that include reflection, diffuse reflection, refraction, transparency, are added to the objects and surface patterns are applied to the objects. In addition, neighboring situations are represented, for example, windows and scenes are reflected and rays of light are introduced. The shading is governed by the directions of normal lines to the polygons that make up an object and the point of view for the light rays. There is a process of expressing the shadow of an object applied to another object placed behind based on the distribution of a light source and several objects. This last process, unlike shading, can not be carried out by techniques other than a very expensive representation approach, such as ray tracing. If high costs are not to be incurred, such as real-time representation, then it has been usual to approximate this shadow with perspective projection on a simple plane or the representation of a simple figure, such as a circle. Also, if the light source has a certain size, such as a flame, it is extremely difficult to express the shadows produced by the light source. Accordingly, it is an object of the present invention to provide a method and apparatus for processing an image to express shadows in relation to several objects placed in a complex distribution or a shadow in relation to an object having a complex shape, a means of record that stores a program capable of expressing in a simple way said shadow or shadows, and a program to express said shadow or shadows.
Another object of the present invention is to provide a method and an apparatus for processing an image in order to be able to selectively express a shadow in relation to an object, a recording medium that stores a program capable of selectively expressing a shadow relative to an object. object, and a program to selectively express a shadow in relation to an object. Another object of the present invention is to provide a method and an apparatus for processing an image in such a way that various effects such as, for example, blurring can be easily applied by expressing a shadow in relation to an object, a recording medium which stores a program capable of easily applying various effects when expressing a shadow in relation to an object, and a program for easily applying various effects when expressing a shadow in relation to an object. Another object of the present invention is to provide a method and apparatus for processing an image that can easily control shadow blur between various shading effects to easily express a more realistic shadow, a recording medium that stores a program capable of easily expressing a more realistic shadow, and a program to easily express a more realistic shadow. Another object of the present invention is to provide a method and an apparatus for processing an image in such a way that it can easily express a projected image of an extended light source as, for example, a flame and a shadow produced by said extended light source, a recording medium that stores a program capable of easily expressing a projected image of an extended light source such as, for example, a flame and a shadow produced by said extended light source, and a program for easily expressing a projected image of an extended light source such as, for example, a flame and a shadow produced by said extended light source. DISCLOSURE OF THE INVENTION An image processing method according to the present invention comprises the steps of establishing at least one virtual plane from the distribution of several objects generated by three-dimensional modeling, and expressing a shadow of the object projected on the plane virtual by a light source as a point of view, in the object furthest from the light source than the object. With the aforementioned method, it is possible to easily express a shadow in several objects placed in a complex distribution or a shadow in relation to an object having a complex shape. The method may further comprise the steps of defining a shadow expression attribute as to whether the shadow should be expressed in the objects in light source processing attributes of the objects and selectively expressing the shadow in the object based on the attribute of shade expression. In this way, the shadow can be expressed selectively in relation to the object. Specifically, the method can further comprise the steps of establishing a shadow plane that serves as a texture plane corresponding to the virtual plane, representing the shadow of the object formed by projection in the virtual plane in the shadow plane, and applying the shadow represented in the shadow plane in the other object through texture application. The step of applying the shadow on the other object through texture application can include the step of applying the shadow on the other object through texture application based on projected coordinates of the other object in the virtual plan, or with relation to each of the polygons of the other object. The method may further comprise the steps of determining coordinates of the objects with the light source as a point of view, determining the projected coordinates of the objects in the virtual plane successively in a direction away from the light source, and representing the Shadow formed by the object in the shadow plane based on the projected coordinates every time the texture application in one of the objects has finished.
The method may further comprise the steps of determining the coordinates of the objects and the projected coordinates of the objects in the virtual plane in relation to each of the polygons of the objects, recording the determined coordinates in a representation list successively in the direction moving away from the light source, and reading successively the coordinates recorded from the representation list to represent the shadow in the shadow plane. It is preferable to carry out a low-pass filtering in the shadow represented in the shadow plane according to at least the distance from the light source in order to apply shadowing in this way according to at least the distance of the light source. In this way, various effects such as, for example, blurring, etc., can easily be applied to a shadow expressed in relation to an object. The method can further comprise the steps of interpolating the shadow represented in the shadow plane of generation when it is expressed in the object, in accordance with the representation according to the shadow before its submission to low-pass filtering, the shadow after its subjection to low pass filtering, and the light source coordinates of the object to be processed, in order to control in this way the shadowiness of the shadow. Thus, fuzziness can be easily controlled for an easier expression of a more realistic shadow. The method may further comprise the steps of preparing a reference shadow plane and a generation reference plane such as the shadow plane, and each time the objects to be processed change from one to another, copy the shadow represented in the plane of reference. shadow of generation, in the reference shadow plane, and each time the shadow in the reference shadow plane is applied through texture application in relation to each of the polygons of an object, represent a projected image of the polygon in the virtual plane as a new shadow combined in the shadow plane of generation. Whenever the shadow represented in the shadow plane of generations copied in the reference shadow plane, a low-pass filtering in the shadow represented in the shadow plane of generation can be effected. In this way, various effects such as, for example, blurring, etc., can easily be applied to a shadow expressed in relation to an object. The method further comprises the steps of preparing, in addition to the reference shadow plane, and generation shadow plane as a shadow plane, a background shadow plane which is a texture plane corresponding to a virtual background plane placed back of the object to be processed, with the light source as a point of view, representing a shadow formed by the projection of a shadow projected on the virtual plane in the virtual background plane, in the background shadow plane, and applying an expressed shadow in the object to be processed through the texture application while the shadow is interpolated in accordance with the representation based on the shadow represented in the reference shadow plane, the shadow represented in the background shadow plane, and the source coordinates of the shadow light of the object. Thus, you can easily control the blur for easy expression of a more realistic shadow. The method may further comprise the steps of establishing an extended light source with an initial value for the shadow plane, and reflecting the extended light source and forming a shadow on the object. An apparatus for processing an image according to the present invention comprises a first device for establishing a virtual plan from the distribution of several objects generated by a three-dimensional modeling, and a second device for expressing a shadow of the object projected on the virtual plane by a light source as a point of view, in the object that is furthest from the light source than the object. The first device may comprise a device for defining a shadow expression attribute as to whether the shadow should be expressed in the objects in light source processing attributes of the objects, and wherein the second device may comprise a device for selectively expressing the shadow on the object based on the shadow expression attribute. The second device can have a representation device to establish a shadow plane that serves as a texture plane corresponding to the virtual plane, to represent the shadow of the object formed by projection in the virtual plane in the shadow plane, and to apply the shadow represented in the plane of shadow in the other object through texture application. The representation device may comprise a device for applying the shadow on the other object through texture application based on projected coordinates of the other object in the virtual plane, or in relation to each of the polygons of the other object. The second device may comprise a coordinate calculation device for determining the coordinates of the objects with the light source as a point of view, and determining the projected coordinates of the objects in the virtual plane successively in a direction away from the source of light, and the display device may comprise a device for representing the shadow formed by the object in the shadow plane based on the projected coordinates each time the texture application in one of the objects has finished. The second device can comprise a representation list generation device for determining the coordinates of the objects and the projected coordinates of the objects in the virtual plane in relation to each of the polygons of the objects, and recording the determined coordinates in a display list successively in direction away from the light source, and the display device may comprise a device for successively reading the registered coordinates from the display list to represent the shadow in the shadow plane. The display device may comprise a device for effecting a step-by-step filtering in the shadow represented in the shadow plane according to at least the distance of the light source in order to apply shadowing in this way according to at least the distance of the light source. The display device may comprise a device for interpolating the shadow represented in the shadow plane of generation when it is expressed in the object in accordance with the representation according to the shadow before being subjected to low-pass filtering, the shadow after being subjected to low-pass filtering, and the light source coordinates of the object to be processed, in order to control in this way the shadowiness of the shadow. The representation device may contain a device for preparing a reference shadow plane and a generation reference plane as the shadow plane, and each time the objects to be processed change from one to another, copy the shadow represented in the plane of reference. generation shadow in the reference shadow plane, and each time the shadow in the reference shadow plane is applied by means of texture application in relation to each of the polygons of an object, represent a projected image of the polygon in the virtual plane as a new shadow combined in the shadow plane of generation. The display device may comprise, for example, each time that the shadow represented in the shadow plane of generation is copied in the reference shadow plane, the low-pass filtering embodiment in the shadow shown in the shadow plane of generation. The display device may comprise a device for preparing, in addition to the reference shadow plane and the generation shadow plane as the shadow plane, a shadow plane, a background shadow plane which is a texture plane corresponding to a virtual background plane placed behind the object to be processed, with the light source as a point of view, represent a shadow formed by the projection of a shadow projected on the virtual plane in the virtual background plane, in the background shadow plane, and apply the expressed shadow of the object to be processed through texture application while interpolating the shadow in accordance with the representation based on the shadow represented in the reference shadow plane, the shadow represented in the background shadow plane, and the light source coordinates of the object. The display device may comprise a device for establishing an extended light source as the initial value for the shadow plane, and reflect the extended light source and form a shadow thereof in the object. A recording means according to the present invention stores a program comprising the steps of (a) establishing in at least one virtual plane of the distribution of several objects generated by three-dimensional modeling and (b) expressing a shadow of the projected object in the virtual plane by a light source as a point of view, in the object that is farthest from the light source than the object. The recording medium with the stored program makes it possible to easily express a shadow on several objects placed in a complex arrangement or shadow in relation to an object having a complex shape. Step (a) may comprise the step of defining a shadow expression attribute as to whether the shadow is to be expressed in the objects in light source processing attributes of the objects, and step (b) may comprise the step of selectively expressing the shadow in the object based on the shadow expression attribute. Step (b) may comprise the steps of (c) establishing a shadow plane that serves as a texture plane corresponding to the virtual plane, representing the shadow of the object formed by projection in the virtual plane in the shadow plane, and applying the shadow represented in the shadow plane in the other object through texture application.
Steps (c) may further comprise the step of applying the shadow on the other object through texture application based on projected coordinates of the other object in the virtual plane, or with respect to each of the polygons of the other object . Step (b) may further comprise the steps of determining coordinates of the objects with the light source as a point of view, and determining projected coordinates of the objects in the virtual plane successively in a direction away from the light source, and steps (c) may further comprise the step of representing the shadow formed by the object in the shadow plane based on the projected coordinates each time a texture application in one of the objects has finished. Step (b) may further comprise the steps of determining the coordinates of the objects and the projected coordinates of the objects in the virtual plane in relation to each of the polygons of the objects, and recording the determined coordinates in a representation list successively in the direction away from the light source, and steps (c) may further comprise the steps of successively reading the coordinates recorded from the representation list to represent the shadow in the shadow plane. The steps (c) may further comprise the step of effecting a low pass filtering in the shadow represented in the shadow plane according to at least the distance of the light source in order to apply in this way a shadow blur according to minus the distance from the light source. The steps (c) may further comprise the step of interpolating the shadow represented in the shadow plane of generation when it is expressed in the object, in accordance with the representation according to the shadow before being subjected to low pass filtering, the shadow after of being subjected to low pass filtering, and the light source coordinates of the object to be processed, in order to control in this way the shadowiness of the shadow. The steps (c) may further comprise the steps of preparing a reference shadow plane and a generation shadow plane as a shadow plane, and, each time the objects to be processed change from one to another, copy the shadow represented in the generation shadow plane in the reference shadow plane and, whenever the shadow in the reference shadow plane is applied through texture application in relation to each of the polygons of an object, represent an image projected from the polygon in the virtual plane as a new shadow combined in the shadow plane of generation. The steps (c) may further comprise the step of, whenever the shadow represented in the generation shadow plane is copied in the reference shadow plane, performing a low pass filtering in the shadow represented in the shadow plane of generation. The steps (c) may further comprise the steps of preparing, in addition to the reference shadow plane and generation shadow plane as the shadow plane, a background shadow plane which is a texture plane corresponding to a virtual plane of background placed behind the object to be processed, with the light source as a point of view, representing a shadow formed by the projection of a shadow projected onto the virtual plane in the virtual background plane, in the background shadow plane, and applying a shadow expressed in the object to be processed through texture application, while interpolating the shadow in accordance with the representation based on the shadow represented in the shadow plane of Figure 2 is an illustrative diagram of a shading process in accordance with the present invention; Figure 3 is a functional block diagram of a shading device in accordance with the present invention; Figure 4 is a diagram showing details of an object information table; Figure 5 is a diagram showing details of a vertex data file; Figure 6 is a diagram showing details of a packet; Figure 7 is a considerable block diagram of various decoration processing devices in a shading process in accordance with a first embodiment of the present invention; Figure 8 is a functional block diagram of a representation list generating device in the shading process in accordance with the first embodiment of the present invention; Figure 9 is a functional block diagram of a representation device in the shading process in accordance with the first embodiment of the present invention; Figure 10 is a diagram showing an effective area where the shading process is effective in relation to a point light source; Figure 11 is a diagram illustrating the perspective transformation of an object in a virtual plane; Figure 12 is a diagram showing a conceptual representation of the shading process according to the first embodiment of the present invention; Fig. 13 is a flowchart of a sequence of the shading process according to the first embodiment of the present invention; Fig. 14 is a flowchart of an operating sequence of the various decorations processing devices in the shading process according to the first embodiment of the present invention; Figures 15 and 16 are flow diagrams of an operation sequence of the rendering list generation device in the shading process according to the first embodiment of the present invention; Figure 17 is a diagram illustrating the insertion of a package into a representation list; Figures 18 and 19 are a diagram of a sequence of operation of the rendering device in the shading process according to the first embodiment of the present invention; Figure 20 is a diagram illustrating the formation of umbra and penumbra regions in a virtual plane by a light source distributed in a shading process according to a second embodiment of the present invention; Figure 21 is an illustrative diagram of the manner in which a generation shading plan is subjected to low pass filtering according to the distance of a light source to express the magnitude of a blur (shadow) according to the distance, in the shading process according to the second embodiment of the present invention; Figure 22 is an illustrative diagram of the manner in which a generation shadow plane is subjected to low pass filtering each time an object is processed and a shadow plane of generation is subjected to low pass filtering at each distance constant, in the shading process in accordance with the second embodiment of the present invention; Figure 23 is an illustrative diagram of a trilinear process in a shading process in accordance with the third embodiment of the present invention; Fig. 24 is a functional block diagram of a device for processing various decorations in the shading process according to the third embodiment of the present invention; Fig. 25 is a functional block diagram having a representation device in the shading process in accordance with the third embodiment of the present invention; Figure 26 is a flow chart of an operation sequence of the multi-decorative processing device in the shading process according to the third embodiment of the present invention; Figures 27 and 28 are a flow diagram of an operation sequence of the display device in the shading process according to the third embodiment of the present invention; Fig. 29 is a diagram illustrating an expression of a shadow whose shape and color vary gradually along the depth of a polygon in the shading process according to the third embodiment of the present invention; and Figure 30 is a diagram illustrating an expression of a projected shadow of an extended light source such as a flame and a shadow projected by the light source extended on an object, in a shading process in accordance with a fourth embodiment of the present invention. PREFERRED MODE OF THE INVENTION Modalities in which an image processing method, an image processing apparatus, a recording medium, and a program in accordance with the present invention are applied to an entertainment apparatus for performing three-dimensional CG processing it will be described below with reference to Figures 1 to 30. As shown in Figure 1, an entertainment apparatus 10 comprises a central processing unit 12 (CPU) for controlling the overall operation of the entertainment apparatus 10, a memory main 14 for storing several programs and several data, an image processor 18 for generating image data under the control of the main processing central unit 12 and for sending the generated image data to a display unit (for example CRT) 16 , and an input / output port 20 to send data and receive data from external devices. The main memory 14, the image processor 18, and the input / output port 20 are connected to the main central processing unit 12 for a bus 22. To the input / output port 20 are connected, for example, a device input / output 24 for inputting data (key input data, coordinate data, etc.) in the entertainment apparatus 10, and an optical disc unit 26 for reproducing an optical disc, such as a CD-ROM, which stores various programs and data (data related to object, texture data, etc.). The image processor 18 comprises a display engine 30, a memory interface 32, an image memory 34, and a display controller 36 such as a programmable CRT controller. The rendering engine 30 represents image data in the image memory 34 through the memory interface 32 in response to rendering commands supplied from the main processing central unit 12. A first bus 38 is connected between the interface memory 32 and the rendering engine 30, and a second bus 40 is connected between the memory interface 32 and the image memory 34. the first bus 38 and the second bus 40 each have a bus width of 128 bits, for example, to allow the rendering engine 30 to rapidly represent image data in the image memory 34. The rendering engine 30 can represent image data of 320x240 pixels in accordance with NTSC or PAL or 640 x image data. 480 pixels based on real time several times, within a range of more than 10 up to several tens, by 1/60 to 1/30 second. The image memory 34 is of a unified memory structure capable of specifying a texture area 34a and a representation area 34b (see Figure 3) in an area. The display controller 36 writes texture data read by the optical disk unit 26 and texture data generated in the memory 14 in the texture area of the image memory 34 through the memory interface 32, and reads image data represented in the representation area of the main memory 14 through the memory interface 32, and sends the image data to the display unit 16 for display on a display screen. The function of a distinguishing feature of the entertainment apparatus 10, that is, the function of a process drops a shadow on an object (hereinafter referred to as "shading process"), will be described in detail below. In accordance with the shading process, as shown in Figure 2, at least one virtual plane 50 is established from the distribution of several Obi objects., 0b2 generated by three-dimensional modeling and a projected image 54 of the Obi object projected in the virtual plane 50 by a light source 52 is expressed as shadow 56 in the object Ob2 that is furthest away from the light source 52 and the object Obi. . A program for carrying out the shading process, that is, a shading device 100 (see FIG. 3), is downloaded from a CD-RC reproduced by the optical disc unit 26 in the main memory 14 of the maintenance apparatus 10. Then, the program downloaded to perform the shading process is used in the entertainment apparatus 10.
The shading device 100 will now be described with reference to Figures 3 to 9. As shown in Figure 3, the shading device 1000 has several decorations processing devices 102, a display list generating device 104, a display device 106, and an image display device 108. The multi-pattern processing device 106 generates an object information table 110, effects decorations for the objects Obi, Ob2 .... a screen, and a light source 52, and establishes at least one virtual plane 50 from the distribution of the objects Obi, Ob2 .... As shown in figure 4, the object information table 110 records as many records as the number of generated objects. Each of the registers contains an initial address of a data file (vertex data file), vertex data (object coordinates) of polygons that constitute the corresponding object, the number M of polygons, an initial address of a Texture table that is created, shading attribute information (such as Gouraud shading), topology information (such as mesh), a light source processing attribute, and object distribution information.
The light source processing attribute defines information as to whether a polygon shadow should be displayed or not (display / not display - 1/0). The list generating device 104 determines a screen coordinate system, a light source coordinate system, and projected coordinates in the virtual plane 50 (coordinates of a polygon shadow) of each of the polygons of objects based on vertices data files 112 of the objects and the distribution of the light source, etc., registers the determined coordinate systems and projected coordinates in packets 114 and effects a Z classification in the packets 114 in a direction away from the source light 52, and registers the packets 114 in a rendering list 116. As shown in FIG. 5, the vertex data files 112 comprise as many files as the number of generated objects. Each of the files registers in each registration object coordinates PP, 3. = (X.-, Y., Z? O), PP: -? ~ (X? Di / Y? 3? Z13l), PP? 2 = (X? L2, Y-.32, Z.:2) of a polygon of the object. As shown in Figure 6, each of the packets 114 stores there the object number (object number) to which the polygon belongs, a sorting pointer Z to use as a pointer when the packet 114 is registered in the list of representation 116, screen coordinates SP13c = 28 vertex data files 112, a pointer determination device 140, for determining an insertion pointer (Z classification pointer) relative to the display list 116 based on the light source coordinates of the polygons, a device insertion of packets 142 to insert a packet 114 into. the record corresponding to the pointer, and an end determining device 144 to determine whether the processing in the polygons constituting the object to be processed have finished or not. As shown in Figure 9, the rendering device 106 is arranged to employ a reference shadow plane 150 and a generation shadow plane 152 corresponding to the virtual plane 50. The shadow planes 150, 152 are logically assigned to the texture area 34a of the image memory 34. The rendering device 106 sends a command to operate a texture expression processing device 154 incorporated in the rendering engine 30. The texture expression processing device 154 comprises a texture rendering device. texture application 156 and a shading device 158. The rendering device 106 has a shadow plane initialization device 160 for writing initial data Di read from an initial data file establishing coordinates 124 to determine global coordinates of a screen, the light source 52, and the open plane to 50 from information regarding the screen distribution, the light source 52, and the virtual plan 50, and records the global coordinates determined in predetermined network variable areas Z1-Z4, and a device and a light source coordinate calculating device 126 for determining the light source coordinates of the virtual plane 50 based on the world coordinates of the virtual plane 50, and recording the Z coordinate in an area default network variable Z5. As shown in Figure 8, the representation list generation device 104 comprises a representation list initialization device 130 for initializing representation lists 116, a table registration reading device 132 for reading information records. from the object information table 110, a file registration reading device 134 for reading records of information from the corresponding vertex data files 112, a data storage device 136 for storing data in pack 114, a coordinate calculation device 138 for calculating screen coordinates, light source coordinates, and projected coordinates in the virtual plane 50, of the polygons registered in the (Xs? 30 / Ys -.] Oi Zsl] o) / SPi ] i = (Xsi] I. Ys-; 1 / Z31?), SPX] 2 = (yes?, ZSi] 2) of the polygon coordinates of light source UP 13o = (ui3? / YuipOf ZU? 3?) i UP i] l = (Xui] l / Y-.? 3l. ZU? 3l) / UP 1? (".i;) / Yu? 2 / ZU132) of the polygon, and projected coordinates UV 1] 0 V13l), UV .3; - (U?:;, V132) of the polygon in the virtual plane 50. The representation device 106 successively takes the packets 114 from the representation list 116 and, based on the various data of the polygons registered in the packets 114, represents the polygons and performs a texture application in polygon shadows of the polygons. The image display device 108 reads data from images stored in the display area 34b of the image memory 34 and sends the read image data to the display controller 36. As shown in FIG. 7, the image processing device various decorations 102 has a table generating device 120 for generating an object information table 110 based on data entered through the input / output device 24, a distribution information recording device 122 for recording information relating to the distribution of the objects entered through the input / output device 24 in the object information table 110, a device 30 do not. A shading process according to a first embodiment of the present invention that is carried out through the shading device 100 will be described below. Prior to describing the shading process, the operation concept of the shading device 100 will first be described with reference to Figs. 2, 10 to 12. Fig. 2 shows the concept of the shading process used in the virtual scheme 50. In the figure 2, virtual plan 50 is located between the Obi object that casts a shadow and the Ob2 object where the shadow is cast. The position of the virtual plan 50 is determined by its size and the magnitude of the space covered by the shading process. In the first embodiment, the Obi object is projected onto the virtual plan 50 in accordance with the perspective transformation with the light source 52 as the point of view, and is written as a polygon shadow on the shadow piano (the shadow plane reference 150 and the generation shadow plane 152) which is a texture plane corresponding to the virtual plane 50. The shading process for leaving a shadow on the Ob2 object is carried out by applying a texture application in each of the polygons of the Ob2 object from the reference plane 150 that serves as the texture pattern. 31 In the texture coordinates of each of the vertices of a polygon can be determined in accordance with the perspective transformation with the light source 52 as a point of view. The formulas of the perspective transformation will now be described with reference to FIG. 11. In FIG. 11, if the light source 52 is a virtual light source, then the perspective transformation of each of the vertices of the Obi to the virtual plane 50 is represented by: Xa = (xa * ScrZ) / za Ya = (and * ScrZ) / za and the texture coordinates (Ub, Vb) of the shadow of each of the vertices of each of Objects Ob2 are similarly represented in accordance with the perspective transformation by: Ub = (xb * ScrZ) / za Vb = (yb * ScrZ) / za If the light source 52 is a parallel light source, then ( Xa, Ya) - (xa, ya) (Xb, Yb) = (xb, yb) Figure 12 shows a conceptual representation of the shading process according to the first embodiment of the present invention. In accordance with the process of 32 shading shown in figure 12, which is applied to each object, a polygon shadow represented in a shadow plane (the reference shadow plane 150) is subjected to texture application on an object and displayed on a screen 176 (procedure 1) and then the polygon shadow formed on the object is plotted on the shadow plane 150 (method 2). The shading process mentioned above is effected by Z classification with the light source 52 as a view, in accordance with that indicated by arrow A. A sequence of the shading process according to the first embodiment of the present invention will be described below with reference to figure 13. In step Sl, the device for processing various decorations 102 generates an object information table 110, makes distribution settings for the object, the screen 176, and the light source 52, and establishes at least one virtual panel 50 from the distribution of several objects (processing several adjustments). Then in step S2, the representation list generation device 104 determines a screen coordinate system, a light source coordinate system, and projected coordinates in the virtual plane 50 (coordinates of a shadow of polygons) of each one of the polygons of objects based on vertex data files 112 of the 33 objects and the distribution of the light source 52, etc., registers the determined coordinate systems and projected coordinates in packets 114 and packet register 114 successively in a direction away from the light source 52 in a display list 116 (process generation of representation list). Then, in step S3, the rendering device 106 successively takes the packs 114 from the rendering list 116, and based on the various data of the polygons registered in the packs 114, represents the polygons and performs a texture application in Polygon shadows of polygons (rendering processing). Then in step S4, the image display device 108 reads image data stored in the display area 34b of the image memory 34 and sends the image data read through the display controller 36 to the display unit 16. In this way, as shown in Figure 2, the shadow of the object Ob2 produced by the light source 52 is cast in the object 0b2 which is placed behind the object Obi in relation to the light source 52. After from step S4, the shading process in accordance with the first mode ends. Operating sequences of the multi-adjustment processing device 102, the list generation device of 34 representation 104, and the rendering device 106 will now be described with reference to figures 14 to 19. First, an operation sequence of the various adjustment processing device 102 will be described below with reference to figure 14. In step SlOl, shown in Figure 14, the table generating device 120 of the multi-adjustment processing device 102 generates an information table of objects 110 based on data entered through the input / output device 24. As shown in FIG. Figure 4, information elements recorded in the object information table 110 include information of shading attributes (for example Gouraud shading) topology information (such as mesh), and a light source processing attribute, among others. The light source processing attribute defines information as to whether a polygon shadow should be displayed (display / not display = 1/0). In the object information table 110, an initial address of a vertex data file 112, the number of polygons, an initial address of a texture table that is used, etc., are recorded as an object is generated in accordance with CAD. In step S102, the information recording device of distribution 122 records information regarding the distribution of objects entered through the input / output device 24 in the object information table 110. In step S103, the coordinate determination device 124 calculates the world coordinates of the screen 176 based on information on the layout of the screen 176 and stores the calculated global coordinates in a predetermined variable area of network. In step S104, the coordinate determination device 124 calculates the world coordinates of the light source based on information relating to the distribution of the light source 52, and stores the calculated global coordinates in a predetermined variable network area Z2 . The coordinate determination device 124 also stores the type of light source 52 inputted through the input / output device 24 in a predetermined network variable area Z3. In step S105, the coordinate determination device 124 establishes the layout of the virtual plan 50 based on the position of the light source 52 stored in the network variable area Z2 and the object distribution information recorded in the table information of objects 110, calculates the world coordinates of virtual plan 50, and stores the calculated global coordinates in an area 36 default network variable Z4. In step S106, the light source coordinate calculating device 126 calculates the light source coordinates of the virtual plan 50 based on the global coordinates of the virtual plan 50 stored in the variable network area Z4 and the position of the light source 52, and stores the Z coordinate of the calculated global coordinates in a predetermined variable network area Z5. After step S105, the operation sequence of the various settings processing device 102 terminates. An operation sequence of the rendering list generation device 104 will be described below with reference to FIGS. 15 and 16. In step S201 illustrated in Fig. 15, the representation list initialization device 130 initializes the rendering list 116. Then in step S102, the rendering list generation device 104 stores an initial value "0" in an index and employee register. to search for an object, thus initializing the index record y. In step S203, the table register read device 132 reads a register (register i) indicated by the index register and from the object information table 110. In step S204, the device for generating the list 37 representation 104 stores an initial value "0" in an index register j used to search for a polygon initialized in this way the index register j. In step S205, the data storage device 136 stores an initial value in a packet 114, thereby initializing the packet 114. In step S206, the data storage device 136 stores an object number and (the value of the register). index i) in packet 114. In step S207, the file register reading device 134 reads a register (register j) indicated by the index register ja from the corresponding vertex data file 112, ie, reads vertex data of a j-th. The corresponding vertex data file 112 is a vertex data file corresponding to the initial address of the vertex data file 112 registered in the register and read from the object information table 110. In step S208, the coordinate calculation device 138 determines the screen coordinates SP.3U = Ysi or Zsi]?) I SP.3l - (Xs? 3l / Ys? L / Zsl-??), SP132 - A? 1j ?, Ysi] 2 / Zsl 2) of the vertices of the j-th polygon based on the distribution information of the object registered in the register i in the object information table 110, the world coordinates of the screen 176 recorded in the variable area 38 of network Zl, and the vertex data of the j-th polygon, and the data storage device 176 stores the screen coordinates determined in packet 114. In step S209, the coordinate calculation device 138 determines the coordinates of light source UP13o = (Xu- o.Yu-io.Zu-o). UPiji = (XUi] i u- i / ZU? 3?), UP132 = (XUi] 2 / ui] 2 Zu?:? 2) the vertices of the j-th polygon based on the distribution information of the object, the global coordinates of the light source 52 recorded in the variable network area Z2, and the apex data of the j-th polygon, and the data storage device 136 stores the light source coordinates determined in the package 114. In step S210, the coordinate calculation device 130 determines the projected coordinates UV_3, - (U13o / V1] 0), UV13l = (U13 ?, V13?), UV13 = (U1] 2, V132) of the vertices of the j -avo polygon based on the object distribution information, the Z coordinate (light source coordinate) of the virtual plane 50 registered in the variable network area Z5, and the vertex data of the j-th polygon and the device data storage 136 stores the projected coordinates determined in packet 114. In step S211, the dot determination device ero 140 selects the Z coordinate that is closest to the light source, between the Z coordinates of the light source coordinates UP13 or = (Xu- c Y'ji]? / Zul-, o) / UP13? = (XUi] i / Yuij i f ZU? 3?), UP1] 2 39 = (Xu? 32 / Yui] 2 Zul-, 2) of the vertices that have been determined in step S209, and uses the Z coordinate selected as a Z-ranking pointer for the j-th polygon. In step S212 shown in Figure 16, the packet insertion device 142 searches for the rendering list 116, and inserts the present packet 114 into the rendering list 116 in such a way that the packets 114 are placed there in accordance with a increasing pattern of Z coordinates (Z-sort pointers), as shown in Fig. 17. In step S213, the end-determining device 144 increases the index register value j by "+1". In step S214, the end determining device 144 determines whether the processing of all the polygons that make up the first object is terminated or not, by determining whether or not the index register value j is equal to or greater than number M of polygons registered in register i in object information table 110. If the processing of all polygons constituting the first object has not ended, then the control returns to step S205 shown in FIG. 15 to calculate the various coordinates in relation to a following polygon, for storing the calculated coordinates in a packet 114, and for inserting the packet 114 in the rendering list 116. 40 If the processing of all the polygons that make up the first object has ended in step S214, then the control proceeds to step S215 where the end determining device 144 increases the value of the index register and by "+1" " In step S216, the end determining device 144 determines whether the processing on all objects has ended or not determining whether or not the value of the index register i is equal to or greater than an N number of records recorded in the table of object information 110. If the processing of all the objects has not finished, then the control returns to step S203 shown in figure 15 to calculate the various coordinates relative to all the polygons of a following polygon, to store the coordinates calculated in the respective packets 114, and for inserting the packets 114 into the rendering list 116, in accordance with an increasing pattern of sorting pointers Z. If the processing of all the objects has finished in step S216, then the operation sequence of the device generation of representation lists 104 comes to an end. An operation sequence of the display device 106 will be described below with reference to FIGS. 18 and 19. 41 In step S301 shown in FIG. 18, the shadow plane initialization device 160 reads initial data Di from the initial data file 178, and represents the initial data Di in the shadow plane used (the reference shadow plane 150). and the generation shadow plane 152) to initialize the reference shadow plane 150 and the generation shadow plane 152. In step S302, the representation device 106 stores an initial value "FF" in a R register that is used to store the object number i, and an initial value "0" in an index register k which is used to search the packets 114, to thereby initialize the R register and the index register k. In step S303, the pack reading device 162 reads a packet 114 at a point (k-tho) indicated by the index register ka from the display list 116. In step S304, the pack read device 162 read the object number ia from the reading packet 114. In step S305, the object determining device 166 determines whether the present object number and is the same as the previous object number or not, determining whether the value of the object index record i is the same as the value of register R. If the present object number i is different from the previous object number, then the control proceeds to step 42 S306 wherein the table register read device 164 reads the register ia from the object information table 110. In step S307, the shadow plane representation device 168 copies texture data (or the initial data Di) relative to a polygon shadow represented in the shadow plane of generation 152 in the reference shadow plane 150. In step S308, the object determination device 166 stores the object number i in the R register. After the completion of the processing in step S308 or if the present object number i is the same as the previous object number in step S305, the control proceeds to step S309 shown in figure 19 wherein the device texture expression processing 154 performs a normal texture expression process. Specifically, the texture expression processing device 154 performs a texture expression process such as shading, texture application, based on the screen coordinates of the present polygon and the initial direction of a texture table. In step S310, the polygon shadow display determination device 170 determines whether a polygon shadow can be displayed on the object or not based on a shadow shadow attribute of 43 polygon of the light source processing attributes recorded in the corresponding register of the object information table 110. If a polygon shadow can be displayed, then the control proceeds to step S311 where the texture application device 156 of the texture expression processing device 154 applies the polygon shadow represented in the shadow plane of references 150 in the polygon to be processed through a texture application, while referencing the projected coordinates UV13o = (U1] 0, V13?), UV13l = (U13l, V13l), UV1] C = (U1] 2, V1] 2) of the polygon to be processed in the virtual plane 50. If only the initial data are represented in the reference shadow plane 150 , then the initial data Di are applied through texture application. After completion of the processing in step S311 or if the polygon shadow can not be displayed in step S310, then the control proceeds to step S312 where the shadow plane representation device 168 represents the polygon shadow of the polygon present in combination with the shadow of the previous polygon in the shadow plane of generation 152 based on the projected coordinates of the present polygon in the virtual plane 50, and paints the combined shadow of black (R, G, B, alpha) = (0, 0, 0, 100%). 44 In step S313, the hidden surface removal processing device 172 writes the data of the present polygon to the rendering area 34b while performing a hidden surface removal in accordance with Z damping, based on the coordinates of the present polygon. In step S314, the rendering device 106 increments the value of the index register k by "+1". Then, in step S315, the end determining device 174 determines whether the processing of all the packets 114 is finished or not. If the processing of all the packets 114 has not finished, then the process returns to step S303 to effect the normal texture expression process, the texture application of the polygon shapes and the hidden surface removal in relation to the polygon registered in a next packet 114. If the processing of all packets 114 recorded in the rendering list 116 is completed in step S315, then the operation sequence of the rendering device 106 comes to an end. The processing in steps S303-S313 is repeated to offer the following advantages: in relation to the polygons of the Obi object placed closest to the light source 52, only the initial data Di is written in the reference shadow plane 150. If the initial data Di indicate 45transparency, then no polygon shadow is represented in the polygons of the Obi object. In the polygons of the object 0b2 which is the second object from the light source 52, a polygon shadow of all the polygon shadows of the first Obi object is represented from the light source, present in the range represented by the projected coordinates of the Obi polygon. When the processing in the second object 0b2 ends, the polygon shadow of the first object Obi is represented in the second object Ob2. Similarly, in an object 0b3 which is the third object from the light source 52, a combination of the polygon shadow of the first object Obi and the polygon shadow of the second object Ob2 is represented. In the shaded process according to the first embodiment, according to what is described above, a virtual plane 50 is established from the distribution of several objects generated by three-dimensional modeling, and a polygon shadow of one of the objects that is formed by projection in the virtual plane by a light source as a point of view is expressed as another object farther from the light source 52 than the object. Specifically, a reference plane 50 and a shadow plane of generation 152 which are texture planes corresponding to virtual plane 50 are set, shadow 46 of an object that is formed by projection in the virtual plan 50 is displayed in the reference shadow plane 150 through the generation shadow plane 152, and the shadow represented in the reference shadow plane 150 is applied to an object next through texture application. In this way, it is easy to express shadows in relation to several objects placed in a complex distribution or a shadow in relation to an object that has a complex shape. In this embodiment, since the shadow expression attribute as to whether a polygon shadow is to be expressed in an object or is not defined in the light source processing attribute in each record in the object information table 110 , the process of expressing a polygon shadow on an object can be done selectively. Therefore, one can prevent the facial expression of a main character in a game, for example, from being hidden by the shadow of another object. A shading process in accordance with a second embodiment of the present invention will now be described with reference to Figures 9, 20 to 22. The shading process according to the second embodiment is essentially the same as the shading process in accordance with the second embodiment. with the first modality except that 47 the rendering device in the shading process has a bilinear processing device 190 for blurring a polygon shadow according to the distance of the light source 52, in accordance with what is indicated (in FIG. 9). As shown in Figure 20, if the light source 52 is not a point light source, but an extended light source, then when the virtual plane 50 is located in a position close to the Obi object, the Obi object casts a umbra Ss in the virtual plane 50. When the virtual plane 50 is located in a distant position of the object Obi, the object Obi throws the umbra Ss and also a penumbra Sd, which is a blurred shadow surrounding the umbra Ss in the virtual plane 50. The degree of blurring of the penumbra Sd is increased according to the distance of the light source 52 towards the virtual plane 50. The shading process according to the second embodiment is arranged to achieve the characteristics of the penumbra Sd. Specifically, as shown in Fig. 18, after the shadow plane representation device 168 has copied texture data relative to a polygon shadow represented in the shadow plane of generation 152 in the reference shadow plane 150 in step S307, the bilinear processing device 190 performs a 48 low pass filtering in the texture data relative to the polygon shadow represented in the generation shadow plane 152 in the indicated step S320 (in Figure 18). Figure 21 shows the texture expression in relation to the polygon shadow represented in the shadow plane of generation 152 is subjected to according to the distance from the light source 52 to express the degree of blur (penumbra according to the distance) . A review of Fig. 21 indicates that the projected shadow is lighter at a close distance from the light source 52 and more blurred at a distance from the light source 52. According to the second embodiment, if the object number present i is different from the previous object number in step S305 illustrated in FIG. 5, then the texture data relative to the polygon shadow represented in the generation shadow plane 152 is subjected to low pass filtering in step S320. In this way, as shown in Figure 22, when the Obi object is processed by Pl, the polygon shadow (umbra) of the Obi object is represented in the shadow plane of generation 152 and when the Ob2 object is processed in a stage P3, the polygon shadow (umbra and penumbra) of the object Obi and the polygon shadow (umbra) of the object Ob2 are represented in the shadow plane of generation 152. Further away from the light source 52 and each time it is change objects, the polygon shadow represented in the 49 plane generation shadow 152 gradually becomes more blurred according to the distance of the light source 52. The polygon shadow represented in the shadow plane of generation 152 may be subjected to low pass filtering each time a certain distance is reached while they monitor the coordinates of the light source. In Fig. 22, the polygon shadow represented in the shadow plane of generation 152 is subjected to low pass filtering in the stages or points Pl, P2, P3. In the shading process according to the second embodiment, insofar as the polygon shadow represented in the shadow plane of generation 152 is subjected to low pass filtering according to the distance of the light source 52, the shadow of polygon becomes blurred according to the distance of the light source 52 and therefore is expressed realistically. A shading process according to a third embodiment of the present invention will be described below with reference to figures 23 to 29. In the shading process according to the third embodiment, when a polygon shadow represented in the shadow plane of generation 152 is expressed in an object through the shadow plane of reference 150, is interpolated in accordance with representation (in a strict sense) according to the polygon shadow before being subjected to 50 low-pass filtering, the polygon shadow after being subjected to low-pass filtering, and the light source coordinates of the object to be processed, in order to control in this way the blur of the polygon shadow. As shown in Figure 23, the polygon shadow is represented using two shadow planes, i.e., the reference shadow plane 150 and the background shadow plane 192. The reference shadow plane 50 and the plane of 192 background are shaded planes where the polygon shadow is subjected to low pass filtering at different times. The polygon shadow is represented by a trilinear texture application process that effects interpolation between the two shadow planes 150, 152, according to the Z coordinate of the light source coordinates. As a result, the shadowiness of the shadow according to the distance of the light source 52 to the object can be controlled within a polygon for a better shadow approach. An arrangement to effect the shading process in accordance with the third mode and operation of the shading process will be described below. The shading process according to the third embodiment is essentially the same as the shading process according to the first embodiment, except that the processing device of various settings 102 and 51 the rendering device 106 have partially different functions. Specifically, as shown in Fig. 24, the multi-setting processing device 102 has a table generating device 120, a distribution information recording device 122, a coordinate determination device 124, and a computing device Light source coordinates that have functions different from those described above. The light source coordinate calculating device 126 determines the coordinates of light sources of the virtual plane 50 based on the world coordinates of the virtual plan 50, and records the Z coordinate in a predetermined network variable area Z5. The light source coordinate calculating device 126 also determines the light source coordinates of the first to n-th virtual background planes that are virtually placed behind respective objects based on the object distribution information and records the Z coordinates of the light source coordinates determined in predetermined network variable areas Zll-Zln. As shown in FIG. 25, the reproduction device 106 is positioned to employ, in addition to the reference shadow plane 150 and generation shadow plane 152, a background shadow plane 192 logically assigned to the area 52 texture 34a of the image memory 34 in association with the 1st to n-th virtual background planes. The rendering device 106 has the shadow plane initialization device 160, the pack reading device 162, the table registration reading device 164, the object determination device 166, the shadow plane rendering device 168, the polygon shadow display determination device 170, the hidden surface removal processing device 172, and the end determination device 174. The shadow plane initialization device 160, the plane display device of shadow 168, and bilinear processing device 190 have functions different from those described above. The shadow plane initialization device 160 writes initial data Di to the background shadow plane 192, as well as to the reference shadow plane 150 and the generation shadow plane 152 to initialize these shadow planes 150, 152, 192. The shadow plane representation device 168 represents, in the background shadow plane 192, a polygon shadow formed when a polygon shadow projected onto the virtual plane 50 (a polygon shadow represented in the shadow plane of reference) is projected on any of the first to n-th virtual background plane 53 virtually placed behind the object, based on the data distance 52 to the virtual plane 50 (the Z coordinate of the virtual plane 50) and the distance of the light source 52 to the corresponding virtual background plane (the Z coordinate) of any of the first to n-th virtual background plane). The bilinear processing device 190 performs a low pass filtering of the polygon shadow represented in the background shadow plane 192, as well as the polygon shadow represented in the shadow plane of generation 152. The operation of the shading process in accordance with the third embodiment will be described below with reference to figures 26 to 28. In S401-404 shown in figure 26, the multi-determination processing device 102 generates an object information table 110, records relative information to the distribution of the objects in the object information table 110, records the world coordinates of the screen 176, and records the type of the light source 52 and the world coordinates of the light source 52, as in the case of steps S101-S104 performed by the multi-determination processing device 102 in the shading process in accordance with the first embodiment. In step S405, the coordinate determination device 124 establishes the distribution of the virtual plane 50 and the first 54 to the n-th virtual background plane based on the position of the light source 52 stored in the variable network area Z2 and the distribution information of the objects recorded in the object information table 110, calculates the world coordinates of the virtual plane 50 and virtual background plans 1 an, and stores the calculated global coordinates in predetermined network variable areas Z4, Zll-Zln. In step S406, the light source coordinate calculating device 126 calculates the light source coordinates of the virtual plane 50 and the virtual background planes lan based on the world coordinates of the light source 52 and the plane 50 stored in the network variable areas Z2, Z4, and stores the Z coordinates of the calculated global coordinates in predetermined network variable areas Z5, Z21-Z2n. The representation list generation device 104 performs the same processing as in the case of the shading process in accordance with the first embodiment. Accordingly, the processing performed by the rendering list generation device 104 will not be described below. Then, in step S501 illustrated in Fig. 27, the shadow plane initialization device 160 of the rendering device 106 reads the initial data Di 55. of the initial data file 178 and represents the initial data Di in the shadow plane that is employed (the reference shadow plane 150, the generation shadow plane 152, and the background shadow plane 192) to initialize these planes shadow 150, 152, 192. In step S502, the rendering device 106 stores an initial value "FF" in a register R used to store the object number, and an initial value "0" in an index register k which is used to search for packets 114, and an initial value "0" in an index register n that is used to search the virtual background planes, to initialize in this way the R register and the index registers k, n. In step S503, a pack reading device 162 reads a packet 114 at a point (k-tho) indicated by the index register k of the rendering list 116. In step S504, the pack reading device 162 read the object number and the reading packet 114. In step S505, the object determining device 166 determines whether the present object number and is the same as the previous object number or not, by determining whether the The value of the index register is the same as the value of the R register or not. If the present object number i is different from the previous object number, then the control proceeds to step 56 S506 wherein the table register reading device 164 reads the register i from the object information table 110. In step S507, the shadow plane representation device 168 copies texture data relative to a polygon shadow represented in the shadow plane of generation 152 in the reference shadow plane 150. In step S508, the bilinear processing device 190 performs a low pass filtering on the texture data relative to the polygon shadow represented in FIG. generation shadow plane 152. In step S509, the object determination device 166 stores the object number and in the R register. Then, the rendering device 106 increments the value of the index register M by "+1" . After finishing the processing in step S510 or if the present object number i is the same as the previous object number in step S505, the control proceeds to step S511 shown in figure 28 where the processing device Texture Expression 154 performs a normal texture expression process. Specifically, the texture expression processing device 154 performs a texture expression process, such as shading, texture application, etc. based on the screen coordinates of the present polygon and the address 57 initial of a texture table 180. In step S512, the polygon shadow display determination device 170 determines whether a polygon shadow can be displayed on the object or not based on a polygon shadow display attribute of the polygon. light source processing attribute registered in the corresponding register in an object information table 110. If a polygon shadow can be displayed, then the control advances to step S513 where the polygon shadow display completion device 170 determines whether the polygon shadow is displayed for the first time or not, based on whether the object number i is "0" or not. If it is not the first time, then the control proceeds to step S514 in which the shadow plane representation device 168 represents, in the background shadow plane 192, a polygon shadow formed when a polygon shadow projected onto the virtual plane 50 (a polygon shadow represented in the reference shadow plane 150) is projected onto the n virtual virtual background plane virtually placed behind the object, based on the Z coordinate of the virtual plan 50 and the coordinate Z of the n-avo virtual background plane. In step S515, the bilinear processing device 190 58 performs a low pass filtering in the polygon shadow represented in the background shadow plane 192, thereby making the polygon shadow blurred according to the distance of the light source. In step S516, the texture application device 156 of the texture expression processing device 154 performs an interpolation in accordance with a representation based on the polygon shadow represented in the background shadow plane 192, and the source coordinates of light from the vertices of the polygon and applies the projected polygon shadow in the polygon through texture application. At the same time, as shown in Fig. 29, a texture application is made in such a way that the shape of the polygon shadow changes gradually, along the depth of a polygon 200, from the shape of a polygon shadow 204 represented in the shadow plane of reference 150 to the shape of a shadow of polygon 206 represented in the n-th plane of shadow background 202n, and the color of the shadow of polygon 200 changed gradually, along the depth of the polygon 200, from the color of the shadow of polygon 204 represented in the reference shadow plane 150 to the color of the shadow of polygon 206 represented in the n-th background shadow plane 202n. If, for the first time in step S513, then the control 59 proceeds to step S517 where the texture application device 156 applies the polygon shadow represented in the reference shadow plane 150 to the polygon to be processed, through texture application, while removing the projected coordinates of the polygon in the plane 50. After processing in step S516 or step S517, or if a polygon shadow can not be displayed, then the control proceeds to step S518, where the shadow plane representation device 168 represents the shadow of polygon of the present polygon in combination with the shadow of the middle polygon in the shadow plane of generation 152, based on the projected coordinates of the present polygon in the virtual plane 50, and paints the combined shadow of black (R, G, B, alpha) = (0, 0, 0, 100%). In step S519, the hidden surface revolution processing device 172 writes the data of the present polygon to the rendering area 34b while performing hidden surface removal in accordance with Z damping based on the screen coordinates of the present polygon. In step S520, the rendering device 106 increments the value of the index register k by "+1". Then, in step S521, the end determining device 174 determines whether processing in all packets 60 114 has finished or not. If the processing in all the packets 114 has not finished, then the control returns to step S503 to effect the normal texture expression process, the texture application of the polygon shadow, and the removal of hidden surface in relation to the polygon. registered in a next packet 114. If processing in all packets 114 recorded in the playlist 116 has ended in step S521, then the next operation of the playout device 106 comes to an end. The processing in steps S503-S519 is repeated to offer the following advantages: in relation to the polygons of the Obi object placed closer to the light source 52, only the initial data Di is written in the reference shadow plane 150. If the initial data Di represents transparency, then no polygon shadow is represented in the polygons of the Obi object. In the polygons of the object Ob2 which is the second object from the light source 52, a polygon shape of all the polygon shadows of the first Obi object is represented from the light source, which is present in the range represented by the projected coordinates of the Obi object polygon. When the processing in the second object Ob2 has finished, the polygon shadow of the first object Obi is represented in the second object Ob2. 61 At that time, the color of the polygon shadow projected on the object Ob2 is expressed in a gradually changing manner along the depth of the object 0b2 by the representation (trilinear processing) in step S516. Similarly, in an object 0b3 which is the third object from the light source 52, a combination of the shadow of the polygon of the first object Obi and the shadow of the polygon of the second object 0b2 is represented. The color of the polygon shadow is also expressed as gradually changing along the depth of the Ob3 object. In the shading process according to the third mode, in accordance with what is described above, when a polygon shadow is recorded in the generation shadow plane 152 it is expressed in an object, the polygon shadow before its filtering low pass, and polygon shadow after its submission to low pass filtration are interpolated in accordance with the representation according to the light source coordinates of the object to be processed, to control in this way the blur of the polygon shadow. Therefore, the polygon shadow can be easily expressed more realistically. A shading process in accordance with a fourth embodiment of the present invention will be described below with reference to FIG. 30. 62 In the shading process according to the fourth embodiment as shown in FIG. 30, an extended light source 210, such as a flame, is expressed in the objects Obi, Ob2, Ob3. Said expression is made by projection in advance, of the extended light source 210 on the virtual plane 50. Specifically, a projected image 212 of the extended light source 210 can be preset as the initial data Di that is employed by the device. 106 in accordance with the first embodiment and second embodiment illustrated in FIG. 9 or the representation device 106 in accordance with the third embodiment illustrated in FIG. 25. In the rendering device 106 in the shading process in accordance with FIG. the third embodiment, such as, for example, the shadow plane initialization device 160 represents the initial data Di in the reference shadow plane 150, the generation shadow plane 152 and the background shadow plane 192 in step S501 shown in figure 27, and then paints the polygon shadow on the black object, thus modulating the extended light source 210 as lu z in steps S503-S519. The polygon shadow and the extended light source 210 are subjected to low pass filtering according to the distance of the extended light source 210 in step S515. 63 In an initial stage PO, only the projected image 212 of the extended light source 210 is displayed in the generation shadow plane 152. When the processing of the Obi object has ended in the Pl step, a projected image 212a of the source extended light 210 which has been subjected to low pass filtering and a umbra 214 of the Obi object are displayed in the shadow plane of generation 152. When the processing of object 0b2 has ended in a step P2, a projected image 212b of the source of extended light 210 which has been subjected to low pass filtering twice, a shadow 214a of the Obi object that has been subjected to low pass filtering once, and an umbra 216 of the Ob2 object are represented in the shadow plane of generation 152. The modulation of the extended light source 210 as light means the multiplication of the color of the polygon after light source calculations based on the inclination of an ordinary normal line and texture application, by the extended color represented in the reference shadow plane 150. In the shading process according to the fourth embodiment, since the projected image 212 of the extended light source 210, such as for example a flame, is preset as the initial data Di for the shadow planes 150, 152, 192, and the extended light source 210 is 64 Reflected and shadows are cast into an object, the projected image of the extended light source 210 and shadows produced by the extended light source 210 can be easily expressed. In the aforementioned modalities, the removal of the hidden surface is effected by Z damping, however, the removal of the extended surface can be carried out by means of Z classification based on a screen coordinate system. The image processing method, the image processing apparatus, the recording medium and the program according to the present invention offer the following advantages: (1) it is possible to express shadows easily in relation to several objects placed in the complex distribution or a shadow in relation to an object that has a complex shape. (2) The expression of a shadow in relation to an object can be done selectively. (3) Various effects, such as fuzziness, etc. they can be easily applied to a shadow expressed in relation to an object. (4) Of the various effects on shadows, blur can be easily controlled for a simple expression of a more realistic shadow. 65 (5) A projected image of an extended light source such as a flame and a shadow produced by the extended light source can be easily expressed. Although certain preferred embodiments of the present invention have been illustrated and described in detail, it will be understood that various changes and modifications may be made without departing from the scope of the appended claims.

Claims (40)

  1. 66 CLAIMS l.A method for processing an image, comprising the steps of: establishing at least one virtual plane (50) from the distribution of several objects (Obi, Ob2) generated by three-dimensional modeling; and expressing a shadow (56) of the first object (Obi) projected onto said virtual plane (50) by a light source (52) as a point of view, in the second object (0b2) that is furthest from the source of light. light (52) than the first object (Obi).
  2. 2. A method according to claim 1, further comprising the steps of: defining a shadow expression attribute as to whether the shadow (56) should be expressed in the objects (Obi, Ob2) in processing attributes of Light source of objects (Obi, 0b2); and seively expressing the shadow (56) in the second object (Ob2) based on said shadow expression attribute.
  3. 3. A method according to claim 1 or according to claim 2, further comprising the steps of: establishing a shadow plane (150) that serves as a texture plane corresponding to said virtual plane (50); represent the shadow (56) of the first object (Obi) formed 67 by projection onto said virtual plane (50) in said shadow plane (150); and applying the shadow (56) represented in said shadow plane (150) on the second object (0b2) through texture application.
  4. A method according to claim 3, wherein said step of applying the shadow (56) on the second object (Ob2) through texture application comprises the step of applying the shadow (56) on the second object (Ob2) through texture application based on projected coordinates of the second object (0b2) in said virtual plane (50).
  5. A method according to claim 3 or claim 4, wherein said step of applying the shadow (56) in the second object (0b2) through texture application comprises the step of applying the shadow ( 56) on the second object (Ob2) through texture application in relation to each of the polygons of the second object (Ob2).
  6. 6. A method according to any of claims 3 to 5, further comprising the steps of: determining coordinates of said objects (Obi, Ob2) with said light source (52) as a point of view; determine the projected coordinates of said objects (Obi, 0b2) in the virtual plane (50) successively in a 68 away from said light source (52); and representing the shadow (56) formed by the first object (Obi) in said shadow plane (150) based on said projected coordinates each time the texture application ends in one of the objects (Obi, 0b2).
  7. 7. A method according to claim 6, further comprising the steps of: determining the coordinates of said objects (Obi, Ob2) and the projected coordinates of said objects (Obi, Ob2) in the virtual plane (50) with respect to to each of the polygons of the objects (Obi, 0b2); registering the determined coordinates in a rendering list (116) successively in the direction away from said light source (52); and reading successively the coordinates recorded from said representation list (116) to represent the shadow (56) in said shadow plane (150).
  8. A method according to claim 6, or in accordance with claim 7, further comprising the step of: performing a low pass filtering in the shadow (56) represented in said shadow plane (150) according to minus the distance of said light source (52) to thereby apply a blur to said shadow (56) according to at least the distance of said light source (52). 69
  9. 9. A method according to claim 8, further comprising the step of: interpolating a shadow (56) represented in the shadow plane of generation (150) when it is expressed in the second object (0b2), in accordance with the representation according to the shadow before its submission to low pass filtration, the shadow after its subjection to low pass filtering, and the light source coordinates of the object to be processed, in order to control in this way the blur of said shadow (56).
  10. A method according to any of claims 6 to 9, further comprising: preparing a reference shadow plane (150) and generating shadow plane (152) as said shadow plane; each time the objects to be projected change from one to another, copy the shadow (56) represented in the generation shadow plane (152) in said reference shadow plane (150); and each time that the shadow (56) in said reference shadow plane (150) is applied through texture application with respect to each of the polygons of an object, represent a projected image (54) of the polygon in said virtual plane (50) as a new combined shadow (56) in said generation shadow plane (152).
  11. 11. A method according to claim 10, which further comprises the step of: each time the shadow (56) represented in said generation shadow plane (152) is copied onto said reference shadow plane (150), performing a low pass filtering in the shadow (56) represented in said generation shadow plane (152).
  12. 12. A method according to claim 11, further comprising the steps of: preparing, in addition to the reference shadow plane (150) and generation shadow plane (152) as said shadow plane, a shadow plane of background (192) which is a texture plane corresponding to a virtual background plane placed behind the object to be processed, with the light source (52) as a point of view; representing a shadow formed by the projection of a shadow (204) projected onto said virtual plane (50) in said virtual background plane, on said background shadow plane (192); and apply a shadow expressed in the object to be processed through texture application while the shadow is interpolated in accordance with the representation based on a shadow (204) represented in said reference shadow plane (150), the shadow (206) represented in said background shadow plane (192), and light source coordinates of said object. 71
  13. 13. A method according to any of claims 3 to 12, further comprising the steps of: establishing an extended light source (210) as the initial value for said shadow plane (150); and reflecting said extended light source (210) and forming a shadow thereof in said object.
  14. 14. An apparatus for processing an image, comprising: a first device (102) for establishing at least one virtual plane (50) from the distribution of several objects (Obi, Ob2) generated by three-dimensional modeling; and a second device (104) for expressing a shadow (56) of the first object (Obi) projected in said virtual plane (50) by a light source (52) as a point of view, in the second object (0b2) that is find farther from the light source (52) than the first object (Obi).
  15. 15. An apparatus according to claim 14, wherein said first device (102) comprises a device for defining a shadow expression attribute as to whether the shadow (56) is to be expressed in the objects (Obi, Ob2) in light source processing attributes of the objects (Obi, Ob2) and wherein said second device (104) comprises a device for selectively expressing the shadow (56) in the second object (Ob2) based on said expression attribute of shadow.
  16. 16. An apparatus according to claim 14 or 72 according to claim 15, wherein said second device (104) has a display device (106) to establish a shadow plane (150) that serves as a texture plane corresponding to said virtual plane (50), which represents the shadow (56) of the first object (Obi) formed by projection in said virtual plane (50) ) in said shadow plane (150), and applying the shadow (56) represented in said shadow plane (150) on the second object (0b2) through texture application.
  17. An apparatus according to claim 16, wherein said display device (106) comprises a device for applying the shadow (56) on the second object (0b2) through texture application based on projected coordinates of the second object (Ob2) in said virtual plane (50).
  18. 18. An apparatus according to claim 16 or 17, wherein said display device (106) comprises a device for applying the shadow (56) on the second object (Ob2) through texture application with respect to each of the polygons of the second object (Ob2).
  19. 19. An apparatus according to any of claims 16 to 18, wherein said second device (104) comprises a coordinate calculation device (138) for determining coordinates of said objects (Obi, 0b2) with said light source ( 152) as a point of view, and 73 determining the projected coordinates of said objects (Obi, 0b2) in the virtual plane (50) successively in a direction away from said light source (52), and wherein said representation device (106) comprises a device for representing the shadow (56) formed by the first object (Obi) on said shadow plane (150) based on said projected coordinates each time the texture application ends on one of the objects (Obi, Ob2).
  20. 20. An apparatus according to claim 19, wherein said second device (104) comprises a representation list generation device (142) for determining the coordinates of said objects (Obi, Ob2) and the projected coordinates of said objects (Obi, Ob2). , 0b2) in the virtual plane (50) in relation to each of the polygons of the objects (Obi, 0b2), and to register the determined coordinates in a rendering list (116) successively in the direction away from said light source (52), and wherein said display device (106) comprises a device for successively reading the coordinates recorded from said display list (116) to represent the shadow (56) in said shadow plane (150).
  21. 21. An apparatus according to claim 19 or 20, wherein said display device (106) comprises a device for performing a low pass filtering at 74 the shadow (56) represented in said shadow plane (150) according to at least the distance of said light source (52) to thereby apply a blur to said shadow (56) according to at least the distance of said source of light (52).
  22. 22. An apparatus according to claim 21, wherein said display device (106) comprises a device for interpolating the shadow (56) represented in the shadow plane of generation (150) when it is expressed in the second object (Ob2). ) in accordance with a representation that depends on the shadow before its submission to low-pass filtering, the shadow after its submission to low-pass filtering, and the light source coordinates of the object to be processed, to control in this way the blur of said shadow (56).
  23. 23. An apparatus according to any of claims 19 to 22, wherein said recording device (106) comprises a device for preparing a reference shadow plane (150) and a generation shadow plane (152) as said shadow plane, and each time the objects to be processed pass from one to another, copy the shadow (56) represented in the shadow plane of generation (152) in said reference shadow plane (150), and, each time that the shadow (56) in said reference shadow plane (150) is applied through texture application with respect to each of the polygons of an object, representing an projected image (54) of the polygon in said virtual plane (50) as a new combined shadow (56) in said generation shadow plane (152).
  24. 24. An apparatus according to claim 23, wherein said display device (106) comprises a device for each time that the shadow (56) represented in said generation shadow plane (152) is copied onto said plane of reference shadow (150), perform a low pass filtering in the shadow (56) represented in said generation shadow plane (152).
  25. 25. An apparatus according to claim 24, wherein said display device (106) comprises a device for preparing, in addition to the reference shadow plane (150) and the generation shadow plane (152) as said plane of shadow, a background shadow plane (192) which is a texture plane corresponding to a virtual background plane placed behind the object to be processed, with the light source (52) as a point of view, representing a shadow formed by the projection of a shadow (204) projected onto said virtual plane (50) onto said virtual background plane, onto said background shadow plane (192), and applying a shadow expressed in the object to be processed through texture application while the shadow is interpolated in accordance with the representation based on the shadow (204) represented in said reference shadow plane (150), the 76 shadow (206) represented in said background shadow plane (192), and the light source coordinates of said object.
  26. 26. An apparatus according to any of claims 16 to 25, wherein said display device (106) comprises a device for establishing an extended light source (210) as an initial value for said shadow plane (150), and for reflecting said extended light source (210) and to form a shadow thereof in said object.
  27. 27. A recording medium for storing a program comprising the steps of: (a) establishing at least one virtual plane (50) from the distribution of several objects (Obi, Ob2) generated by three-dimensional modeling; and (b) expressing a shadow (56) of the first object (Obi) projected in said virtual plane (50) by a light source (52) as a point of view, in the second object (Ob2) that is furthest away from the light source (52) than the first object (Obi).
  28. 28. A means of registration in accordance with the claim 27, wherein said step (a) comprises the step of defining a shadow expression attribute as to whether the shadow (56) is to be expressed in the objects (Obi, 0b2) in light source processing attributes of the objects (Obi, Ob2), and said step (b) comprises the step of selectively expressing 77 the shadow (56) in the second object (0b2) based on said shadow expression attribute.
  29. 29. A recording medium according to claim 27 or 28 wherein said step (b) comprises the steps of: (c) establishing a shadow plane (157) that serves as a texture plane corresponding to said virtual plane ( 50), represent the shadow (56) of the first object (Obi) formed by projection in said virtual plane (50) in said shadow plane (150) and apply the shadow (56) represented in said shadow plane (150) in the second object (0b2) through texture application.
  30. 30. A recording medium according to claim 29, wherein said steps (c) further comprise the step of applying the shadow (56) on the second object (0b2) through texture application based on projected coordinates of the second object (Ob2) in said virtual plane (50).
  31. 31. A recording medium according to claim 29 or 30, wherein said steps (c) further comprise the step of applying the shadow (56) on the second object (Ob2) through application and texture with respect to each one of the polygons of the second object (Obi).
  32. 32. A registration means according to any of claims 29 to 31 wherein said step (b) further comprises the steps of determining coordinates of said objects (Obi, 0b2) with said light source (52) as 78 point of view, and determine the projected coordinates of said objects (Obi, 0b2) in the virtual plane (50) successively in a direction away from said light source (52) and wherein said steps (c) further comprise the step of representing the shadow (56) formed by the first object (Obi) on said shadow plane (150) in said projected coordinates each time the texture application ends in one of the objects (Obi, Ob2).
  33. 33. A recording medium according to claim 32, wherein said step (b) further comprises the steps of determining the coordinates of said objects (Obi, Ob2) and the projected coordinates of said objects (Obi, Ob2) in the virtual plane (50) in relation to each of the polygons of the objects (Obi, Ob2), and registering the determined coordinates in a rendering list (116) successively in the direction away from said light source (52), and wherein said steps (c) further comprise the step of successively reading the coordinates recorded from said display list (116) to represent the shadow (56) in said shadow plane (150).
  34. 34. A recording medium according to claim 32 or 33, wherein said steps (c) further comprise the step of performing a low pass filtering in the shadow (56) represented in said shadow plane (150) according to at least the distance of said light source (52) to apply 79 in this way a blurring to said shadow (56) according to, at least, the distance of said light source (52).
  35. 35. A recording medium according to claim 34, wherein said steps (c) further comprise the step of interpolating the shadow (56) represented in the shadow plane of generation (150) when it is expressed in the second object ( Ob2), in accordance with a representation that depends on the shadow before its submission to low-pass filtering, the shadow, after its submission to low-pass filtering, and the light source coordinates of the object to be processed, to control in this way the blur of said shadow (56).
  36. 36. A recording medium according to any of claims 29 to 35, wherein said steps (c) further comprise the steps of preparing a reference shadow plane (150) and a generation shadow plane (152) as said shadow plane, and each time the objects to be processed change from one to another, copy the shadow (56) represented in the generation shadow plane (152) in said reference shadow plane (150), and, each once the shadow (56) in said reference shadow plane (150) is applied through texture application with respect to each of the polygons of an object, representing a projected image (54) of the polygon in said virtual plane (50) as a new combined shadow (56) in said plane of 80 shadow of generation (152).
  37. 37. A recording medium according to claim 36, wherein said steps (c) further comprise the step of, each time the shadow (56) represented in said generation shadow plane (152) is copied onto said plane of reference shadow (150), perform a low pass filtering in the shadow (156) represented in said generation shadow plane (152).
  38. 38. A recording medium according to claim 37, wherein said steps (c) further comprise the steps of preparing, in addition to the reference shadow plane (150) and generation shadow plane (152) as said plane of shadow, a background shadow plane (192) which is a texture plane corresponding to a virtual background plane placed behind the object to be processed, with the light source (52) as a point of view, representing a shadow formed by the projection of a shadow (204) projected in said virtual plane (50) in said virtual background plane, in said background shadow plane (192), and applying a shadow expressed in the object to be processed through texture application , while the shadow is interpolated in accordance with the representation based on the shadow (204) represented on said reference shadow plane (150), the shadow (206) represented on said background shadow plane (192), and the light source coordinates of said object.
  39. 39. A recording medium according to any of claims 29 to 38, wherein said steps (c) further comprise the steps of establishing an extended light source (210) as the initial value for said shadow plane (150), and reflecting said extended light source (210) and form a shadow thereof in said object.
  40. 40. A program comprising the steps of: (a) establishing at least one virtual plane (50) from the distribution of several objects (Obi, Ob2) generated by three-dimensional modeling; and (b) expressing a shadow (56) of the first object (Obi) projected in said virtual plan (50) by a light source (52) as a point of view, on the second object (0b2) furthest from the source of light (52) than the first object (Obi).
MXPA/A/2001/008800A 1999-03-08 2001-08-31 Method and apparatus for processing images MXPA01008800A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP11/60884 1999-03-08

Publications (1)

Publication Number Publication Date
MXPA01008800A true MXPA01008800A (en) 2002-05-09

Family

ID=

Similar Documents

Publication Publication Date Title
JP3599268B2 (en) Image processing method, image processing apparatus, and recording medium
US5592597A (en) Real-time image generation system for simulating physical paint, drawing media, and feature modeling with 3-D graphics
Winkenbach et al. Rendering parametric surfaces in pen and ink
Ebert et al. Texturing and modeling: a procedural approach
Westermann et al. Efficiently using graphics hardware in volume rendering applications
US7411592B1 (en) Graphical processing of object perimeter information
US7724258B2 (en) Computer modeling and animation of natural phenomena
US7081892B2 (en) Image with depth of field using z-buffer image data and alpha blending
US6717599B1 (en) Method, system, and computer program product for implementing derivative operators with graphics hardware
US7468730B2 (en) Volumetric hair simulation
Yuksel et al. Mesh colors
US20070139435A1 (en) Data structure for texture data, computer program product, and texture mapping method
US7173622B1 (en) Apparatus and method for generating 3D images
US6078333A (en) Images and apparatus for carrying out the method
US6784896B1 (en) Colorization of a gradient mesh
US7133052B1 (en) Morph map based simulated real-time rendering
EP1083521B1 (en) Method of and apparatus for rendering image, recording medium, and program
EP1844445A1 (en) Volumetric shadows for computer animation
KR100295709B1 (en) Spotlight characteristic forming method and image processor using the same
MXPA01008800A (en) Method and apparatus for processing images
Dobashi et al. Radiosity for point-sampled geometry
Papaioannou et al. Enhancing Virtual Reality Walkthroughs of Archaeological Sites.
Pastor et al. Graph-based point relaxation for 3d stippling
JPH07152925A (en) Image generating device
CN117292032A (en) Method and device for generating sequence frame and electronic equipment