EP1170701A2 - Image generating system - Google Patents

Image generating system Download PDF

Info

Publication number
EP1170701A2
EP1170701A2 EP01305716A EP01305716A EP1170701A2 EP 1170701 A2 EP1170701 A2 EP 1170701A2 EP 01305716 A EP01305716 A EP 01305716A EP 01305716 A EP01305716 A EP 01305716A EP 1170701 A2 EP1170701 A2 EP 1170701A2
Authority
EP
European Patent Office
Prior art keywords
color
viewpoint
shifted
relation
image generating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP01305716A
Other languages
German (de)
French (fr)
Other versions
EP1170701A3 (en
Inventor
Junya c/o Sony Computer Shimoda
Hiroshi c/o Sony Computer Yamamoto
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Publication of EP1170701A2 publication Critical patent/EP1170701A2/en
Publication of EP1170701A3 publication Critical patent/EP1170701A3/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation

Definitions

  • the present invention relates to image generating systems.
  • a 3D model is made of, for example, a plurality of surfaces. To draw contours in such a 3D model, it is necessary to draw a line between specific surfaces, namely between a surface that faces the front side and a surface that faces the back side, seen from a viewpoint.
  • the above-mentioned method first requires information on connection between surfaces, and further it is required to check if a side shared between the adjacent surfaces is a side where a contour should be drawn, based on the information on the connection between those surfaces.
  • Embodiments of the present invention can provide an image generating system that displays contours at high speed, and accordingly, can realize cel animation style display.
  • the present invention provides in one aspect an image generating system, comprising: a surface direction judging means for judging a direction of a surface constituting a three-dimensional model, in relation to a viewpoint; and a contour generating means for shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and for painting the shifted surface with a color that is darker than a color of the original surface.
  • Embodiments of the present invention relate to three dimensional computer graphics (CG), and, in particular, to an image generating system for displaying a three dimensional (3D) model in a cel animation style.
  • CG three dimensional computer graphics
  • 3D three dimensional
  • an aspect of the present invention provides a method of generating an image, comprising steps of: judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and painting the shifted surface with a color that is darker than a color of the original surface.
  • an aspect of the present invention provides a storage medium that stores an image generating program, and that program causes execution of steps of: judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and painting the shifted surface with a color that is darker than a color of the original surface.
  • the shifted surface may be generated with a different quantity of shift or a different color for each three dimensional model. And, for example, the shifted surface may be generated with a smaller quantity of shift and with a color closer to the color of the original surface, as the three dimensional model exists more distantly from a screen.
  • the above-mentioned program may be distributed by a portable storage medium such as a CD-ROM, a DVD-ROM, a memory card, etc. or through a network.
  • a portable storage medium such as a CD-ROM, a DVD-ROM, a memory card, etc. or through a network.
  • This entertainment apparatus executes an application such as a game provided from a storage medium such as a CD/DVD or through a network.
  • Fig. 1 is a block diagram showing a configuration of the entertainment apparatus according to the present invention.
  • this entertainment apparatus comprises an MPU 100, a graphics processor (GP) 110, an I/O processor (IOP) 120, a CD/DVD decoder 130, an SPU 140, an OSROM 150, a main memory 160, and an IOP memory 170.
  • the MPU 100 and the graphics processor 110 are connected with each other through a bus 101.
  • the MPU 100 and the IOP 120 are connected with each other through a bus 102.
  • the IOP 120, the CD/DVD decoder 130, the SPU 140, and the OSROM 150 are connected with a bus 103.
  • the MPU 100 is connected with the main memory 160, and the IOP 120 is connected with the IOP memory 170. Further, the IOP 120 is connected with a controller (PAD) 180.
  • PID controller
  • the MPU 100 is a main CPU of this entertainment apparatus.
  • the MPU 100 executes a program stored in the OSROM 150 or a program loaded onto the main memory 160 from a CD or DVD, to perform certain processing.
  • the graphics processor 110 is an image generating processor that realizes the rendering function of the present entertainment apparatus.
  • the graphics processor 110 performs image generation, on the instructions of the MPU 100.
  • the IOP 120 is an input-output sub processor that controls data transmission and reception between the MPU 100 and a peripheral device (the CD/DVD decoder 130, the SPU 140, or the like).
  • the CD/DVD decoder 130 reads data from a CD or DVD mounted in a drive, and transfers the data to the main memory 160.
  • the SPU 140 is a sound reproducing processor, and reproduces sound data (such as PCM data) stored in a sound buffer (not shown) at a predetermined sampling frequency, on a sound-producing instruction of the MPU 100.
  • the OSROM 150 is a ROM that stores programs executed by the MPU 100 and IOP 120 at the time of the starting.
  • the main memory 160 is the main memory for the MPU 100, and stores instructions executed by the MPU 100, data used by the MPU 100, and the like.
  • the IOF memory 170 is a main memory for the IOP 120, and stores instructions executed by the IOP 120, data used by the IOP 120, and the like.
  • the controller (PAD) 180 is an interface for transmitting a player's intention to an application or the like during execution of a game or the like.
  • Fig. 2 is a diagram showing internal structure of the graphics processor 110.
  • the graphics processor 110 comprises a host interface 200, an image generating function block 210, a local memory 220, and a CRTC part 230.
  • the host interface 200 is an interface for transmitting and receiving data to and from the MPU 100.
  • the image generating function block 210 is a logic circuit that performs rendering, on the instructions of the MPU 100.
  • the image generating functional block 210 comprises sixteen digital differential analyzers (DDA) and sixteen pixel engines, to parallelly process 16 pieces (at maximum) of 64 bits pixel data (32 bits of color information and 32 bits of a Z value).
  • the DDA calculates RGB values, a Z value, a texture value, etc. Based on such data, the pixel engine generates final pixel data.
  • the local memory 220 stores pixel data generated by the image generating function block 210, texture data transferred from the MPU 100, and the like.
  • the CRTC part 230 outputs contents of a frame buffer area in the local memory 220, as a picture signal of a designated output format (NTSC, PAL, VESA format, or the like).
  • Fig. 3 is a diagram showing structure of the local memory 220.
  • the local memory 220 comprises the frame buffer area 250, a Z-buffer area 260, a texture buffer area 270, and a texture CLUT area 280.
  • the frame buffer area 250 and the Z-buffer area 260 are object areas for image generation in the meaning that the frame buffer area 250 stores pixel data as a result of image generation, and the Z-buffer area 260 stores Z values as a result of the image generation.
  • the texture buffer area 270 stores image data of texture
  • the texture CLUT area 280 stores a color look-up table (CLUT) used when texture is an index color.
  • the areas 250 - 280 can be freely arranged at any addresses in any order in the local memory 220, by setting suitable values into a prescribed control register.
  • the hexagonal prism 40 consists of six surfaces 1 ⁇ - 6 ⁇ .
  • the figure shows contours of the polygon.
  • Fig. 5 is a table showing an example of data structure of a 3D model as shown in Pig. 4.
  • Fig. 5 shows data structure of a 3D model for a hexagonal prism having a diameter (width in the x-direction) 1 and height 1.
  • the x-axis is directed toward the right in the figure
  • the y-axis is directed in the protruding direction from the figure
  • the z-axis is directed upward in the figure.
  • the 3D model data includes (x, y, z) coordinates of vertices of the surfaces 1 ⁇ - 6 ⁇ that constitute the 3D model, and (x, y, z) components of a normal vector at each vertex.
  • a 3D model is accompanied with normal (vector) information that indicates directions of the surfaces constituting the model.
  • Gouraud shading is employed as a method of shading.
  • a normal existing at each vertex is an average of normals of the surfaces that abut the vertex in question.
  • Fig. 6 is a view showing a state of respective normal vectors at vertices of the hexagonal prism 40.
  • the MPU 100 transforms the coordinates of each vertex, based on the viewpoint or the like that corresponds to input from the controller 180, and instructs the graphics processor 110 to generate an image of each surface constituting the 3D model, while specifying color of each vertex, a shading method to be employed, and the like.
  • the present method utilizes the normal information of the 3D model and surfaces facing the back side seen from a viewpoint, in order to display the contours.
  • the surfaces that face the back side seen from the viewpoint are shifted in the directions of their normal vectors, respectively.
  • Fig. 7 is a view showing a state in which the surfaces facing the back side are shifted in the directions of their normals.
  • the surfaces 3 ⁇ , 4 ⁇ , and 5 ⁇ are surfaces facing the back side, and those surfaces 3 ⁇ , 4 ⁇ , and 5 ⁇ are shifted in their normal directions.
  • the shifted surfaces are painted with a suitable color (for example, black).
  • a suitable color for example, black.
  • This generates display, just as if the contours of the model were drawn.
  • the surfaces to be shown like the counters are painted not with black but with a color that is darker to some degree (for example, about 50 %) than the color of the original surfaces to be shown, contours of soft coloring can be displayed, realizing soft presentation.
  • Fig. 8 is a view showing a state in which the shifted surfaces are painted with a darker color.
  • hidden surfaces are removed by a rendering method using the Z-buffer.
  • the surfaces that are shifted in their normal directions and painted with the darker color are, in fact, painted only in their parts that extend out of the surfaces existing in the foreground.
  • display is generated as if the contours were drawn on the screen.
  • the present method generates the image according to the directions of the surfaces (directions of their normals), the widths of the contours depend on the directions of the surfaces.
  • Fig. 9 is a view showing how thicknesses of the contours change according to the direction of a surface. As seen in the figure, Fig. 9A shows the thicker contours than Fig. 9B. Thus, thicknesses of the contours included in one object are not uniform. As a result, it is possible to generate lines of various thicknesses, looking like handwritten lines.
  • the present method can change coloring and thicknesses of contours by simple calculation, and thus, real-time control can be realized.
  • Fig. 10 is a flowchart showing the flow of the above-described image generation. This processing is executed by the MPU 100.
  • one surface is selected as an object of image processing out of the surfaces constituting a 3D model as an object of image processing, and calculates a direction of that surface (S1000).
  • an instruction is given to the graphics processor 110 to simply generate an image of that surface (S1006), and, in addition, an instruction is given to the graphics processor 110 to generate an image of a surface used for displaying contours (S1008). Namely, instructions are given to the graphics processor 110 to generate a surface defined by vertices obtained by adding (normals ⁇ the thickness coefficient of the lines) to the vertices defining the surface as the present object of the image processing, and to paint the generated surface with a color obtained by (the color of the original surface ⁇ the color coefficient of the lines).
  • Fig. 11 is a view showing an example of an image generated by the present method of image generation. As shown in the figure, a character is displayed, being accompanied with contours, and thus, giving a display screen like a cel animation.
  • every calculation can be performed for one surface at a time, without requiring consideration of another surface.
  • a storage area for expressing correlation between surfaces is not necessary, and it is not required to check a relation between surfaces. Accordingly, the present method of image generation can realize high speed generation of contours.
  • the present invention can realize high speed display as if accompanied by contours. Accordingly, real-time three dimensional CG animation in a cel animation style can be realized, for example.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present invention can display contours at high speed, and realize real-time three dimensional CG animation in a cel animation style. Among the surfaces constituting a 3D model, surfaces facing the back side seen from the viewpoint are shifted in their normal directions and painted with a darker color than the original surfaces. To this end, an image generating system is provided with a surface judging means for judging an image generating system, comprising:
  • a surface direction judging means for judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and
  • a contour generating means for shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and for painting the shifted surface with a color that is darker than a color of the original surface.
  • Figure 00000001

    Description

    The present invention relates to image generating systems.
    Owing to the recent improvement in hardware performance, even a consumer video game machine can generate images of three dimensional CG (Computer Graphics) in real time, realizing a three dimensional CC animation.
    Further, according to contents of a game or the like, it is sometimes desired to display such a three dimensional CG animation in a conventional ceI animation style. In a cel animation, an image is generated by drawing lines and painting the inside of each drawing, and, as a result, a character or the like has contours. However, according to ordinary three dimensional CG, contours are not drawn, Thus, to realize display in a cel animation style, contours should be drawn in a 3D model.
    A 3D model is made of, for example, a plurality of surfaces. To draw contours in such a 3D model, it is necessary to draw a line between specific surfaces, namely between a surface that faces the front side and a surface that faces the back side, seen from a viewpoint.
    However, the above-mentioned method first requires information on connection between surfaces, and further it is required to check if a side shared between the adjacent surfaces is a side where a contour should be drawn, based on the information on the connection between those surfaces.
    Accordingly, when the number of polygons becomes larger, more calculations are required that the calculations can not be processed in real time.
    Various aspects and features of the present invention are defined in the appended claims.
    Embodiments of the present invention can provide an image generating system that displays contours at high speed, and accordingly, can realize cel animation style display.
    The present invention provides in one aspect an image generating system, comprising: a surface direction judging means for judging a direction of a surface constituting a three-dimensional model, in relation to a viewpoint; and a contour generating means for shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and for painting the shifted surface with a color that is darker than a color of the original surface.
    Embodiments of the present invention relate to three dimensional computer graphics (CG), and, in particular, to an image generating system for displaying a three dimensional (3D) model in a cel animation style.
    Further, an aspect of the present invention provides a method of generating an image, comprising steps of: judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and painting the shifted surface with a color that is darker than a color of the original surface.
    Further, an aspect of the present invention provides a storage medium that stores an image generating program, and that program causes execution of steps of: judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and painting the shifted surface with a color that is darker than a color of the original surface.
    In the above-mentioned cases, the shifted surface may be generated with a different quantity of shift or a different color for each three dimensional model. And, for example, the shifted surface may be generated with a smaller quantity of shift and with a color closer to the color of the original surface, as the three dimensional model exists more distantly from a screen.
    The above-mentioned program may be distributed by a portable storage medium such as a CD-ROM, a DVD-ROM, a memory card, etc. or through a network.
    The invention will now be described by way of example with reference to the accompanying drawings, throughout which like parts are referred to by like references, and in which:
  • Fig. 1 is a block diagram showing a configuration of an entertainment apparatus to which the present invention is applied;
  • Fig. 2 is a diagram showing internal structure of a graphics processor 110;
  • Fig. 3 is a diagram showing structure of a local memory 220;
  • Fig. 4 is a view showing an example of a 3D model;
  • Fig. 5 is a table showing data structure of the 3D model;
  • Fig. 6 is a view showing a state of respective normal vectors at vertices of the 3D model;
  • Fig. 7 is a view showing a state in which surfaces facing the back side are shifted in the directions of the normals;
  • Fig. 8 is a view showing a state in which shifted surface are painted with a darker color;
  • Figs. 9A and 9B are views showing how thicknesses of the contours change according to the direction of a surface;
  • Fig. 10 is a flowchart showing the flow of image generation for a 3D model; and
  • Fig. 11 is a view showing an example of an image generated by the image generating method according to the present invention.
  • Now, embodiments of the present invention will be described in detail referring to the drawings.
    First, will be described an entertainment apparatus according to the present invention. This entertainment apparatus executes an application such as a game provided from a storage medium such as a CD/DVD or through a network.
    Fig. 1 is a block diagram showing a configuration of the entertainment apparatus according to the present invention.
    As shown in the figure, this entertainment apparatus comprises an MPU 100, a graphics processor (GP) 110, an I/O processor (IOP) 120, a CD/DVD decoder 130, an SPU 140, an OSROM 150, a main memory 160, and an IOP memory 170.
    The MPU 100 and the graphics processor 110 are connected with each other through a bus 101. The MPU 100 and the IOP 120 are connected with each other through a bus 102. Further, the IOP 120, the CD/DVD decoder 130, the SPU 140, and the OSROM 150 are connected with a bus 103.
    Further, the MPU 100 is connected with the main memory 160, and the IOP 120 is connected with the IOP memory 170. Further, the IOP 120 is connected with a controller (PAD) 180.
    The MPU 100 is a main CPU of this entertainment apparatus. The MPU 100 executes a program stored in the OSROM 150 or a program loaded onto the main memory 160 from a CD or DVD, to perform certain processing.
    The graphics processor 110 is an image generating processor that realizes the rendering function of the present entertainment apparatus. The graphics processor 110 performs image generation, on the instructions of the MPU 100.
    The IOP 120 is an input-output sub processor that controls data transmission and reception between the MPU 100 and a peripheral device (the CD/DVD decoder 130, the SPU 140, or the like).
    The CD/DVD decoder 130 reads data from a CD or DVD mounted in a drive, and transfers the data to the main memory 160.
    The SPU 140 is a sound reproducing processor, and reproduces sound data (such as PCM data) stored in a sound buffer (not shown) at a predetermined sampling frequency, on a sound-producing instruction of the MPU 100.
    The OSROM 150 is a ROM that stores programs executed by the MPU 100 and IOP 120 at the time of the starting.
    The main memory 160 is the main memory for the MPU 100, and stores instructions executed by the MPU 100, data used by the MPU 100, and the like.
    The IOF memory 170 is a main memory for the IOP 120, and stores instructions executed by the IOP 120, data used by the IOP 120, and the like.
    The controller (PAD) 180 is an interface for transmitting a player's intention to an application or the like during execution of a game or the like.
    Fig. 2 is a diagram showing internal structure of the graphics processor 110. As shown in the figure, the graphics processor 110 comprises a host interface 200, an image generating function block 210, a local memory 220, and a CRTC part 230.
    The host interface 200 is an interface for transmitting and receiving data to and from the MPU 100.
    The image generating function block 210 is a logic circuit that performs rendering, on the instructions of the MPU 100. The image generating functional block 210 comprises sixteen digital differential analyzers (DDA) and sixteen pixel engines, to parallelly process 16 pieces (at maximum) of 64 bits pixel data (32 bits of color information and 32 bits of a Z value). The DDA calculates RGB values, a Z value, a texture value, etc. Based on such data, the pixel engine generates final pixel data.
    The local memory 220 stores pixel data generated by the image generating function block 210, texture data transferred from the MPU 100, and the like.
    The CRTC part 230 outputs contents of a frame buffer area in the local memory 220, as a picture signal of a designated output format (NTSC, PAL, VESA format, or the like).
    Fig. 3 is a diagram showing structure of the local memory 220. As shown in the figure, the local memory 220 comprises the frame buffer area 250, a Z-buffer area 260, a texture buffer area 270, and a texture CLUT area 280.
    The frame buffer area 250 and the Z-buffer area 260 are object areas for image generation in the meaning that the frame buffer area 250 stores pixel data as a result of image generation, and the Z-buffer area 260 stores Z values as a result of the image generation.
    The texture buffer area 270 stores image data of texture, and the texture CLUT area 280 stores a color look-up table (CLUT) used when texture is an index color.
    Here, the areas 250 - 280 can be freely arranged at any addresses in any order in the local memory 220, by setting suitable values into a prescribed control register.
    Next, will be described image generation of a 3D model that is performed by the entertainment apparatus having the above-described structure. An application performs image generation based on a 3D model that is stored in a CD/DVD if necessary. In the following, description is given with respect to a case where the present invention is applied to a polygon model. However, the present invention can be applied just similarly to a spline model.
    In the following, is discussed a case where contours are displayed for a hexagonal prism shown in Fig. 4. As shown in the figure, the hexagonal prism 40 consists of six surfaces 1 ○ - 6 ○. Here, for the sake of clarification, the figure shows contours of the polygon.
    Fig. 5 is a table showing an example of data structure of a 3D model as shown in Pig. 4. Fig. 5 shows data structure of a 3D model for a hexagonal prism having a diameter (width in the x-direction) 1 and height 1. Here, it is assumed that the x-axis is directed toward the right in the figure, the y-axis is directed in the protruding direction from the figure, and the z-axis is directed upward in the figure.
    As shown in the figure, the 3D model data includes (x, y, z) coordinates of vertices of the surfaces 1 ○ - 6 ○ that constitute the 3D model, and (x, y, z) components of a normal vector at each vertex.
    Thus, generally, a 3D model is accompanied with normal (vector) information that indicates directions of the surfaces constituting the model. Here, Gouraud shading is employed as a method of shading. Accordingly, a normal existing at each vertex is an average of normals of the surfaces that abut the vertex in question. Fig. 6 is a view showing a state of respective normal vectors at vertices of the hexagonal prism 40.
    When such a 3D model is to be displayed on a display screen, the MPU 100 transforms the coordinates of each vertex, based on the viewpoint or the like that corresponds to input from the controller 180, and instructs the graphics processor 110 to generate an image of each surface constituting the 3D model, while specifying color of each vertex, a shading method to be employed, and the like.
    Next, will be described the method of generating contours according to a preferred embodiment. The present method utilizes the normal information of the 3D model and surfaces facing the back side seen from a viewpoint, in order to display the contours.
    First, in the present method, the surfaces that face the back side seen from the viewpoint are shifted in the directions of their normal vectors, respectively.
    Fig. 7 is a view showing a state in which the surfaces facing the back side are shifted in the directions of their normals. As shown in the figure, in this example, the surfaces 3 ○, 4 ○, and 5 ○ are surfaces facing the back side, and those surfaces 3 ○, 4 ○, and 5 ○ are shifted in their normal directions.
    Then, the shifted surfaces are painted with a suitable color (for example, black). This generates display, just as if the contours of the model were drawn. When the surfaces to be shown like the counters are painted not with black but with a color that is darker to some degree (for example, about 50 %) than the color of the original surfaces to be shown, contours of soft coloring can be displayed, realizing soft presentation.
    Fig. 8 is a view showing a state in which the shifted surfaces are painted with a darker color. In the present embodiment, hidden surfaces are removed by a rendering method using the Z-buffer. As a result, the surfaces that are shifted in their normal directions and painted with the darker color are, in fact, painted only in their parts that extend out of the surfaces existing in the foreground. Thus, display is generated as if the contours were drawn on the screen.
    Further, since the present method generates the image according to the directions of the surfaces (directions of their normals), the widths of the contours depend on the directions of the surfaces.
    Fig. 9 is a view showing how thicknesses of the contours change according to the direction of a surface. As seen in the figure, Fig. 9A shows the thicker contours than Fig. 9B. Thus, thicknesses of the contours included in one object are not uniform. As a result, it is possible to generate lines of various thicknesses, looking like handwritten lines.
    Further, it is possible also to intentionally change thickness of contours, by multiplying values of normal vectors, which are used for shifting surfaces in the directions of their normals, by a predetermined coefficient. For example, multiplication by 2 makes the thicknesses of the contours twofold, and multiplication by 0.5 makes the thicknesses half.
    In other words, the present method can change coloring and thicknesses of contours by simple calculation, and thus, real-time control can be realized.
    The above-described techniques can be used in such a manner that, for example, a 3D model existing more distantly from the screen has thinner contours and has a color closer to the original surface color, As a result, it is possible to provide expression in which the more distant a 3D model is, the more it matches the background.
    Fig. 10 is a flowchart showing the flow of the above-described image generation. This processing is executed by the MPU 100.
    First, one surface is selected as an object of image processing out of the surfaces constituting a 3D model as an object of image processing, and calculates a direction of that surface (S1000).
    Next, it is judged if the surface, i.e., the object of the image processing, faces toward the screen (viewpoint) or not (S1002).
    As a result, when the surface as the object of the image processing faces toward the screen (S1002: YES), then, an instruction is given to the graphics processor 110 to simply generate an image of that surface (S1004).
    On the other hand, when the surface as the object of the image processing faces in the opposite direction to the screen (S1002: NO), then, an instruction is given to the graphics processor 110 to simply generate an image of that surface (S1006), and, in addition, an instruction is given to the graphics processor 110 to generate an image of a surface used for displaying contours (S1008). Namely, instructions are given to the graphics processor 110 to generate a surface defined by vertices obtained by adding (normals × the thickness coefficient of the lines) to the vertices defining the surface as the present object of the image processing, and to paint the generated surface with a color obtained by (the color of the original surface × the color coefficient of the lines).
    Here, when all (or almost of) 3D models have closed shapes (i.e., shapes whose inner surfaces can not be seen), the simple image generation processing S1006 of the surfaces facing backward can be omitted.
    Performing the above-described processing for every surface constituting the 3D model, a resultant image having contours can be obtained.
    Fig. 11 is a view showing an example of an image generated by the present method of image generation. As shown in the figure, a character is displayed, being accompanied with contours, and thus, giving a display screen like a cel animation.
    According to the above-described method of image generation, every calculation can be performed for one surface at a time, without requiring consideration of another surface. Thus, a storage area for expressing correlation between surfaces is not necessary, and it is not required to check a relation between surfaces. Accordingly, the present method of image generation can realize high speed generation of contours.
    As described above in detail, the present invention can realize high speed display as if accompanied by contours. Accordingly, real-time three dimensional CG animation in a cel animation style can be realized, for example.
    In so far as the embodiments of the invention described above are implemented, at least in part, using software-controlled data processing apparatus, it will be appreciated that a computer program providing such software control and a storage medium by which such a computer program is stored are envisaged as aspects of the present invention.
    Various different aspects and features of the present invention are defined in the appended claims. Combinations of features from the dependent claims may be combined with features of the independent claims as appropriate and not merely as explicitly set out in the claims.

    Claims (9)

    1. An image generating system, comprising:
      a surface direction judging means for judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and
      a contour generating means for shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and for painting the shifted surface with a color that is darker than a color of the original surface.
    2. The image generating system according to Claim 1, wherein:
      said contour generating means can generate said shifted surface with a different quantity of shift for each three dimensional model.
    3. The image generating system according to Claim 1, wherein:
      said contour generating means can paint said shifted surface with a different color for each three dimensional model.
    4. The image generating system according to Claim 1, wherein:
      said contour generating means can generate said shifted surface with a smaller quantity of shift and with a color closer to the color of the original surface, as the three dimensional model exists more distantly from a screen.
    5. The image generating system according to Claim 2, wherein:
      said contour generating means can generate said shifted surface with a smaller quantity of shift and with a color closer to the color of the original surface, as the three dimensional model exists more distantly from a screen.
    6. The image generating system according to Claim 3, wherein:
      said contour generating means can generate said shifted surface with a smaller quantity of shift and with a color closer to the color of the original surface, as the three dimensional model exists more distantly from a screen.
    7. A method of generating an image, comprising steps of:
      judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and
      shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and painting the shifted surface with a color that is darker than a color of the original surface.
    8. A storage medium that stores an image generating program, wherein said program causes a computer, which has read said program, to execute processes of:
      judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and
      giving instructions of shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and of painting the shifted surface with a color that is darker than a color of the original surface.
    9. A computer program for causing a computer, which has read said program, to execute processes of:
      judging a direction of a surface constituting a three dimensional model, in relation to a viewpoint; and
      giving instructions of shifting an original surface that faces a back side in relation to the viewpoint, in a direction of a normal, and of painting the shifted surface with a color that is darker than a color of the original surface.
    EP01305716A 2000-07-03 2001-07-02 Image generating system Withdrawn EP1170701A3 (en)

    Applications Claiming Priority (2)

    Application Number Priority Date Filing Date Title
    JP2000200784 2000-07-03
    JP2000200784 2000-07-03

    Publications (2)

    Publication Number Publication Date
    EP1170701A2 true EP1170701A2 (en) 2002-01-09
    EP1170701A3 EP1170701A3 (en) 2004-11-10

    Family

    ID=18698595

    Family Applications (1)

    Application Number Title Priority Date Filing Date
    EP01305716A Withdrawn EP1170701A3 (en) 2000-07-03 2001-07-02 Image generating system

    Country Status (2)

    Country Link
    US (1) US6914603B2 (en)
    EP (1) EP1170701A3 (en)

    Cited By (2)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    EP1249791A2 (en) * 2001-03-29 2002-10-16 Konami Computer Entertainment Osaka, Inc. 3-D game image processing method and device for drawing border lines
    EP2186062A1 (en) * 2007-08-02 2010-05-19 Disney Enterprises, Inc. Surface shading of computer-generated object using multiple surfaces

    Families Citing this family (4)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    JP2006331503A (en) * 2005-05-24 2006-12-07 Funai Electric Co Ltd Optical disk recording/reproducing device
    US20070011617A1 (en) * 2005-07-06 2007-01-11 Mitsunori Akagawa Three-dimensional graphical user interface
    KR100727034B1 (en) 2005-12-09 2007-06-12 한국전자통신연구원 Method for representing and animating 2d humanoid character in 3d space
    US9171396B2 (en) * 2010-06-30 2015-10-27 Primal Space Systems Inc. System and method of procedural visibility for interactive and broadcast streaming of entertainment, advertising, and tactical 3D graphical information using a visibility event codec

    Citations (1)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    US5966134A (en) * 1996-06-28 1999-10-12 Softimage Simulating cel animation and shading

    Family Cites Families (14)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    JP3330841B2 (en) 1997-04-30 2002-09-30 シャープ株式会社 Three-dimensional image generation method and apparatus
    JP3372832B2 (en) * 1997-07-25 2003-02-04 コナミ株式会社 GAME DEVICE, GAME IMAGE PROCESSING METHOD, AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING GAME IMAGE PROCESSING PROGRAM
    JP3804328B2 (en) 1999-03-02 2006-08-02 株式会社セガ Image processing apparatus and image processing method
    JP2000250194A (en) 1999-03-03 2000-09-14 Lpl:Kk Black box and black box kit
    JP2000331175A (en) 1999-05-19 2000-11-30 Sony Computer Entertainment Inc Method and device for generating border line generating data, recording system, computer readable execution medium stored with data and entertainment system for adding outline to object according to data
    JP3262541B2 (en) 1999-07-08 2002-03-04 株式会社伊藤製作所 Hanging equipment such as safety nets and seats
    JP3352982B2 (en) 1999-09-14 2002-12-03 株式会社スクウェア Rendering method and device, game device, and computer-readable recording medium for storing program for rendering three-dimensional model
    JP3647691B2 (en) 1999-09-30 2005-05-18 コナミ株式会社 Polygon image display method, polygon image creation apparatus, and recording medium
    JP2001243493A (en) 1999-12-24 2001-09-07 Sony Computer Entertainment Inc Method and device for image plotting, recording medium, and program
    JP3807654B2 (en) 1999-12-28 2006-08-09 株式会社スクウェア・エニックス Computer-readable recording medium having recorded video game program, object drawing method in video game, and video game apparatus
    JP3604312B2 (en) 1999-12-28 2004-12-22 株式会社スクウェア・エニックス Computer-readable recording medium having recorded video game program, object drawing method in video game, and video game apparatus
    JP2001202530A (en) 2000-01-24 2001-07-27 Dainippon Printing Co Ltd Method for plotting contour of three-dimensional computer graphics
    JP3732386B2 (en) 2000-05-25 2006-01-05 大日本印刷株式会社 Method for creating outline of 3D computer graphics
    JP2002056404A (en) 2000-08-09 2002-02-22 Taito Corp Border line display method for polygon model in three- dimensional cg video game

    Patent Citations (1)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    US5966134A (en) * 1996-06-28 1999-10-12 Softimage Simulating cel animation and shading

    Non-Patent Citations (2)

    * Cited by examiner, † Cited by third party
    Title
    MARTIN D ET AL: "Alhambra: a system for producing 2D animation" COMPUTER ANIMATION, 1999. PROCEEDINGS GENEVA, SWITZERLAND 26-29 MAY 1999, LOS ALAMITOS, CA, USA,IEEE COMPUT. SOC, US, 26 May 1999 (1999-05-26), pages 38-47, XP010343896 ISBN: 0-7695-0167-2 *
    RASKAR R ET AL: "IMAGE PRECISION SILHOUETTE EDGES" PROCEEDINGS OF THE 1999 SYMPOSIUM ON INTERACTIVE 3D GRAPHICS. ATLANTA, GA, APRIL 26 - 28, 1999, PROCEEDINGS OF THE SYMPOSIUM ON INTERACTIVE 3D GRAPHICS, NEW YORK, NY : ACM, US, 26 April 1999 (1999-04-26), pages 135-140,231, XP001032571 ISBN: 1-58113-082-1 *

    Cited By (6)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    EP1249791A2 (en) * 2001-03-29 2002-10-16 Konami Computer Entertainment Osaka, Inc. 3-D game image processing method and device for drawing border lines
    EP1249791A3 (en) * 2001-03-29 2002-12-04 Konami Computer Entertainment Osaka, Inc. 3-D game image processing method and device for drawing border lines
    US6831639B2 (en) 2001-03-29 2004-12-14 Konami Computer Entertainment Osaka, Inc. Computer readable storage medium storing 3-D game image processing program, 3-D game image processing method, video game machine, and 3-D game image processing program
    EP2186062A1 (en) * 2007-08-02 2010-05-19 Disney Enterprises, Inc. Surface shading of computer-generated object using multiple surfaces
    EP2186062A4 (en) * 2007-08-02 2013-01-16 Disney Entpr Inc Surface shading of computer-generated object using multiple surfaces
    US8797320B2 (en) 2007-08-02 2014-08-05 Disney Enterprises, Inc. Surface shading of computer-generated object using multiple surfaces

    Also Published As

    Publication number Publication date
    EP1170701A3 (en) 2004-11-10
    US20020033822A1 (en) 2002-03-21
    US6914603B2 (en) 2005-07-05

    Similar Documents

    Publication Publication Date Title
    US6580430B1 (en) Method and apparatus for providing improved fog effects in a graphics system
    US7034828B1 (en) Recirculating shade tree blender for a graphics system
    US6980218B1 (en) Method and apparatus for efficient generation of texture coordinate displacements for implementing emboss-style bump mapping in a graphics rendering system
    EP1189173B1 (en) Achromatic lighting in a graphics system and method
    US20120249566A1 (en) Floating point computer system with frame buffer for storing color values during or after rasterization
    US6738061B2 (en) Method, apparatus, storage medium, program, and program product for generating image data of virtual three-dimensional space
    WO2005109345A1 (en) Display, displaying method, information recording medium, and program
    US7479961B2 (en) Program, information storage medium, and image generation system
    EP1312047B1 (en) Apparatus and method for rendering antialiased image
    US6914603B2 (en) Image generating system
    WO2006024873A2 (en) Image rendering
    US6828969B2 (en) Game system, program and image generating method
    EP1081654A2 (en) Method and apparatus for providing depth blur effects within a 3d videographics system
    US7796132B1 (en) Image generation system and program
    JP4159082B2 (en) Image generation system, program, and information storage medium
    JP3467259B2 (en) GAME SYSTEM, PROGRAM, AND INFORMATION STORAGE MEDIUM
    JP3474179B2 (en) Image drawing system
    US20070115279A1 (en) Program, information storage medium, and image generation system
    US7724255B2 (en) Program, information storage medium, and image generation system
    JP4391632B2 (en) Image generation system and information storage medium
    Nagy Real-time shadows on complex objects
    JP2006004364A (en) Program, information storage medium and image generation system
    JP2006244011A (en) Program, information storage medium and image generation system

    Legal Events

    Date Code Title Description
    PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

    Free format text: ORIGINAL CODE: 0009012

    AK Designated contracting states

    Kind code of ref document: A2

    Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

    AX Request for extension of the european patent

    Free format text: AL;LT;LV;MK;RO;SI

    PUAL Search report despatched

    Free format text: ORIGINAL CODE: 0009013

    AK Designated contracting states

    Kind code of ref document: A3

    Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR

    AX Request for extension of the european patent

    Extension state: AL LT LV MK RO SI

    RIC1 Information provided on ipc code assigned before grant

    Ipc: 7G 06T 15/70 A

    AKX Designation fees paid
    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: 8566

    STAA Information on the status of an ep patent application or granted ep patent

    Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

    18D Application deemed to be withdrawn

    Effective date: 20050511