EP1190391A1 - Verfahren und vorrichtung zur konturengenerierung - Google Patents

Verfahren und vorrichtung zur konturengenerierung

Info

Publication number
EP1190391A1
EP1190391A1 EP00927771A EP00927771A EP1190391A1 EP 1190391 A1 EP1190391 A1 EP 1190391A1 EP 00927771 A EP00927771 A EP 00927771A EP 00927771 A EP00927771 A EP 00927771A EP 1190391 A1 EP1190391 A1 EP 1190391A1
Authority
EP
European Patent Office
Prior art keywords
data
generating
outline
true
computer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP00927771A
Other languages
English (en)
French (fr)
Inventor
Kazutoshi Nakashima
Tomozo Sony Computer Entertainment Inc. Tamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Publication of EP1190391A1 publication Critical patent/EP1190391A1/de
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Definitions

  • the present invention relates to a technique for adding an outline to an object in computer graphics.
  • Description of the Related Art Conventionally, drawing methods using polygons have been used for generating virtual three-dimensional images in a simpler fashion. Particularly, in game systems, the above image generating method using polygons is employed due to the reason that three-dimensional images must be generated instantaneously in accordance with player operation information. It is known that the image generating method using polygons is carried out following steps such as:
  • Step 1 An object is divided into polygon areas, thereby forming object data of a collection of polygon apex data (x, y, z).
  • Step 2 The pattern, i.e., texture data is set for the above object data.
  • Step 3 The point of view from the player is set based on the operating information, and the coordinates values for each piece of polygon apex data in the above object data is changed based on this point of view.
  • Step 4 Brightness according to the distance from the set light source is set for each piece of polygon apex data.
  • Step 5 The determined polygon apex data is replaced with two-dimensional data in predetermined units, e.g., every frame. This is generally referred to as perspective conversion.
  • Step 6 The data following perspective conversion is two-dimensional polygon data, and the coordinate values are only x and y.
  • Step 7 Texture data is appropriated to each of the above polygons, i.e., color and patterns are set to the corresponding memory area indicated by the polygon data. This is generally called texture mapping.
  • Step 8 Image data stored in the memory is read out, and displayed on a television monitor.
  • a three-dimensional image can be obtained with simple processing, by using the above process.
  • adding outlines to objects using polygons is not being practiced as of now. The reason is that, in order to add outlines to objects using polygons, the edge of each polygon, i.e., the boundary line for each polygon area must be detected. Inserting this edge detecting into the above step makes the amount of processing to be very great, which would cause deterioration in response to the player's operation, and render the game impractical.
  • the present invention has been made in order to achieve the above object, and has the following invention specifying features.
  • the outline generating data generating method comprises a step for generating an object for generating an outline, which is in a homothetic relation with a true object regarding which normal vectors are set outside of self, and which is larger than the true object, based on input parameters.
  • the object for generating an outline may have inward-pointing normal vectors set thereto, and the parameters may comprise at least normal vector data and data indicating size regarding the true object.
  • the outline generating object generating apparatus comprises: parameter input means for inputting at least normal vector data, and data indicating size regarding a true object regarding which normal vectors are set outside of self; and outline generating object data generating means for generating an object for generating an outline, which is in a homothetic relation with a true object regarding which normal vectors are set outside of self, and which is larger than the true object, based on the data from the parameter input means.
  • the outline generating object recording system comprises: parameter input means for inputting at least normal vector data, and data indicating size regarding a true object regarding which normal vectors are set outside of self; outline generating object data generating means for generating an object for generating an outline, which is in a homothetic relation with a true object regarding which normal vectors are set outside of self, and which is larger than the true object, based on the data from the parameter input means; and authoring means for recording the true object data and the outline generating object data to a master for generating computer-readable computer-executable media.
  • the computer-readable computer-executable medium for generating the outline generating object stores the following data in a computer-readable and computer-executable manner: true object data regarding which normal vectors are set outside of self; object data for generating an outline, which is in a homothetic relation with the true object, and which is larger than the true object; and program data for using the object data.
  • the entertainment system comprises: reading means for reading data from a computer-readable computer-executable medium; operating means for inputting various types of operating information; a CPU for performing processing based on program data read from the computer-readable computer-executable medium and operation information input from the operating means; graphic processing means for generating images based on commands from the CPU; and output means for displaying output images from the graphic processing means on a television monitor; wherein the graphic processing means generates an outline for the true object, based on true object data regarding which normal vectors are set outside of self, and object data for generating an outline, which is in a homothetic relation with the true object, and which is larger than the true object, the data having been stored in the computer-readable computer-executable medium, thus adding outlines to objects based on outline generating data.
  • Figs. 1A-1F are conceptual diagrams for describing coloring based on normal lines, wherein;
  • Fig. 1A is an explanatory diagram illustrating outward-facing normal lines
  • Fig. IB is an explanatory diagram illustrating coloring based on outward- facing normal lines
  • Fig. 1C is an explanatory diagram illustrating inward-facing normal lines
  • Fig. ID is an explanatory diagram illustrating coloring based on inward- facing normal lines
  • Fig. IE is a development illustrating the outer plane of the cube
  • Fig. IF is a development illustrating the inner plane of the cube
  • Fig. 2 is a conceptual diagram illustrating the formation of an outline, wherein; Fig. 2A is an explanatory diagram illustrating a true object OBI set with normal lines facing outwards;
  • Fig. 2B is an explanatory diagram illustrating an object OB2 set with normal lines facing inwards for forming an outline greater than the true object OBI;
  • Fig. 2C is an explanatory diagram illustrating that the outer plane of the object OB2 set with normal lines facing inwards is not colored.
  • Fig. 2D is an explanatory diagram illustrating the object OB2 which has become an outline regarding the difference with the object OBI and thus colored
  • Fig. 3 is an explanatory diagram illustrating an example of an object drawn using the method shown in Fig. 2
  • Fig. 4 is a block diagram illustrating a system for generating an outline generating object and recording this in a computer-readable computer-executable medium
  • Figs. 5A-5B are diagrams illustrating an example of the data structure of an object to which is added an outline, recorded by the system shown in Fig. 4, wherein; Fig. 5A is a format diagram illustrating true object data; and
  • Fig. 5B is a format diagram illustrating outline generating object data
  • Fig. 6 is a block diagram illustrating an example of an entertainment system for displaying an object with an outline shown in Fig. 5 added thereto.
  • the crux of the present invention is in that a true object to which an outline is to be added, and an object for adding the outline which is in a homothetic relation with the true object and is larger than the true object, are overlaid, and further the directions of the normal vectors are made to be different for these objects.
  • Fig. 1A shows a certain object.
  • outwards-facing normal vectors N are set to the visible planes al, bl, and cl of this object.
  • the term outwards-facing means facing outwards from the object, and thus facing toward the player, i.e., the point of view.
  • Fig. IB color is applied to the outer planes al, bl, and cl of this object to which outwards-facing normal vectors N are set.
  • the object is not transmitting, and accordingly, the inside thereof is not visible. It is like looking at a cubic box that has been painted solid with red paint, for example.
  • inwards-facing normal vectors N are set to the visible planes al, bl, and cl of the object.
  • the term inwards-facing means facing inwards to the object, and thus facing away from the player, i.e., the point of view.
  • color is applied to the inner planes a2, b2, c2, d2, e2, and f2 of this object to which inwards-facing normal vectors N are set. At this time, color is not applied to the outer planes al, bl, cl, dl, el, and fl of the object.
  • the color on the inside is viewed through the uncolored, i.e., transparent outer planes. This is like looking at a cubic glass box that has been painted solid with red paint on the inside only, for example.
  • Figs. 1E-1F describe this with an example of a cardboard box.
  • Fig. IE is the outside of the cardboard box
  • Fig. IF is the inside of the cardboard box. The inside is colored, but the outside is not colored. Representing this state on a television monitor displays the colored inner planes through the uncolored transparent outer planes, as shown in Fig. ID.
  • outwards-facing normal vectors N are set for each of the planes of the true object.
  • an outline generating object OB2 which is in a homothetic relation with the true object OBI and which is larger than the true object OBI is generated, and inwards-facing normal vectors -N are set for each of the planes of the outline generating object OB2. Accordingly, texture of color, pattern, etc., that has been specified beforehand is mapped on the outer plane of the true object OBI, and pre-specified color for generating the outline is set to the inner plane of the outline generating object OB2.
  • the portion of the true object OBI is all displayed in the state of the predetermined texture of color, pattern, etc., having been mapped thereto, and the remaining portion, i.e., the portion of the outline generating object OB2 that has not overlapped with the true object OBI is all displayed in the predetermined color, as shown in Fig. 2D.
  • the color in this example is black, and this portion becomes the outline. What is important here is that the outer plane of the outline generating object OB2 is transparent, so the inner plane of the outline generating object OB2 colored black is displayed, and this is viewed as the outline.
  • Fig. 3 is a display example wherein the above outline generating method has been applied to an actual character. As can be understood from Fig. 3, adding an outline visually makes a great difference as compared to normal characters formed with polygons.
  • the parameter input means 1 is for performing operations such as at least specifying each piece of polygon apex data for the true object data, setting normal vectors for each plane of the generated object, specifying the homothetic relation with the true object and the size ratio to the true object, and so forth.
  • a keyboard, mouse, digitizer, etc. are used.
  • the outline generating object data generating means 2 generates outline generating object data based on the parameters input from the parameter input means 1, i.e., the data of the direction of the normal vectors and the ratio of size as to the true object.
  • the authoring means 3 is for recording the true object data, outline generating object data, texture data, program data, etc., on a master in a predetermined format.
  • the authoring means 3 includes register processing systems and the like for manufacturing a master such as a CD-ROM and a stamper.
  • the computer-readable computer-executable medium 4 is manufactured by the stamper cut from a glass master.
  • the polygon apex data for the true object is sequentially input via the parameter input means 1. Once all true object data is input, and outwards-facing normal vectors are set for all polygons, the outline generating object generating process is started.
  • the size ratio data as to the true object is input via the parameter input means 1.
  • the outline generating object data generating means 2 performs computation processing such that the values of each of piece of polygon apex data (x, y, z) of the true object becomes a value based on the above ratio data, and the outline generating object is generated based on the newly-selected polygon apex data.
  • Fig. 5 illustrates data per unit for the true object, and data per unit for the outline generating object.
  • both the data per unit for the true object, and data per unit for the outline generating object are made up of polygon apex data, normal line data, a CLUT (Color Look-Up Table) for specifying color, texture No. data, and so forth.
  • CLUT Color Look-Up Table
  • the difference between these sets of data is thus.
  • the values x, y, and z for the polygon apex data of the data per unit for the true object See Fig. 5A
  • the values x, y, and z for the polygon apex data of the data per unit for the outline generating object have been respectively increased or decreased by + -. , ⁇ ⁇ , and ⁇ ⁇ .
  • Fig. 6 illustrates an example of an entertainment system which reads in program data recorded in the computer-readable computer-executable medium 4, draws objects based on this program data and the operating information from the operating means 9, and adds an outline to the object based on true object data and outline generating object data.
  • the entertainment system shown in Fig. 6 is arranged such that a bus 8 made up of control, data, and addresses busses, is connected to a CPU 7, and connected to this bus 8 are: reading means 5 such as an optical disk drive; main memory 6; operating means 9 such as a controller, keyboard, and so forth; graphics processing means 10 for subjecting the object following perspective conversion to texture mapping processing so as to generate image data for display, based on commands supplied from the CPU 7; output means 11 for converting the image data processed here into standard television signals such as NTSC or PAL for example; a television monitor 12 for displaying the picture signals from the output means as an image on the display screen thereof; a D/A converter 13 for converting digital audio signals into analog audio signals; an amplifier circuit 14 for performing current amplification of the analog audio signals from the D/A converter 13; and a speaker 15 for outputting the audio signals from the amplifier circuit 14 as audio.
  • the program data, true object data, outline generating object data, texture data, etc. are read out by the reading means 5.
  • the program data, true object data, and outline generating object data are each stored in the main memory 6, and the texture data is held in the graphics processing means 10.
  • the player operating the operating means 9 causes the operation information thereof, e.g., information for moving the object, is supplied to the CPU 7 via the bus 8.
  • the CPU 7 determines the point of view based on the above operating information, and changes each set of polygon apex data for the true object data, based on the point of view.
  • the CPU 7 also changes each set of polygon apex data for the outline generating object data.
  • brightness is obtained for each polygon apex, by performing light source calculations according to the position of a light source determined beforehand.
  • the polygon apex data for the true object and the outline generating object consists only of x and y.
  • the CPU 7 supplies the polygon apex data (x, y) of the outline generating object following perspective conversion, normal line data, and the CLUT to the graphics processing means 10.
  • the CPU 7 supplies the polygon apex data (x, y) of the true object following perspective conversion, normal line data, CLUT. and texture No. data to the graphics processing means 10.
  • the graphics processing means 10 sets the polygon area of the outline generating object to the internal frame buffer, and also performs coloring for the outline, based on the CLUT. At this time, the CPU 7 does not add color the outer plane of this object, but only adds the color specified for the outline to the inner plane thereof, since the directions of the normal vectors indicated by normal line data point inwards.
  • the graphics processing means 10 sets the polygon area of the true object to the internal frame buffer, maps the texture based on the texture No. data, and performs coloring based on the CLUT.
  • the CPU 7 adds color to the outer plane of the true object, since the directions of the normal vectors indicated by normal line data point outwards.
  • an image with an outline added to a true object is drawn on the frame buffer.
  • This image data is supplied to the output means 11 and converted into picture signals, then supplied to the television monitor 12 and displayed on the display screen thereof as an image.
  • outwards-facing normal vectors are set for the true object
  • inwards-facing normal vectors are set for the object for adding the outline
  • the object for adding the outline is further made to be larger than the true object, and these are overlaid, so an image with an outline added thereto can be instantaneously obtained without placing a load on the CPU 7, and this is advantageous since an image with an outline added thereto can be obtained in games and the like with good response.
  • outwards-facing normal vectors are set for the true object
  • inwards-facing normal vectors are set for the object for adding the outline
  • the object for adding the outline is further made to be larger than the true object, and these are overlaid, thus yielding the advantage that an image with an outline added thereto can be instantaneously obtained without placing a load on the control system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Generation (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
EP00927771A 1999-05-19 2000-05-16 Verfahren und vorrichtung zur konturengenerierung Withdrawn EP1190391A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP11139315A JP2000331175A (ja) 1999-05-19 1999-05-19 輪郭線生成用データ生成方法及び装置、記録システム、該データの記録されたコンピュータ可読実行媒体、並びに該データに基づいてオブジェクトに輪郭を付すエンタテインメント・システム
JP13931599 1999-05-19
PCT/JP2000/003111 WO2000072269A1 (en) 1999-05-19 2000-05-16 Method and apparatus for generating outlines

Publications (1)

Publication Number Publication Date
EP1190391A1 true EP1190391A1 (de) 2002-03-27

Family

ID=15242453

Family Applications (1)

Application Number Title Priority Date Filing Date
EP00927771A Withdrawn EP1190391A1 (de) 1999-05-19 2000-05-16 Verfahren und vorrichtung zur konturengenerierung

Country Status (11)

Country Link
EP (1) EP1190391A1 (de)
JP (1) JP2000331175A (de)
KR (1) KR20010113952A (de)
CN (1) CN1351736A (de)
AU (1) AU4613600A (de)
BR (1) BR0011207A (de)
CA (1) CA2371364A1 (de)
MX (1) MXPA01011799A (de)
NZ (1) NZ513799A (de)
RU (1) RU2001133349A (de)
WO (1) WO2000072269A1 (de)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6914603B2 (en) 2000-07-03 2005-07-05 Sony Computer Entertainment Inc. Image generating system
JP4488346B2 (ja) * 2004-05-10 2010-06-23 株式会社バンダイナムコゲームス プログラム、情報記憶媒体、及び画像生成システム
JP4584665B2 (ja) * 2004-10-01 2010-11-24 株式会社コナミデジタルエンタテインメント 3次元ゲーム画像処理プログラム、3次元ゲーム画像処理方法及びビデオゲーム装置
JP5004148B2 (ja) * 2005-12-07 2012-08-22 サミー株式会社 画像生成装置、遊技機、画像生成方法、及びプログラム
JP4671431B2 (ja) * 2006-06-22 2011-04-20 サミー株式会社 画像生成方法及び装置
JP4764381B2 (ja) * 2007-06-05 2011-08-31 株式会社コナミデジタルエンタテインメント 画像処理装置、画像処理方法及びプログラム
GB0805924D0 (en) 2008-04-02 2008-05-07 Hibbert Ralph Animation Storyboard creation system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61246877A (ja) * 1985-04-25 1986-11-04 Canon Inc 図形変換装置
JPS6282472A (ja) * 1985-10-07 1987-04-15 Canon Inc 画像処理方法
US5966134A (en) * 1996-06-28 1999-10-12 Softimage Simulating cel animation and shading
US5767857A (en) * 1996-08-30 1998-06-16 Pacific Data Images, Inc. Method, apparatus, and software product for generating outlines for raster-based rendered images

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO0072269A1 *

Also Published As

Publication number Publication date
KR20010113952A (ko) 2001-12-28
CN1351736A (zh) 2002-05-29
AU4613600A (en) 2000-12-12
BR0011207A (pt) 2002-02-26
WO2000072269A1 (en) 2000-11-30
MXPA01011799A (es) 2002-04-24
NZ513799A (en) 2001-09-28
RU2001133349A (ru) 2003-08-20
CA2371364A1 (en) 2000-11-30
JP2000331175A (ja) 2000-11-30

Similar Documents

Publication Publication Date Title
US6654020B2 (en) Method of rendering motion blur image and apparatus therefor
US5877769A (en) Image processing apparatus and method
US6144387A (en) Guard region and hither plane vertex modification for graphics rendering
JPH07146952A (ja) 3次元画像処理装置
US6441818B1 (en) Image processing apparatus and method of same
JP3352982B2 (ja) レンダリング方法及び装置、ゲーム装置、並びに立体モデルをレンダリングするプログラムを格納するコンピュータ読み取り可能な記録媒体
US6774897B2 (en) Apparatus and method for drawing three dimensional graphics by converting two dimensional polygon data to three dimensional polygon data
EP1190391A1 (de) Verfahren und vorrichtung zur konturengenerierung
KR100542958B1 (ko) 합성영상을생성하는방법과장치,및정보처리시스템
JP3547250B2 (ja) 描画方法
JP3052839B2 (ja) 画像処理装置及びその処理方法
JP3745659B2 (ja) 画像生成装置および画像生成プログラム
JP3872056B2 (ja) 描画方法
JP3375879B2 (ja) グラフィック処理方法および装置
JP4204114B2 (ja) ポリゴンデータの処理方法
JP3642952B2 (ja) 画像合成方法及び装置、並びに情報処理システム
JP3453410B2 (ja) 画像処理装置及びその方法
JPH07296188A (ja) 疑似3次元キャラクタ描画装置
JPH08329280A (ja) 画像合成装置
JPH11203486A (ja) 半透明オブジェクトの表示方法及び、これを用いる画像表示装置
JPH11144074A (ja) 画像処理装置
JPS63247868A (ja) 三次元図形表示装置
JPH06203171A (ja) 画像生成装置および方法
JP2000076480A (ja) 画像生成装置、画像生成方法および記憶媒体
MXPA98004831A (en) Method and apparatus for generating a composite image and information processing system

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20010821

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE

17Q First examination report despatched

Effective date: 20021223

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20060925