WO1995027263A1 - Rendering 3-d scenes in computer graphics - Google Patents

Rendering 3-d scenes in computer graphics Download PDF

Info

Publication number
WO1995027263A1
WO1995027263A1 PCT/GB1995/000746 GB9500746W WO9527263A1 WO 1995027263 A1 WO1995027263 A1 WO 1995027263A1 GB 9500746 W GB9500746 W GB 9500746W WO 9527263 A1 WO9527263 A1 WO 9527263A1
Authority
WO
WIPO (PCT)
Prior art keywords
faces
scanline
active
list
edges
Prior art date
Application number
PCT/GB1995/000746
Other languages
French (fr)
Inventor
Samuel Littlewood
Original Assignee
Argonaut Technologies Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Argonaut Technologies Limited filed Critical Argonaut Technologies Limited
Priority to JP7525511A priority Critical patent/JPH09511083A/en
Priority to EP95913278A priority patent/EP0753181A1/en
Priority to CA002185906A priority patent/CA2185906A1/en
Publication of WO1995027263A1 publication Critical patent/WO1995027263A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/40Hidden part removal
    • G06T15/405Hidden part removal using Z-buffer

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)

Abstract

A method of rendering a 2-D image includes the steps of analysingsurfaces facing a viewing direction into scanline sequences which represent continuous surfaces, checking depth values of those surfaces relative to a viewing position and discarding without rendering those objects or surfaces lying behind a foremost surface. This has the effect of extending the scanline algorithm to reduce the amount of work managing and processing lists of faces by exploiting the fact that most 3-D scenes are constructed from continuous surfaces made up of adjoining faces.

Description


  
 



   Rendering 3-D scenes in Computer Graphics
This invention relates to 3-D computer graphics.



  Converting the information relating to a 3-D image into a 2-D projection for a computer requires an assessment of which objects are visible, and which are hidden by others.



  In doing this, it is conventional to analyse all surfaces of objects into smaller polygonal flat faces defined by the coordinates of those faces, often triangles. The images for an animation (e.g. for computer games), have to be produced in real time, i.e. at a rate that gives the impression of fairly smooth movement.



  The current techniques are as follows:
Z-Buffer - A depth value is kept for each image pixel.



  Each face in the scene is rendered, and at each pixel the new depth is compared with that already in the image. If the new depth is nearer to the observer than the old, the image pixel and depth are updated.



  Painters algorithm - Each face in the scene is rendered into the image, the faces are visited in the order 'furthest' to 'nearest'. This order may be generated by sorting the faces at runtime, or by using a binary space partition that was calculated offline.



  Scanline Z Buffer - The image is traversed in scanline order. As each scanline is processed, the faces that intersect this scanline are maintained in an active list.



  A set of depth values are maintained, corresponding to the pixels in a scanline. For each of the current active faces, the section that intersects the current scanline is rendered. At each pixel, a depth test is made, and the image pixel is only updated if the new pixel is nearer.  



  Scanline - The image is traversed in scanline order. As each scanline is processed, the faces that intersect this scanline are maintained in an active list. The faces in the active list are sorted along the horizontal axis by the first scanline pixel which they intersect. This sorted list is then processed to generate the sections of faces that are frontmost. Each of these visible sections is then rendered into the output image.



  These techniques suffer from various disadvantages:
Z Buffer - A large amount of memory is consumed by maintaining a depth value per image pixel. A test is performed per pixel to find out if it is obscured.



  Painters Algorithm - Fully correct sorting is time consuming. An approximate sort can be used, but this leads to visual artifacts. Binary Space Partitions can be used to accelerate the sorting, at the cost of making some or all of the 3-D scene unchangeable.



  Scanline Z Buffer - Extra work is required to maintain the active lists.



  All the above techniques suffer from the problem that the value for an image pixel may be generated several times, once for every face that covers that pixel. Only the 'nearest' value will survive into the output image. If there are complex calculations needed to generate a pixel's colour, the extra work can amount to a significant portion of the overall processing time.



  Scanline - Extra work is required to maintain and sort lists of active faces.



  The invention proposes a new technique which is designed to speed up processing while saving processing power.  



  The invention proposes a method of rendering a 2-D image which includes the steps of analysing surfaces facing the camera into scanline sequences which. represent continuous surfaces, checking depth values of those surfaces and discarding without rendering those objects or surfaces lying behind a foremost surface.



  This has the effect of extending the scanline algorithm to reduce the amount of work managing and processing lists of faces by exploiting the fact that most 3-D scenes are constructed from continuous surfaces made up of adjoining faces.



  The invention also extends to image generating apparatus for rendering images by the method herein disclosed. The apparatus includes the means necessary to carry out the described method steps and may be in hardware or software form or in any combination thereof. These means will be apparent to the skilled reader from the teachings herein.



  In order that the invention shall be fully understood, a more detailed example of the technique will now be described.



  A 3-D scene is fully described by defining for each object a series of faces (which together make up a surface) employing coordinates which define the vertices, edges connecting each two vertices, and faces formed within a set of connected edges.



  As a first step, the faces are examined to see whether the camera is in front of or behind the face. If behind, then the face is away from the camera on the back of the object and can be ignored.



  The next step is to look in turn at all the faces which are facing the camera. One imagines going around the edges  of each face once with a pen, and keeping a count of how many times one passes over each edge in doing so. Starting from zero, any edge. at the silhouette of an   object    will accumulate a count of 1; an edge between two faces will count 2. Moreover, each edge is marked to say which face is to right or left of it.



  Using this information, two lists are built up for each scanline. A first list identifies those visible silhouette edges which become active on that scanline; the second lists all other edges that become active.



  Now the scene is considered by scanline in turn. A third list is prepared of active surfaces (i.e. sequences of faces). As each new lefthand silhouette edge becomes active on the scanline, an active surface is logged. As other active edges are noted from the second list, it is added to one of the existing active surfaces in the third list by updating the adjoining faces to indicate that this edge is now their neighbour. Thus, each active surface (without regard to depth) is enumerated by starting at the lefthand silhouette edge and then following the neighbour references between edges and faces until the righthand edge is reached.



  This active list of surfaces is now processed to find the visible segments. The scanline is broken up into runs (groups of pixels separated by left or right hand edges of surfaces) . These runs are enumerated in order. During this process, an active list of surfaces that span the run is maintained, sorted by nearest depth.



  If there are no surfaces in the list for a run, then the section of the scanline is the background colour.



  If the furthest depth of the first surface is greater than the nearest depth of any further surfaces on the list, than  that surface is rendered, and the next run processed.



  This technique, although some mpre memory is   required,    makes it possible to perform bulk rejection of obscured parts of a scene, based on accepting whole surfaces formed by linked sequences of faces.

 

  Thus, considerably less processing is required than simple scanline techniques.



  There are complex areas of a scene which do not lend themselves to this simplified treatment, for example where a run has two or more surfaces of which the depth overlap.



  This may require that such areas of the scene are treated in more detail by conventional techniques. These areas will require more processing than usual and be slower, but these are usually a minority of the scene and the savings on the majority are greater.



  The disclosures in British patent application no.



  9406509.1, from which this application claims priority, and in the abstract accompanying this application are incorporated herein by reference. 

Claims

1. A method of rendering a 2-D- image including the steps of analysing surfaces facing a viewing direction into scanline sequences which represent continuous surfaces, checking depth values of those surfaces relative to a viewingsposition and discarding without rendering those objects or surfaces lying behind a foremost surface.
2. A method according to claim 1, including the steps of obtaining coordinates of vertices of an or each object of the image and coordinates of edges connecting each two vertices, and defining faces within a set of connected edges, the faces forming one or more of said surfaces.
3. A method according to claim 2, comprising the steps of examining each face to determine whether the face is in front of or behind the viewing position relative to the viewing direction, and ignoring each face behind the viewing position.
4. A method according to claim 2 or 3, comprising the steps of scanning the image data by line and generating first and second lists for each scanline, the first list identifying those visible silhouette edges at the edge of an object which become active on that scanline; the second list identifying all other edges that become active.
5. A method according to claim 4, comprising the step of identifying as first silhouette edges the silhouette edges of an object first reached during scanning along a scanline.
6. A method according to claim 5, comprising the steps of generating a third list of active faces by the steps of logging an active face as each new first silhouette edge becomes active on the scanline, as other active edges are noted from the second list, associating each of said other active edges to one of the existing active faces in the third list by updating -the adjoining farces to indicate that this edge is now their neighbour.
7. A method according to claim 6, comprising the step of processing the active list of faces to find.the visible segments by dividing the scanline into runs, and enumerating the runs in order so as to sort an active list of faces that span the run by nearest depth relative to the viewing position.
8. A method according to claim 7, comprising the step of, when there are no faces in a list for a run, generating a background image or colour for the associated section of the scanline.
9. A method according to claim 7 or 8, wherein when the furthest depth of the first face is greater than the nearest depth of any further faces of the list, any such further face is rendered, and the next run processed.
10. Imaging generating apparatus operative to render a 2-D image by a method according to any preceding claim.
PCT/GB1995/000746 1994-03-31 1995-03-31 Rendering 3-d scenes in computer graphics WO1995027263A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP7525511A JPH09511083A (en) 1994-03-31 1995-03-31 3-D representation in computer graphics
EP95913278A EP0753181A1 (en) 1994-03-31 1995-03-31 Rendering 3-d scenes in computer graphics
CA002185906A CA2185906A1 (en) 1994-03-31 1995-03-31 Rendering 3-d scenes in computer graphics

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB9406509A GB9406509D0 (en) 1994-03-31 1994-03-31 Rendering 3-d scenes in computer graphics
GB9406509.1 1994-03-31

Publications (1)

Publication Number Publication Date
WO1995027263A1 true WO1995027263A1 (en) 1995-10-12

Family

ID=10752896

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1995/000746 WO1995027263A1 (en) 1994-03-31 1995-03-31 Rendering 3-d scenes in computer graphics

Country Status (5)

Country Link
EP (1) EP0753181A1 (en)
JP (1) JPH09511083A (en)
CA (1) CA2185906A1 (en)
GB (1) GB9406509D0 (en)
WO (1) WO1995027263A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0840915A1 (en) * 1995-07-26 1998-05-13 Raycer, Incorporated Method and apparatus for span sorting rendering system
US6410643B1 (en) 2000-03-09 2002-06-25 Surmodics, Inc. Solid phase synthesis method and reagent
USRE38078E1 (en) 1994-04-21 2003-04-15 Apple Computer, Inc. Graphical rendering system using simultaneous parallel query Z-buffer and method therefor
US9298311B2 (en) 2005-06-23 2016-03-29 Apple Inc. Trackpad sensitivity compensation

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7583263B2 (en) * 2003-12-09 2009-09-01 Siemens Product Lifecycle Management Software Inc. System and method for transparency rendering

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2406927A1 (en) * 1977-10-19 1979-05-18 Inst Avtomatiki Elektrometri DEVICE FOR THE PRODUCTION OF BLACK AND WHITE OR COLOR IMAGES OF THREE-DIMENSIONAL OBJECTS ON REAL-TIME TELEVISION SCREEN
EP0300703A2 (en) * 1987-07-20 1989-01-25 General Electric Company Depth buffer priority processing for real time computer image generating systems
EP0503251A2 (en) * 1991-03-12 1992-09-16 International Business Machines Corporation Direct display of CSG expression by use of depth buffers
EP0531157A2 (en) * 1991-09-06 1993-03-10 Canon Kabushiki Kaisha Three dimensional graphics processing

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2406927A1 (en) * 1977-10-19 1979-05-18 Inst Avtomatiki Elektrometri DEVICE FOR THE PRODUCTION OF BLACK AND WHITE OR COLOR IMAGES OF THREE-DIMENSIONAL OBJECTS ON REAL-TIME TELEVISION SCREEN
EP0300703A2 (en) * 1987-07-20 1989-01-25 General Electric Company Depth buffer priority processing for real time computer image generating systems
EP0503251A2 (en) * 1991-03-12 1992-09-16 International Business Machines Corporation Direct display of CSG expression by use of depth buffers
EP0531157A2 (en) * 1991-09-06 1993-03-10 Canon Kabushiki Kaisha Three dimensional graphics processing

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
USRE38078E1 (en) 1994-04-21 2003-04-15 Apple Computer, Inc. Graphical rendering system using simultaneous parallel query Z-buffer and method therefor
EP0840915A1 (en) * 1995-07-26 1998-05-13 Raycer, Incorporated Method and apparatus for span sorting rendering system
EP0840915A4 (en) * 1995-07-26 1998-11-04 Raycer Inc Method and apparatus for span sorting rendering system
US5977987A (en) * 1995-07-26 1999-11-02 Raycer, Incorporated Method and apparatus for span and subspan sorting rendering system
US6410643B1 (en) 2000-03-09 2002-06-25 Surmodics, Inc. Solid phase synthesis method and reagent
US9298311B2 (en) 2005-06-23 2016-03-29 Apple Inc. Trackpad sensitivity compensation

Also Published As

Publication number Publication date
JPH09511083A (en) 1997-11-04
EP0753181A1 (en) 1997-01-15
CA2185906A1 (en) 1995-10-12
GB9406509D0 (en) 1994-05-25

Similar Documents

Publication Publication Date Title
US5596685A (en) Ray tracing method and apparatus for projecting rays through an object represented by a set of infinite surfaces
US6529207B1 (en) Identifying silhouette edges of objects to apply anti-aliasing
Raskar et al. Image precision silhouette edges
EP0638875B1 (en) A 3-dimensional animation generating apparatus and a method for generating a 3-dimensional animation
EP1640915B1 (en) Method and system for providing a volumetric representation of a 3-dimensional object
US6204856B1 (en) Attribute interpolation in 3D graphics
US6774910B2 (en) Method and system for providing implicit edge antialiasing
JP4480895B2 (en) Image processing device
US7126600B1 (en) Method and apparatus for high speed block mode triangle rendering
JPH08251480A (en) Method and apparatus for processing video signal
EP0568358B1 (en) Method and apparatus for filling an image
EP0910044A3 (en) Method and apparatus for compositing colors of images with memory constraints
DE602004012341T2 (en) Method and system for providing a volume rendering of a three-dimensional object
US6906715B1 (en) Shading and texturing 3-dimensional computer generated images
US6501481B1 (en) Attribute interpolation in 3D graphics
EP0753181A1 (en) Rendering 3-d scenes in computer graphics
Schollmeyer et al. Efficient and anti-aliased trimming for rendering large NURBS models
US6839058B1 (en) Depth sorting for use in 3-dimensional computer shading and texturing systems
EP2249312A1 (en) Layered-depth generation of images for 3D multiview display devices
EP0725365B1 (en) Method and apparatus for shading three-dimensional images
EP0753182B1 (en) Texture mapping in 3-d computer graphics
JP2532055B2 (en) How to create a time series frame
Chow et al. Fast Display of Articulated Characters using Impostors.
Kurka Image-BASED Occluder Selection
KIM et al. Fast Image Generation Method for Animation

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): CA GB JP US

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2185906

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1995913278

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 1996 716234

Country of ref document: US

Date of ref document: 19961226

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 1995913278

Country of ref document: EP

WWW Wipo information: withdrawn in national office

Ref document number: 1995913278

Country of ref document: EP