GB2356114A - Method for displaying a three dimensional object scene - Google Patents

Method for displaying a three dimensional object scene Download PDF

Info

Publication number
GB2356114A
GB2356114A GB0015421A GB0015421A GB2356114A GB 2356114 A GB2356114 A GB 2356114A GB 0015421 A GB0015421 A GB 0015421A GB 0015421 A GB0015421 A GB 0015421A GB 2356114 A GB2356114 A GB 2356114A
Authority
GB
United Kingdom
Prior art keywords
objects
motion vector
buffer
motion
selected object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0015421A
Other versions
GB2356114B (en
GB0015421D0 (en
Inventor
Iliese Claire Chelstowski
Charles Ray Johns
Barry Minor
Jnr George Leopold White
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
International Business Machines Corp
Original Assignee
International Business Machines Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by International Business Machines Corp filed Critical International Business Machines Corp
Publication of GB0015421D0 publication Critical patent/GB0015421D0/en
Publication of GB2356114A publication Critical patent/GB2356114A/en
Application granted granted Critical
Publication of GB2356114B publication Critical patent/GB2356114B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

A method for displaying a three-dimensional scene in which some objects are blurred because of motion comprises the steps of categorising the objects in a scene into two sets, a first set containing objects to be blurred and a second set containing objects not to be blurred. The first set of objects is rendered directly to a first buffer, the render time period is then divided into a plurality of time slices and for each time slice each object in the second set is rendered to the first buffer, a scene is accumulated from the first buffer to an accumulation buffer to be displayed. Thus, static objects (non-blurred) are rendered only once and moving objects (blurred) are rendered over a series of time slices. Preferably the categorising step includes determining if a motion vector is associated with an object.

Description

2356114 SYSTEM AND METHOD FOR DISPLAYING A THREE -DIMENSIONAL OBJECT SCENE
The invention relates to the field of information handling system, and, more particularly, to a system and method for displaying a three-dimensional object scene. Still more particularly, the invention relates to using motion vectors to generate a blurring effect for an object.
BACKGROUND OF THE INVENTION
Three-dimensional OD) graphics systems are used for a variety of applications, including computer-assisted drafting, architectural design, simulation trainers for aircraft and other vehicles, molecular modelling, virtual reality applications, and video games. Three-dimensional systems are often implemented on workstations and personal computers, which may or may not include 3D graphics hardware. In systems which include 3D graphics hardware, a graphics accelerator card typically facilitates the creation and display of the graphics imagery.
A software application program generates a 3D graphics scene, and provides the scene, along with lighting attributes, to an application programming interface (API). Current APIs include OpenGL, PHIGS, and Direct3D. A 3D graphics scene consists of a number of polygons which are delimited by sets of vertices. The vertices are combined to form larger primitives, such as triangles or other polygons. The triangles (or polygons) are combined to form surfaces, and the surfaces are combined to form an object. Each vertex is associated with a set of attributes, typically including: 1) material colour, which describes the colour of the object to which the vertex belongs; 2) a normal vector, which describes the direction to which the surface is facing at the vertex; and 3) a position, including three Cartesian coordinates x, y, and z. Each vertex may optionally be associated with texture coordinates and/or an alpha (i.e.
transparency) value. In addition, the scene typically has a set of attributes, including: 1) an ambient colour, which typically describes the amount of ambient light; and 2) one or more individual light sources. Each 3S light source has a number of properties associated with it, including a direction, an ambient colour, a diffuse colour, and a specular colour.
Rendering is employed within the graphics system to create two-dimensional image projections of the 3D graphics scene for display on a monitor or other display device. Typically, rendering includes processing geometric primitives (e.g., points, lines, and polygons) by performing one 2 or more of the following operations as needed: transformation, clipping, culling, lighting, fog calculation, and texture coordinate generation.
Rendering further includes processing the primitives to determine component pixel values for the display device, a process often referred to specifically as rasterization.
In some 3D applications, for example, computer animation and simulation programs, objects within the 3D graphics scene may be in motion.
In these cases, it is desirable to simulate motion blur for the objects that are in motion. Without motion blur, objects in motion may appear to move jerkily across the screen.
Similar techniques are also commonly used to blur objects when simulating depth of field. Objects which are within the "field of view" are left un-blurred, while objects which are closer or farther away are blurred according to their distance from the camera (i.e. Viewer).
A prior art method for simulating object blur includes the use of an accumulation buffer. The accumulation buffer is a non-displayed buffer that is used to accumulate a series of images as they are rendered. An entire scene (i.e. each object, or primitive, in the scene) is repeatedly rendered into the accumulation buffer over a series of time slices. The entire scene is thus accumulated in the accumulation buffer, and then copied to a f rame buf f er f or viewing on a display device.
A prior art method for using an accumulation buffer to simulate object blur is illustrated in Figure 1. As shown in Figure 1, a time period is divided into "n" time slices (step 100). The time period is the amount of time during which a scene is visible on a display device, and is analogous to the exposure interval, or shutter speed, of a video camera shutter. A longer shutter speed corresponds to a greater amount of blurring, whereas a shorter shutter speed corresponds to a lesser amount of blurring. A time-slice count is set to one (step 102). Next, an object (i.e. primitive) is selected for rendering (step 104). The location, colour, and all other per-vertex values are calculated for each vertex in the object for this particular time slice (step 106). The object is then rendered into a colour buffer (step 108). A check is made to determine if the object rendered is the last object in the scene (step 110). If not, the process loops back to step 104, and is repeated for each object in the scene.
3 If the last object in the scene has been rendered (i.e. the answer to the question in step 110 is "yes"), the scene is accumulated (step 112), meaning it is scaled (for example, by 1/n) and copied into the accumulation buffer. The time-slice count is checked to see if it is equal to n (step 114). If not, the time slice count is incremented (step 116). The process then loops back to step 104, and is repeated for each time slice. If the time-slice count is equal to n (i.e. the answer to the question in step 114 is "Yes"), then the accumulation buffer is scaled and copied to the frame buffer (step 120) and is displayed on a display screen (step 122).
The use of an accumulation buf f er as described in Figure I is a computationally expensive process, as the entire scene (i.e. each object in the scene) is rendered 'In" times for each time period. Consequently, it would be desirable to have a system and method for more efficiently is simulating object blur in a three-dimensional graphics environment.
SUMMARY OF THE INVENTION
Accordingly, the present invention is directed to a system, method, and computer-usable medium for simulating object blur using motion vectors.
A motion vector, or array of motion vectors, may be specified on either a per-vertex or per-primitive (i.e. per-object) basis. A motion vector is opposite to the direction of the motion, and thus points in the direction of the blur. The magnitude of the motion vector represents the distance the vertex or the primitive (i.e. each vertex in the primitive) travels in one unit of time.
when a scene is rendered, only those objects which are in motion, or which are subject to depth of field blurring, are rendered over a series of time slices. All objects which are static (i.e. non-blurred) are rendered directly into a colour buffer, rather than being repeatedly rendered over a series of time slices. Thus, static (i.e. non-blurred) objects are rendered only once, while objects which are to be blurred are rendered over a series of time slices. This increases the efficiency of the rendering process while simulating object blur of the objects which are in motion and/or subject to depth of field blurring.
BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other features and advantages of the present invention will become more apparent from the detailed description for
4 carrying out the invention as rendered below. In the description to follow, reference will be made to the accompanying drawings, where like reference numerals are used to identify like parts in the various views and in which:
Figure 1 is a flow chart illustrating a prior art method for simulating object blur; Figure 2 is a representative system in which the present invention may be implemented; Figure 3 depicts a moving object, including a motion vector, within a static scene; Figure 4 depicts a moving object, including an array of motion vectors, within a static scene; and Figures 5A and 5B are flow charts illustrating a method for using motion vectors to simulate object blur in accordance with the present is embodiment.
DETAILED DESCRIPTION OF A PREFERRED EMBODIMENT OF THE INVENTION
A representative system in which the present invention may be implemented is illustrated in Figure 2. Information handling system 200 includes one or more processors 202 coupled to a processor or host bus 204.
Cache memory 206 may also be coupled to host bus 204. A bridge/memory controller 208 provides a path between host bus 204 and system memory 210, as well as a path between host bus 204 and peripheral bus 212. Note that system memory 210 may include both read only memory (ROM) and random access memory (RAM). Accumulation buffer 214 is included within system memory 210. Alternately, accumulation buffer 214 can be included in graphics adapter 218. In one embodiment, peripheral bus 212 is a PCI bus or other bus suitable for supporting high performance graphics applications and hardware. Graphics adapter 218 is coupled to peripheral bus 212, and may include local memory portion 220, and frame buffer 221. System 200 may or may not include graphics adapter 218, and if graphics adapter 218 is not present in system 200, then frame buffer 221 may be included in system memory 210 or in video controller 216. Video controller 216 is coupled to display device 222, and is configured to refresh display device 222 with a graphics image stored in frame buffer 221. Note that graphics adapter 218 may be suitably integrated in a single device with video controller 216.
The present invention is a system, method, and computer-usable medium for simulating object blur using motion vectors. A motion vector, or array of motion vectors, may be specified on either a per-vertex or per- primitive (i.e. per-object) basis. A motion vector is opposite to the direction of the motion, and thus points in the direction of the blur. The magnitude of the motion vector represents the distance the vertex or the primitive (i. e.
each vertex in the primitive) travels in one unit of time.
when a scene is rendered, only those objects which are in motion are rendered over a series of time slices. All objects which are static (i.e.
nonmoving) are rendered directly into a colour buffer, rather than being repeatedly rendered over a series of time slices. In the prior art, every object in a scene (whether static or in motion) is rendered over a series of time slices, and then accumulated, as discussed above in the Background
Of The Invention section herein. The present invention renders static objects only once, and only performs rendering over a series of time slices for those objects which are in motion. This increases the efficiency of is the rendering process while simulating object blur of the objects which are in motion. The present invention may also be used to simulate object blur associated with depth of field blurring.
Referring to Figure 3, an example of a linearly moving object within a static scene will now be described. While many objects within a typical scene may be in motion, for illustrative purposes Figure 3 depicts a single object 300 in motion within a static scene 302. Note that motion vector 304 is opposite to the direction of motion 306. The magnitude of motion vector 304 represents the distance over which object 300 moves in a predefined period of time. In the example shown in Figure 3, a single motion vector 304 has been specified for the object. Thus, motion vector 304 applies to each vertex of object 300. During the predefined time period, the vertices of object 300 have moved from points al, bi, and ci to points a2, b2, and c2 respectively. The magnitude of motion vector 304 is thus equal to a2-al. The magnitude is also equal to b2-bi, and equal to c2-cl.
of course, each vertex of object 304 does not have to be moving at the same velocity. It is possible to assign a different motion vector to each vertex of an object. Each vertex may be moving in a different direction and/or at a different rate of speed.
Referring to Figure 4, an example of a non-linearly moving object within a static scene will now be described. As in Figure 3, f or illustrative purposes only, a single moving object 400 is depicted within a static scene 402. There are several motion vectors 404, 406, 408,and 410 6 associated with object 400. Each motion vector has a magnitude equal to a portion of the distance travelled by object 400 during a predefined time period. In the example shown, each motion vector 404, 406, 408, and 410 applies to every vertex of object 400. During the predefined time period, the vertices of object 400 move from points al, bl, and cl to points a2, b2, and c2 respectively. Each vertex moves uniformly with the other vertices, however, object 400 (and its vertices) do not move linearly.
Thus, each motion vector 404, 406, 408, and 410 includes a magnitude equal to the distance travelled during a portion of the predefined time period.
As in Figure 3, each motion vector is opposite to the direction of motion 412. Motion vectors 404, 406, 408, and 410 are referred to as an array of motion vectors. An array of motion vectors may be assigned to an object, as shown in Figure 4, in which case the array of motion vectors applies to every vertex in the object. Alternately, an array of motion vectors may be assigned to a vertex.
An Application Programming Interface (API) is preferably provided in order to allow an application program to specify motion vectors for objects and vertices. An exemplary OpenGL API is depicted below. Note that this API is shown for illustrative purposes only, and is not meant to be limiting. Those skilled in the art will appreciate that motion vectors may be specified using a variety of programming techniques. Further, the use of OpenGL as an API is not meant to be limiting. The present invention may be implemented using various APIs, including, but not limited to PHIGS and Direct3D.
An exemplary OpenGL API is as follows:
overview This extension allows object blur to occur via a point, line, or edge along a specified motion vector. The motion vector is opposite to the direction of motion, thus it points in the direction of blur. The magnitude of the vector is the distance each vertex has travelled in one unit of time.
The llglMotionVector" routines allow the application to specify motion vectors or arrays of motion vectors on a per-vertex basis or a per-primitive basis. The "glMotionEnv" routines allow the application to specify the duration of motion, the degree of fade over time, and whether motion blur (i.e. object blur) is enabled or disabled.
7 Procedures And Functions 1. void glMotionVector[bsifd] IBM (T xcomponent, ycomponent, zcomponent) Purpose: Specify a motion vector for an object variables: Three [blytes, [slhortwords, [ilntegers, [f]loating point numbers, or [d1ouble precision floats specifying a 3D vector 2. void glMotionVectorv[bsifdII13M(T components) Purpose: Specify a motion vector for a vertex variables: An array specifying a 3D vector 3. void glMotionVectorPointerIBM(int size, enum type, sizei stride, void pointer) Purpose: Specify an array of motion vectors for an object or vertex Variables: The size, type, and stride of a list of motion vectors pointed to by the pointer variable 4. void glMotionEnv[if]IBM(GLenum pname, GLfloat param) Purpose: Specifies the duration and degree of blur Variables: If pname is equal to GL-MOTION-ENV-FADE-IBM, then param specifies the degree to which the object is faded over time. If pname is equal to GL-MOTION-ENV-DELTA-TIME-IBM, then param specifies the number of units of time to blur.
Referring to Figures 5A and 5B, a flow chart illustrating a method for using motion vectors to simulate object blur in accordance with the present invention will now be described. Note that the steps described in Figures 5A and 5B can be performed either in software or in hardware (e.g. , by a graphics accelerator), or by a combination of software and hardware.
An object within a scene is defined for rendering (step 500), meaning that the location, colour, motion vector(s), and other attributes are defined for the object. The environment of the scene, along with any motion 8 vectors associated with the object, are used to determine whether the object is static or in motion (step 502). Note that the determination in step 502 could further include a determination as to whether or not the object needs to be blurred due to its depth of field. If the object is not static (i.e. the answer to the question in step 504 is "no"), then the object is identified as in "in-motion" object (step 506). If the object is static (i.e. the answer to the question in step 504 is "yes"), then the object is rendered into a colour buffer. The colour buf f er can be any displayed or non-displayed buf f er area. For example, the colour buf f er may be a portion of system memory 210 or local memory 220 on graphics adapter 218, as described above with reference to Figure 2.
Referring back to Figures 5A and 5B, a check is made to determine if the object is the last object in the scene (step 510). If not (i.e. the answer to the question in step 510 is "no"), then another object is defined for rendering in step 500. If the object is the last object in the scene (i.e. the answer to the question in step 510 is "yes"), then the in- motion objects (i.e. the objects that require blur) are processed. one skilled in the art will realize that if there are no "in-motion" objects in the scene, the colour buf f er may be copied to the f rame buf f er at this point, and the scene may be displayed. For illustrative purposes, the process depicted in Figures SA and 5B assumes a combination of static and "in-motion" objects in the scene.
A predetermined render time period is divided into 'In" time slices (step 512). As discussed above with reference to Figure 1, the render time period is the amount of time during which a scene is visible on a display device, and is analogous to the exposure interval, or shutter speed, of a video camera shutter. A longer shutter speed corresponds to a greater amount of blurring, whereas a shorter shutter speed corresponds to a lesser amount of blurring. A time-slice count is set to one (step 514). Next, an "in-motion" object (i.e. an object identified as an "in-motion" object in step 506) is selected for rendering (step 516) The motion vector or vectors associated with the object are used, along with other motion variables to calculate and/or modify the location, colour, and all other attributes for each vertex in the object (step 518). The object is then rendered into a colour buffer (step 520). A check is made to determine if the object rendered is the last "in-motion" object in the scene (step 522).
If not, the process loops back to step 516, and is repeated for each "in-motion" object in the scene.
9 If the last "in-motion" object in the scene has been rendered (i.e.
the answer to the question in step 522 is "yes"), the scene is accumulated (step 524), meaning it is scaled (for example, by 1/n) and copied from the colour buffer into the accumulation buffer. The time-slice count is checked to see if it is equal to n (step 526) If not, the time slice count is incremented (step 528), and the process then loops back to step S16, and is repeated for each time slice. If the time-slice count is equal to n (i.e. the answer to the question in step 528 is "yes"), then the accumulation buffer is scaled and copied to the frame buffer (step 532) and is displayed on a display screen (step 534).
Note that it is possible to define two entry points into the method described in Figures SA and SE, or alternately, it is possible to have two separate routines to execute the method described in Figures 5A and 5B.
For example, the determination as to whether an object is in motion (i.e.
needs to be blurred) or not can be made by an application program. If the application program determines that an object is static, the application program can call a routine which executes only steps 500 through 510 to render the static object. If the application program determines that an object needs to be blurred, the application program can call a routine which executes only steps 512 through 534 to render the in-motion object.
Although the invention has been described with a certain degree of particularity, it should be recognised that elements thereof may be altered by persons skilled in the art without departing from the spirit and scope of the invention. One of the implementations of the invention is as sets of instructions resident in the random access memory of one or more computer systems configured generally as described in Figure 2. Until required by the computer system, the set of instructions may be stored in another computer readable memory, for example in a hard disk drive, or in a removable memory such as an optical disk for eventual use in a CD-ROM drive, or a floppy disk for eventual use in a floppy disk drive. Further, the set of instructions can be stored in the memory of another computer and transmitted over a local area network or a wide area network, such as the Internet, when desired by the user. One skilled in the art will appreciate that the physical storage of the sets of instructions physically changes the medium upon which it is stored electrically, magnetically, or chemically so that the medium carries computer usable information. The invention is limited only by the following claims and their equivalents.

Claims (16)

1 A method for displaying a three-dimensional graphics scene, including a plurality of objects, comprising the steps of:
categorising the plurality of objects into a first set of objects and a second set of objects, wherein the first set of objects contains one or more objects to be blurred, and wherein the second set of objects contains one or more objects not to be blurred; rendering each object in the second set of objects directly into a first buffer; dividing a render time period into a plurality of time slices; for each time slice, performing the following steps:
rendering each object in the first set of objects into the first buffer; and is accumulating the three-dimensional graphics scene from the first buffer into an accumulation buffer; and displaying the three-dimensional graphics scene.
2. A method according to claim 1, wherein said categorising step comprises the steps of:
determining if a motion vector is associated with a selected object; if a motion vector is associated with the selected object, assigning the selected object to the first set of objects; and if a motion vector is not associated with the selected object, assigning the selected object to the second set of objects.
3. A method according to claim 2, wherein said determining step comprises the step of determining if a motion vector is associated with a vertex of the selected object.
4. A method according to claim 2 or 3, wherein the motion vector is opposite to a direction of motion of the selected object.
5. A method according to claim 2, 3 or 4, wherein a length of the motion vector is proportional to a speed of motion of the selected object.
6. A method according to any one of claims 2 to 5, wherein the motion vector is opposite to a direction of blur of the selected object.
7. A method according to any one of claims 2 to 6, wherein a length of the motion vector is proportional to a speed of blur of the selected object.
S. A method according to any one of claims 1 to 7, wherein said displaying further comprises the steps of:
copying the accumulation buffer to a frame buffer; and displaying the frame buffer on a display device.
9. A graphics system, comprising:
a display means; a plurality of objects to be displayed as a threedimensional graphics scene on said display means; means for categorising the plurality of objects into a first set of objects and a second set of objects, wherein the first set of objects contains one or more objects to be blurred, and wherein the second set of objects contains one or more objects not to be blurred; means for rendering each object in the second set of objects directly into a first buffer; means for dividing a render time period into a plurality of time slices; means for rendering each object in the first set of objects into the first buffer during each time slice; means for accumulating the three-dimensional graphics scene from the first buffer into an accumulation buffer during each time slice; and means for displaying the three-dimensional graphics scene on said display device.
10. A graphics system according to claim 9, wherein said means for categorising comprises:
means for determining if a motion vector is associated with a selected object; means for assigning the selected object to the first set of objects if a motion vector is associated with the selected object; and means for assigning the selected object to the second set of objects if a motion vector is not associated with the selected object.
11. A graphics system according to claim 10, wherein said means for determining comprises means for determining if a motion vector is associated with a vertex of the selected object.
12 12. A graphics system according to claim 10 or 11, wherein the motion vector is opposite to a direction of motion of the selected object.
13. A graphics system according to claims 10, 11 or, wherein a length of the motion vector is proportional to a speed of motion of the selected obj ect.
14. A graphics system according to any one of claims 10 to 13, wherein the motion vector is opposite to a direction of blur of the selected obj ect.
15. A graphics system according to any one of claims 10 to 14, wherein said means for displaying further comprises:
means for copying the accumulation buffer to a frame buffer; and means for displaying the frame buffer on said display means.
16. A computer program product on a computer usable medium, the computer usable medium having computer usable program means embodied therein for displaying a three-dimensional graphics scene on a display device, the computer usable program means comprising:
means for categorising a plurality of objects into a first set of objects and a second set of objects, wherein the first set of objects contains one or more objects to be blurred, and wherein the second set of objects contains one or more objects not to be blurred; means for rendering each object in the second set of objects directly into a first buffer; means for dividing a render time period into a plurality of time slices; means for rendering each object in the first set of objects into the first buffer during each time slice; means for accumulating the three-dimensional graphics scene from the first buffer into an accumulation buffer during each time slice; and means for displaying the three-dimensional graphics scene on the display device.
GB0015421A 1999-06-30 2000-06-26 System and method for displaying a three-dimensional object scene Expired - Fee Related GB2356114B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US34344399A 1999-06-30 1999-06-30

Publications (3)

Publication Number Publication Date
GB0015421D0 GB0015421D0 (en) 2000-08-16
GB2356114A true GB2356114A (en) 2001-05-09
GB2356114B GB2356114B (en) 2003-09-10

Family

ID=23346144

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0015421A Expired - Fee Related GB2356114B (en) 1999-06-30 2000-06-26 System and method for displaying a three-dimensional object scene

Country Status (3)

Country Link
JP (1) JP3286294B2 (en)
CA (1) CA2307352A1 (en)
GB (1) GB2356114B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012001587A1 (en) * 2010-06-28 2012-01-05 Koninklijke Philips Electronics N.V. Enhancing content viewing experience
US8390729B2 (en) 2007-09-05 2013-03-05 International Business Machines Corporation Method and apparatus for providing a video image having multiple focal lengths

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003018578A (en) * 2001-06-27 2003-01-17 Sony Corp Communication equipment and method therefor, communication system, recording medium and program
JP4596227B2 (en) 2001-06-27 2010-12-08 ソニー株式会社 COMMUNICATION DEVICE AND METHOD, COMMUNICATION SYSTEM, RECORDING MEDIUM, AND PROGRAM
CN114419099A (en) * 2022-01-18 2022-04-29 腾讯科技(深圳)有限公司 Method for capturing motion trail of virtual object to be rendered

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
INSPEC Abstract Accession No. 3812984 The accumulation buffer *
INSPEC Abstract Accession No. 5350135 Interactive real-time motion blur *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8390729B2 (en) 2007-09-05 2013-03-05 International Business Machines Corporation Method and apparatus for providing a video image having multiple focal lengths
WO2012001587A1 (en) * 2010-06-28 2012-01-05 Koninklijke Philips Electronics N.V. Enhancing content viewing experience
CN103003775A (en) * 2010-06-28 2013-03-27 Tp视觉控股有限公司 Enhancing content viewing experience

Also Published As

Publication number Publication date
GB2356114B (en) 2003-09-10
JP3286294B2 (en) 2002-05-27
CA2307352A1 (en) 2000-12-30
JP2001043398A (en) 2001-02-16
GB0015421D0 (en) 2000-08-16

Similar Documents

Publication Publication Date Title
Kolb et al. Hardware-based simulation and collision detection for large particle systems
US7362332B2 (en) System and method of simulating motion blur efficiently
Stamminger et al. Perspective shadow maps
Heidelberger et al. Detection of collisions and self-collisions using image-space techniques
Bartz et al. OpenGL-assisted occlusion culling for large polygonal models
Latta Building a million particle system
CA2225017C (en) Method and apparatus for rapidly rendering computer generated images of complex structures
JP3358169B2 (en) Mirror surface rendering method and apparatus
Kreeger et al. Mixing translucent polygons with volumes
US20130127895A1 (en) Method and Apparatus for Rendering Graphics using Soft Occlusion
Kurihara et al. Hair animation with collision detection
Shelley et al. Path specification and path coherence
Batagelo et al. Real-time shadow generation using bsp trees and stencil buffers
GB2356114A (en) Method for displaying a three dimensional object scene
CN1201268C (en) Image process for realizing moving fuzzification
KR100298789B1 (en) Clipping processing method of graphic processing method
US11367262B2 (en) Multi-dimensional acceleration structure
Rauwendaal Hybrid computational voxelization using the graphics pipeline
Rohmer et al. Tiled frustum culling for differential rendering on mobile devices
Mantler et al. The state of the art in realtime rendering of vegetation
Leith Computer visualization of volume data in electron tomography
Georgii et al. Interactive gpu-based collision detection
Tost et al. A definition of frame-to-frame coherence
Zakaria et al. Hybrid shear-warp rendering
Ahokas Shadow Maps

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20080626