AU744983B2 - System and computer-implemented method for modeling the three-dimensional shape of an object by shading of two-dimensional image of the object - Google Patents

System and computer-implemented method for modeling the three-dimensional shape of an object by shading of two-dimensional image of the object Download PDF

Info

Publication number
AU744983B2
AU744983B2 AU67437/98A AU6743798A AU744983B2 AU 744983 B2 AU744983 B2 AU 744983B2 AU 67437/98 A AU67437/98 A AU 67437/98A AU 6743798 A AU6743798 A AU 6743798A AU 744983 B2 AU744983 B2 AU 744983B2
Authority
AU
Australia
Prior art keywords
updated
image
computer
operator
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU67437/98A
Other versions
AU6743798A (en
Inventor
Rolf Herken
Tom-Michael Thamm
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nvidia ARC GmbH
Original Assignee
Mental Images GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mental Images GmbH filed Critical Mental Images GmbH
Publication of AU6743798A publication Critical patent/AU6743798A/en
Application granted granted Critical
Publication of AU744983B2 publication Critical patent/AU744983B2/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)

Description

WO 98/37515 WO 9837515PCTIIB98/00612 SYSTEM AND COMPUTER-IMPLEMENTED METHOD FOR MODELING THE THREE- DIMENSIONAL SHAPE OF AN OBJECT By SHADING OF A Two-DIMENSIONAL IMAGE OF THE
OBJECT
FIELD OF THE INVENTION The invention relates generally to the field of computer graphics, computer-aided geometric design and the like, and more particularly to generating a three-dimensional model of an object.
BACKGROUND OF THE INVENTION In computer graphics, computer-aided geometric design and the like, an artist, draftsman or the like (generally referred to herein as an "operator) attempts to generate a three-dimensional model of an object, as maintained by a computer, from lines defining two-dimensional views of objects.
Conventionally, computer-graphical arrangements generate a three-dimensional model from, for example, various two-dimensional line drawings comprising contours and/or cross5-sections of the object and by applying a number of operations to such lines which will result in two-dimensional surfaces in three-dimensional space, and subsequent modification of parameters and control points of such surfaces to correct or otherwise modify the shape of the resulting model of the object. After a three-dimensional model for the object has been generated, it may be viewed or displayed in any of a number of orientations.
In a field of artificial intelligence commonly referred to as robot vision or machine vision (which will generally be referred to herein as "machine vision"), a methodology referred to as "shape from shading" is used to generate a three-dimensional model of an existing object from one or more two-dimensional images of the object as recorded by a camera. Generally, in machine vision, the type of the object recorded on the image(s) is initially unknown by the machine, and the model of the object that is generated is generally used to, for example, facilitate identification of the type of the object depicted on the image(s) by the machine or another device.
In the shape from shading methodology, the object to be modeled is illuminated by a light source, and a camera, such as a photographic or video camera, is used to record the image(s) from which the object will be modeled. It is assumed that the orientation of a light source, the camera position and the image plane relative to the object are known. In addition, it is assumed that the reflectance properties of the surface of the object are also known. Ii is further assumed that an orthographic projection technique is used to project the surface of the object onto the image plane, WO 98/37515 PCT/IB98/00612 -2that is, it is assumed that an implicit camera that is recording the image on the image plane has a focal length of infinity. The image plane represents the x,y coordinate axes (that is, any point on the image plane can be identified by coordinates and the z axis is thus normal to the image plane; as a result, any point on the surface of the object that can be projected onto the image plane can be represented by the coordinates The image of the object as projected onto the image plane is represented by an image irradiance function I(x,y) over a two-dimensional domain QcR 2 while the shape of the object is given by a height function z(x,y) over the domain Q. The image irradiance function I(x,y) represents the brightness of the object at each point in the image. In the shape from shading methodology, given I(x,y) for all points in the domain, the shape of an object, given by is determined.
In determining the shape of an object using the shape from shading methodology, several assumptions are made, namely, the direction of the light source is known; (ii) the shape of the object is continuous; (iii) the reflectance properties of the surface of the object are homogenous and known; and (iv) the illumination over at least the portion of the surface visible in the image plane is uniform.
Under these assumptions, the image irradiance function I(x,y) for each point on the image plane can be determined as follows. First, changes in surface orientation of the object is given by means of first partial derivatives of the height function z(x,y) with respect to both x and y, d dz(x,y) p(x,y) z(xy) and q(x,y) dx dy where p-q space is referred to as the "gradient space." Every point of the gradient space corresponds to a particular value for the surface gradient. If the surface is continuous, values for p and q are dependent on each other since the cross-partial-derivatives have to be equal, that is: c p(x, y) q(x, y) y dy dx /mf, 2 "i t- -3- In accordance with one aspect of the present invention there is provided a computer graphics system for generating a structural model of a three-dimensional object by shading by an operator in connection with a two-dimensional image of the object, the image representing the object as projected onto an image plane, the computer graphics system including: A. an operator input device configured to receive shading information provided by the operator, the shading information representing a change in brightness level of at least a portion of the image; B. a model generator configured to receive the shading information from the operator input device and to generate in response thereto an updated structural model of the object, the model generator being configured to use the shading information to determine at least one structural feature of the updated structural model; and S. C. an object display configured to display the image of the object as defined by 15 the updated structural model.
In accordance with another aspect of the present invention there is provided a computer implemented graphics method for generating a structural model of a three-dimensional object by shading provided by an operator in connection with a S 20 two-dimensional image of the object, the image representing the object as projected onto an image plane, the method including the steps of: A. receiving shading information provided by the operator in connection with the image of the object, the shading information representing a change in brightness level of at least a portion of the image; B. generating in response to the shading information an updated structural model of the object, the shading information being used to determine at least one structural feature of the updated structural model; and C. displaying the image of the object as defined by the updated structural model.
In accordance with a further aspect of the present invention there is provided a computer graphics computer program product for use in connection with a computer for generating a structural model of a three-dimensional object by I RA shading provided by an operator in connection with a two-dimensional image of O 008/01/02,jfl10793 clm,3 3a the object, the image representing the object as projected onto an image plane, the computer graphics computer program product including a computer-readable medium having encoded thereon: A. an operator input module configured to enable the computer to receive shading information provided by the operator in connection with the image of the object, the shading information representing a change in brightness level of at least a portion of the image; B. a model generator module configured to enable the computer to receive the shading information from the operator input device and to generate in response thereto an updated structural model of the object, the model generator module being configured to enable the computer to use the shading information to determine at least one structural feature of the updated structural model; and C. an object display module configured to enable the computer to display the image of the object as defined by the updated structural model.
08/01/02,jf10793 clm,3 WO 98/37515 WO 9837515PCTIIB98/00612 -4- In brief summary, the invention provides a computer graphics system for facilitating the generation of a three-dimensional model of an object in an interactive manmer with an operator, such as an artist or the like. Generally, the operator will have a mental image of the object whose model is to be generated, and the operator will co-operate with the computer graphics system to develop the model. The computer graphics system will display one or more images of the object as currently modeled from rotational orientations, translational positions, and scaling or zoom settings as selected by the operator, and the operator can determine whether the object corresponds to the mental image.
In the model generation process, an initial model for the object is initialized and an image thereof is displayed to the operator by the computer graphics system. The image that is displayed will reflect a particular position of a light source and camera relative to the object, the position of the light source relative to the object defining an illumination direction, and the position of the camera relative to the object defining an image plane onto which the image of the object is projected. Any initial model, defining at least an infinitesimally small fragment of the surface for the object to be modeled, can be used, preferably occupying at least one pixel of the image plane. The initial model will identify', for the point or points on the image plane onto which the image of the surface fragment is projected, respective height values for the surface fragment defining the distance from the image plane for the surface fragment at that (those) point(s). The collection of height value(s) for the respective points on the image plane comprise a height field which defines the initial model for the object.
The initial model used in the model generation process may be one of a plurality of default models as provided by the computer graphics system itself, such as a model defining a hemispherical or -ellipsoid shape. Alternatively, the initial model may be provided by the operator by providing an initial shading of at least one pixel of the image plane, through an operator input device provided by the computer graphics system. If the initial model is provided by the operator, one of the points, or pixels, on the image plane is preferably selected to provide a "reference" portion of the initial surface fragment for the object, the reference initial surface fragment portion having a selected spatial position, rotational orientation, and height value with respect to the image plane, and the computer graphics system determines the initial model for the rest of the surface fragment (if .any) in relation to shading (if any) applied to other pixels on the image plane. In one embodiment, the reference initial surface fragment portion is selected to be the portion of the surface fragment for which the first point or pixel to which the operator applies shading. In addition, in that embodiment, iTh 41'S V t WO 98/37515 WO 9837515PCTIIB98/00612 the reference initial surface fragment portion is determined to be parallel to the image plane, so that a vector normal thereto is orthogonal to the image plane and it has a selected height value. In any case, computer graphics system will display the image of the initial model, the image defining the shading of the object associated with the initial model as illuminated from the particular illumination direction and projected onto the image plane.
After the initial model has been developed and the image for the object associated with the initial model as projected onto the image plane has been displayed, the operator can update the shading of the image on the image plane, using, for example, a conventional pressure sensitive pen and digitizing tablet. In updating the shading, the operator can increase or reduce the shading at particular points in the image, thereby to control the brightness, or intensity values, of the image at those points. In addition, the operator can add to the surface fragment by providing shading at points on the image plane proximate those points onto which the surface fragment is currently projected.
Furthermore, in an erasing mode of the shading operation, the operator can remove portions of the surface fragment by, for example, marking as unshaded the particular points on the image plane onto which the portions of the surface fragment that are to be removed are projected. After the shading of a point of the image plane has been updated, if the point is not marked as being unshaded, the computer graphics system will use the updated shading to generate an updated normal vector which identifies, for that point, the normal vector of portion of the surface of the object as projected onto the respective point, and, using the updated normal vector field and the height field, will generate an updated height field for the object. The updated normal vector field and the updated height field define the updated model of the object, which corresponds to the updated shape of the object as updated based on the shading provided by the operator.
After generating the updated model of the object, the computer graphics system can display an image of the object, as defined by the updated model, to the operator. If the updated model is satisfactory, the computer graphics system can save the updated model as the final model. On the other hand, if the updated model is not satisfactory, the operator can update the shading further.
thereby to enable the computer graphics system to generate a further updated normal vector field and updated height field, thereby to generate a further updated model for the object. The computer graphics system and operator can repeat these operations until the operator determines that the object is satisfactory.
~tc~v. WO 98/37515 WO 9837515PCTIIB98/00612 -6- A computer graphics system constructed in accordance with the invention avoids the necessity of solving partial differential equations, which is required in prior art systems which operate in accordance with the shape-from-shading methodology.
Embodiments of the invention also allow the operator to perform conventional computer graphics operations in connection with the object, including rotation and spatial translation of the object to facilitate projection of an image of the object onto an image plane from any of a number of rotational orientations and spatial positions, and scaling or zooming to facilitate enlargement or reduction of the object and/or the image. In such embodiments, the operator can update the shading of the image from any particular three-dimensional rotational and/or translational orientation and position, and from the scaling or zoom setting, as selected by the operator. In addition, embodiments of the invention allow the operator to trim any surface fragment at any moment in time or the updated final object, which may consist of a plurality of such surface fragments, in a conventional manner by projecting two-dimensional trim curves onto the surface of the object. The operator can use the input device, operating in an appropriate drawing mode, to draw these trim curves on the image plane.
BRIEF DESCRIPTION OF THE DRAWINGS This invention is pointed out with particularity in the appended claims. The above and further advantages of this invention may be better understood by referring to the following description taken in conjunction with the accompanying drawings, in which: FIG. 1 depicts a computer graphics system for generating a three-dimensional model of an object by shading as applied by an operator or the like to a two-dimensional image of the object in the given state of its creation at any point in time, constructed in accordance with the invention; FIGS. 2 through 6 are diagrams that are useful in understanding the operations performed by the computer graphics system depicted in FIG. 1 in determining the updating of the model of an object by shading as applied to the two-dimensional image of the object in its given state of creation at any point in time; and FIG. 7 is a flow-chart depicting operations performed by the computer graphics system and operator in connection with the invention.
DETAILED DESCRIPTION OF AN ILLUSTRATIVE
EMBODIMENT
FIG. 1 depicts a computer graphics system 10 for generating a three-dimensional model of an object by shading as applied by an operator or the like to a two-dimensional image of the object WO 98/37515 PT19/01 PCT/IB98/00612 -7in the given state of its creation at any point in time, constructed in accordance with the invention.
With reference to FIG. 1, the computer graphics system includes a processor module 11, one or more operator input devices 12 and one or more display devices 13. The display device(s) 13 will typically comprise a frame buffer, video display terminal or the like, which will display information in textual and/or graphical form on a display screen to the operator. The operator input devices 12 for a computer graphics system 10 will typically include a pen 14 which is typically used in conjunction with a digitizing tablet 15, and a trackball or mouse device 16. Generally, the pen 14 and digitizing tablet will be used by the operator in several modes. In one mode, particularly useful in connection with the invention, the pen 14 and digitizing tablet are used to provide updated shading information to the computer graphics system. In other modes, the pen and digitizing tablet are used by the operator to input conventional computer graphics information, such as line drawing for, for example, surface trimming and other information, to the computer graphics system 10, thereby to enable the system 10 to perform conventional computer graphics operations. The trackball or mouse device 16 can be used to move a cursor or pointer over the screen to particular points in the image at which the operator can provide input with the pen and digitizing tablet. The computer graphics system 10 may also include a keyboard (not shown) which the operator can use to provide textual input to the system The processor module I1I generally includes a processor, which may be in the form of one or more microprocessors, a main memory, and will generally include one a mass storage subsystem including one or more disk storage devices. The memory and disk storage devices will generally store data and programs (collectively, "information") to be processed by the processor, and will store processed data which has been generated by the processor. The processor module includes connections to the operator input device(s) 12 and the display device(s) 13, and will receive information input by the operator through the operator input device(s) 12, process the input information, store the processed information in the memory and/or mass storage subsystem. In addition, the processor module can provide video display information, which can form part of the information obtained from the memory and disk storage device as well as processed data generated thereby, to the display device(s) for display to the operator. The processor module 11 may also include connections (not shown) to hardcopy output devices such as printers for facilitating the generation of hardcopy output, modems and/or network interfaces (also not shown) for connecting WO 98/37515 PTI9/01 PCT/IB98/00612 -8the system 10 to the public telephony system and/or in a computer network for facilitating the transfer of information, and the like.
The computer graphics system 10 generates from input provided by the operator, through the pen and digitizing tablet and the mouse, information defining the initial and subsequent shape of a three-dimensional object, which information may be used to generate a two-dimensional image of the corresponding object for display to the operator, thereby to generate a model of the object. The image displayed by the computer graphics system 10 represents the image of the object as illuminated from an illumination direction and as projected onto an image plane, with the object having a spatial position and rotational orientation relative to the illumination direction and the image plane and a scaling and/or zoom setting as selected by the operator. The initial model Used in the model generation process may be one of a plurality of default models as provided the computer graphics system itself, such as a model defining a hemi-spherical or -ellipsoid shape. Alternatively, the initial model may be provided by the operator by providing an initial shading of at least one pixel of the image plane, using the pen 14 and digitizing tablet 15. If the initial model is provided by the operator, one of the pixels on the image plane is selected to provide a "reference" portion of the initial surface fragment for the object, the reference initial surface fragment portion having a selected spatial position, rotational orientation and height value with respect to the image plane, and the computer graphics system determines the initial model for the rest of the surface fragment (if any) in relation to shading (if any) applied to other pixels on the image plane. In one embodiment, the reference initial surface fragment portion is selected to be the portion of the surface fragment for which the first pixel on the image plane to which the operator applies shading. In addition, in that embodiment, the reference initial surface fragment portion is determined to be parallel to the image plane, so that a vector normal to the reference initial surface fragment portion is orthogonal to the image plane and the reference initial surface fragment portion has a height value as selected by the operator. In any case, computer graphics system will display the image of the initial model, the image defining the shading of the object associated with the initial model as illuminated from the particular illumination direction and projected onto the image plane.
The operator, using the mouse and the pen and digitizing tablet, will provide updated shading of the image of the initial object, and/or extend the object by shading neighboring areas on the image plane, and the computer graphics system 10 will generate an updated mrodel representing the shape of the object based on the updated shading provided by the operator. In updating the shading, the 2 -fl WO 98/37515 PCT/IB98/00612 -9operator can increase or decrease the amount of shading applied to particular points on the image plane. In addition, the operator, using the mouse or trackball and the pen and digitizing tablet, can perform conventional computer graphics operations in connection with the image, such as trimming of the surface representation of the object defined by the model. The computer graphics system can use the updated shading and other computer graphic information provided by the operator to generate the updated model defining the shape of the object, and further generate from the updated model a two-dimensional image for display to the operator, from respective spatial position(s), rotational orientation(s) and scaling and/or zoom settings as selected by the operator. If the operator determines that the shape of the object as represented by the updated model is satisfactory, he or she can enable the computer graphics system 10 to store the updated model as defining the shape of the final object. On the other hand, if the operator determines that the shape of the object as represented by the updated model is not satisfactory, he or she can cooperate with the computer graphics system to further update the shading and other computer graphic information, in the process using threedimensional rotation and translation and scaling or zooming as needed. As the shading and other computer graphic information is updated, the computer graphics system 10 updates the model information, which is again used to provide a two-dimensional image of the object, from rotational orientations, translation or spatial position settings, and scale and/or zoom settings as selected by the operator. These operations can continue until the operator determines that the shape of the object is satisfactory, at which point the computer graphics system 10 will store the updated model information as representing the final object.
The detailed operations performed by the computer graphics system 10 in determining the shape of an object will be described in connection with FIGS. 2 through 7. With reference to FIG.
2, in the operations of the computer graphics system 10, it is assumed that the image of the object is projected onto a two-dimensional image plane 20 that is tessellated into pixels 21(ij) having a predetermined number of rows and columns. The image plane 20 defines an x,y Cartesian plane, with rows extending in the direction and colurnms extending in the direction. The projection of the surface of the object, which is identified in FIG. 2 by reference numeral 22, that is to be formed is orthographic, with the direction of the camera's "eye" being in the direction, orthogonal to the x,y image plane. Each point on the image plane corresponds to a picture element, or "pixel," represented herein by 9ij, with ie[1,N] andje[l,M], where is the maximum number of columns (index ranging over the columns in the image plane) and is the maximum number of rows I u.~ WO 98/37515 WO 9837515PCTIIB98/00612 (index ranging over the rows in the image plane). In the illustrative image plane 20 depicted in FIG. 2, the number of columns is eight, and the number of rows is nine. If the display device(s) 13 which are used to depict the image plane 20 to the operator are raster-scan devices, the rows may correspond to scan lines used by the device(s) to display the image. Each pixel E~i corresponds to a particular point (x,,y 1 of the coordinate system, and WM" by identifies the resolution of the image. In addition, the computer graphics system 10 assumes that the object is illuminated by a light source having a direction L (XL YL 3 ZL) where" Lis a vector, and that the surface of the object is Lambertian. The implicit camera, whose image plane is represented by the image plane 20, is assumed to be view the image plane 20 from a direction that is orthogonal to the image plane 20, as is represented by the arrow with the label "CAMERA." As noted above, the computer graphics system 10 initializes the object with at least an infinitesimally small portion of the object to be modeled as the initial model. For each pixel E~i the height value z(x,y) defining the height of the portion of the object projected onto the pixel is known, and is defined as a height field H(x,y) as follows: H(xy)= y) E where "V(x,y)EQ" refers to "for all points in the domain Cl," with the domain 0 referring to the image plane 20. Furthermore, for each pixel ae 1 j, the normal n(x,y) of the portion of the surface of the basic initial object projected thereon is also known and is defined as a normal field N(x,y) as follows: N(x, y) n(x, y):vz(x, Y) E H(x, In FIG. 2, the normal associated with the surface 22 of the object projected onto one the pixels of the image plane 20 is represented by the arrow labeled After the computer graphics system 10 displays the image representing the object defined by the initial model, which is displayed to the operator on the display 13 as the image on image plane the operator can begin to modify it (that is, the image) by updating the shading the image using the pen 14 and digitizing tablet 15 (FIG. It will be appreciated that the image of the initial model as displayed by the computer graphics system will itself be shaded to represent the shape of the object as defined by the initial model, as illuminated from the predetermined illumination direction WO 98/37515 PCTIIB98/00612 -11and as projected onto the image plane. Each pixel Eij on the image plane will have an associated intensity value I(x,y) (which is also referred to herein as a "pixel value") which represents the relative brightness of the image at the pixel 9i,, and which, inversely, represents the relative shading of the pixel. If the initial pixel value for each pixel eij is given by L 3 which represents the image intensity value or brightness of the respective pixel E~i at location on the image plane 20, and the pixel value after shading is represented by 1 1 then the operator preferably updates the shading for the image such that, for each pixel I I 0 y) el for E~ where "el" (e 1 is a predetermined bound value selected so that, if equation is satisfied for each pixel, the shape of the object can be updated based on the shading provided by the operator.
After the operator updates the shading for a pixel, the computer graphics system 10 will perform two general operations in generation of the updated shape for the object. In particular, the computer graphics system 10 will first determine, for each pixel 9i whose shading is updated, a respective new normal vector n, and (ii) after generating an updated normal vector ni(x,y), determine a new height value z(x,y).
The computer graphics system 10 will perform these operations and (ii) for each pixel eij whose shading is updated, as the shading is updated, thereby to provide a new normal vector field N(x,y) and height field Operations performed by the computer graphics system 10 in connection with updating of the normal vector n, (item above) for a pixel 9ij will be described in connection with FIGS. 3 and 4, and operations performed in connection with updating of the height value z(x,y) (item (ii) above) for the pixel ei will be described in connection with FIGS. 5 and 6.
With reference initially to FIG. 3, that FIG. depicts a portion of the object, identified by reference numeral 30, after a pixel's shading has been updated by the operator. In the following, it will be assumed that the updated normal vector, identified by the arrow identified by legend "nl, for a point z(x,y) on the surface of the object 30, is to be determined. The normal vector identified by legend represents the normal to the surface prior to the updating. The illumination direction is represented by the line extending from the vector corresponding to the arrow identified by legend "fL.v" specifically represents an illumination vector whose direction is based on the direction of illumination from the light source illuminating the object, and whose magnitude represents the WO 98/37515 PCT/IB98/00612 -12magnitude of the illumination on the object provided by the light source. In that case, based on the updating, the set of possible new normal vectors lie on the surface of the cone 31 which is defined by: n,L I that is, the set of vectors for which the dot product with the illumination vector corresponds to the pixel value for the pixel after the updating of the shading as provided by the operator. In addition, since the normal vector n, is, as is the case with all normal vectors, normalized to have a predetermined magnitude value, preferably the value "one," the updated normal vector has a magnitude corresponding to: n, n, ,ll= 1 where "|lnI" refers to the magnitude of updated normal vector n 1 Equations and define a set of vectors, and the magnitudes of the respective vectors, one of which is the updated normal vector for the updated object at point The computer graphics system 10 will select one of the vectors from the set as the appropriate updated normal vector n, as follows. As noted above, the updated normal vector will lie on the surface of cone 3 1.
It is apparent that, if the original normal vector no and the illumination vector L are not parallel, then they (that is, the prior normal vector no and the illumination vector L) will define a plane. This follows since the point z(x,y) at which the illumination vector L impinges on the object 30, and the origin of the normal vector no on object 30, is the same point, and the tail of the illumination vector and head of the prior normal vector no will provide the two additional points which, with the point suffices to defined a plane. Thus, if a plane, which is identified by reference numeral 32, is constructed on which both the illumination vector L and the prior normal vector no lie, that plane 32 will intersect the cone along two lines, which are represented by lines 33 in FIG. 3. One of the lines 33 lies on the surface of the cone 31 which is on the side of the illumination vector L towards the prior normal vector no, and the other line 33 lies on the surface of the cone 31 which is on the side of the illumination vector L away from the normal vector no, and the correct updated normal vector n, is defined by the line on the cone 31 which is on the side of the illumination vector L towards the prior normal vector no.
I-i~Llr L~ L~ L~ L~ WO 98/37515 PCT/IB98/00612 -13- Based on these observations, the direction of the updated normal vector can be determined from equation and the following. Since the prior normal vector no and the illumination vector L form a plane 32, their cross product, L" defines a vector that is normal to the plane 32. Thus, since the updated normal vector n, also lies in the plane 32, the dot product of the updated normal vector n, with the vector defined by the cross product between the prior normal vector n and the illumination vector L has the value zero, that is, n -(no x 0 In addition, since the difference between the pixel values Io and I, provided by the prior shading and the updated shading is bounded el (equation above), the angle 6 between the prior normal vector no and the updated normal vector n, is also bounded by some maximum positive value Ec. As a result, equation (10) can be re-written as I(n,,no x L) (11).
This is illustrated diagrammatically in FIG. 4. FIG. 4 depicts a portion of the cone 32 depicted in FIG. 3, the updated normal vector and a region, identified by reference numeral 34, that represents the maximum angle c 8 from the prior normal vector in which the updated normal vector n 1 is constrained to lie.
The computer graphics system 10 (FIG. 1) will generate an updated normal vector n, for each pixel ij in the image plane 20 based on the shading provided by the operator, thereby to generate an updated vector field After the computer graphics system 10 has generated the updated normal vector for a pixel, it can generate a new height value z(x,y) for that pixel, thereby to update the height field H(x,y) based on the updated shading. Operations performed by the computer graphics system 10 in connection with updating the height value z(x,y) will be described in connection with FIGS. 5 and 6. FIG. 5 depicts an illustrative updated shading for the image plane depicted in FIG. 2. For the image plane 20 depicted in FIG. 5, the pixels ej have been provided with coordinates, with the rows being identified by numbers in the range from 1 through 8, inclusive, and the columns being identified by letters in the range A through I inclusive. As shown in FIG. in the updated shading, the pixels 9EI through GE,3, ©D,3 through DD.4 and Ec.5 through Gc.8 have all been modified, and the computer graphics system 10 is to generate an updated height value h(x,y) therefor for use as the updated height value for the pixel in the updated height field To s WO 98/37515 PCT/IB98/00612 -14accomplish that, the computer graphics system 10 performs several operations, which will be described below, to generate a height value for each pixel ,ij whose shading has been modified along a vertical direction, a horizontal direction, and two diagonal directions, and generates the final height value for the pixel as the average of the four height values (that is, the height values along the vertical, horizontal, and two diagonal directions).
The operations performed by the computer graphics system 10 in generating an updated height value will be described in connection with one of the modified pixels in the image plane namely, pixel eD,4, along one of the directions, namely, the horizontal direction. Operations performed in connection with the other directions, and the other pixels whose shading is updated, will be apparent to those skilled in the art. In generating an updated height value, the computer graphics system 10 makes use of Bzier-Berstein interpolation, which defines a curve P(t) of degree as (n P(t)=Z B, (12), i=o 0 where is a numerical parameter on the interval between "zero" and "one," inclusive, and vectors Bi (defined by components (bx,,by,bz)) define control points for the curve with control points Bo and B, comprising the endpoints of the curve. The tangents of the curve P(t) at the endpoints correspond to the vectors BoB, and In one embodiment, the computer graphics system 10 uses a cubic B6zier-Bemstein interpolation
P=
3
B
0 t) 3 3Bt(1- t) 2 3B 2 t 2
B
3 t 3 (13) to generate the updated height value. The points Bo, B, B 2 and B 3 are control points for the cubic curve Pn.3(t).
Equation as applied to the determination of the updated height value h, for the pixel eD,4 corresponds to ha(- t) 3 3Bit(l- t) 2 3B 2 t 2 hbt 3 (14).
It will be appreciated from equation (14) that, for equal to "zero," the updated height value h, for pixel D 0 4 corresponds to ha, which is the height value for pixel ec.4, and for equal to "one," the updated height value h, for pixel eD.
4 corresponds to hb, which is the height value for pixel EE,4. On l l~ WO 98/37515 PCT/IB98/00612 the other hand, for having a value other than zero or one, the updated height value h, is a function of the height values h and hb of the pixels (Dc, 4 and E.4 and the height values for control points B and B 2 As noted above, for an degree curve the tangents at the endpoints B 0 and B, correspond to the vectors BOB, and Thus, for the curve Pn= 3 shown in FIG. 6, the vector
BIB
0 that is defined by endpoint B 0 and adjacent control point B, is tangent to the curve Pn= 3 at endpoint B 0 and the vector B 2
B
3 defined by endpoint B 3 and adjacent control point B 2 is tangent to the curve at endpoint B 3 Accordingly, the vector B 1 Bo is orthogonal to the normal vector n at pixel @c.
4 and the vector B 2
B
3 is orthogonal to the normal vector nb at pixel Thus, 0= (B Bo).n, and 0= (B 2
B
3 )'nb which leads to 0 h n and 0 =(B 2 (16).
For the determination of the updated height value h, for the horizontal direction (see FIG. the equation which is in vector form, gives rise to the following equations for each of the dimensions and (the dimension being orthogonal to the image plane): h 1 X t) 3 3bIXt(1- t) 2 3b 2 xt 2 hbt 3 (17) and hl 2 3 3b 1 tt(1- t) 2 3b 2 t 2 hb.t 3 (18), where the and subscripts in equations (17) and (18) indicate the respective and "z" components for the respective vectors in equation It will be appreciated that, for equations (17) and only value of the component, of the height value is unknown; the value of the "x" component, will be a function of the position of the pixel whose height value is being determined, in this case pixel 9D, 4 In addition, equation (16) gives rise to the following two equations 0= (bl hbx)nb (bly hby)nby (b 1 hb.)nb. (19), and WO 98/37515 WO 9837515PCTJIB98/00612 -16- 0O= (b 2 hbr)nX' (b 2 hbY)nby (b 2 z, hb..)nbZ, where subscripts ity" and in equations (19) and (20) indicate the respective y and 'Y' components for the respective vectors in equation (16).
In addition, as noted above, there is the further constraint on the curve Pn= 3 in particular the constraint that the updated normal n, be normal to the curve at the point corresponding to pixel
EDD.
4 If the vector B 012
BI
23 in FIG. 6 is tangent to the curve at the point corresponding to pixel (DD 4 then the point hl, whose component corresponds to the updated height value, also lies on the vector B 012
B
1 23 Thus, 0= (B 012 (21), and 0= (B 12 3 -h 1 (22).
Based on the convex combination depicted in FIG. 6,
B
012
=B
01 tB2-BO (23)
=B
01 B1 2 t and B 1 23 =B1 2 0 3- B 12 (24),
-B
12 01- B2t which lead to
B
012 B I -B t[B] t(B 2 B) B 0 t(BI B 0 and.
B
123 B, 0( 2 l (6 t[B 2 t(B 3
B
2 B, t(B 2 (6 .2 Th. LX L.A L't~ ~.AL WO 98/37515 PCT/IB98/00612 -17- Combining equations (23) and 0 (Bo(1-
B
12 t h,).n t) 2 2B(1- B 2 t 2 h)n, (27), which leads to 0= (b 0 t) 2 2bxt(1- b 2 xt 2 hx)ni and 0= (b 0 o(1- t) 2 2blzt(1- b2zt 2 hlz)nI (28) for the and components of the respective vectors. Similarly, for equations (24) and (26), 0= (bx(1- t) 2 2b 2 xt(1- b3xt 2 hlx)nx and 0= t) 2 2b 2 zt(1- b 3 zt 2 hz)nz (29) for the and components of the respective vectors.
It will be appreciated that the eight equations (17) through (28) and (29) are all onedimensional in the respective and components. For the equations (17) through (28) and there are six unknown values, namely, the value of parameter t, the values of the and "z" components of the vector B, (that is, values and the and components of the vector
B
2 (that is, values b2x and b 2 and the component of the vector hi (that is, value to the point
P,=
3 for the pixel eD,4. The eight equations (17) through (28) and (29) are sufficient to define a system of equations which will suffice to allow the values for the unknowns to be determined by methodologies which will be apparent to those skilled in the art.
The computer graphics system 10 will, in addition to performing the operations described above in connection with the horizontal direction (corresponding to the coordinate axis), also perform corresponding operations similar to those described above for each of the vertical and two diagonal directions to determine the updated height vector h, for the pixel eD.4. After the computer graphics system 10 determines the updated height vectors for all four directions, it will average them together. The component of the average of the updated height vectors corresponds to the height value for the updated model for the object.
The operations performed by the computer graphics system 10 will be described in connection with the flowchart in FIG. 7. Generally, it is anticipated that the operator will have a r T ~2_2fl-LTi~f~~ WO 98/37515 PCT/IB98/00612 -18mental image of the object that is to be modeled by the computer graphics system. With reference to FIG. 7, the initial model for the object is determined (step 100), and the computer graphics system displays a two dimensional image thereof to the operator based on a predetermined illumination direction, with the display direction corresponding to an image plane (reference image plane depicted in FIG. 2) (step 101). As noted above, the initial model may define a predetermined default shape, such as a hemi-sphere or -ellipsoid, provided by the computer graphics system, or alternatively a shape as provided by the operator. In any case, the shape will define an initial normal vector field N(x,y) and height field defining a normal vector and height value for each pixel in the image. After the computer graphics system 10 has displayed initial model, the operator can select one of a plurality of operating modes, including a shading mode in connection with the invention, as well as one of a plurality of conventional computer graphics modes, such as erasure and trimming (step 102). If the operator selects the shading mode, the operator will update the shading of the two-dimensional image by means of, for example, the system's pen and digitizing tablet (step 103). While the operator is applying shading to the image in step 103, the computer graphics system can display the shading to the operator. The shading that is applied by the operator will preferably be a representation of the shading of the finished object as it would appear illuminated from the predetermined illumination direction, and as projected onto the image plane as displayed by the computer graphics system When the operator has updated the shading for a pixel in step 103, the computer graphics system 10 will generate an update to the model of the object. In generating the updated model, the computer graphics system 10 will first determine, for each pixel in the image, an updated normal vector, as described above in connection with FIGS. 3 and 4, thereby to provide an updated normal vector field for the object (step 104). Thereafter, the computer graphics system 10 will determine, for each pixel in the image, an updated height value, as described above in connection with FIGS.
and 6, thereby to provide an updated height field for the object (step 105).
After generating the updated normal vector field and updated height field, thereby to provide an updated model of the object the computer graphics system 10, will display an image of the updated model to the operator from one or more directions and zooms as selected by the operator (step 106), in the process rotating, translating and scaling and/or zooming the image as selected by the operator (step 107). If the operator determines that the updated model is satisfactory (step 108), which may occur if, for example, the updated model corresponds to his or her mental image of the WO 98/37515 PCT/IB98/00612 -19object to be modeled, he or she can enable the computer graphics system 10 to save the updated model as the final model of the object (step 109). On the other hand, if the operator determines in step 107 that the updated model is not satisfactory, he or she can enable the computer graphics system 10 to return to step 101.
Returning to step 102, if the operator in that step selects another operating mode, such as the erasure mode or a conventional operational mode such as the trimming mode, the computer graphics system will sequence to step 110 to update the model based on the erasure information, or the trimming and other conventional computer graphic information provided to the computer graphics system 10 by the operator. The computer graphics system will sequence to step 107 to display an image of the object based on the updated model. If the operator determines that the updated model is satisfactory (step 108), he or she can enable the computer graphics system 10 to save the updated model as the final model of the object (step 109). On the other hand, if the operator determines in step 107 that the updated model is not satisfactory, he or she can enable the computer graphics system 10 to return to step 101.
The operator can enable the computer graphics system 10 to perform steps 101, 103 through 107 and 110 as the operator updates the shading of the image of the object (step 103), or provides other computer graphic information (step 110), and the computer graphics system 10 will generate, in steps 104 and 105, the updated normal vector field and updated height field, or, in step 110, conventional computer graphic components, thereby to define the updated model of the object.
When the operator determines in step 108 that the updated model corresponds to his or her mental image of the object, or is otherwise satisfactory, he or she can enable the computer graphics system to store the updated normal vector field and the updated height field to define the final model for the object (step 109).
The invention provides a number of advantages. In particular, it provides an interactive computer graphics system which allows an operator, such as an artist, to imagine the desired shape of an object and how the shading on the object might appear with the object being illuminated from a particular illumination direction and as viewed from a particular viewing direction (as defined by the location of the image plane). After the operator has provided some shading input corresponding to the desired shape, the computer graphics system displays a model of the object, as updated based on the shading, to the operator. The operator can accept the model as the final object, or alternatively can update the shading further, from which the computer graphics system will further update the n -7 WO 98/37515 PCT/IB98/00612 model of the object. The computer graphics system constructed in accordance with the invention avoids the necessity of solving partial differential equations, which is required in prior art systems which operate in accordance with the shape-from-shading methodology.
A further advantage of the invention is that it readily facilitates the use of a hierarchical representation for the model of the object that is generated. Thus, if, for example, the operator enables the computer graphics system 10 to increase the scale of the object or zoom in on the object thereby to provide a higher resolution, it will be appreciated that a plurality of pixels of the image will display a portion of the image which, at the lower resolution, were associated with a single pixel.
In that case, if the operator updates the shading of the image at the higher resolution, the computer graphics system will generate the normal vector and height value for each pixel at the higher resolution for which the shading is updated as described above, thereby to generate and/or update the portion of the model associated with the updated shading at the increased resolution. The updated portion of the model at the higher resolution will be associated with the particular portion of the model which was previously defined at the lower resolution, thereby to provide the hierarchical representation, which may be stored. Thus, the object as defined by the model inherits a level of detail which corresponds to a higher resolution in the underlying surface representation.
Corresponding operations can be performed if the operator enables the computer graphics system to decrease the scale of the object or zoom out from the object, thereby providing a lower resolution.
It will be appreciated that a number of variations and modifications may be made to the computer graphics system 10 as described above in connection with FIGS. 1 through 7. For example, the computer graphics system 10 can retain the object model information, that is, the normal vector field information and height field information, for a number of updates of the shading as provided by the operator, which it (that is, system 10) may use in displaying models of the object for the respective updates. This can allow the operator to view images of the respective models to, for example, enable him or her to see the evolution of the object through the respective updates. In addition, this can allow the operator to return to a model from a prior update as the base which is to be updated. This will allow the operator to, for example, generate a tree of objects based on different shadings at particular models.
In addition. although the computer graphics system 10 has been described as making use of Btzier-Bemstein interpolation to determine the updated height field it will be appreciated that WO 98/37515 PCT/IB98/00612 -21other forms of interpolation, such as Taylor polynomials and B-splines, may be used. In addition, multiple forms of surface representations may be used with the invention. Indeed, since the model generation methodology used by the computer graphics system 10 is of general applicability, all freeform surface representations as well as piecewise linear surfaces consisting of, for example, triangles, quadrilaterals and/or pentagons can be used.
Furthermore, although the computer graphics system 10 has been described as making use of an orthogonal projection and a single light source, it will be appreciated that the other forms of projection, including perspective projection, and multiple light sources can be used.
In addition, although the computer graphics system 10 has been described as providing shape of an object by shading of an image of the object, it will be appreciated that it may also provide computer graphics operations, such as trimming and erasure, through appropriate operational modes of the pen 14 and digitizing tablet.
Furthermore, although the computer graphics system has been described as generating a model of an object on the assumption that the object's surface is Lambertian, it will be appreciated that other surface treatments may be used for the object when an image of the object is rendered.
It will be appreciated that a system in accordance with the invention can be constructed in whole or in part from special purpose hardware or a general purpose computer system, or any combination thereof, any portion of which may be controlled by a suitable program. Any program may in whole or in part comprise part of or be stored on the system in a conventional manner, or it may in whole or in part be provided in to the system over a network or other mechanism for transferring information in a conventional manner. In addition, it will be appreciated that the system may be operated and/or otherwise controlled by means of information provided by an operator using operator input elements (not shown) which may be connected directly to the system or which may transfer the information to the system over a network or other mechanism for transferring information in a conventional manner.
The foregoing description has been limited to a specific embodiment of this invention. It will be apparent, however, that various variations and modifications may be made to the invention, with the attainment of some or all of the advantages of the invention. It is the object of the appended claims to cover these and such other variations and modifications as come within the true spirit and scope of the invention.
What is claimed as new and desired to be secured by Letters Patent is: 21 a Where the terms "comprise", "comprises", "comprised" or "comprising" are used in this specification, they are to be interpreted as specifying the presence of the stated features, integers, steps or components referred to, but not to preclude the presence or addition of one or more other feature, integer, step, component or group thereof.
see 0@ 19/08/99, tdI0793.21a.doc,2

Claims (34)

1. A computer graphics system for generating a structural model of a three- dimensional object by shading by an operator in connection with a two- dimensional image of the object, the image representing the object as projected onto an image plane, the computer graphics system including: A. an operator input device configured to receive shading information provided by the operator, the shading information representing a change in brightness level of at least a portion of the image; B. a model generator configured to receive the shading information from the operator input device and to generate in response thereto an updated structural model of the object, the model generator being configured to use the shading information to determine at least one structural feature of the updated structural model; and 15 C. an object display configured to display the image of the object as defined by ,the updated structural model.
2. The computer graphics system as defined in claim 1, in which the operator input device includes a pen and digitizing tablet.
3. The computer graphics system as defined in claim 1 or claim 2, further including an updated model store configured to store the updated structural model as a final structural model for the object under control of the operator.
4. The computer graphics system as defined in any one of claims 1 to 3, further including an initial model generator configured to generate an initial structural model for the object, the object display initially displaying an initial image of the object as defined by the initial structural model to the operator.
5. The computer graphics system as defined in claim 4 in which the initial structural model includes a default initial structural model provided by the computer graphics system. 08/01/02,jf10793 cm,22 I ,P I 23
6. The computer graphics system as defined in claim 4 or claim 5, in which the initial structural model is generated in response to shading input provided by the operator for at least one reference pixel.
7. The computer graphics system as defined in any one of claims 1 to 6, in which the model generator includes: A. an updated normal vector generator configured to generate, from updating of the shading as provided by the operator of the image, an updated normal vector for at least a portion of the object; and B. an updated height value generator configured to generate from the updated normal vector an updated height value for the at least a portion of the object, the updated height value representing a height of the at least a portion of the object from the image plane, thereby to update the structural model of the object for the at least a portion of the object.
8. The computer graphics system as defined in claim 7, in which the updated normal vector generator is configured to select the updated normal vector ni for the at least a portion of the object in accordance with n, -L=1 where represents an illumination vector indicative of an illumination level and illumination direction for the object and represents brightness of the at least a portion of the object as displayed on the image plane.
9. The computer graphics system as defined in claim 8, in which the updated normal vector has a predetermined magnitude. The computer graphics system as defined in claim 9, in which the predetermined magnitude is "one."
11. The computer graphics system as defined in any one of claims 8 to 10, in which the updated normal vector generator is further configured to select the 0 08/O1/02,Jf0793 ~m,23 I J
24- updated normal vector nl for the at least the portion of the object in accordance with i (no x L) 0 where "no" represents a normal vector for the at least a portion of the object prior to the shading. 12. The computer graphics system as defined in any one of claims 8 to 10, in which the updated normal vector generator is further configured to select the updated normal vector n, for the at least the portion of the object in accordance with [(nl,no x where e6 is a predetermined value. 13. The computer graphics system as defined in any one of claims 7 to 12, in which the updated height value generator is configured to generate the updated 20 height value in accordance with a B6zier-Bernstein interpolation methodology. 14. The computer graphics system as defined in claim 13, in which the updated height value generator is configured to generate the updated height value in relation to a plurality of height values along a plurality of directions along said image plane for the at least the portion of the object. The computer graphics system as defined in any one of claims 1 to 14, in which said model generator is configured to generate a hierarchical surface representation of the structural model including a plurality of resolution levels. 16. The computer graphics system as defined in claim 15, the object display being configured to display said image in a plurality of image resolution levels, said model generator being configured to generate the hierarchical surface 000fT f073l2 08/01/02jf10793 cdm,24 017 k I, representation of the structural model in a plurality of hierarchical surface resolution levels each corresponding to respective image resolution levels. 17. The computer graphics system as defined in claim 16, in which the model generator is configured to generate the plurality of hierarchical surface resolution levels in response to the operator providing shading information at the respective image resolution levels. 18. A computer implemented graphics method for generating a structural model of a three-dimensional object by shading provided by an operator in connection with a two-dimensional image of the object, the image representing the object as projected onto an image plane, the method including the steps of: A. receiving shading information provided by the operator in connection with the image of the object, the shading information representing a change in 15 brightness level of at least a portion of the image; B. generating in response to the shading information an updated structural model of the object, the shading information being used to determine at least one structural feature of the updated structural model; and C. displaying the image of the object as defined by the updated structural model. 19. The method as defined in claim 18, further including the step of storing the updated structural model as a final structural model for the object under control of the operator. The method as defined in claim 18 or claim 19, further including an initial model generation step in which an initial structural model for the object is generated and displayed to the operator. 21. The method as defined in claim 20, in which the initial structural model includes a default initial structural model. 2'A 08101102 Jf10793 OF 1 .1 p 26 22. The method as defined in claim 20 or claim 21, in which the initial structural model is generated in response to shading input provided by the operator for at least one reference pixel. 23. The method as defined in any one of claims 18 to 22, in which the model generation step includes the steps of A. generating, from updating of the shading of the image as provided by the operator, an updated normal vector for at least a portion of the object; and B. generating from the updated normal vector, an updated height value for the at least a portion of the object, the updated height value representing a height of the at least a portion of the object from the image plane, thereby to update the structural model of the object for the at least a portion of the object. 15 24. The method as defined in claim 23, in which the updated normal vector generation step includes the step of selecting the updated normal vector nl for the at least a portion of the object in accordance with nlL=I n, -L where represents an illumination vector indicative of an illumination level and illumination direction for the object and represents brightness of the at least a portion of the object as displayed on the image plane. ooo.oi
25. The method as defined in claim 24, in which the updated normal vector has a predetermined magnitude.
26. The method as defined in claim 25, in which the predetermined magnitude is "one."
27. The method as defined in any one of claims 24 to 26, in which the updated normal vector generation step further includes the step of selecting the updated normal vector nl for the at least the portion of the object in accordance with 08/01/02,jf10793 clm,26 4 t, -'J 27 ni-(no x L) 0 where "no" represents a normal vector for the at least a portion of the object prior to the shading.
28. The method as defined in any one of claims 24 to 27, in which the updated normal vector generation step further includes the step of selecting the updated normal vector n, for the at least the portion of the object in accordance with [(fli,fl x <K where e6 is a predetermined value.
29. The method as defined in any one of claims 23 to 28, in which the updated height value generation step includes the step of generating the updated height value in accordance with a Bbzier-Bernstein interpolation methodology. :30. The method as defined in claim 29, in which the updated height value generation step includes the step of generating the updated height value in 20 relation to a plurality of height values along a plurality of directions along said image plane for the at least the portion of the object.
31. The method as defined in any one of claims 23 to 30, in which said model generation step includes the step of generating a hierarchical surface representation of the structural model including a plurality of resolution levels.
32. The method as defined in claim 31, the object display step including the step of displaying said image in a plurality of image resolution levels, said model generation step including the step of generating the hierarchical surface representation of the structural model in a plurality of hierarchical surface resolution levels each corresponding to respective image resolution levels.
33. The method as defined in claim 32, in which the model generation step includes the step of generating the plurality of hierarchical surface resolution levels C' 08/01/02,J10793 dm,27 0 P 14 28 in response to the operator providing shading information at the respective image resolution levels.
34. A computer graphics computer program product for use in connection with a computer for generating a structural model of a three-dimensional object by shading provided by an operator in connection with a two-dimensional image of the object, the image representing the object as projected onto an image plane, the computer graphics computer program product including a computer-readable medium having encoded thereon: A. an operator input module configured to enable the computer to receive shading information provided by the operator in connection with the image of the object, the shading information representing a change in brightness level of at least a portion of the image; a model generator module configured to enable the computer to receive the shading information from the operator input device and to generate in response thereto an updated structural model of the object, the model generator module being configured to enable the computer to use the shading information to determine at least one structural feature of the updated structural model; and C. an object display module configured to enable the computer to display the image of the object as defined by the updated structural model.
35. The computer graphics computer program product as defined in claim 34, further including an updated model store module configured to enable the computer to store the updated structural model as a final structural model for the object under control of the operator.
36. The computer graphics computer program product as defined in claim 34 or claim 35, further including an initial model generator module configured to enable the computer to generate an initial structural model for the object, the object display module initially enabling the computer to display an initial image of the object as defined by the initial structural model to the operator. Q 08/0 1/02,Jf10793 dm,28 V -29-
37. The computer graphics computer program product as defined in claim 36, in which the initial structural model includes a default initial structural model provided by the computer.
38. The computer graphics computer program product as defined in claim 36 or claim 37, in which the initial structural model is generated in response to shading input provided by the operator for at least one reference pixel.
39. The computer graphics computer program product as defined in any one of claims 34 to 38, in which said model generator module includes: A. an updated normal vector generator module configured to enable the computer to generate, from updating of the shading of the image as provided by the operator, an updated normal vector for at least a portion of the object; 15 B. an updated height value generator module configured to enable the i* computer to generate from the updated normal vector an updated height value for the at least a portion of the object, the updated height value representing a height of the at least a portion of the object from the image .i plane, thereby to update the structural model of the object for the at least a 20 portion of the object.
40. The computer graphics computer program product as defined in claim 39 in which the updated normal vector generator module is configured to enable the computer to select the updated normal vector nl for the at least a portion of the object in accordance with n l L=I where represents an illumination vector indicative of an illumination level and illumination direction for the object and represents brightness of the at least a portion of the object as displayed on the image plane.
41. The computer graphics computer program product as defined in claim 40, in which the updated normal vector has a predetermined magnitude. 08/01/02,jf10793 clm,29 9 '4'
42. The computer graphics computer program product as defined in claim 41, in which the predetermined magnitude is "one."
43. The computer graphics computer program product as defined in any one of~ claims 40 to 42, in which the updated normal vector generator module is further configured to enable the computer to select the updated normal vector n, for the at least the portion of the object in accordance with ni (no x L) 0 where "no" represents a normal vector for the at least a portion of the object prior to the shading.
44. The computer graphics computer program product as defined in any one of claims 40 to 43, in which the updated normal vector generator is further configured to enable the computer to select the updated normal vector n, for the at least the portion of the object in accordance with [(4n L]<e whr c8 isapeeemndvle Th.optrgahc omue rga rdc a eie nayoeo clis34o4,i hc h pae egtvlegnrtri ofgrdt enbetecmutrt eeae h pae.eih.au nacodnewt wherTe coimapretermahc optrpormpouta eined value. cam3904,i which the updated height value generator is configured to eal h optrt generate the updated height value in relation to a plurality of height values along a plurality of directions along said image plane for the at least the portion of the object. 08/01/02Jf10793 0t 4 31
47. The computer graphics computer program product as defined in any one of claims 34 to 46, in which said model generator module is configured to enable the computer to generate a hierarchical surface representation of the structural model including a plurality of resolution levels.
48. The computer graphics computer program product as defined in claim 47, in which the object display module is configured to enable the computer to display said image in a plurality of image resolution levels, said model generator module being configured to enable the computer to generate the hierarchical surface l0 representation of the structural model in a plurality of hierarchical surface resolution levels each corresponding to respective image resolution levels.
49. The computer graphics computer program product as defined in claim 48, in :91 eve which the model generator module is configured to enable the computer to generate the plurality of hierarchical surface resolution levels in response to the operator providing shading information at the respective image resolution levels. A computerimlnd graphics systemoa claimed in claim 18, substantially ecie asdsbdherein with reference to the accompanying drawings.
51. A computerimlnd graphics coptrpormhoduca claimed in claim 83ubtntal sbtilyas described herein with reference to the accompanying drawings. DAE hi t ayo auay 3 0 Z N A I A ESM H C K By thi.aetAtres see.ANLARI 08/01/02,J1 0793 dm,31
AU67437/98A 1997-02-21 1998-02-20 System and computer-implemented method for modeling the three-dimensional shape of an object by shading of two-dimensional image of the object Ceased AU744983B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US3888897P 1997-02-21 1997-02-21
US60/038888 1997-02-21
PCT/IB1998/000612 WO1998037515A2 (en) 1997-02-21 1998-02-20 System and computer-implemented method for modeling the three-dimensional shape of an object by shading of a two-dimensional image of the object

Publications (2)

Publication Number Publication Date
AU6743798A AU6743798A (en) 1998-09-09
AU744983B2 true AU744983B2 (en) 2002-03-07

Family

ID=21902479

Family Applications (1)

Application Number Title Priority Date Filing Date
AU67437/98A Ceased AU744983B2 (en) 1997-02-21 1998-02-20 System and computer-implemented method for modeling the three-dimensional shape of an object by shading of two-dimensional image of the object

Country Status (5)

Country Link
EP (1) EP0961992A2 (en)
JP (1) JP4138018B2 (en)
AU (1) AU744983B2 (en)
CA (1) CA2282240C (en)
WO (1) WO1998037515A2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8247965B2 (en) 2003-11-14 2012-08-21 Semiconductor Energy Laboratory Co., Ltd. Light emitting display device and method for manufacturing the same
CN105513054B (en) * 2015-11-26 2019-03-29 北京市计算中心 Inscription rubbing method based on 3-D scanning

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4888713B1 (en) * 1986-09-05 1993-10-12 Cdi Technologies, Inc. Surface detail mapping system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HANRAHAN P ET AL:"DIRECT WYSIWYG PAINT. & TEXT. ON 3D SHAPES SIGGRAPH 1990. 17TH ANNUAL ACM CONF. ON COMP. GRAPHICS &INT TECHNIQUES,DALLAS USA 6-10AUG.1990 VOL.24 NO4,PP215-223 *

Also Published As

Publication number Publication date
WO1998037515A3 (en) 1998-11-05
JP2001512602A (en) 2001-08-21
CA2282240C (en) 2009-12-29
EP0961992A2 (en) 1999-12-08
WO1998037515A2 (en) 1998-08-27
JP4138018B2 (en) 2008-08-20
AU6743798A (en) 1998-09-09
CA2282240A1 (en) 1998-08-27

Similar Documents

Publication Publication Date Title
EP0950988B1 (en) Three-Dimensional image generating apparatus
KR100415474B1 (en) Computer graphics system for creating and enhancing texture maps
US20070103466A1 (en) System and Computer-Implemented Method for Modeling the Three-Dimensional Shape of An Object by Shading of a Two-Dimensional Image of the Object
Hearn Computer graphics, C version
US5995110A (en) Method and system for the placement of texture on three-dimensional objects
JP5133418B2 (en) Method and apparatus for rendering a virtual object in a real environment
JP5603377B2 (en) Method and apparatus for rendering a virtual object in a real environment
EP0451875B1 (en) Image displaying system
US6724383B1 (en) System and computer-implemented method for modeling the three-dimensional shape of an object by shading of a two-dimensional image of the object
JPH0627930A (en) Method and apparatus for formation, storage generation of three-dimensional image font character and for execution of three- dimensional typesetting
JP2002520749A (en) Method and system for generating a fully textured three-dimensional model
EP1008112A1 (en) Techniques for creating and modifying 3d models and correlating such models with 2d pictures
JP2006120166A (en) Perspective editing tool for 2-d image
CN113593027B (en) Three-dimensional avionics display control interface device
AU2006332582A1 (en) Modeling the three-dimensional shape of an object by shading of a two-dimensional image
CN110428504B (en) Text image synthesis method, apparatus, computer device and storage medium
US5793372A (en) Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points
AU744983B2 (en) System and computer-implemented method for modeling the three-dimensional shape of an object by shading of two-dimensional image of the object
US5821942A (en) Ray tracing through an ordered array
JP3149389B2 (en) Method and apparatus for overlaying a bitmap image on an environment map
Shen et al. Texture Mapping Volume Objects.
FI108679B (en) A 3D graphics arrangement
Srinivasan et al. Integrating volume morphing and visualization
JPH06231274A (en) Method and device for three-dimensional simulation
Applegate The use of interactive raster graphics in the display and manipulation of multidimensional data

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)