GB2256118A - Image synthesis and processing - Google Patents

Image synthesis and processing Download PDF

Info

Publication number
GB2256118A
GB2256118A GB9110945A GB9110945A GB2256118A GB 2256118 A GB2256118 A GB 2256118A GB 9110945 A GB9110945 A GB 9110945A GB 9110945 A GB9110945 A GB 9110945A GB 2256118 A GB2256118 A GB 2256118A
Authority
GB
United Kingdom
Prior art keywords
data
image
line
points
defining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB9110945A
Other versions
GB9110945D0 (en
Inventor
Andrew Louis Berend
Mark Jonathan Williams
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cambridge Animation Systems Ltd
Original Assignee
Cambridge Animation Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cambridge Animation Systems Ltd filed Critical Cambridge Animation Systems Ltd
Priority to GB9110945A priority Critical patent/GB2256118A/en
Publication of GB9110945D0 publication Critical patent/GB9110945D0/en
Priority to JP4500061A priority patent/JPH06505817A/en
Priority to AU89321/91A priority patent/AU8932191A/en
Priority to AU90158/91A priority patent/AU9015891A/en
Priority to PCT/GB1991/002124 priority patent/WO1992009966A1/en
Priority to EP91920646A priority patent/EP0559708A1/en
Priority to EP91920852A priority patent/EP0559714A1/en
Priority to PCT/GB1991/002122 priority patent/WO1992009965A1/en
Priority to JP4500477A priority patent/JPH06503663A/en
Priority to US07/844,634 priority patent/US5692117A/en
Priority to JP4510508A priority patent/JPH06507742A/en
Priority to AU17934/92A priority patent/AU1793492A/en
Priority to EP19920910492 priority patent/EP0586444A1/en
Priority to PCT/GB1992/000928 priority patent/WO1992021096A1/en
Priority to US08/150,100 priority patent/US5598182A/en
Priority to JP4510509A priority patent/JPH06507743A/en
Priority to AU17921/92A priority patent/AU1792192A/en
Priority to PCT/GB1992/000927 priority patent/WO1992021095A1/en
Priority to EP92910474A priority patent/EP0585298A1/en
Publication of GB2256118A publication Critical patent/GB2256118A/en
Priority to US08/311,398 priority patent/US5611036A/en
Priority to US08/643,322 priority patent/US5754183A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Abstract

A method of computer painting comprising defining the path of a brush stroke by parametric curve control data A, B; defining and interactively editing the attributes (for example, stroke width; stroke transparency and stroke colour) at predetermined points A1 to A4, B1, B2 along the path, using a supervisory display of the path and the points thereon; and generating the intervening points by interpolation, for storage in a frame store and subsequent reproduction as an image of the brush stroke. <IMAGE>

Description

IMAGE SYNTHESIS AND PROCESSING This invention relates to apparatus for, and a method of, image generation and processing, particularly but not exclusively for use in computer illustration or in cartoon animation.
Prior to the advent of digital computers, the tools available to the graphic artist included pencils or pens (generally producing a substantially uniform line), and brushes or airbrushes (producing a controllably nonuniform line, variable by varying the pressure and/or speed applied to the tool).
Digital image processing apparatus is known which provides simulations of these manually operable graphics tools. For example, British Patents 2059625, 2140257, 2113950 and 2147122 describe aspects of the "Paintbox system available from Quantel Limited. With this system, an operator selects the characteristics of the graphics tool he wishes to imitate and then manipulates a pressure sensitive stylus over a digitising pad to input a desired line. As the stylus is moved over the tablet, the apparatus senses the stylus position and the pressure applied thereto, reads image data from a corresponding mapped area of an image store (e.g a frame buffer) modifies the data in accordance with the sensed pressure, and writes it back into the store.The system is arranged and intended to simulate a conventional graphics tool such as a paintbrush or airbrush, and the artist exerts control over the parameters of the line "drawn" in the image store in the same way, so that the width and other attributes of the line are controlled as the stylus moves, and the stored image data comprises a direct representation of the line itself, corresponding to a manually painted line.
It is known in computer graphics to represents objects as parametric curves, the curve shape being specified and controlled by data representing the positions of points on the curve and the tangents thereat; as disclosed in, for example, "Interactive Computer Graphics", P Burger and D Gillies, 1989, Addison Wesley, ISBN 0-201-17439-1.
In "Hairy Brushes", Strassman, 1986 Siggraph Conference Proceedings (Vol 20, No 4, Page 225-232), a system for emulating paintbrushes of a particular kind is described in which a brush stroke is first defined by data specifying the linear trajectory of the brush (as point positions and tangents), and the pressure applied to the brush (which in turn specifies the width of the stroke normal to the line running along the trajectory), and then the colour profile laterally across the brush stroke is specified by the user defining a profile of individual bristle colours laterally across the brush stroke. It is suggested that profiles could be defined at the start and end of the stroke, and the colour profile along the stroke be interpolated from the end values.
As that system is intended to simulate particular types of existing brush, it makes a distinction between properties of the stroke (its trajectory and its pressure - dictated width) and those of the brush (its colour profile).
In one aspect, the present invention provides image processing apparatus and a method of image processing in which the width of an object, for example a representation of a brush stroke, can be varied independently of the path of that object. In another aspect, the invention provides a method and apparatus for image processing in which attributes of an object in an image to be generated are manipulable independently of its path, and are displayed symbolically to enable interactive manipulation thereof.
In yet another aspect, the invention provides a method and apparatus for drawing lines parametrically, in which one line is connectable to another so as to be movable therewith without altering the other to which it is connected.
In a further aspect, the invention provides a method and apparatus for parametric line drawing, in which parametric control points may be defined and selectively activated or de-activated.
Other aspects and preferred embodiments of the invention will be apparent from the following description and claims.
The invention will now be illustrated by way of example only with reference to the accompanying drawings in which: FIG 1 shows schematically the elements of apparatus according to an embodiment of the invention; FIG 2 shows schematically the arrangement of an image store forming part of FIG 1; FIGS 3A and 3B show schematically displays produced by the apparatus of FIG 1 on a monitor forming part thereof; FIG 4 shows schematically the arrangement of data in a memory forming part of the apparatus of FIG 1; FIG 5 shows schematically the functional elements of apparatus for generating the display of FIG 3B; FIG 6 shows schematically the process by which the apparatus of FIG 1 and FIG 5 produces that display; FIG 7 shows schematically the functional elements of the apparatus of FIG 1 for allowing input of data to produce that display;; FIG 8 shows the process performed by the apparatus of FIG 7 for inputting such data; FIG 9 shows schematically the functional elements of the apparatus of FIG 1 for producing the display of FIG 3A; FIG 10 illustrates schematically the relationship between the displays of FIG 3A and 3B; FIG 11 illustrates schematically the representation of data specifying colour in the display of FIG 3A; FIG 12 illustrates schematically the arrangement of data representing opacity in the display of FIG 3A; FIG 13 illustrates schematically the process performed by the apparatus of FIG 7 in editing the data shown in FIGS 10-12; FIG 14 shows schematically the arrangement of apparatus forming part of the apparatus for FIG 1 for entering or editing the data represented in FIGS 11 and 12; FIG 15 shows schematically the process performed by the apparatus of FIG 14;; FIG 16 shows schematically a line displayed upon the display of FIG 3A; FIG 17 shows schematically the arrangement of attribute data within the table of FIG 4; FIG 18 shows schematically the flow of operation of the apparatus of FIG 9; FIG 19 shows schematically the arrangement of data within the memory of FIG 1 produced during the process of FIG 18; FIG 20 shows schematically the positions from which that data is derived in the display of FIG 16; FIG 21 shows schematically the arrangement of data corresponding to further stage in the process of FIG 18; FIG 22 shows schematically the relation between points held in the table of FIG 21 and image points stored within the generated image frame store of FIG 2; FIG 23 shows schematically a problem encountered during the process of FIG 18; FIGS 24A-E show a set of supervisory display representations of an object; ; FIGS 25A-E show the corresponding generated images of the object; FIGS 26A-C show attribute settings corresponding to the objects of FIGS 24B-D and 25D-D; FIG 27 shows schematically a generated image containing further objects; FIG 28 shows the corresponding supervisory image; FIG 29 shows schematically a flow of operations of an embodiment of the invention for automatically varying the appearance of an object; FIG 30 shows schematically supervisory display indicating the effect of translating an object path; FIGS 31A-C show schematically supervisory displays indicating the effects of scaling an object; FIGS 32A-B show schematically a supervisory display indicating the effects of translating attributes of an object relative to its path; FIG 33 shows schematically an illumination effect produced by an embodiment of the invention;; FIG 34 shows schematically the connection of objects in a further embodiment of the invention; FIG 35 shows schematically the arrangement of data within the memory 120 corresponding to the connected objects of FIG 34.
Referring to FIG 1, apparatus according to an embodiment of the invention comprises a computer 100 comprising a central processing unit 110, a memory device 120 for storing the program sequence for the CPU 110 and providing working read/write memory, a frame store 130 comprising a series of memory locations each associated with, or mapped to, a point in an image to be generated or processed, and an input/output controller 140 providing input and output ports for reading from and writing to external devices, all intercoupled through common parallel data and address buses 150.
A monitor 160 is connected to the computer 100 and its display updated from the frame store 130 under control of the CPU 110. At least one user input device 170a,170b is provided; typically a keyboard 170b for inputting commands or control signals for controlling peripheral operations such as starting, finishing and storing the results of an image generation or image processing operation, and a position sensitive input device 170a such as, in combination, a stylus and digitising tablet, or a "mouse", or a touch sensitive screen on the monitor 160, or a "trackerball" device or a joystick.A cursor symbol is generated by the computer 180 for display on the monitor 160 in dependence upon the signal from the position sensitive input device 170a to allow a user to inspect an image on the monitor 160 and select or designate a point or region of the image during image generation or processing.
A mass storage device 180 such as, for instance, a hard disk device is preferably provided as a long term image store, since the amount of data associated with a single image stored as a frame at an acceptable resolution is high. Preferably, the mass storage device 180 also or alternatively comprises a removable medium storage device such as a floppy disk drive, to allow data to be transferred into and out from the computer 100.
Also preferably provided, connected to input/output device 140, is a printer 190 for producing a permanent visual output record of the image generated. The output may be provided on a transparency or on a sheet of paper.
A picture input device (not shown) such as a scanner for scanning an image on, for example, a slide, may also be provided.
Referring to FIG 2, the frame store device 130 comprises a pair of image stores 130a,130b. The image store 130a store stores the image point, or pixel, data for the image to be generated or processed. The second area, 130b, stores a supervisory or control image displayed during generation or processing of the image stored in the store 130a. The supervisory image may be represented at a lower resolution than the generated image and/or in monochrome and hence the image store 130b may be smaller than the store 130a.
Referring to FIG 3, the appearance of the contents of the generated image store 130a, when displayed on the monitor 160 or output by the printer 180, comprises, as shown, objects a,b each having a trajectory and an extent, against a background c. The objects a,b and background c also possess colour (or in a monochrome system, brightness).
The supervisory display illustrated in FIG 3B comprises, associated with each object, a line A, B and, disposed upon the or each line, a number of points A1,A2,A3,A4,B1, Ba.
The contents of the generated image frame store 130a therefore comprise a plurality of point data defining colour and/or intensity of each of a plurality of points to be displayed to form the display shown in FIG 3A, for example, 500 x 500 image point data, each comprising colour or brightness information as a multi-bit digital number. Typically, several bits representing each of Red (R), Green (G) and Blue (B) are provided. Preferably, the frame store 130a is of the type which stored additionally a transparency value for each image point to allow the generated image to be merged with another image. The address within the store 130a of given point data is related, or mapped, to its position in the display of FIG 3A, which will hereafter be referred to in X (horizontal position) and Y (vertical position) Cartesian co-ordinates.
Likewise, the contents of the supervisory image store 130b comprise point data for a plurality of points making up the image of FIG 3B; in this case, however, the display may comprise only a monochrome line and the point data may for each point merely comprise a single bit set to indicate either a dark point or a light point.
Referring to FIG 3B, a line shown on the supervisory display image in FIG 3B is therefore represented by a plurality of pixel values at corresponding X,Y positions within the supervisory image store area 130b. However, this representation of the line is difficult to manipulate if the line is to be amended. A second representation of the line is therefore concurrently maintained in the working memory area 121 of the memory device 120. This representation comprises a plurality of data defining the curve in vector form. Conveniently, the curve 13 is represented by the position of points ("control points") between which intervening curve values can be derived by calculation.
For each line A,B, a table 122 is provided in the working memory 121 storing the control point data for that line as shown in FIG 4. Conveniently, the curve connecting the points is a spline curve; particularly conveniently a cubic spline defined by x axt bxt +ct+d x x x x 3 2 Y = +byt + cyt + dy y +yt y where a,b,c,d are constants and t is a parameter allocated values between 0 and 1.
Conveniently, the data are stored in Bezier curve format, that is to say, as a plurality of point data each comprising point X,Y co-ordinates, and data representing slope value for the tangent to the curve at those coordinates, and a tangent magnitude parameter indicating (broadly speaking) the extent to which the curve follows the tangent. This format is used, for example, in control of laser printer output devices.
The data may be stored as shown in FIG 4, as point coordinates x and y, and tangent angle and length (Hermite form), or alternatively as point coordinates x and y and tangent end coordinates x,y (=x + r cose, y + r sin o) (Bezier form). In the following, 'Bezier' format will be used to describe both. Full details will be found at "An Introduction to Splines For Use in Computer Graphics and Geometric Modelling" R H Bartels et al, especially at pages 211-245, published by Morgan Kaufmann, ISBN 0-934613-27-3.
Complex curved lines can be represented by a number of such control points, two (at least) for each inflexion in the line. The control points stored in the line table 122 each define, between adjacent points, a line segment described by a corresponding cubic equation, and are the values at which the parameter t in that equation is 0 and 1. As intervening points in the ling (e.g POINT 2) play a part in defining two neighbouring line segments, each is effectively two control points and consequently has two stored tangents (TAN2a,br R2a,b) Although the above described Bezier format is particularly convenient, other parametric ways of representing a curve by control points may be employed, such as the B-spline form, in which the curve control points are not required to lie upon the curve which they characterise.
SUPERVISORY DISPLAY Path Display Referring to FIG 5, to generate the supervisory display shown in FIG 3B, supervisory display generating means 111 reads the control point data from the corresponding line table 122 in the memory 120, and calculates the values of intervening points on the curve. It then accesses the supervisory display image store 130b and sets the values of the corresponding image points therein, to cause display of the generated line on the monitor 160. In practice, the line generating means 111 comprises the CPU 110 operating under control of a program stored in a program store area 129 of the memory 120.
If the display device onto which the supervisory display is to be shown is arranged to accept a vector input signal, of course, the supervisory display image store 130b is unnecessary and the generating means 111 merely outputs the vector information from the table 122 to the display, for example as a command in the "Postscript" graphics computer language.
Separate monitor devices 160a,160b could be provided, one for each of the supervisory display and generated display; for instance, the supervisory display monitor may be a monochrome personal computer monitor provided with the computer 100 and the monitor 160b for the generated image a high resolution colour monitor.
Alternatively, the computer 100 may be arranged to alternately select one of the supervisory display and generated image display for display on the monitor 160, by alternately connecting the frame stores 130a or 130b thereto. Normally, the supervisory display would be shown, except where it is desired to view the effect of editing an object in the generated image.
Alternatively, a single monitor 160 could be arranged to display both displays adjacent or one overlaying the other as a window. In a further alternative, the outputs of both the frame stores 130a,130b may be connected so that the supervisory display overlies the generated image; in this case, the supervisory display may be indicated by dashed lines or in any other convenient manner so as not to be confusable with the generated image.
The general flow of operation in generating the path lines shown in the supervisory display on the display device 160, from the data held in the table 122, is shown in FIG 6.
In one method of generating the line, the position and tangent data for a pair of adjacent control points is read from the table 122, and the parameters a,b,c,d of equation 1 are derived therefrom. A large number of intervening values of the parameter t between 0 and 1 are then sequentially calculated to provide x,y coordinates of interventing points along the line, and these are quantised to reflect the number of image points available in the supervisory display, and corresponding point data in the supervisory image store 130b are set. Once all intervening points between that pair of control points have been calculated, the supervisory display generator 111 accesses the next pair of points in the table 122.
This method is relatively slow, however; faster methods will be found in the above Bartels reference.
The curve or path vector data held within the line tables 122 may have been stored therein from different sources.
For instance, the data may be read from a file within the mass storage (for example disk) device 180. Alternatively, they could be derived by fitting a spline approximation to an input curve represented by data derived, for instance, from a scanner or from a user operated digitising pad and stylus. However, a particularly preferred method of allowing the input and modification of the point data will now be described.
Path Input and Editing In this embodiment, referring to FIGS 7 and 8, a user manipulates the position sensing input device 170a, for example "mouse", by moving the device 170a so as to generate a signal indicating the direction and extent of the movement. This signal is sensed by the device input/output controller 140, which provides a corresponding signal to a cursor position controller 112 (in practice, provided by the CPU 110 operating under stored program control) which maintains stored current cursor position data in x,y co-ordinates and updates the stored cursor position in accordance with the signal from the device input/output controller 140.The cursor position controller 112 accesses the supervisory display store area 130b and amends the image data corresponding to the stored cursor position to cause the display of a cursor position symbol D on the supervisory display shown on the monitor 160. The user may thus, by moving the input device 170a, move the position of the displayed cursor position symbol D.
In a preferred embodiment, the supervisory display line generator 111 is arranged not only to write data corresponding to the line A into the supervisory display store 130b, but also to generate a display of the control point data. Accordingly, for each control point A1,A2, the supervisory image generator 111 writes data representing a control point symbol (for example, a dark blob) into the image store 130b at address locations corresponding to the control point co-ordinates x,y.
Further, the supervisory image generator 111 preferably, for each control point, correspondingly generates a second control point symbol E1 located relative to the A1 along a line defined by the control point tangent data at a length determined by the control point magnitude data; preferably, a line between the two points A1 and E1 is likewise generated to show the tangent itself.
To enter a line A, the user signals an intention so to do (for example by typing a command on the keyboard 170b, or by positioning the cursor symbol at a designated area of a displayed control menu), positions the cursor symbol d at desired point on the display 160, by manipulating the position sensitive input device 170a and generates a control signal to indicate that the desired point has been reached. The cursor position controller 112 supplies the current cursor position data to the table 122 as control point position co-ordinates, and the supervisory display generator 111 correspondingly writes data representing a control point symbol into the image store 130b at address locations corresponding to the control point co-ordinates. The user then inputs tangent information, for example via the keyboard 170b, or in the manner described below.When a second path control point has been thus defined and stored in the table 122, the supervisory image generator 111 will correspondingly generate the line segment therebetween on the supervisory display by writing the intervening image points into the supervisory display store 130a.
To amend the shape or path of the line A displayed on the supervisory display, a user manipulates the input device 170a to move the cursor position symbol D to coincide with one of the control point symbols A1 or E1 on the display 160. To indicate that the cursor is at the desired position, the user then generates a control signal (for example, by "clicking" a mouse input device 170a). The device input/ output controller 140 responds by supplying a control signal to the cursor position controller 112. The cursor position controller 112 supplies the cursor position data to a supervisory display editor 113, (comprising in practice the CPU 110 operating under stored program control) which compares the stored cursor position with, for each point, the point position (X,Y) and the position E of the end of the tangent (derived from X,Y the tangent and the magnitude thereof).
When the cursor position is determined to coincide with any point position A or tangent end position E , the display editor 113 is thereafter arranged to receive the updated cursor position from the cursor controller 112 and to amend the point data corresponding to the point A1 with which the cursor symbol coincides, so as to move that point to track subsequent motion of the cursor.
If the cursor is located at the point A1 on the curve A, manipulation by a user of the input device 170a amends the position data (X1,Y1) in the line table 122, but leaves the tangent data unaffected. If, on the other the cursor is located at an end of tangent point E1, manipulation by a user of the input device 170a alters the magnitude and tangent data in the line table 122 within the memory 120, leaving the position data (x,y) unaffected.
In either case, after each such amendment to the contents of the line table 122, the supervisory display generator 111 regenerates the line segment affected by the control point in question within the supervisory display image store 130b so as to change the representation of the line on the supervisory display.
Once a line has been amended to a desired position, the user generates a further control signal (e.g by "clicking" the mouse input device 170a), and the supervisory display editor 113 thereafter ceases to amend the contents of the memory 120. The cursor controller 112 continues to update the stored cursor position.
This method of amending the line representation is found to be particularly simple and quick to use.
GENERATED IMAGE The relationship between the contents of the supervisory image store 130b and the generated image store 130a will now be discussed in greater detail.
Referring to FIGS 3A and 3B, the display of FIG 3B represents only lines, and corresponds to, for example, the output of an animation program or a PostScript (TM) page design program. The objects a,b shown in FIG 3A correspond to the lines A,B shown in FIG 3B insofar as their general trajectory or path is concerned, but differ therefrom by displaying one or more of the following additional attributes: Extent : Each object a,b in FIG 3A has a finite width.
This width is not necessarily constant along the object.
Colour : Each object a,b in FIG 3A may be coloured, and the colour may vary along the line of each object. The profile of colour across the width of the object may also be non-constant.
Opacity : An object a,b may be given the appearance of a semitransparent object positioned in front of the background c, by providing that the colour of a part of the object a, be influenced by the colour of the background c, to an extent determined by an opacity parameter varying between 0 (for a transparent or invisible object, the colour of which is entirely dictated by the colour of the background c) and unity (for an entirely opaque object, the colour of which does not depend on that of the the background c). The effect of the opacity of the object is significant when the object is moved, since parts of the object exhibiting some transparency will show an altered appearance depending upon the colour of the background c.
The manner in which these attributes of the generated line shown in the display of FIG 3A may be manipulated by a user will now be discussed in general terms.
The objects a,b to be displayed are represented within the frame store 130a, in the same form in which they are normally represented within computer painting systems as an array of stored image point data. However, changing the representation of attributes in this form requires a very large amount of data processing since a large number of pixel values must be amended. Further, it is not possible to change the position or shape of a line whilst leaving other of the above listed attributes unaffected.
Rather than storing colour information for every image point in the object a or b, this embodiment accordingly stores information corresponding to attributes of the object associated with predetermined points along the line A shown in the supervisory display, and the corresponding values at intervening points in the object a are generated therefrom by an image generator device 114 shown in FIG 9 comprising, in practice, the CPU 110 operating under stored program control, and the generated image data stored at corresponding positions within the generated image store 130a for display on the generated image display on the monitor 160.
Referring to FIGS 10-13, the nature of the attribute data will now be discussed in greater detail.
ATTRIBUTE DATA Lateral Extent Referring to FIG 10, a line A on the supervisory display and a corresponding object a on the generated image display are shown superimposed. The object a has, at any point along the line A, a lateral extent on either side of the line A. It is found that a good approximation to most object shapes can be provided by specifying the lateral extent only at a small number of points along F11 F 2,F3,F4 along the line A, and calculating the values of lateral extent at intervening points along the line by linear or other interpolation between the values at those points F1-F4. If the width of the line is to be variable along the line, at least two such points F1,F2 must be specified.Thus, for each such width or extent point, the data stored comprises an indication of the position of the point along the line A, and an indication of the width or extent on either side of the line el, e2.
Conveniently, these widths are taken normal to the tangent to the line at the corresponding point F.
It is possible to specify two extent values both lying on the same side of the line. In this case, the generated image will not include the path of the line but will be offset from it. The extent values therefore define the positions of the edges of the object, relative to the line, but not necessarily about the line The data specifying the width of or extent of a given line therefore comprises extent control data comprising extent values corresponding to spaced apart width control points F1-F4. It should particularly be noted that the positions of these width control points F1-F4 need not for any reason correspond to the curve control points A1, A2 specifying the curvature of the line A.
It is preferred to thus keep the number of curve control points at a minimum to keep the path smooth and unbroken, whilst providing sufficient intervening attribute control points to select a desired object appearance.
The control point coordinates and their associated extent data are stored together with the curve control point data in the line table 122. The image generating means 114 therefore comprises means for reading the curve control point data, and generating curve points therebetween, as with the supervisory display generator 111, and further comprises means for reading the attribute control point data and for generating the attribute (e.g extent) values at points in between, and means for storing, in the generated image store 130a image data values corresponding to the object a comprising the line A surrounded by an area having the interpolated attribute values.
It is preferred that the interpolation between the stored extent values is performed in such a manner that discontinuities of curvature do not occur; for this purpose, the interpolation is preferably not linear but takes the form of a spline curve, for example, a cubic spline. Accordingly, together with the extent values el, e2, further data controlling the curvature is stored; preferably, in the form of Bezier control data (i.e as a tangent value and, optionally, a magnitude value, defining the value of the tangent to the curve at the ends of the lateral extents e1,e2). It would also be possible to specify a pair of tangent values at each extent value so as to permit a controlled discontinuity, if required. This also permits the curvature of the edges of the object to be controlled independently of that of the path defining the object.
It would likewise be possible to specify the colour of the object a at spaced apart points along the line A.
However, this permits only linear colour changes along the line A. It is preferred to specify colour as data which permits a variation of colour across the line A as well as along it. In a preferred embodiment, referring to FIG 11 colour information is therefore stored representing the variation of colour through the width of the line at each of a plurality of points along the line A.
Rather than storing colour data for each point along the section through the line, preferably the colour data stored comprises the colour value for each of a small number of points along the cross-section (C1-C4), and the image generating means 114 correspondingly generates the colour values at the intervening points by interpolation therebetween. Colour values are set at points C1 and C4 corresponding to the greatest lateral extents of the object a at a colour control point. The positions therebetween of the intervening points could be pre determined but are preferably selectable by the user, in which case an indication of the position along the crosssection between the two extents e1,e2 is stored with each value C2,C3.Preferably, the position data stored comprises a fraction of the distance between the two extents e1,e2. Thus, in this embodiment, if the extent data are changed (for example, to reduce the width of the line), the colour profile is automatically repositioned. For ease of understanding, the profile shown in FIG 11 is on a monochrome scale from black to white; in general, colour value data are typically stored as a set of R, G, B values defining a colour by reference to primary colours.
Colour information specifying the variation in colour of the objects a,b is thus stored as a plurality of colour value data C1-C4, each associated with a lateral position transverse to the line A corresponding to that object a at a colour control point G11G2. The colour control points G1,G2 need not be co-located with the width control point; the extent data e1,e2 may be derived by the image generator 114 during image generation, by interpolation between width control points F1,F2.
The colour control point data associated with a given line are stored, in the same manner as the width control data, in the line table 122.
Opacity Referring to FIG 12, opacity or transparency data, specifying, as a fraction, the relative dependence of the colour of the object a on the colour data for the object a relative to the colour data for the background c, is likewise stored in the line table 122 corresponding to opacity control points H1,H2 in the same manner as described above for colour data, except that an opacity value is stored rather than a colour value. It is therefore possible to vary the degree of transparency of the object across its lateral extent, as well as along its length.
The image generator 114 is therefore arranged preferably to initially derive the extent of the object a at either side of the line A using the extent data, by interpolating between extent data values at extent control points, then to derive colour data values by interpolating across the line A and along the line A between colour control points, and to do likewise to derive transparency values, and finally to set the colours of image points stored in the generated image store 130a by reading the stored background colour, and forming for each image point the interpolated colour value multiplied by the interpolated opacity value, together with the background colour value multiplied by unity less the interpolated opacity value.
The process of setting and amending the values of the above attributes will now be discussed in greater detail.
Attribute Input and Editing Referring to FIG 13, to set up the attribute values for an object a to be displayed on the monitor 160, the user generates a control signal (typically by typing an appropriate command on the keyboard 170b, or by positioning the cursor symbol on a specified part of the screen of the monitor 160 and "clicking" a mouse input device 170a) indicating that an attribute is to be input or added to the object.
The user positions the cursor symbol at a point on the line A shown on the supervisory display and generates a further control signal (e.g by "clicking" the mouse 170a). The supervisory display editor 113 receives the cursor position from the cursor controller 112, and writes a corresponding attribute control point symbol into a corresponding position in the supervisory display image store 130b, which is consequently subsequently displayed on the monitor 160.
The stored cursor position indicating the position along the line at which the control point is placed by the user is then processed for storage in the attribute line data within the line table 122 in the memory 120. The cursor position is not directly stored since, if the user subsequently repositioned the line as discussed above, the attribute control point would no longer lie on the line. Instead, an indication of the relative position along the line, between its two neighbouring curve control points, is derived and this indication is stored so that the position of the attribute control point is defined relative to the position of the line, regardless of subsequent redefinitions of the line position.
This may be achieved, for example, by accessing the line table, reading the Bezier control point information, deriving therefrom the cubic spline equation 1 above and solving for a value t at the cursor X,Y coordinates if the cursor is not exactly on the line. The value of t at the closest point on the line is derived, 2 2 for example to set (x-xt) + (Y-yt) to a minimum. The value of the parameter t is then stored as an entry in the attribute data within the line table 122.
The values of colour, width and opacity are initially set to a predetermined flag value indicating to the supervisory display generator 111 that the attribute control point has no effect), and are then alterable by the user.
Preferably the alteration is incremental and graphically illustrated; for example, to set the extent at control point on either side of the line, the user moves the input device 170a so that the cursor controller 112 positions the cursor symbol at the control point, generates an appropriate control signal (by typing a command in the keyboard 170b) and thereafter causes the cursor symbol to move outwardly from the line A a desired distance to indicate the desired extent at that point.
The current cursor position is employed to calculate a line, normal to the line A, drawn from the attribute control point, to approximately track the cursor position. In this manner, the user may interactively increase or reduce the width of the object a by increasing or reducing the length of the displayed extent line by moving a control device 170a (for instance, a mouse). When the desired value is found, the user generates a further appropriate control signal (via the keyboard 170a).
The length e, of the extent line is then calculated by the supervisory display editor 113, for example by calculating the square root of the sum of squares of the differences in X and Y coordinates of the attribute control point and of the end of the extent line. This value is then written into the extent data stored in the line table 122.
Colour and transparency data, when in the above described format, require the input both of positions along the extent lines at the control point and of colour or opacity values at those positions. Preferably, the positions are specified by placing the cursor symbol at the desired position along the displayed extent line and generating a control signal; the cursor position is converted to a fraction of the distance between the two extents at that attribute control point by the supervisory display editor 113, as discussed above, by calculating the ratio of the difference between X or Y coordinates of the cursor position and one such extent point, divided by the difference in X or Y coordinates between the two extent points. The position data is then written to the attribute data table 122.Colour or opacity numerical values at the selected points may for simplicity be typed in via the keyboard 170b.
Referring to FIGS 14 and 15, an alternative and preferred method of inputting opacity data is illustrated. When the user generates a control signal indicating a desire to input opacity data, a profile generating means 118 (comprising, conveniently, the CPU 100 acting under stored program control) causes the display 160 to display the contents of a profile display store 130c (as, for example, a display "window" overlying the supervisory display). The contents of the profile display store 130c comprise image data defining a horizontal and vertical axes. The display represents the profile of opacity across the brush, corresponding to a cross-section along the line A. The horizontal axis represents position across the line A between the two lateral extents e1,e2.
The vertical line represents opacity. Both axes are conveniently scaled between 0 and 1.
The cursor position controller 112 is arranged to write data into the profile display store 130c to cause the display of a cursor symbol D at a position therein defined by movements of the position sensitive input device 170b. By positioning the cursor symbol at a point between the axes, and generating a control signal, the user signals an opacity value at a given distance across the object a transverse to the line A. The corresponding position between the extents e1,e2 and opacity value thereat are derived by the profile generator 118 from the current cursor position supplied by the cursor tracker 112 and are written into the attribute data held within the line data store 122. The profile generator 118 likewise causes the generation, at the current cursor position, of a point symbol. The cursor may then be repositioned, but the point symbol remains. When two or more different point symbols are displayed and, correspondingly, two or more opacity data values and positions are stored within the line table 122, the profile generator 118 preferably calculates by interpolation, the coordinates of image data within the profile display store corresponding to intervening points along an interpolated line between the points for which opacity data is stored, and sets the value of those image points within the profile display store 130c, so that when displayed on the display device 160, so as to represent the profile which would be followed at that point. Generating a schematic cross-section display of this type is found to be of assistance to a user in visualising the transparency of, for example, an object corresponding to an airbrush stroke.The interpolation performed by the profile generator 118 is preferably the same as that which will be performed by the image generator 114.
To permit discontinuities in the colour or opacity across the extent of the object to be defined, preferably, the line table 122 is dimensioned to allow storage of two attribute values for each such lateral position C2,C3; as shown in FIG 12, one value is used to perform interpolation to one neighbouring point and the other to the other neighbouring point.
A corresponding profile display could be provided to allow the input and amendment of other attributes; for instance, brightness (of a monochrome object) or colour (of a coloured object).
Preferably, predefined attribute data specifying width, colour profiles and opacity profiles are also stored on the mass storage device 180 corresponding, for example, to particular paintbrushes or airbrushes, or to particular previously defined objects. Rather than manually enter and edit the attribute control data, the user may enter an appropriate command (via the keyboard 170b) to read such predetermined data from the mass storage device 180 into the line data table 122.
Preferably, the data stored for each attribute control point can specify all, or only a subset of the available attributes; for instance, it may be used to specify only width at a point, or only colour and opacity. Thus, the variations across the object a of these attributes may be separately controlled, and independently edited. In such a case, the default predetermined value assigned to each attribute is a flag indicating that the attribute is not set.
Image Generation The process by which the image generator 114 generates or "renders" the image stored in the generated image store 130a from the attribute and line data held in the line table 122 in the memory 120 will now be discussed in greater detail.
A typical line is shown in FIG 16, as it would be displayed upon the supervisory display on the monitor 160. The line A is terminated by a pair of points A1, A4 defining curvature and there are two further control points A2, A3, at intervening positions along the line to permit three points of inflexion. The points A2 and A3 each include two different stored tangent angles and magnitudes to permit the line to include discontinuities.
They thus act as two points A2a, A2b and A3a, A3b in controlling the path of the line.
Three extent or width control points F17F2, F3 are provided; width points F1 and F3 are co-located with the curve control points A1 and A4 and the width control point F2 lies between the curve control points A2 and A3.
Colour control points G1,G2,G3 are provided; G1 is co-located with A1 and F1,G2 lies between A2 and A3 and G3 lies between A3 and A4.
Opacity control points H1,H2,H3 are provided; H1 is co-located with A1 and H3 with A4, and H2 lies between A2 and A3.
Although in this example attribute control points are located with the curve control points A1,A2 on the ends of the line, this is not in general necessary.
The data stored in the line table 122 for this line is shown in FIG 17. In this table the curve control points A1-A4 are represented as in FIG 4, and the attribute data is represented as defined at six attribute control points P1-P6. P1 corresponds to colocated width, colour and opacity control points F1, G1, H1. P2 corresponds to colour control point G2. P3 corresponds to opacity control point H2. P4 corresponds to width control point F2. P5 corresponds to colour control point G3. P6 corresponds to width and opacity control point F3, H3.
At each attribute control point P1-P6, therefore, some attribute data are set and some (marked by "-") are not set. The width data comprises a pair of extent values el e2 and, for reasons discussed above, a corresponding pair of tangent values and tangent magnitude values. The colour data comprises a list of colour values C1,C21C3...
and corresponding parametric position values X1,X2,X3 ...
defining the position of the corresponding colour values as a fraction of the total length between the two extent values e1,e2 at that point.
The opacity data likewise comprises opacity values O1,O2J 03 and corresponding position values X1,X2,X3. A predetermined number of positional data could be stored (for example, three as shown), or the data could be stored as a list of variable length.
Referring to FIG 18, the first step performed by the image generator 114 when the image is to be generated in the generated image store 130a is to split the line A into a series of sections or segments, each bounded by a curve or attribute control point but not including any further such points, such that at the start and end of each second, the curve or path control data and/or the attribute data are known. It is convenient for this purpose to split the line at each of the curve control points A1-A4 and, where different, each of the attribute control points G21H2,F2,G3.
Accordingly, at each path control point all attribute values must be derived and at each attribute control point path control values must be derived together with values for those attributes not set. A table 126 (shown in FIG 19) is provided for storing data at each point.
At the point A1, all curve and attribute data is already available.
The first segment is defined between the point A1 and the point A2 (the tangent A2a). The length of this section is determined and stored in a working store 125, by for example determining the positions of 20 points at equally spaced values of the curve parameter along the line, and deriving between each point and the next the linear distance by deriving the square root of the sum of the squares of the X,Y coordinate differences therebetween.
The next section lies between point A2 (tangent A2b) and G2. The path of the line between A2 and A3 is derived, in the form of equation 1 from the curve control points and A2 and A3 (tangents A2b and A3a), and the value of the parameter T and the point G2 is substituted in to derive its X and Y coordinates. The constants a,b,c,d are then rederived to normalise the parameter values to run through 0 at A2 to 1 at G2, and these values are stored as curve data for that section.
The next curve segment lies between G2 and H2, and the constants a to d of the curve section between A2 and A3 are likewise rederived so as to normalise the parameter value between 0 at G2 and 1 at H2.
At this stage, therefore, the segment table 126 comprises a series of entries, one for each of the curve or attribute control points, each including data defining the curvature, position and all attributes of the object at that point.
Referring to FIGS 20 and 21, the next step is to split each segment (defined by a pair of adjacent points in the table 126) into a plurality of slices along the line A.
Each slice is processed in turn, to derive image data corresponding to the image area within the slice, and to write such image data into the generated image store 130a.
The number of slices within each segment is such that the curvature of the object over a slice is reasonably well approximated by a straight line. A predetermined number (for example, 20) of slices of equal length may be created, or the number of slices may be controlled in dependence upon the curvature of the segment (derived from the magnitude and angle of the tangents at the control points at either end of the segment). Preferably, however, the length of each slice is individually calculated in turn, starting from one end of the segment, to be the maximum length over which the curve does not deviate from a straight line by more than a predetermined amount. Thus, if the line is straight over a segment, only a single slice is generated, considerably reducing the amount of processing to be performed where the curvature of a line is small.Where the extent values stored include tangent values, so that the extents are interpolated as curves rather than as straight lines, the curvature of the extents is preferably also taken into account in determining the length of slices. The maximum deviation from a straight line is preferably set in dependence upon the resolution of the image to be generated, and the hence the size of the generated image store 130a; preferably, the deviation is not more than one pixel width.
One way of doing this is as follows: The cubic parametric equations for x and y coordinates are found for the curve, and for the extents. This gives six equations in the form f(t) = at3 + bt2 + ct + d with different constants a - d in each.
For each of the six equations the largest incremental length of the parameter t for which the deviation from linearity is below one pixel width E is found.
Starting from the end of the previous slice (or, for the first slice, from the start of the segment) to one process for doing this is therefore as follows: Find the maximum distance dt that can be travelled from to such that the left extent value does not deviate from a straight line approximation between to and (t0 + dt) by more than one pixel width (dt~eO).
Find the maximum distance dt that can be travelled from to such that the right extent value does not deviate from a straight line approximation between to and (t0 + dt) by more than one pixel width (dt=e1).
Find the minimum of dt 0 and dt~e1 (dt#).
Find the maximum of either extent between to and (t0 + dt,e) (e-max).
Find the maximum distance dt that can be travelled from to such that the x coordinate value does not deviate from a straight line approximation between to and (t0 + dt) by more than one pixel width divided by e,ax (dt,x).
Find the maximum distance dt that can travelled from to such that the y coordinate value does not deivate from a straight line approximation between to and (t0 + dt) by more than one pixel width divided by e,max (dt~y).
Find the minimum of dtx and dt~y (dt~xy).
Find the minimum of dtxy and dt~e (dt).
The value dt thus calculated is the maximum distance that can be travelled along the curve whilst guaranteeing that neither the path nor either extent will deviate from a straight line approximation between to and (t0 + dt) by more than one pixel.
In general, this method errs on the safe side and other methods are equally acceptable.
One method of finding, in the above processes, the maximum distance dt that can be travelled from to along some cubic parameter equation (either the line or the extents) such that the deviation from a straight line approximation is less than some error E is as follows: Given f(to) = at0 + bt02 + cto + d, then 3 + dt) = a(tO + dt)3 + b(t0 + dt)2 + c(to + dt-2) + d df = f(t0 + dt) - f(t0) = dt(c + bdt + adt2 + 2bt0 + 3adtt0 + 3at02 Assuming that the best straight line approximation between to and f (t0+dt) will be used, all constant and linear terms in dt can be ignored (t0 is a constant). Therefore, deviation from straight line approximation is: de(dt) = dt2 (b + 3at0 + adt) de has minima at dt = O (designated dtO) and dt = (2b + 6at0) . 3a (designated dtl) If the value of de is less than 0 or greater than unity at dtl, then the deviation from non-linearity is monotonic to the end of the segment (i.e over the range dt = 0-1). In this case, if the magnitude of the deviation de at the end of the segment (dt = 1) is less than or equal to one pixel width E, then the slice can continue until the end of the segment and maximum value dt is therefore 1. If the error at the end of the segment exceeds unity, a straightforward binary search technique is employed to converge on the maximum value of the slice length dt for which the error is below one pixel width E.
If the error does not increase monotonically but the value of the error at the maximum dtl is less than or equal to one pixel width, and the value of the error at the end of the segment (dt = 1) is less than or equal to one pixel width, then the slice extends to the end of the segment as above. If the value at the maximum less than one pixel width, but the value at the end of the segment is greater, then a binary search is conducted between the maximum and the end of the segment.
Finally, if the value at the maximum dtl is greater than one pixel width, a binary search is performed over the range between to and the maximum dtl to find the maximum value of dt such that the magnitude of de is less than or equal to one pixel width.
The next step is to determine, at each slice, the position of the four vertices marking the two extents of the width at each end of the slice. This is achieved for each successive slice by finding, as above, the value of the parameter, t, at each end of the slice deriving the position of the end of the slice along line A, and the tangent at that point, interpolating the width at that point by interpolation between the values at the two control points defining the segment, as a pair of extent values, deriving the positions of the extents of the widths by calculating the ends of a straight line normal to the tangent to the curve, and storing these coordinates.
The colour and opacity data in the table 126 are likewise interpolated to derive position and value data for points along the extent lines bounding the slice.
Referring to FIG 21, the slice is therefore subdivided into a number of quadrilateral regions defined by four points at the vertices thereof, each point being defined by X,Y coordinates and, at each point, values of colour and opacity being defined. The image data to be stored in the generated image store 130a may now be calculated by linear interpolation from this data.
Referring to FIG 22, each quadrilateral region is therefore mapped onto a matrix of image points, depending upon the resolution of the image store 130a. The x,y co-ordinates of the corners of the quadrilateral are therefore calculated in units of the matrix of image points (but remain in higher precision, retaining their fractional part).
The interpolation is then performed, line by line of image points, by calculating the colour and opacity values at positions along the quadrilateral edge corresponding vertically to the ends of each line by linear interpolation between the two corner points values at the ends of the edge, and then calculating interpolated colour and opacity values at each image point along the line from the line end values. The calculated colour value is multiplied by the calculated opacity value. The colour value stored in the image store 130a is then read and multiplied by unity less the opacity value, and added to the above calculated colour multiplied by opacity value. The sum is then written back into the generated image store 130a.
The image points through which the edges of the quadrilateral pass do not lie wholly within the quadrilateral. They likewise do not lie wholly within neighbouring quadrilaterals. Preferably, for each such image point along the edges of a quadrilateral, a measure of the proportion of the area of the image point which lies within the quadrilateral is taken. The image point is then assigned a colour which depends upon the calculated colour for that area multiplied by the proportion of the image point lying within the quadrilateral, so that if the image point forms part of two neighbouring quadrilaterals its final colour depends upon those calculated for each quadrilateral in the ratio of the area of the image point which lies within each.
To avoid the appearance of a jagged, or rastered, line at the quadrilateral edges, well know anti-aliasing techniques may be adopted, for example, similar to the "A buffer" technique described in "The A Buffer, An Anti-Aliased Hidden Surface Method", L Carpenter, Computer Graphics (SIGGRAPH 1984), Vol 18 3 (July 1984), p103-108, but not employing the hidden line removal feature of that technique.
More information on suitable methods of rendering image areas defined by quadrilaterals, in the above manner, will be found in "Procedural Elements For Computer Graphics", David F Rogers, published by McGraw Hill, ISBN 0-07-053534-5, at pages 70-73.
After all image data line across the quadrilateral region have been written into the store 130a, the next line is processed in the same manner. After all the lines of a region have been written into the image store 130a, the next region within the slice is interpolated in the same manner. After all the regions of a slice have been processed, the coordinates of the next slice are calculated and placed in the slice table 127 to replace those for the existing slice. After all the slices for one segment are processed the next segment in the segment table 126 is processed. After all the segments in the segment table 126 have been processed, the representation of the object a in the generated image table 130a is complete.
Referring to FIG 23, when the curvature of the path is particularly sharp, relative to its width, a problem can occur; a similar problem is referred to as "the annoying bow-tie case" in the above referenced Strassman article.
Essentially, rather than defining a quadrilateral a "bowtie" or figure of eight may occur. A consequence is that certain image points stored within the generated image store 130a are accessed twice by the image generator 114, in relation to different quadrilaterals Q11Q2. This can lead to a noticeable discontinuity, particularly if one or both quadrilaterals have a degree of transparency, as the background colour is then doubly attenuated. Whilst this problem is generally not critical one way of reducing the effect of the problem would be to provide a further store storing data defining the background colour, and to provide that where transparency is specified, rather than reading the existing image point data in the frame store 130a and employing this as background colour data, the stored background colour data is employed instead.This would have the effect of making the object "opaque" to itself, but "transparent" to the stored background.
In an alternative method, an intermediate image store or image buffer is provided and the image generator 114 is arranged to calculate, as above, the interpolated colour and opacity values at each image point along each line of each quadrilateral. However, instead of calculating a final image point colour value based on both interpolated data and a background value, the interpolated colour and transparency values are instead stored in the image buffer for each point in the quadrilateral. Where a given image point in the image buffer is written to by two quadrilaterals one after the other, the later simply overwrites the earlier so that the stored data relates to the later written quadrilateral.
When the generation of the object is complete, the image generator 114 then merges the image from the buffer into the generated image store 130a, by setting the values of colour and opacity in the generated image store 130a as follows: Colour and opacity values C1,O1 from the image buffer for an image point are read and colour and opacity values for the corresponding image point in the generated image store 130a (C2,02) are likewise read. The new values of colour and opacity for that point to be stored in the image store 130a are calculated as follows: C=C1O1 + C2(l-O1); 0=01 +02 - (0102) These values are written back into the generated image store 130a, and then the next image point in the image buffer is correspondingly processed.
Using this method, a given object is transparent to others but opaque to itself.
Where the invention is practiced upon general purpose digital processing apparatus, such as a microcomputer, rather than on dedicated image processing apparatus, this process is inevitably time consuming and preferably, therefore, the image generator 114 is operable thus to generate or amend the image in the generated image store 130a only on receipt of a control signal from the user, generated for example by pressing a key marked "RENDER" on the keyboard 170b. The image generator may be arranged to be capable of performing in two or more different modes; for example lower and higher resolution, or with and without anti-aliasing, to give a choice of speeds.
Referring to FIGS 24, 25 and 26A-C, the effects of various attributes will be graphically illustrated, separately and in combination.
FIG 24A shows the path of the object on a supervisory display. The path is defined by three path control points A1,A2,A3; the tangents at the control points are illustrated. No attribute values are set, and consequently the image shown in FIG 25A produced by the image generator 114 shows a line with substantially no width and dark colour.
Referring to FIG 24B and FIG 25B a predetermined, constant extent is provided by defining the extent at two extent control points co-located with the curve control points A1,A3. Colour profiles G1,G4 are defined at these two points, and at two intervening colour control points G11 G3. The profiles are shown in FIG 26A.
Referring to FIG 24C, as above the same extent control points and values are employed as in FIG 24B. Opacity profiles are set at the two end curve control points H1, H4, and at two intervening points, H2,H3. The corresponding profiles are shown in FIG 26B. It will be seen from FIG 25C that the object is rendered entirely transparent towards its left hand end, so that the background grid previously stored in the frame store 130b is visible, and opaque at its right hand end, with intervening transparent and opaque stripes produced by interpolation through point H3.
Referring to FIG 24D, the lateral extent of the object is set in this case to different values at each of four control points F1-F4 to taper to O at the left hand end point F1, rise to a maximum value at F4, and thereafter taper to a predetermined level at F2 as shown in FIG 26C.
Referring to FIG 25E, when the colour profiles of FIG 26A, the opacity profiles of FIG 26B and the extent values of FIG 26C are combined, a complex object is produced resembling a sophisticated brush stroke. It will be seen that the invention allows easy separation and manipulation of attributes of objects, but can equally be used to create complex and artistic displays.
Referring to FIGS 27 and 28, further examples of the output produced by this embodiment of the invention are shown. Objects a-k of FIG 27 correspond to lines A-K on the supervisory display shown in FIG 28. Objects a and b show highlighted areas and shaded areas in substantially transparent objects, produced by relatively small numbers of transparency, colour, and extent control points.
Object c shows an "airbrush" effect produced by two control points (as shown in FIG 28) at which are specified extent values and colour and opacity profiles, the colour profile in this case being monochrome.
Referring to FIG 27, a linear object (a nail) is represented as a straight path D, with an effective illusion of illumination being provided by the colour profiles giving bright and shaded areas of the object. The object e shown in FIG 27 is also represented by a linear path E and a relatively small number of extent control points; setting colour profiles provides the illumination highlights and shading, the colours (blue for the handle and brown for the tip) of the different parts of the object e and the appearance of hairs within the brush tip.
The object f shows substantially the same object as e, but with the curve control points amended to create a curve.
The objects g-k (respectively coloured blue, yellow, mauve, green and red) show highlighting created by colour profile control points, as previously discussed.
Automatic Editing Thus far, editing of attribute by a user has been described. Since the positions or control points at which the attribute data is defined are themselves defined relative to a path A corresponding to the object, modifications of the path by the user will not affect the validity of the attribute data to a large extent.
Preferably, the apparatus according to a preferred embodiment is arranged to effect predetermined amendments to the path and attribute data in response to corresponding user commands.
Object Translation To translate the entire object within the display, referring to FIGS 29 and 30, it is merely necessary to amend the X,Y co-ordinate data relating to the path or line A within the line table 122 shown in FIG 4. For example, if the object is to be shifted a distance corresponding to 50 units (for example millimetres) along and 50 up within a display, the supervisory display editor 113 reads each X,Y position within the line table 122, adds a corresponding increment of 50 to each co-ordinate and writes the co-ordinates back into the line table 122.
The new position for the object, or the distance the object is to be moved in each of the X and Y directions, may be specified by typing numeric values via the keyboard 170b, together with a key stroke indicating that a translation operation is to be performed.
Alternatively, on receipt of a control signal indicating a translation operation, the supervisory display editor 113 may be arranged to accept from the cursor tracker 112 the cursor X,Y position as an indication of either the position to which line A is to be moved or the amount of a translation. In this case, the user indicates the translation by manipulation of the mouse or other position sensitive input device 170a.
Rotation To rotate the object within a display, a user initiates a control signal indicating that rotation is to be performed and inputs, in the same manner as described generally above, data indicating the point about which rotation is be performed and the angle through which the object is to be rotated. The supervisory display editor 113 is arranged to recalculate the X,Y co-ordinates of each of the curve control points in the table 122, using strightforward trigonometrical algorithms, to derive coordinates of the rotated path and store these in the table 122. Further, the editor 113 is arranged to recalculate the tangent values at each point to add the angle of rotation to the tangent angle.
Scaling To scale the line A to expand or contract the line, relative to some point, the supervisory display editor 113 needs to recalculate, as above, the X,Y position coordinates of each of the curve control points in the table 122 so as to multiply the differences between each co-ordinate and the corresponding co-ordinate of the reference position by the desired scaling factor. At the same time, the magnitude of the tangent data is likewise multiplied by the scaling factor. The user therefore inputs the scaling factor and the position relative to which the scaling is to be calculated (which is, in many applications, one of the ends of the object).
However, since this type of scaling (shown in FIG 31B) scaling of the line A leaves the extent data unmodified, the shape of the object (specifically, its aspect ratio) is altered. This undesirable in many applications.
Accordingly, it is preferred that the supervisory display editor 113 should also amend the extent data, by multiplying each extent value in the line table 122 by the scaling factor and then storing the scaled extent data back in the table 122 as illustrated in FIG 31C.
Alternatively, in another embodiment, the supervisory display editor 113 is arranged to accept an instruction to amend the extent values by multiplying each with a scale factor, whilst leaving the co-ordinates of the curve or path control points unchanged. In this way, it is possible to make an object thinner or fatter without changing its length.
Translation Along A Path Referring to FIG 32, it may be useful to move certain portions of an object, along the path A about which the object is defined. For example, the visibility of the object may be zero towards the ends of the path so that the object is not visible towards the ends of the path.
To move a portion of the object (for example, the visible part), the supervisory display editor 113 is arranged to amend the attribute control point position data stored in the table 122, leaving other contents of the table unchanged. Where path or curvature control points lie between attribute control point positions, it is necessary to derive the line length as described above, and the actual X,Y positions of the attribute control points along the line, scale these positions and then recalculate the parametric attribute control point positions.
Automatic Lighting Effects One particular use to which colour profiles can be put is in simulating the effect of illuminating the object, to give an impression of depth to the object as prevously discussed and shown in FIG 28. To achieve this, parts of the object which are intended to be represented as having alignment such as to reflect light from the light source are represented as brighter, and others as darker; the impression of a curved surface is achieved by providing a gradation of brightness, by setting the colour value at one point to represent bright white and that at another point to represent dark. The effect of coloured lighting is achievable similarly.
To provide such an effect automatically, referring to FIG 33 a user specifies a light source position LS (for a representation of illumination from a point source) or direction for a representation of the effects of a parallel illumination source. What is required is that the brightness at points in the object should be determined by the inclination of the object at those points to the direction to the defined light source.
Parts (N) of the object inclined normal to the direction to the defined light source position should be bright, whereas those parts (M) approaching parallel inclination thereto should be dark.
Preferably, the edges or extents of the object (R) which are the more distant to the light source LS are also darkened.
This may be achieved in several ways. Firstly, the supervisory display editor 113 may determine the inclination of the object at the existing points at which colour values are set, and amend those values.
Secondly, the supervisory display editor 113 may create a further set of colour or opacity control points, specifically to add highlighting and shading to the object as desired. Alternatively, the highlighting or shading may be executed during the process of generating the image by the image generator 114, at the point, for example, where the quadrilateral corner points have been established; the modification being effected on the colour values at each of the quadrilateral corner points prior to interpolating within the quadrilateral.
In a further embodiment, the table 122 may be dimensioned to store depth information at points in the object a allowing the shading to be responsive also to the depth of the object in or out of the display 160. Such stored depth information could also or alternatively be employed for other purposes; for example, for determining, where two objects occupy an overlapping area of a display, which one is to be represented as hidden or partially obscured by the other. The depth information could be input by the user or alternatively, in the case where the object is derived from a three dimensional representation, supplied in accordance with that representation.
Composite Assemblies of Objects Referring to FIG 34, a flag may be stored indicating that one object is linked to another object or forms an assembly therewith and the two may be displayed connected on the supervisory display. In this case, upon a user moving scaling or otherwise varying (for example by another affine transformation) one object, parameters of the other are automatically correspondingly amended by the supervisory display editor 113.
Preferably, the flags indicate a hierarchical structure to the object; for example, one object (A) may represent a toe and another (B) a foot to which the toe is connected. In this case, when a user changes one object (for example, by moving the object), the supervisory display editor 113 examines the flag of the object or objects to which it is connected. Any objects ranked lower in importance than the changed object are correspondingly changed; objects ranked higher are left unchanged. Thus, in the above example, movement of a toe would not move the foot, but movement of the foot moves all the toes. Likewise movement of the leg moves both the foot and the toe.
Preferably, as shown, the objects which are linked together are displayed on the supervisory display with paths joined. At the join, a control point symbol is displayed. This symbol represents a path control point for the joining path, but has no effect upon the joined path. It is constrained to be moved along the joined path, to change the point of contact of the objects.
Data relating to this point within the display is therefore stored within two separate line tables; in the subsidiary object table, the stored data represents, as usual, position coordinates and tangent data whereas in the table relating to the line to which it is connected, the data stored is a position value (relative to the line, and preferably a value of the interpolated parameter t at that position along the line) and the address within the memory 120 of the line table corresponding to the joining or subsidiary object. Accordingly, when the joining point position is changed by a user, the supervisory display editor is operable firstly to change the parametric position of the joining point within the joined object line table, and secondly, to access the subsidiary line table using the stored base address, calculate the actual positional coordinates of the joining point, and amend the x,y coordinates of the curve control points in the subidiary object by a corresponding amount.
Whenever an object is moved or transformed in any way, the supervisory display editor is likewise operable in this embodiment to recalculate the actual positions of the joining points of that object, and access the line tables of the subsidiary objects joined at those points, and correspondingly amend the curve control position data therein.
OTHER MODIFICATIONS AND IMPROVEMENTS Extent Angles Although in the foregoing the extent of the object has been defined and represented normal to the line or path A of the object, where the object is to appear as, for example, a brush stroke the extents could be defined at an angle inclined to the line A; in this case the line store 122 will include also data defining the angle of inclination at each extent control point, and the supervisory display generator and image generator will be arranged to correspondingly process the inclination data.
Mattes Mattes or marks, are used by artists to create an area within the display, inside or outside of which the object is rendered opaque. They therefore act as "windows" either allowing the object to be visible against the background or the background to be visible through the object. Where several intersecting path lines are provided, a matte may be defined as an area bounded by the paths. A matte table is provided within the memory 120 storing, for each matte, data specifying the lines within the line memory 122 defining the matte and data specifying the relative transparency within the matte.
The relative transparency may be specified at several points, and interpolated through the area of the matte, to provide a varying degree of transparency or opacity.
The image generator 114 is arranged to access this table, and derive the boundaries of each matte whilst generating the image, and to apply, in addition to the transparency data defined for each object, the transparency data defined at each point by the matte.
Dormant Curve Control Points It may be convenient to provide within the or each line table 122, entries for storing curve control points including an active/dormant flag region. By generating an appropriate control signal, the user can change the state of this flag. When the flag indicates that the point is active, the supervisory display generator and image generators 111,114 treat the point as a curve control point and generate the respective images in dependence upon the value of the tangent at the point.
The x,y position of the point, on becoming active, is derived by interpolation between its neighbouring curve control points.
However, when the flag is set to indicate that the point is dormant, the only action taken in response to the point is to display a corresponding symbol on the supervisory display. Thus, a user may provide alternative configurations of a line and switch therebetween merely by changing the flag state of such a dormant point.
Libraries of Attribute Point Sets Conveniently, the appearance of certain objects (for example, types of brush stroke) may be stored as a predetermined set of profiles and/or extent values, on the mass storage device 180. On execution of an appropriate control signal, the corresponding attribute point data are read from the mass storage into a line table 122 at predetermined points along a line corresponding thereto, and may subsequently be amended by a user as described above.
Object Ends In the particular case where the object to be represented is a brush stroke, it may be convenient to provide means for automatically generating a smooth end for the object in some convenient way.
Fills As with mattes above, an area to be filled with a given colour may be designated as an area bounded by certain paths, and means for setting the values of image points within the generated image store 130a which correspond to points within the area defined by these lines to a predetermined colour are therefore preferably provided.
To achieve this, the means are arranged to calculate the coordinates of the intervening points along the lines between the control points defining the lines.
Texture Interesting visual effects are produced if the outline of an object, represented by its extent lines, is used as a window through which a further image is visible. This may be achieved, for example, by modifying the process of FIG 18 so that image points within each quadrilateral are derived, not by interpolation but by looking up an image point at a corresponding address within a texture image buffer which stores a texture image as a plurality of image points.
Preferably, however, rather than merely allowing the texture image to become visible through the object, it is processed in various ways. For example, in one preferred embodiment of this type, the orientation of the path of the object at each quadrilateral is taken into account when accessing the texture image buffer, so that the texture image is warped to follow the path of the object; the inclination at each quadrilateral is derivable by interpolation between the extent tangents for each segment. In another embodiment, the texture image is expanded or contracted (for example, by spatial subsampling) to scale the texture image to the extent of the object along its path.
The texture image may, rather than being stored in raster form in a texture image buffer, be an image characterised, as above, as objects each defined by path and attribute data at control points. In this case, a raster image may be reconstituted, as a above, in the texture image buffer or alternatively the operations of generating the texture image and the generated image may be combined into a single rendering operation.
One example of a texture image is a picture of fur, to enable a 'furry' line or stroke to be created.
Extent Representation Various ways of representing the extent of the object have been described above. Where the path of each extent of the object is defined by a spline curve, specifically by a Bezier curve, it would be possible to dispense with a further line defining the path of the object, and to define the attributes by reference to one or both of the extent curves.

Claims (28)

1. Image processing apparatus for generating a visible output image including visually distinct objects, comprising: input means for receiving signals defining said objects; processing means for processing said input signals; and storage means for storing generated image data comrprising a plurality of image point values associated with corresponding points in the image, in which the processing means includes means for storing data defining a path characterising the disposition one or more said objects within the image, and data defining an extent of each said object within said image along a direction which is not parallel to said path, characterised in that said input means are arranged to receive signals to cause said processing means to vary said extent data independently of said path data, and/or vice versa.
2. A process according to claim 1 in which said path data comprises data defining the position and curvature of a plurality of points along said path, and the processing means is arranged to calculate the positions of intervening points therefrom.
3. Apparatus according to claim 1 or claim 2, in which said extent data comprises data defining a length at a predetermined position along said path and at a predetermined angle thereto.
4. Apparatus according to claim 3 in which said predetermined angle is 900.
5. Apparatus according to claim in which said input means and said processing means are arranged to accept signals defining said angle.
6. Apparatus according to any one of claims 3 to 5 in which said extent data further comprises data defining the tangent angle of the edge or extent of the object at said points relative to said path.
7. Apparatus according to any preceding claim, in which said extent data further comprises data defining the position of said points.
8. Image processing apparatus for generating an image which includes displayed objects, characterised in that it comprises means for generating an auxilliary display representing attributes of the object symbolically, and means for varying said attributes, prior to generation of said image.
9. Apparatus according to claim 8 in which said means for varying comprises means for receiving input signals specifying a variation of said attributes.
10. Apparatus according to claim 9 further comprising a keyboard device connected to said receiving means.
11. Apparatus according to claim 9 or claim 10 further comprising manually operable position sensitive input means coupled to said receiving means, and means for causing the display within said auxilliary display of a cursor symbol responsive to said input means.
12. Apparatus according to any one of claims 8 to 11 wherein a said object is represented within said auxilliary display as a line defining the disposition of said object within said image.
13. Apparatus according to claim 12, further comprising data storage means arranged to store data defining said line as vector data.
14. Apparatus according to claim 13 in which said vector data comprises data defining the positions of a plurality of spaced apart points along said line.
15. Apparatus according to claim 14 wherein said vector data comprises also data defining the inclination of said line at said positions.
16. Apparatus according to claim 15, wherein said data comprises Bezier control point data.
17. Apparatus according to any one of claims 12 to 16 in which further comprising attribute store means for storing values of said attributes corresponding to spaced apart points on said line.
18. Apparatus according to claim 17 wherein said auxilliary display generating means is arranged to display symbols representing said points.
19. Apparatus according to claim 18 when appended to claim 11 wherein said attributes are variable by operating said position sensitive input means to vary the position of said attribute points along the line.
20. Image processing apparatus comprising store means for storing data representing a plurality of point positions and interpolation means for generating intervening positions from said stored data to generate data defining a line image, in which said store means is arranged to store data defining a plurality of such line images, and data defining the spatial relationship therebetween, and there are provided input means for accepting a signal from a user to vary said stored data representing a first said line image, the arrangement being such that a second said line image generated by said interpolation means is also varied thereby.
21. Apparatus according to claim 20 in which there are provided editing means for editing the stored data representing said second line in dependence upon variations to said first line stored data.
22. Apparatus according to claim 20 or claims 21 in which said second line joins said first line and said relationship data comprises parametric data specifying the distance along said first line of said join, relative to the length of said first line.
23. Apparatus according to claim 22 when appended to claim 21, wherein said editing means is arranged to vary the stored data relating to said first line, calculate the position of said join in dependence upon said relationship data, and vary the stored position data of said second line in dependence thereon.
24. Image processing apparatus for generating a representation of a line in dependence upon stored position data, comprising means for storing said position data and means for storing control data comprising parametric data defining an intermediate position along a said line, and flag data associated with said parametric position data, said apparatus being operable in a first mode in response to a first setting of said flag data to display a representation of said control data and in a second mode in response to a second setting of said flag data to generate said line in dependence upon said control data.
25. Apparatus according to claim 24, further comprising input means for varying the value of said parametric position data.
26. A method of image processing for generating an image comprising the steps of: inputting parametric data specifying the path of a line defining the disposition in said image of an object; inputting data defining attributes (such as colour and width) of the object at specified points along the path; varying the said positions along the path; and subsequently calculating the appearance of the image in dependence upon the path and the defined attribute values, using image processing apparatus.
27. A method of image processing comprising the steps of: defining, using image processing apparatus, control data defining the spatial arrangement of an object within the image; defining values of visual attributes of that object; outputting, on image output apparatus, a symbolic representation of the said spatial arrangement and attributes; modifying the control data and/or attribute data if required; and generating an image in dependence upon the control data and attribute data using image generating apparatus.
28. A method of computer painting comprising: defining the path of a brush stroke by parametric curve control data; defining and interactively editing the attributes (for example, stroke width; stroke transparency and stroke colour) at predetermined points along the path, using a supervisory display of the path and the points thereon; and generating the intervening points by interpolation, for storage in a frame store and subsequent reproduction as an image of the brush stroke.
GB9110945A 1990-11-30 1991-05-21 Image synthesis and processing Withdrawn GB2256118A (en)

Priority Applications (21)

Application Number Priority Date Filing Date Title
GB9110945A GB2256118A (en) 1991-05-21 1991-05-21 Image synthesis and processing
JP4500061A JPH06505817A (en) 1990-11-30 1991-11-29 Image synthesis and processing
AU89321/91A AU8932191A (en) 1990-11-30 1991-11-29 Image synthesis and processing
AU90158/91A AU9015891A (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002124 WO1992009966A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
EP91920646A EP0559708A1 (en) 1990-11-30 1991-11-29 Image synthesis and processing
EP91920852A EP0559714A1 (en) 1990-11-30 1991-11-29 Animation
PCT/GB1991/002122 WO1992009965A1 (en) 1990-11-30 1991-11-29 Animation
JP4500477A JPH06503663A (en) 1990-11-30 1991-11-29 Video creation device
US07/844,634 US5692117A (en) 1990-11-30 1991-11-29 Method and apparatus for producing animated drawings and in-between drawings
EP92910474A EP0585298A1 (en) 1990-11-30 1992-05-21 Animation
JP4510508A JPH06507742A (en) 1991-05-21 1992-05-21 Video creation device
AU17934/92A AU1793492A (en) 1991-05-21 1992-05-21 Animation
EP19920910492 EP0586444A1 (en) 1991-05-21 1992-05-21 Image synthesis and processing
PCT/GB1992/000928 WO1992021096A1 (en) 1990-11-30 1992-05-21 Image synthesis and processing
US08/150,100 US5598182A (en) 1991-05-21 1992-05-21 Image synthesis and processing
JP4510509A JPH06507743A (en) 1991-05-21 1992-05-21 Image synthesis and processing
AU17921/92A AU1792192A (en) 1991-05-21 1992-05-21 Image synthesis and processing
PCT/GB1992/000927 WO1992021095A1 (en) 1990-11-30 1992-05-21 Animation
US08/311,398 US5611036A (en) 1990-11-30 1994-09-23 Apparatus and method for defining the form and attributes of an object in an image
US08/643,322 US5754183A (en) 1991-05-21 1996-05-06 Image processing apparatus and method for producing pixel data in dependence upon the shape of a sectional line extending between boundary lines of an object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9110945A GB2256118A (en) 1991-05-21 1991-05-21 Image synthesis and processing

Publications (2)

Publication Number Publication Date
GB9110945D0 GB9110945D0 (en) 1991-07-10
GB2256118A true GB2256118A (en) 1992-11-25

Family

ID=10695340

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9110945A Withdrawn GB2256118A (en) 1990-11-30 1991-05-21 Image synthesis and processing

Country Status (1)

Country Link
GB (1) GB2256118A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2297673A (en) * 1995-01-26 1996-08-07 Sony Corp Simulating ink spread in fibrous paper
US5940081A (en) * 1995-01-27 1999-08-17 Sony Corporation Method and apparatus for forming a font and the font produced method and apparatus for drawing a blurred figure
GB2338160A (en) * 1995-01-27 1999-12-08 Sony Corp Simulating a brush stroke with varying coverage

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1984002993A1 (en) * 1983-01-20 1984-08-02 Dicomed Corp Method and apparatus for representation of a curve of uniform width
US4897638A (en) * 1987-02-27 1990-01-30 Hitachi, Ltd. Method for generating character patterns with controlled size and thickness

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1984002993A1 (en) * 1983-01-20 1984-08-02 Dicomed Corp Method and apparatus for representation of a curve of uniform width
US4897638A (en) * 1987-02-27 1990-01-30 Hitachi, Ltd. Method for generating character patterns with controlled size and thickness

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2297673A (en) * 1995-01-26 1996-08-07 Sony Corp Simulating ink spread in fibrous paper
US5784301A (en) * 1995-01-26 1998-07-21 Sony Corporation Method and apparatus for producing paper fiber structure data, and method and apparatus for drawing bled figure
GB2297673B (en) * 1995-01-26 1999-10-13 Sony Corp Figure drawing
US5940081A (en) * 1995-01-27 1999-08-17 Sony Corporation Method and apparatus for forming a font and the font produced method and apparatus for drawing a blurred figure
GB2338160A (en) * 1995-01-27 1999-12-08 Sony Corp Simulating a brush stroke with varying coverage

Also Published As

Publication number Publication date
GB9110945D0 (en) 1991-07-10

Similar Documents

Publication Publication Date Title
US5611036A (en) Apparatus and method for defining the form and attributes of an object in an image
US5598182A (en) Image synthesis and processing
US8760467B2 (en) Distortion of raster and vector artwork
JP3837162B2 (en) Interactive image editing
US6867787B1 (en) Character generator and character generating method
Fekete et al. TicTacToon: A paperless system for professional 2D animation
AU738375B2 (en) Methods and apparatus for changing a color of an image
US6919888B1 (en) Computer drawing shape manipulation with envelope meshes
US5361333A (en) System and method for generating self-overlapping calligraphic images
Akeo et al. Computer Graphics System for Reproducing Three‐Dimensional Shape from Idea Sketch
US6774907B1 (en) Tint transformation of fill
GB2256118A (en) Image synthesis and processing
Draper et al. A Gestural Interface to Free-Form Deformation.
EP0494325B1 (en) Apparatus for editing and creating video image and method for editing and creating video image
JPH0696186A (en) Editor for graphic changing by time/attribute value
Barghiel Feature oriented composition of B-spline surfaces
WO1994028476A2 (en) Interactive image synthesis and processing
JP3132220B2 (en) 3D model shape creation method
EP0586444A1 (en) Image synthesis and processing
JP3078955B2 (en) Filling method
JP4230051B2 (en) Image generation method
Bujans A thesis on sketch-based techniques for mesh deformation and editing
JPH04267427A (en) Three-dimensional cursor indication system
JPH0991451A (en) Image edit device
JP3144235B2 (en) Figure creation device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)