GB2317309A - An electronic graphic system - Google Patents

An electronic graphic system Download PDF

Info

Publication number
GB2317309A
GB2317309A GB9618601A GB9618601A GB2317309A GB 2317309 A GB2317309 A GB 2317309A GB 9618601 A GB9618601 A GB 9618601A GB 9618601 A GB9618601 A GB 9618601A GB 2317309 A GB2317309 A GB 2317309A
Authority
GB
United Kingdom
Prior art keywords
data
image
store
image data
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9618601A
Other versions
GB9618601D0 (en
GB2317309B (en
Inventor
Colin John Wrey
Matthew Sumner
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Quantel Ltd
Original Assignee
Quantel Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Quantel Ltd filed Critical Quantel Ltd
Priority to GB9618601A priority Critical patent/GB2317309B/en
Publication of GB9618601D0 publication Critical patent/GB9618601D0/en
Publication of GB2317309A publication Critical patent/GB2317309A/en
Application granted granted Critical
Publication of GB2317309B publication Critical patent/GB2317309B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An electronic graphic system 1 comprises an image store 2 for storing data defining an image, a stylus and touch tablet device 5 for generating parameter data, and a monitor 4 for displaying at least a portion of the image represented by the data in the image store and for displaying a grid 33 representing the parameter data generated by way of the stylus and touch tablet device 5. The system also comprises a processor 3 for processing stored image data depending on the parameter data generated by way of the stylus and touch tablet device. The stylus and touch tablet device 5 also generates position data and the processor is arranged to process an area of image data from locations in the store determined by the position data and to output the processed area of image data for display as a processed image area 35 in the displayed image.

Description

AN ELECTRONIC GRAPHIC SYSTEM The invention relates to an electronic graphic system.
Electronic graphic or image systems in which the painting or drawing of a colour image can be simulated, or a portion of one image can be merged into another by electronic means are well known. One such graphic system is described in our British patent number 2,089,625 and corresponding US patent number 4,514,818, the teachings of which are incorporated herein by reference. This system includes a user operable input device which may be used by the user to select from a range of colours and a range of intensities and to choose from a set of notional drawing implements for use in the painting or drawing.
When a colour is chosen by the user, values representing the components of the selected colour are stored in a colour register. An implement is chosen by selecting from among different implement representations displayed on a display screen and the selected implement is defined by parameters conforming to a 3-dimensional surface representing the profile of the implement. Generally speaking the implement profile will have a high centre falling away to a lower value toward the periphery of the profile although other profiles may, of course, be defined.
The implement profile represents the notional distribution of colour which would be applied by the implement to the image over the area of the image which it covers.
The user operable input device is preferably a touch tablet and stylus combination. The touch tablet is arranged to generate position signals designating the position of the stylus relative to the touch tablet when the stylus is brought into proximity.
When the stylus is applied to the touch tablet a pressure signal representing the pressure applied via the stylus to the touch tablet is output from the stylus and stored in a pressure signal register. For some implements, representing say paint brushes, position signals are allowed to be generated for each movement of the stylus by the distance between picture points or similar distance, whilst for other implements, say air brushes, position signals are generated at regular time intervals, even if the stylus is held stationary on the touch tablet.
When a position signal is produced, new video signals (pixels) are derived for every picture point in the patch covered by the selected implement. An image store is provided and each new pixel is written at the appropriate picture point in the store. Such new pixels are derived by a processing circuit in accordance with the selected colour data and the distribution of the selected implement, and in response also to the pressure applied to the stylus and to the value of the pixel previously stored at the respective picture point in the store.
The user, who it is envisaged would normally be an artist lacking experience in the use of computer based systems, paints or draws by choosing a desire colour and implement and then manipulating the stylus, causing the touch tablet to generate a series of position signals which define the path or positioning of the stylus. The processing circuit reads pixels from the image store for a patch of picture points in response to each position signal and these pixels are blended by the processor with signals representing the chosen colour in proportions depending upon the respective values of the brush profile and pressure.
The blend is then written back to the picture store replacing the pixels previously stored therein.
In general, the blending process is carried out a number of times for each picture point in the image store whether the implement is moving or stationary (assuming in the case of the moving implement that the patch covered by the implement is larger than the spacing between picture points). The final proportion will depend on the number of processing operations performed per pixel.
To enable the user to observe his creation, the stored picture is read repeatedly and the pixels are applied to a TV-type colour monitor, so that the build-up of the picture can be observed. Of course such systems are not limited to TV-type formats and any suitable video format may be adopted. The system described avoids the problem of jagged edges in the image, an unpleasant stepping appearance normally associated with lines not lying horizontally or vertically in a raster display.
Another system which enables a user to perform picture composition in addition to painting is described in our British Patent No. 2113950 and corresponding US Patent No. 4602286, the teachings of which are also incorporated herein. In this system stores are provided for storing data representing two independent pictures and a control image or stencil.
A stencil is produced for example by "drawing" data into the control image store. The stencil data is used to control the combining of the data representing the two independent pictures to produce data representing a composite picture. The data representing the composite picture is output continuously for display of the picture on a monitor.
Once the user is satisfied with the displayed composite picture the composite data is stored permanently for subsequent processing or printing for example.
According to one aspect of the invention there is provided an electronic graphic system comprising: an image store for storing data defining an image; user operable means for generating parameter data; a display for displaying at least a portion of the image represented by the data in the image store and for displaying a representation of the parameter data input by way of the user operable means; and a processor for processing image data for the store depending on the parameter data generated by way of the user operable means.
According to another aspect of the invention there is provided a method of processing image data, the method comprising: storing data defining an image; generating parameter data; displaying at least a portion of the image represented by the stored image data together with a representation of the generated parameter data; and processing stored image data depending on the generated parameter data.
The above and further features of the invention are set forth with particularity in the appended claims and together with advantages thereof will become clearer from consideration of the following detailed description of an exemplary embodiment of the invention given with reference to the accompanying drawings.
In the drawings: Figure 1 is a schematic diagram of an electronic graphic system embodying the invent ion; Figure 2 represents an image displayable in the system of Figure 1; and Figure 3 shows part of the system of Figure 1 in greater detail.
Referring now to Figure 1 of the accompanying drawings, an electronic graphic system, indicated generally at 1, comprises a first image store 2 for storing data defining a first image, a display processor 3 and a monitor 4. The contents of the first image store 2 are read continuously in raster sequence via a serial access port by the display processor 3 and the thus scanned data is output by the display processor 3 for display of the image represented thereby on the monitor 4.
The system 1 also comprises a user operable stylus/touch tablet device 5 by which the user can modify the image data in the store 2, and hence the image represented thereby. As the stylus is drawn across the touch tablet signals representative of the instantaneous position X,Y of the stylus are output to a drawing processor 6. The display processor 3 and the drawing processor 6 are shown as separate entities in order to facilitate understanding by simplifying the following explanation. In practice the two processors 3, 6 may be provided as a single processing unit.
The position information X,Y is provided at a higher resolution than that of the image store 2.
That is to say, the spacing between adjacent addresses in the store 2 is significantly larger than the spacing between adjacent positions on the touch tablet 5. It follows that for a given pixel location in the store 2 there are a number of corresponding positions on the touch tablet 5. For example, the spacing between adjacent positions on the touch tablet may be say eight times smaller than that between adjacent addresses in the store 2, and thus there will be sixty four touch tablet positions corresponding to one pixel address in the store. The drawing processor 6 is arranged among other things to convert the instantaneous X,Y position information from the stylus/touch tablet 5 into data representing an equivalent location in the store 2. The equivalent location is defined in terms of a store address and an offset. The offset is calculated as the difference between the store address and the position X,Y of the stylus on the touch tablet. The offset has vertical and horizontal components each having a value of less than one pixel. In the above example the offset would have component values which are integer multiples of 1/8.
As the user moves the stylus on the touch tablet the position data X,Y is continuously generated by the touch tablet 5 and delivered to the drawing processor 6 where it is converted into x,y data identifying patches of store addresses in the store 2. Each patch of addresses is centered over the location in the store equivalent to the corresponding X,Y position data generated by the touch tablet.
The stylus of the stylus/touch tablet device 5 also includes a pressure sensor that outputs a pressure related signal for storage as a parameter in a stylus pressure register 7. Modern stylus touch tablet devices are also capable of generating data defining the orientation (twist) and defining the angle of the stylus in relation to the touch tablet.
These parameters may be stored as well as or instead of the pressure data in the stylus register for use by the processor.
Notional drawing implements are predefined in the system 1 and are selectable by the user from a menu of options (not shown) generated by the display processor 3 and displayed on the monitor 4. When the user selects a particular implement, data defining a continuous three dimensional shape covering a patch of pixels and representing the profile of the implement, as described in our above mentioned patents, is stored in a brush store 8.
A selection of predefined colours is also provided in the displayed menu and the user may select one of these predefined colours or instead may define a colour of his own choosing. Data representing the selected colour is stored by the display processor 3 in a colour register 9.
The image store 2 also includes random access ports for random access writing or reading of data to and from the store 2 independently of the serial raster reading of data to the monitor 4. As the stylus is moved across the touch tablet, data at each addressed patch is read from the store 2 via the random access read port to the drawing processor 6.
At the same time, brush shape data from the brush store 8 and colour data from the colour store 9 are also input to the drawing processor 6. The reading of the brush patch data from the brush shape store 6 and the colour data from the colour register 8 is synchronised to the generation of individual addresses within the patch of addresses by the drawing processor 6 which outputs said x,y patch addresses to the brush store 8 and reads signals from the colour store 9.
In the drawing processor 6 the image data Ion read from the image store 2 is processed with the colour data C, the brush data B and the stylus pressure data P to produce new image data I which is written back to the image store 2.
One way in which the drawing processor 6 may process the image data is to interpolate the image data I,, and colour data C using the product of the pressure data and the brush data as an interpolation co-efficient K to produce new data I in accordance with the algorithm I = KC + (1-K)Io. This processing serves to add data representing a patch of colour to the image data in the store. In the displayed image the patch appears as if an area of colour has been stamped into the image. The drawing processor 6 is arranged to combine colour data into the image data at regular intervals of time or distance. Thus, as the stylus is moved over the touch tablet data representing a series of overlapping patches of colour ("stamps") is added to the image data in the store and appears in the displayed image as a continuous line or stroke. In the following the read-modify-write operation will be referred to as "painting" or "stamping" as the context requires.
In addition to the above described painting operation, the system 1 is operable to perform image compositing. The system further comprises a second image store 10 for storing data defining a second image, and a stencil store 11 for storing data defining a control image or stencil. Conceptually the image stores 2 and 10 and the stencil store 11 are separate entities, but it will be appreciated that in practice they may be provided in a single large random access store with the capacity to store data relating to at least two colour images and one monochrome control image.
A colour image may be painted into either of the image stores 2, 10 and a monochrome image may be painted into the stencil store 11. Alternatively, image data from another source such as a camera or a scanner may be loaded directly into any or all of the two image stores 2, 10 and the stencil store 11. In the interest of clarity it will be assumed in the following that the image store 2 is used to store data representing a foreground image (F), the image store 10 is used to store data representing a background image (B), and the stencil store is used to store control data (K) defining how the foreground image is to be overlayed on the background image.
The system is arranged to interpolate the foreground image data (F) and the background image data (B) using the control image data (K) as an interpolation coefficient to produce data representing a composite image (Ic) in accordance with the compositing algorithm Ic = KF + (1-K)B. It will be appreciated that when the stencil data K has a value equal to 0 then the composite image data Ic will be equal to the background image data B. When the stencil data K has a value equal to 1 then the composite image data I will be equal to the foreground image data F. For values of K between 0 and 1 the composite image data I, will contain contributions from both the foreground image data and the background image data.
In one mode of operation the system 1 can be arranged to enable the user to paint the foreground image into the first image store 2. In this first mode of operation as the foreground image data is painted into the image store 2, data representing a corresponding stencil is painted into the stencil store 11. Previously created background image data from an image data source (not shown) is stored in the second image store 10. Data from the two image stores 2, 10 and from the stencil store 11 are read continuously in raster sequence to the display processor 3. The display processor combines the foreground image data from the image store 2 with the background image data from the image store 10 using the control image data from the stencil store 11 in accordance with the above-mentioned compositing algorithm. The composite data is output to the monitor 4 for display of the composite image represented thereby. Thus, as the foreground image and the stencil are painted into the image store 2 and the stencil store 11 respectively, the corresponding change can be seen in the composite image displayed on the monitor 4.
In another mode of operation previously created data representing foreground and background images are written into the image stores 2 and 10 respectively and the drawing processor 6 is arranged to enable the user to paint a control image into the stencil store 11. Again, the two image stores 2, 10 and the stencil store 11 are read in raster sequence to the display processor 3. Thus, the composite image displayed on the monitor 4 changes as the control image is painted to enable the user to see the effect of his painting in combining the foreground image with the background image.
In both of the above-described modes of operation the image data in the two image stores 2, 10 are kept separate. Once the user is satisfied with the displayed composite image, the image data in the two stores is combined by the drawing processor 6 in accordance with the control image data in the store 11. The data is read a pixel at a time from both image stores 2, 10 and the stencil store 11 and combined in accordance with the above-discussed compositing algorithm. The composite data may be written back to one of the stores, say store 10, replacing the data previously stored therein to enable further processing thereof, and/or it may be written to a bulk store (not shown) for more permanent storage.
Turning now to Figure 2 of the accompanying drawings there is shown an example of a display on the monitor 4. The display comprises an image portion 31 and a menu portion 32. The display processor 3 is arranged to generate, in response to suitable commands from the user, a grid 33 representing a digital filter. The grid 33 is displayed in the menu portion 32 and enables the user to input parameters defining operation of a spatial digital filter. The grid 33 may be of any size, but conveniently a grid of 3 x 3 to 9 x 9 may be defined. Typically, parameters will be input to the grid as integer numbers from 0 upwards. Normally, the parameters entered into the grid will be normalised so that the sum of all parameters in the grid is equal to 1. Under some circumstances a non-normalised set of parameters may be required and the user is therefore free to override the normalisation process if desired.
Once all parameters have been defined in the grid 33 they are normalised (unless normalisation is not required) by the display processor 3 and transferred to the drawing processor 6. As shown in Figure 5 of the accompanying drawings the drawing processor 6 comprises a spatial digital filter 25 and a patch address generator 26. The parameter input to the grid 33 by the user are applied to the digital filter 25 for use thereby.
The design of spatial digital filters is per se well known and is described for example in co-pending British Patent Application NO. 9518443.8. The display processor 3 is also arranged to generate data representing a cursor 34 for display in the image 31 on the monitor 4 at a position corresponding to the X,Y position of the stylus on the touch tablet. Once the parameters of the filter have been defined the display processor 3 is also arranged to define a preview area 35 attached to the cursor 34.
The preview area 35 is of a size sufficient to enable the effect of the filter 25 to be seen by the user. As will become apparent from the following description, it is necessary for the filtering operation to be applied to all pixels in the preview area 35 at a speed sufficient to enable the effect of the filtering to be seen in the preview area 35 as the cursor 34 is moved about over the image 31. The size of the preview area will therefore vary with the size of the grid 33; a filter with a large number of parameters will take longer to process data that will a filter with a smaller number of parameters. Data pertaining to the size of the grid 33 (and therefore the size of the preview area 35) is input to the patch address generator 26 by the display processor 3.
The patch address generator 26 responds to the position data X,Y from the stylus/touch tablet 5 generating a patch of corresponding x,y addresses in the image store 2. The image data in the patch of addresses is read from the image store to the filter 25 where it is filtered and then input to the display processor 3. At the same time, image data outside the patch of addresses (i.e. outside the preview area 35) is read from the image store 2 to the display processor 3. The display processor combines data from the image store and the data from the image store 2 and the data from the filter 25 to display on the monitor an image in which the image portion within the preview area 35 is filtered. This enables the user to see the effect of the filtering in specific areas before committing to the filtering. Thus, the user is free to modify parameters within the grid 33 until his achieves a filtering effect with which he is satisfied.
Once the user is satisfied with the filtering effect he may apply it to the entire image data within the image store 2 or to selected portions thereof. In the latter case the filtering may be applied by stamping filtered data into the initial image data in a manner similar to the above-described painting.
A library of predefined filters may be provided within the system. The filters may be predefined by the user and/or may be installed before the system is supplied to the user. This enables the user to select from a range of predefined effects or to use the predefined effects as a starting point for designing a new filter to achieve a desired effect.
Having thus described the present invention by reference to a preferred embodiment it is to be well understood that the embodiment in question is exemplary only and that modifications and variations such as will occur to those possessed of appropriate knowledge and skills may be made without departure from the spirit and scope of the invention as set forth in the appended claims and equivalents thereof.

Claims (11)

CLAIMS:
1. An electronic graphic system comprising: an image store for storing data defining an image; user operable means for generating parameter data; a display for displaying at least a portion of the image represented by the data in the image store and for displaying a representation of the parameter data generated by way of the user operable means; and a processor for processing stored image data depending on the parameter data generated by way of the user operable means.
2. A system as claimed in claim 1, wherein the processor comprises a filter arranged to filter image data from the store depending on the input parameter data.
3. A system as claimed in claim 7, wherein the filter comprises a spatial filter.
4. A system as claimed in any of claims 1 to 3, wherein the user operable means is operable to generate position data, and the processor is arranged to process an area of image data from locations in the store determined by the position data and to output the processed area of image data for display as a processed image area in the displayed image.
5. A system as claimed in claim 4, wherein the user operable means comprises a stylus and touch tablet device arranged to output a sequence of position data, and the processor is responsive to the sequence of process a different area of image data when the position identified by the sequence changes.
6. A method of processing image data, the method comprising: storing data defining an image; generating parameter data; displaying at least a portion of the image represented by the stored image data together with a representation of the parameter data; and processing stored image data depending on the generated parameter data.
7. A method as claimed in claim 6, wherein the processing comprises filtering stored image data.
8. A method as claimed in claim 7, wherein the filtering is spatial filtering.
9. A method as claimed in any of claims 6 to 8, further comprising generating position data, processing an area of the stored image data determined by the position data, and displaying the processed area of data as a processed image area in the displayed image.
10. A method as claimed in claim 9, further comprising generating a sequence of position data, and responding to the sequence of data by processing a different area of image data when the position identified by the sequence changes.
11. An electronic graphic system or a method of processing image data substantially as described herein with reference to the accompanying drawings.
GB9618601A 1996-09-06 1996-09-06 An electronic graphic system Expired - Fee Related GB2317309B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB9618601A GB2317309B (en) 1996-09-06 1996-09-06 An electronic graphic system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB9618601A GB2317309B (en) 1996-09-06 1996-09-06 An electronic graphic system

Publications (3)

Publication Number Publication Date
GB9618601D0 GB9618601D0 (en) 1996-10-16
GB2317309A true GB2317309A (en) 1998-03-18
GB2317309B GB2317309B (en) 2001-03-28

Family

ID=10799506

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9618601A Expired - Fee Related GB2317309B (en) 1996-09-06 1996-09-06 An electronic graphic system

Country Status (1)

Country Link
GB (1) GB2317309B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2322268B (en) * 1997-02-13 2001-07-25 Quantel Ltd An electronic graphic system
GB2381686A (en) * 2001-10-31 2003-05-07 Hewlett Packard Co Apparatus for recording and reproducing pointer positions on a document.

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2232045A (en) * 1989-04-17 1990-11-28 Quantel Ltd "paintbox" has interleaved processor/display access, pipelined brush processor, interpolated pressure values, fractional zoom, on-screen control of parameters

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2232045A (en) * 1989-04-17 1990-11-28 Quantel Ltd "paintbox" has interleaved processor/display access, pipelined brush processor, interpolated pressure values, fractional zoom, on-screen control of parameters

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
'IMAGE ASSISTANT' (Software, PC and compatibles) V 1.10 Caere Corporation C 1990-1993 *
'Positive Image' (Software, Atari ST, TT, Falcon and compatibles, Floppy Shop. *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2322268B (en) * 1997-02-13 2001-07-25 Quantel Ltd An electronic graphic system
GB2381686A (en) * 2001-10-31 2003-05-07 Hewlett Packard Co Apparatus for recording and reproducing pointer positions on a document.

Also Published As

Publication number Publication date
GB9618601D0 (en) 1996-10-16
GB2317309B (en) 2001-03-28

Similar Documents

Publication Publication Date Title
US5986665A (en) Electronic graphic system
US5142616A (en) Electronic graphic system
US4514818A (en) Video image creation system which simulates drafting tool
US5459529A (en) Video processing for composite images
US4524421A (en) Computerized graphics system and method using an electronically synthesized palette
EP0202014B1 (en) Improvements in video image creation systems
US6067073A (en) Electronic graphic system
US5412402A (en) Electronic graphic systems
US5289566A (en) Video image creation
US5175625A (en) Video image creation systems combining overlapping stamps in a frame period before modifying the image
GB2140257A (en) Video image creation
US5412767A (en) Image processing system utilizing brush profile
US4775858A (en) Video image creation
EP0399663A1 (en) An electronic image progressing system
EP0538056B1 (en) An image processing system
US6292167B1 (en) Electronic graphic system
US6606086B1 (en) Electronic graphic system
US5276786A (en) Video graphics systems separately processing an area of the picture before blending the processed area into the original picture
JP2552113B2 (en) Video image forming device
US6064400A (en) Video image processing system
EP0814429A2 (en) An image processing system
GB2317309A (en) An electronic graphic system
EP0425133B1 (en) An electronic graphic system
US6137496A (en) Electronic graphic system
EP0360432B1 (en) Video graphics system

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20050906