GB2113950A - Image composition system - Google Patents
Image composition system Download PDFInfo
- Publication number
- GB2113950A GB2113950A GB8300378A GB8300378A GB2113950A GB 2113950 A GB2113950 A GB 2113950A GB 8300378 A GB8300378 A GB 8300378A GB 8300378 A GB8300378 A GB 8300378A GB 2113950 A GB2113950 A GB 2113950A
- Authority
- GB
- United Kingdom
- Prior art keywords
- picture
- information
- source
- shape
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The image composition system includes framestores 30, 31 for receiving information from first and second picture sources. A processor 33 provides the composed image by using information from these sources. The processor is controlled by picture shape information made available from a third framestore 32. This shape information may be provided by a camera 26 receiving an image of a silhouette for example or the shape can be manually generated via a touch tablet 38. The instantaneous value of the shape controls the blending of the pictures such that portions of the picture can be taken from a scene and inserted without noticeable degradation. Manipulation of the position, orientation or size of the inserted picture portion for example can also be effected. <IMAGE>
Description
SPECIFICATION
Image composition system
The invention relates to image composition arrangements. The composition of pictures where one image is keyed into a second picture is known using simple switching techniques.
Unfortunately, the results are not altogether visually satisfactory and the manipulation of such images is limited.
The present invention is concerned with providing an arrangement which allows a more realistic resultant composed image to be provided and which provides greater flexibility than heretofor.
According to the invention there is provided an image composition system comprising, first input means for providing a first source of picture information, second input means for providing a second source of picture information, means for providing a third source of picture information, processing means for receiving information from both said first and second sources so as to provide the composed picture, and control means for varying the proportions of picture information used by said processing means to provide said composed image in dependence on the information from said third picture source.
Further according to the invention there is provided a method for image composition comprising, receiving a first source of picture information, receiving a second source of picture information, receiving a third source of picture information, processing the information from both said first and second sources by variably controlling the proportions of picture information used in the processing step in dependence on information from said third picture source.
The first and/or second information sources may comprise a picture source, colour information source or synthetically generated source.
The invention will now be described by way of example with reference to the accompanying
drawings, in which:
Figure 1 shows one embodiment of the present invention for composing an image produced from more than one image source,
Figure 2 shows visual representation of the image composition,
Figure 3 shows various parameter values illustrating the blending technique employed in the image composition,
Figure 4 shows an embodiment of the processor of Figure 1,
Figure 5 shows additional manipulation components,
Figure 6 shows an arrangement allowing the capability of moving the inserted picture,
Figure 7 shows an arrangement for artificially generating the insert shape,
Figure 8 shows an arrangement allowing the insert to be transferred to the original picture, and
Figure 9 shows an arrangement for providing a mask to aid picture build up.
As already discussed use of established composite image generation techniques tend to produce unrealistic results which appear contrived or degraded, this degradation can be more pronounced when the data is in digital format, due to the quantised nature of a digital television picture. To achieve enhanced results the present invention is concerned with manipulating the picture information in such a way that the composed picture is a composite picture from more than one source blended in a manner which visually results in a produced image which is generally indistinguishable from images which were composed originally as a single picture yet allowing manipulation of the composition of this picture to be effected.
Figure 1 shows one embodiment of the system of the present invention for producing the image composition. The first picture source is provided by camera 20 and passes via analogue to digital converter (ADC) 27 to framestore 30.
The second picture source is provided by camera 21 and passes via ADC 28 to framestore 31.
The outputs from framestores 30 and 31 are made available to processor 33 described, in more detail below, and the result therefrom is available for display on a monitor 34, via DAC 39 as necessary, or for use elsewhere in analogue or digital form as desired. Thus the composed image from processor 33 can be considered as being comprised of picture information from both original picture sources. The way in which these sources are used within the processor is effectively controlled by a third source of picture information. This is shown as being provided by the additional framestore 32.
This further framestore 32 contains picture shape and blending information for use in controlling the processor 33. This information may have been derived via camera 26 and ADC 29 or touch tablet 38 as explained below and can be considered as a variable stencil for use in composing the picture.
The resultant manipulation is exemplified in Figure 2. The first and second images within framestores 30 and 31 respectively are received by processor 33. The control image from framestore 32 is used to cause the processor to compose the final image from the first image together with that portion of the second image corresponding to the shape of the control image. This allows only selected features from within the second image to be used in the final image. Thus the person can be transposed from the original indoor scene shown into an outdoor scene as shown on monitor 34. In practice the processor is also configured to manipulate the data in such a way that the insert is realistically blended into the picture to appear as if it were there in the original scene.The control image itself is the mechanism which directs this blending operation, using both its shape and instantaneous value as now explained.
The control image is arranged to effectively define the percentage used from one picture when processed with the other picture, with blending for example adjacent the picture insert interface. This value (K) is shown in the example of Figure 3c as varying initially from a minimum to a maximum adjacent boundary I and then subsequently decreasing to a minimum adjacent boundary II for that T.V.
line. In the Figure 2 example, this could correspond to a T.V. line towards the bottom of the frame. At the changeover, this technique avoids any sharp edges by providing a gradual increase in picture contribution spaced over one or more picture points. Thus, adjacent the first boundary, a small contribution is made from the picture about to be inserted (picture B) and this increases, with a corresponding decrease in picture A until B completely replaces first picture source A. When the next boundary is approached, the operation repeats, this time in reverse. This technique results in blending of the pictures from the first and second sources in this example only in the marginal regions of their interface. Although the blending described can be considered as along a horizontal line, the same technique is employed vertically.At other parts of the picture the relationship will be different. Since the transition point may be displaced on subsequent scan lines, each line will result in slightly different values of K as exemplified in Figure 3d. For any parts of the picture where no contribution is required from the second picture source B, then K will be a minimum throughout the horizontal scanning line (see Figure 3a). Where the insert has a horizontal edge, then adjacent this boundary, a value of K shown in Figure 3b could be expected for the relevant scanning line. Adjacent lines would have an increasing value of K until the Figure 3c situation was reached, thus giving the blending technique vertically as well as horizontally.
The framestores 30 to 32 share common write/read addressing block 35, which is under the control of the input sync generator 36 and output sync generator 37 in normal manner.
The process within processor 33 required to achieve the blending is given by the equation: OUTPUT=KxPICTURE 1+(1-K)xPICTURE 2 where K#1 An embodiment for the processor 33 is shown in Figure 4. The value of K from control shape store 32 for a given pixel location is received by multiplier 41 and its inverse (1-K) is made available to multiplier 40 via inverter 43 to control the image composition. The outputs from multipliers 40, 41 are received by adder 42, the output of which can be passed to the monitor as described in Figure 1.
Although the system is shown for simplicity as having single frame stores 30 and 31 for handling monochrome only it can be extended to colour by adding additional memory to these picture framestores and also adding parallel processing circuitry for the colour data.
Although the picture shape with associated values of K for each picture point within the frame for store 32 could be generated synthetically, a preferable way of providing these values is to use a visual shape mechanism. One approach shown in Figure 1 is to use a camera 26 with its output passed to store 32 via ADC 29. The insert shape can be a profile or silhouette which will in practice cause a slope to be produced in the analogue output level from the camera over a number of picture points as in
Figure 3c and thus when digitised and stored as 8 bit words for example will give the variation in K desired for smooth blending in the changeover region.
By providing shapes with intermediate intensity values for K throughout the insert, we have found that special effects such as transparent or translucent images for example can be included in the composed scene.
By incorporating a high resolution camera or by including filtering techniques, the number of pixels involved at changeover (horizontally and vertically) and thus the gradient of the slope can be varied. Another approach shown in Figure 1 is to use a touch tablet 38 or other manually operable device to provide the desired insert shape and use this as the K data input to the store 32 using techniques extending from these described in UK Patent Publication No. 2089625 for example, as described with reference to Figure 9 below.
Although the picture inputs to the stores 30 and 31 have been described as coming from cameras, the system is flexible enough to allow other picture sources to be used. One specific aspect is to compose a picture containing graphic information. In this case store 30 can provide the background information (luminance or colour) and the graphic shape can be input to store 32 as before. These shapes could be derived from any suitable source but for realism the manner described in Figure 9 below is preferable. The shape could be a straight line, circle, alpha/numeric or other character if desired. In the present situation the store 31 could merely contain a fixed (or variable) colour or intensity which would be selected dependent on shape defined by store 32.
Now wherever the store 32 gives a value of zero then the output from framestore 30 will be passed to the monitor 34 without modification, but when the store 32 output equals one then the colour as defined by the store 31 will appear on the monitor. For values between nought and one a proportion of mix of colour value and framestore 30 output will be applied to the monitor.
Where the system is used in conjunction with the image creation system mentioned above the picture visible on the monitor directly simulates the effect which the artist will achieve on his picture when he finally decides to commit these lines and other graphic representations to his picture, or alternatively can use these lines as a guide line as the picture is created.
Although the Figure 1 arrangements is shown with common addressing for framestores 30-32, so that there is a fixed pixel relationship between the images stores therein, an additional benefit can be achieved as now described with reference to Figure 5 to allow a changeable pixel relationship with additional manipulation so that, whilst retaining the original picture information, it is possible to move the location, size or orientation of the insert into the composed picture.
The Figure 5 arrangement is concerned with the manipulation of information from the framestores and for simplicity only the relevant blocks are included.
The input and output arrangements would be as in Figure 1. The address generator 35 is now used only with the first picture framestore 30. An additional address generator 44 is provided for use by both the second picture framestore 31 and the control shape image framestore 32. The outputs from framestores 31 and 32 now pass via interpolators 47 and 48 prior to receipt by processor 33. The read addressing of the framestores and the control of the interpolators is effected by manipulator 49 to give the required address selection and pixel interpolation to achieve the size change or orientation desired for example. The mechanisms for interpolation and address manipulation are known in the art see also US Patent No. 4163249 for example. Because the addressing block 44 is common to framestores 31 and 32 the pixel relationship is maintained.This ensures that the manipulation of the control image shape is duplicated for the image within considering the images represented in Figure 2, the control image can be manipulated so that it can be shrunk for example and the person in the second image will shrink also and be inserted in the picture at reduced size. Rotation manipulation of the control shape will cause the person to lie down in the final image for example.
The requirement to produce a new shape representation each time is removed. Manual control of the position, size or orientation can be achieved using trackerball or joystock for example in usual manner.
Where there is a requirement to only move the control image location an alternative system can be used without employing interpolation as now described.
An example of the use of this technique is for 'cut and paste'. An artist often finds that he is painting the same pictures over again and it would greatly assist him if he could have a means of taking part of the previously drawn picture and pasting this into his new picture.
Prior to pasting together his pictures the artist may require that the picture to be pasted can be moved around the original screen and viewed as if it existed in that picture but without finally committing it to the picture, this can be achieved using a system as shown in Figure 6. The picture framestores 30, 31 now each has its own address generator 35, 44 which can be independently controlled. If the start address of generator 44 is varied, while the start address of address generator
35 is kept at 0.0, then using the shape as defined in store 32, the picture cut from picture B can be
made to move around the picture A until its desired placement is found. This movement can be
controlled by joystick, tracker ball or other means.
In addition a further shape store 45 is provided as a refinement to allow a foreground object to be
defined in picture A. This store 45 is also driven by address generator 35. This store contains the shape
or blending information which defines the foreground object in picture A. The processor 33 has been
modified to include a further multiplier 46 to cope with the additional manipulation. To produce the
correct K factor for the processor the outputs from stores 45 and 32 are first multiplied together before
being applied to the processor as previously described. The result as far as the artist or operator is concerned will be that as he moves his cut picture from picture B around picture A then it will seem to disappear behind the objects in picture A which are defined as being foreground.
The system is shown with the capability of scrolling both or either picture A or B using addressing blocks 35 or 44.
The common address generator 35 for stores 30 and 45 and the common address generator 44 for stores 31 and 32 ensures that correct picture relationships are maintained.
The ability to move and insert parts of the picture make the system ideal for providing animation.
An image of an animal or parts thereof can be inserted and moved across the screen and if a separate cut out shape of the legs are stored then these can be manipulated and captured frame by frame under joystick control for example to simulate a walking movement whilst the animal tranverses the picture.
Alternatively, the positions can be generated under computer control, where a range of repetitive movements can be made available and so create simply and less tediously animation effects.
As already mentioned a touch tablet 38 or its equivalent can be used to enter the desired shape information into stores 32 or 45. In order to produce the blending effect desired, our research has shown that this can be provided by utilizing techniques derived from modified arrangements to those disclosed in UK Patent Publication No. 2089625. Thus the touch tablet 38 of Figure 1 is in practice used with additional system elements to provide the desired painting or drawing techniques as now described in detail in Figure 7.
The co-ordinates from the touch tablet 38 are received by an address generator 53 which provides addressing for the frame store 32 so as to correctly enter the data into the store locations. The frame store could alternatively be store 45 of Figure 6. The address generator controls the frame store to allow a 'read-modify-write' sequence to occur, the modification taking place in processor 50. The address generator 53 also controls the stores 51 and 52 which have a size corresponding to a designated number of picture points in a patch. The pencil intensity (or colour) and pencil shape provided by stores 51 and 52 are an ideal way of providing the insert shape as the artist or operator can draw around the object of interest and then fill in the space inside the outline.Because of the pencil shape being of the type that falls away at the edges, this will also cause the desired blending effect as now described.
A patch of 1 6 xl 6 pixels is shown as being large enough to encompass the desired pen shape in this example. The peak of the pen shape is central of the patch in this instance and will produce the maximum value of K at this point. The x and y coordinate provided by the touch tablet will correspond to the corner of the patch read out from the store and processing of all points within this patch is effected in processor 50 and the modified data written back into the store 32 (or 45). During this processing the old luminance value and the designated intensity value are subtracted in subtractor 55 and the difference multiplied by coefficient K in multiplier 56, the value of K being dependent on where the particular picture point lies within the selected patch. The result is added to the earlier luminance data by adder 57.It is clear that some picture points at the periphery will remain unchanged in this example. Movement of the actual stylus on the touch pad by one picture point will cause a new patch to be read out from the store 32 which will contain most of the earlier picture points but 1 6 new pictures will be present and naturally 16 others will have been omitted. The processing will again be carried out for the entire patch. It can be seen that during the second processing operation just described, the previous movement by 1 picture point will cause a proportion of the luminance information generated by the earlier processing operation to be used in the calculation of the new content for the updated patch.
The number of processing steps for a given coordinate will depend on the size of the patch accessed.
Thus, if the patch was say 32 picture points wide and 32 high there are 32x32 or 1024 points to be processed for each movement of the stylus.
In this way, the insert shape is built up. At the edges the intensity will automatically fade away, allowing the blending effect to be achieved as the desired value of K will have been provided and entered into stores 32 or 45 during this operation. Shapes drawn with variable intensity other than at the boundary will cause variable blending elsewhere. The operator can view this shape during or after generation by feeding the data from the relevant store to the monitor 34, or by considering it as a graphic input and superimposing it on the original picture as described above.
Once the operator or artist has decided where he requires his cut picture to be placed then there is a need for actually transferring the cut picture from picture B to picture A. Once again, the cut picture must be blended in properly when interfaced to the original picture and this can be achieved using the arrangement shown in Figure 8. In this case, however, instead of the pencil colour and pencil shape together with framestore 32 output being applied to the processor 50 as in Figure 7, the framestore 31 plus its insert shape as defined by store 32 are applied to the processor in place of the pencil or the implement.
As for the video path this processing path also requires that framestore 30 and framestore 31 have separate address generators 35, 39 as they have to be accessed from different addresses for a particular process. To achieve the foreground/background effect then the outputs from the stores 32 and 45 are multiplied in an additional multiplier 62 before being received by the other elements of the processor. A further multiplier 60 is provided which acts as an input for pressure provided from transducer 61 thus allowing the information from framestore 31 to be only partially applied onto the framestore 30 information if so desired.
A dedicated processor is required for b!ock 50 in order to achieve reasonable processing speed. A full framestore of 584x768 pixels takes about 0.6 of a second to be pasted into the first framestore 30. Since in many cases the object to be cut from framestore 31 does not occupy the whole of this framestore then a saving in time can be made by only accessing a rectangle which is sufficient to enclose the object to be cut. This patch size can be controlled by a patch size selector 63 as shown which in turn controls the implementing of the two address generators. In practice this patch size generator could incorporate a similar addressing mechanism to that used for the address generation within block 53 of Figure 7 and described in more detail in the aforementioned patent application.
The output provided by this processing is given by the following:~ OUTPUT=(1-K1)FS1+K1 [ K2FS2+(1-K2)FS1 ] - [ (1-K1)+(K1-K1K2)jFS1+K1K2FS2 -(1-K1K2)FS1+K1K2FS2 where
K1 is the output provided by store 45
K2 is the output provided by store 32
FS, is the output provided by store 30
FS2 is the output provided by store 31.
Although the control image shape and intensity has been described as being generated in a single operation, in practice the shape can be first defined and the intensity or colour built up subsequently.
This allows special effects mentioned above to be readily altered by the operator.
From the artistic point of view this is similar to applying a form of masking tape or stencil to the picture so defining the area within which the artist requires his paint applied to the paper. This is of particular use when using an airbrush but may also be used for any other painting media. This arrangement now described provides a means of producing the equivalent of painting mask electronically as shown in Figure 9. The processor is similar to that of Figures 7 and 8 and the system includes the store 32 which is driven in parallel with the framestore 30. When the pencil colour (or intensity), pencil shape and framestore value are read out an additional value is read out from the store 32 which defines the mask. This is multiplied in multiplier 62 by the pressure from the stylus via transducer 61 before the further processor steps. Thus the store 32 modulates the pressure such that when the mask is one the pressure is allowed through and there is no effect upon the painting but when the mask is zero then the pressure turned to zero and no painting will appear on the framestore.
Since the mask can have any value between nought and one a shape can be applied to it which will define the exact shape in which the artist requires his paint to be applied.
Once again, the mask provides a blend of the required paint to the original picture and so producing a very natural effect.
Thus the various embodiments described allow a composite picture to be generated from normal picture sources or by synthesis, which retains its realism by removing sharp edges from the interfaces and allowing additional manipulation such as relative movement to be provided, under the control of the image shape.
Claims (21)
1. An image composition system comprising:
first input means for providing a first source of picture information;
second input means for providing a second source of picture information;
means for providing a third source of picture information;
processing means for receiving information from both said first and second sources so as to provide the composed picture; and
control means for varying the proportions of picture information used by said processing means to provide said composed image in dependence on the information from said third picture source.
2. A system as claimed in Claim 1, wherein at least one of said first and second input means includes a frame store for holding said picture information in a plurality of pixel locations.
3. A system as claimed in Claim 2, wherein said control means includes a framestore for holding picture information from said third source to effect control of said picture composition.
4. A system as claimed in Claim 2 or 3 wherein address control means are provided to allow picture location access from one frame store to be modified relative to another frame store, whereby relative movement of a portion of the composite picture can be effected.
5. A system as claimed in one of Claims 1 to 4 wherein picture manipulation means are provided prior to said processing means to allow relative movement, orientation and size change to be effected.
6. A system as claimed in Claim 5, wherein said manipulation means includes an interpolator for manipulating information from more than one pixel to ensure that picture quality is maintained.
7. A system as claimed in any one of Claims 1 to 6, wherein said processing means includes an arithmetic processor adapted to produce an output whereby the information used from said first source is automatically increased as the information from said second source is decreased under control of said control means.
8. A system as claimed in any one of Claims 1 to 7, wherein said means for providing the third source of picture information is adapted to receive picture shape information to produce an output for said control means dependent on the instantaneous value of said shape information.
9. A system as claimed in any one of Claims 1 to 7 wherein said control means includes a generator for providing a blending parameter for use by said processing means.
10. A system as claimed in Claim 9, wherein said generator is adapted to derive said blending parameter from image content made available thereto from said third source of picture information.
11. A system as claimed in Claim 10, wherein graphics means for manually providing said shape information is connected to said parameter generator.
12. A system as claimed in Claim 11, wherein said graphics means includes a manually operable co-ordinate control and a processor for synthesising the stored blending parameter in dependence on drawing implement data made available thereto.
13. A system as claimed in any one of Claims 1 to 12 including modifier means for designating part of the picture information from said first or second source as a priority area whereby this part will be output in the composed picture in preference to other picture information.
14. A system as claimed in Claim 13, wherein said modifier means includes a frame store for storing information indicative of the shape of said priority area.
15. A system as claimed in any one of Claims 1 to 14, wherein combination processing means is provided to allow commitment of said composed picture to be effected following a separate previewing operation.
16. A method for image composition comprising:
receiving a first source of picture information,
receiving a second source of picture information,
receiving a third source of picture information,
processing the information from both said first and second sources by variably controlling the proportions of picture information used in the processing step in dependence on information from said third picture source.
17. A method as claimed in Claim 16 including the step of moving the picture information from one source relative to that from the other source to cause a shift in the picture composition.
18. A method as claimed in Claim 16 or 17 including manually generating graphics information used to variably control picture content proportions.
19. A method as claimed in Claim 1 6, 1 7 or 18 including generating shape information to designate the available area for information insertion from a given source.
20. A image composition system substantially as described herein.
21. A method of image composition substantially as described herein and with reference to the accompanying drawings.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB8300378A GB2113950B (en) | 1982-01-15 | 1983-01-07 | Image composition system |
HK29691A HK29691A (en) | 1982-01-15 | 1991-04-18 | Image composition system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB8201136 | 1982-01-15 | ||
GB8300378A GB2113950B (en) | 1982-01-15 | 1983-01-07 | Image composition system |
Publications (3)
Publication Number | Publication Date |
---|---|
GB8300378D0 GB8300378D0 (en) | 1983-02-09 |
GB2113950A true GB2113950A (en) | 1983-08-10 |
GB2113950B GB2113950B (en) | 1986-10-01 |
Family
ID=26281722
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB8300378A Expired GB2113950B (en) | 1982-01-15 | 1983-01-07 | Image composition system |
Country Status (2)
Country | Link |
---|---|
GB (1) | GB2113950B (en) |
HK (1) | HK29691A (en) |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2130838A (en) * | 1982-11-16 | 1984-06-06 | Dainippon Ink & Chemicals | Method and apparatus for forming a combined image signal |
GB2151100A (en) * | 1983-11-29 | 1985-07-10 | Yokogawa Medical Syst | Composite image display system for computerized tomographs and other image data |
FR2561053A1 (en) * | 1984-03-07 | 1985-09-13 | Quantel Ltd | SYSTEM AND METHOD FOR PROCESSING VIDEO SIGNALS |
FR2578130A1 (en) * | 1985-02-28 | 1986-08-29 | Rca Corp | APPARATUS AND METHOD FOR OVERLAYING VIDEO IMAGES |
GB2174266A (en) * | 1985-04-15 | 1986-10-29 | Philips Nv | Television system and data generator and receiver suitable therefor |
EP0219251A2 (en) * | 1985-10-11 | 1987-04-22 | Quantel Limited | Improvements in video image processing systems |
GB2183428A (en) * | 1985-10-07 | 1987-06-03 | Canon Kk | Pattern forming apparatus |
EP0235902A1 (en) * | 1986-01-23 | 1987-09-09 | Crosfield Electronics Limited | Digital image processing |
EP0271995A2 (en) * | 1986-12-13 | 1988-06-22 | Quantel Limited | Stencils for video graphics systems |
US4827344A (en) * | 1985-02-28 | 1989-05-02 | Intel Corporation | Apparatus for inserting part of one video image into another video image |
EP0318149A1 (en) * | 1987-11-21 | 1989-05-31 | Quantel Limited | Video processing |
DE3916984A1 (en) * | 1988-05-25 | 1989-11-30 | Canon Kk | DATA PROCESSING DEVICE |
EP0344976A1 (en) * | 1988-05-31 | 1989-12-06 | Crosfield Electronics Limited | Image generating apparatus |
GB2220331A (en) * | 1985-10-07 | 1990-01-04 | Canon Kk | Image processing |
GB2223651A (en) * | 1988-10-07 | 1990-04-11 | Sun Microsystems Inc | Overwriting display memory without clearing speeds computer animation |
GB2226468A (en) * | 1988-12-22 | 1990-06-27 | Rank Cintel Ltd | Image processing system |
GB2227902A (en) * | 1988-12-21 | 1990-08-08 | Bosch Gmbh Robert | Mixing device for video signals |
US4949180A (en) * | 1985-10-11 | 1990-08-14 | Quantel Limited | Video image processing systems |
EP0396377A2 (en) * | 1989-05-01 | 1990-11-07 | EVANS & SUTHERLAND COMPUTER CORPORATION | Computer graphics dynamic control |
GB2233856A (en) * | 1989-05-17 | 1991-01-16 | Quantel Ltd | Electronic image processing |
EP0423961A2 (en) * | 1989-09-28 | 1991-04-24 | Sony Corporation | Video special effects apparatus |
DE3936334A1 (en) * | 1989-10-30 | 1991-05-02 | Siemens Ag | DATA TRANSFER PROCEDURE |
GB2238214A (en) * | 1989-10-13 | 1991-05-22 | Quantel Ltd | Electronic graphic system combines images using variable density stencil |
EP0429869A2 (en) * | 1989-11-28 | 1991-06-05 | The State Of Israel Ministry Of Defence Israel Military Industries | A system for simulating X-ray scanners |
US5043923A (en) * | 1988-10-07 | 1991-08-27 | Sun Microsystems, Inc. | Apparatus for rapidly switching between frames to be presented on a computer output display |
GB2243044A (en) * | 1989-12-31 | 1991-10-16 | Samsung Electronics Co Ltd | Video editing system in a camcorder |
EP0454996A2 (en) * | 1990-03-30 | 1991-11-06 | Kabushiki Kaisha Toshiba | Signal dynamic increase for a multi-function digital CCD camera |
WO1991019242A2 (en) * | 1990-05-29 | 1991-12-12 | Eastman Kodak Company | Neighborhood-based merging of image data |
WO1992005664A1 (en) * | 1990-09-20 | 1992-04-02 | Spaceward Holdings Limited | Video image composition |
EP0512839A2 (en) * | 1991-05-09 | 1992-11-11 | Quantel Limited | Image keying generator for video special effects |
EP0529743A1 (en) * | 1988-12-23 | 1993-03-03 | Rank Cintel Limited | Video special effects |
EP0560624A2 (en) * | 1992-03-13 | 1993-09-15 | Quantel Limited | Electronic video system with simultaneous real-time processing |
US5384899A (en) * | 1991-04-16 | 1995-01-24 | Scitex Corporation Ltd. | Apparatus and method for emulating a substrate |
US5649046A (en) * | 1992-12-07 | 1997-07-15 | Quantel Limited | Video processing system with random access framestore for video editing |
US5668639A (en) * | 1995-03-21 | 1997-09-16 | Comunicacion Integral | Method for video editing |
GB2317087A (en) * | 1996-09-06 | 1998-03-11 | Quantel Ltd | An electronic graphic system |
US5986665A (en) * | 1996-09-06 | 1999-11-16 | Quantel Limited | Electronic graphic system |
-
1983
- 1983-01-07 GB GB8300378A patent/GB2113950B/en not_active Expired
-
1991
- 1991-04-18 HK HK29691A patent/HK29691A/en not_active IP Right Cessation
Cited By (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3341371A1 (en) * | 1982-11-16 | 1984-07-19 | Dainippon Ink And Chemicals, Inc., Tokio/Tokyo | METHOD AND DEVICE FOR GENERATING A COMBINED IMAGE SIGNAL |
GB2130838A (en) * | 1982-11-16 | 1984-06-06 | Dainippon Ink & Chemicals | Method and apparatus for forming a combined image signal |
GB2151100A (en) * | 1983-11-29 | 1985-07-10 | Yokogawa Medical Syst | Composite image display system for computerized tomographs and other image data |
FR2561053A1 (en) * | 1984-03-07 | 1985-09-13 | Quantel Ltd | SYSTEM AND METHOD FOR PROCESSING VIDEO SIGNALS |
US4827344A (en) * | 1985-02-28 | 1989-05-02 | Intel Corporation | Apparatus for inserting part of one video image into another video image |
FR2578130A1 (en) * | 1985-02-28 | 1986-08-29 | Rca Corp | APPARATUS AND METHOD FOR OVERLAYING VIDEO IMAGES |
GB2171875A (en) * | 1985-02-28 | 1986-09-03 | Rca Corp | Superimposing video images |
DE3606456A1 (en) * | 1985-02-28 | 1986-09-04 | Rca Corp., Princeton, N.J. | ARRANGEMENT AND METHOD FOR OVERLAYING TELEVISION PICTURES |
GB2174266B (en) * | 1985-04-15 | 1989-06-21 | Philips Nv | Television system and data generator and receiver suitable therefor |
GB2174266A (en) * | 1985-04-15 | 1986-10-29 | Philips Nv | Television system and data generator and receiver suitable therefor |
GB2220331A (en) * | 1985-10-07 | 1990-01-04 | Canon Kk | Image processing |
GB2183428B (en) * | 1985-10-07 | 1990-08-29 | Canon Kk | Image processing system |
GB2183428A (en) * | 1985-10-07 | 1987-06-03 | Canon Kk | Pattern forming apparatus |
GB2220331B (en) * | 1985-10-07 | 1990-09-12 | Canon Kk | Image processing system |
EP0219251A2 (en) * | 1985-10-11 | 1987-04-22 | Quantel Limited | Improvements in video image processing systems |
US4949180A (en) * | 1985-10-11 | 1990-08-14 | Quantel Limited | Video image processing systems |
EP0219251A3 (en) * | 1985-10-11 | 1989-12-06 | Quantel Limited | Improvements in video image processing systems |
EP0235902A1 (en) * | 1986-01-23 | 1987-09-09 | Crosfield Electronics Limited | Digital image processing |
EP0271995A3 (en) * | 1986-12-13 | 1990-07-18 | Quantel Limited | Stencils for video graphics systems |
EP0271995A2 (en) * | 1986-12-13 | 1988-06-22 | Quantel Limited | Stencils for video graphics systems |
EP0318149A1 (en) * | 1987-11-21 | 1989-05-31 | Quantel Limited | Video processing |
US5163125A (en) * | 1988-05-25 | 1992-11-10 | Canon Kabushiki Kaisha | Data processing apparatus |
DE3916984A1 (en) * | 1988-05-25 | 1989-11-30 | Canon Kk | DATA PROCESSING DEVICE |
EP0344976A1 (en) * | 1988-05-31 | 1989-12-06 | Crosfield Electronics Limited | Image generating apparatus |
US4954912A (en) * | 1988-05-31 | 1990-09-04 | Crosfield Electronics Limited | Image generating apparatus |
US5043923A (en) * | 1988-10-07 | 1991-08-27 | Sun Microsystems, Inc. | Apparatus for rapidly switching between frames to be presented on a computer output display |
GB2223651B (en) * | 1988-10-07 | 1993-03-31 | Sun Microsystems Inc | Apparatus for rapidly clearing the output display of a computer sytem |
GB2223651A (en) * | 1988-10-07 | 1990-04-11 | Sun Microsystems Inc | Overwriting display memory without clearing speeds computer animation |
GB2227902A (en) * | 1988-12-21 | 1990-08-08 | Bosch Gmbh Robert | Mixing device for video signals |
GB2227902B (en) * | 1988-12-21 | 1993-07-07 | Bosch Gmbh Robert | Mixing device for video signals |
GB2226468B (en) * | 1988-12-22 | 1993-08-11 | Rank Cintel Ltd | Image processing system |
GB2226468A (en) * | 1988-12-22 | 1990-06-27 | Rank Cintel Ltd | Image processing system |
EP0529743A1 (en) * | 1988-12-23 | 1993-03-03 | Rank Cintel Limited | Video special effects |
EP0396377A3 (en) * | 1989-05-01 | 1991-12-04 | EVANS & SUTHERLAND COMPUTER CORPORATION | Computer graphics dynamic control |
EP0396377A2 (en) * | 1989-05-01 | 1990-11-07 | EVANS & SUTHERLAND COMPUTER CORPORATION | Computer graphics dynamic control |
US5412767A (en) * | 1989-05-17 | 1995-05-02 | Quantel, Ltd. | Image processing system utilizing brush profile |
GB2233856A (en) * | 1989-05-17 | 1991-01-16 | Quantel Ltd | Electronic image processing |
GB2233856B (en) * | 1989-05-17 | 1993-10-27 | Quantel Ltd | Electronic image processing |
EP0423961A3 (en) * | 1989-09-28 | 1992-07-08 | Sony Corporation | Video special effects apparatus |
EP0423961A2 (en) * | 1989-09-28 | 1991-04-24 | Sony Corporation | Video special effects apparatus |
GB2238214A (en) * | 1989-10-13 | 1991-05-22 | Quantel Ltd | Electronic graphic system combines images using variable density stencil |
DE3936334A1 (en) * | 1989-10-30 | 1991-05-02 | Siemens Ag | DATA TRANSFER PROCEDURE |
EP0429869A2 (en) * | 1989-11-28 | 1991-06-05 | The State Of Israel Ministry Of Defence Israel Military Industries | A system for simulating X-ray scanners |
EP0429869A3 (en) * | 1989-11-28 | 1992-12-02 | The State Of Israel Ministry Of Defence Israel Military Industries | A system for simulating x-ray scanners |
GB2243044A (en) * | 1989-12-31 | 1991-10-16 | Samsung Electronics Co Ltd | Video editing system in a camcorder |
GB2243044B (en) * | 1989-12-31 | 1994-09-07 | Samsung Electronics Co Ltd | Editing system intergrated within a camcorder |
EP0454996A3 (en) * | 1990-03-30 | 1993-03-03 | Kabushiki Kaisha Toshiba | Signal dynamic increase for a multi-function digital ccd camera |
EP0454996A2 (en) * | 1990-03-30 | 1991-11-06 | Kabushiki Kaisha Toshiba | Signal dynamic increase for a multi-function digital CCD camera |
WO1991019242A3 (en) * | 1990-05-29 | 1992-10-15 | Eastman Kodak Co | Neighborhood-based merging of image data |
WO1991019242A2 (en) * | 1990-05-29 | 1991-12-12 | Eastman Kodak Company | Neighborhood-based merging of image data |
WO1992005664A1 (en) * | 1990-09-20 | 1992-04-02 | Spaceward Holdings Limited | Video image composition |
US5384899A (en) * | 1991-04-16 | 1995-01-24 | Scitex Corporation Ltd. | Apparatus and method for emulating a substrate |
EP0512839A2 (en) * | 1991-05-09 | 1992-11-11 | Quantel Limited | Image keying generator for video special effects |
EP0512839A3 (en) * | 1991-05-09 | 1993-08-18 | Quantel Limited | Image keying generator for video special effects |
EP0560624A2 (en) * | 1992-03-13 | 1993-09-15 | Quantel Limited | Electronic video system with simultaneous real-time processing |
JPH06121269A (en) * | 1992-03-13 | 1994-04-28 | Quantel Ltd | Electronic video storage apparatus and electronic video processing system |
EP0560624B1 (en) * | 1992-03-13 | 2002-06-26 | Quantel Limited | Electronic video system with simultaneous real-time processing |
US5649046A (en) * | 1992-12-07 | 1997-07-15 | Quantel Limited | Video processing system with random access framestore for video editing |
US5825967A (en) * | 1992-12-07 | 1998-10-20 | Ricoh Company, Ltd. | Video processing system with randow access frame store for video editing |
US5668639A (en) * | 1995-03-21 | 1997-09-16 | Comunicacion Integral | Method for video editing |
GB2317087A (en) * | 1996-09-06 | 1998-03-11 | Quantel Ltd | An electronic graphic system |
US5986665A (en) * | 1996-09-06 | 1999-11-16 | Quantel Limited | Electronic graphic system |
GB2317087B (en) * | 1996-09-06 | 2001-04-18 | Quantel Ltd | An electronic graphic system |
US6606086B1 (en) | 1996-09-06 | 2003-08-12 | Quantel Limited | Electronic graphic system |
Also Published As
Publication number | Publication date |
---|---|
GB8300378D0 (en) | 1983-02-09 |
GB2113950B (en) | 1986-10-01 |
HK29691A (en) | 1991-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US4602286A (en) | Video processing for composite images | |
US5459529A (en) | Video processing for composite images | |
GB2113950A (en) | Image composition system | |
GB2157122A (en) | Image composition system | |
Wood et al. | Multiperspective panoramas for cel animation | |
EP0403253B1 (en) | An electronic image composition system | |
US5142616A (en) | Electronic graphic system | |
US6590573B1 (en) | Interactive computer system for creating three-dimensional image information and for converting two-dimensional image information for three-dimensional display systems | |
Agrawala et al. | Artistic multiprojection rendering | |
JP2508513B2 (en) | Image generator | |
US5986665A (en) | Electronic graphic system | |
US5107252A (en) | Video processing system | |
JP4481166B2 (en) | Method and system enabling real-time mixing of composite and video images by a user | |
US5696892A (en) | Method and apparatus for providing animation in a three-dimensional computer generated virtual world using a succession of textures derived from temporally related source images | |
US5103217A (en) | Electronic image processing | |
US5412402A (en) | Electronic graphic systems | |
US5999194A (en) | Texture controlled and color synthesized animation process | |
JPH05506520A (en) | computer video processing system | |
US5412767A (en) | Image processing system utilizing brush profile | |
EP0399663A1 (en) | An electronic image progressing system | |
GB2244895A (en) | Computer graphics | |
CA2237266A1 (en) | Processing image data | |
GB2261803A (en) | Storing a high resolution image as several low resolution images | |
GB2157121A (en) | Image composition system | |
Durand | The “TOON” project: requirements for a computerized 2D animation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
775B | Specification amended (sect. 75/1977) | ||
748C | Proceeding under section 48 patents act 1977 | ||
SPAC | Amended specification published ** copy of the specification now available | ||
752 | Proceeding under section 52(1) patents act 1977 | ||
748C | Proceeding under section 48 patents act 1977 | ||
713B | Proceeding under section 13(1) patents act 1977 | ||
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20020107 |