GB2352950A - Image superposition including transparency and translucency parameters - Google Patents

Image superposition including transparency and translucency parameters Download PDF

Info

Publication number
GB2352950A
GB2352950A GB9930710A GB9930710A GB2352950A GB 2352950 A GB2352950 A GB 2352950A GB 9930710 A GB9930710 A GB 9930710A GB 9930710 A GB9930710 A GB 9930710A GB 2352950 A GB2352950 A GB 2352950A
Authority
GB
United Kingdom
Prior art keywords
pixel
image
parameters
translucent
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9930710A
Other versions
GB2352950B (en
GB9930710D0 (en
Inventor
Yuji Hisamatsu
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Publication of GB9930710D0 publication Critical patent/GB9930710D0/en
Publication of GB2352950A publication Critical patent/GB2352950A/en
Application granted granted Critical
Publication of GB2352950B publication Critical patent/GB2352950B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/503Blending, e.g. for anti-aliasing

Abstract

An image processing method and apparatus in which . a. blend processing, transparent processing and translucent processing can be performed with a small amount of input data, and transmittance control of respectivel parameters of each pixel can be performed independently. Image data of a higher priority image for superposition is input as YA which is any of data specifying transparence or translucence and data specifying luminance, and as CbA and CrA. The CbA and CrA are data specifying the degree of translucency of luminance and color of A lower priority image downward when translucence is specified by the YA and otherwise are chroma data and hue data. Decoder DE decides whether the data YA is the transparent or translucent specifying value or luminance data and outputs a translucent enable signal TLE and a transparent enable signal TPE according to the decision result. Using these signals, selectors SE1 and SE2 select the degree of translucency or a constant a blend rate to output it, and blenders BL1, BL2 and BL3 perform any of transparent, translucent or overlay processing.

Description

2352950 IMAGE PROCESSING METHOD AND APPARATUS The present invention
relates to an image processing method and apparatus. There will be described below, by way of example in illustration of the invention, an image processing method and apparatus for superposing a plurality of images in which the amount of input data necessary for the superposition processing of a plurality of images can be reduced and the bandwidth of the input data can be decreased.
Image processing includes processing for superposing and for displaying a plurality of images in various ways. For example, there are some cases in which animations occupying a part of a picture area are composed into a certain background image, or in which characters or symbols are overlaid on normal animations in order to display them. In such cases, in superposing images of animations to be composed on background image, or where characters are to be overlaid on normal animations, there is an area or there are areas in which some images to be displayed exist, and an area or areas in which the background of the superposed images is to be displayed because of the absence of any image. Therefore, in the former area or areas, a process of overlaying or of superposing the images according to a predetermined rate of mixing is carried out, and, in the latter area or areas, a process of rendering the superposing images transparent in order to display the superposed images is carded out. Also, sometimes, predetermined translucent images, such as images obtained through a ground glass screen are superposed. In this case, a step of processing is performed in which portions of the superposed images corresponding to areas of the translucent images are attenuated in a predetermined ratio, and the superposing images are rendered transparent.
Reference will now be made to Fig. 5 of the accompanying drawings which shows a block schematic circuit diagram of a previously proposed image processing apparatus for carrying-out the superposition processing of images.
2 In Fig. 5, YA, CbA and CrA indicate the image data of superposing images (hereinafter called "higher priority image") which are superposed on other images and are displayed preferentially. YB, CbB and CrB indicate image data of superposed images (hereinafter called "lower priority image"), i.e., the ground images on which the higher priority image is superposed. In this apparatus the image data of two systems, YA and YB correspond to luminance data, CbA and CbB correspond to color saturation or chroma data; and CrA and CrB correspond to hue data, respectively. The data for each pixel of the higher priority image and each pixel of the lower priority image is provided in sequence (sometimes, the color saturation and the hue together may be simply called "color" below).
A "flag" is a flag for specifying whether or note every pixel of the higher priority image is transparent, and the flag becomes 1 " when a pixel is transparent and "0" when a pixel is not transparent. An a blend rate indicates the mixing ratio of the pixel value of each image when overlaying the higher priority image and the lower priority image. 131-10, BL20 and BL30 are blenders for performing a blend processing of luminance data YA and YB, color saturation data CbA and CbB, and hue data CrA and CrB, respectively, according to the flag and the a blend rate. The blenders 131-10, BL20 and BL30 output the processing result as one system of image data Y (blended luminance), Cb (blended color saturation) and Cr (blended hue data), respectively.
In such a configuration, the two systems of image data YA, CbA, CrA, and YB, CbB, CrB, the flag, and the a blend rate are supplied to every pixel in sequence, and using them as input data, superposition processing is performed. If the flag input here is ul", the blenders 131-10, BL20 and BL30 respectively render the higher priority images transparent (that is, YA, CbA and CrA are not used). For the superposition processing in this case, there are the process of outputting the lower priority images as is (hereinafter referred to as "transparent processing") and processing of attenuating the 3 lower priority images in a predetermined ratio to render them translucent (hereinafter called "translucent processing") as described above. Thus, where each of the image data (VA, CbA and CrA) of the higher priority images is A, and each of image data (YB, CbB and CrB) of the lower priority images is B, each blender performs the following operations and outputs its operation result as output data (Y, Cb and Cr).
For transparent processing: A x 0 + B x 1 For translucent processing: A x 0 + B x On the other hand, if the input flag is "0", the process of ovedaying or superposing the higher priority image and the lower priority image according to the a bland rate (hereinafter called" superposing processing" or "overlay processing") is performed. That is, each blender performs the following operation to use its operation result as output data.
For overlay processing: A x a + B x'l - a) k This overlay processing corresponds to normal a blend processing, and the blenders BL1 0, BL20 and BL30 switch between this oveday processing and the transparent processing, or the translucent processing is performed according to the flag value. That is, in the processing when the above-mentioned flag is "ll", if "a=O", the above transparent processing is performed by substituting this value (a=O) for the same operation expression as that of the overlay processing. If "a=0", the above translucent processing is performed by rendering a coefficient of A in the same operation expression V and rendering a coefficient of B "(1 - a)" as it is. Thus, for the translucent processing, the rate 0 to transmit the lower priority image will'be specified by "(1 - a)" (a itself specifies the above-mentioned predetermined ratio for attention).
4 Other previously proposed techniques of image superposition processing include a technique which has been proposed in, for example, Japanese patent laid-open publication No.6-335022. In this publication, a technique is disclosed wherein it is decided that the image is transparent if each bit of RGBI (red, green, blue, luminance) data of one image is "0", and.
two images are overlaid by switching to the other image in its transparent portion. In this publication, only transparent processing is disclosed, and no description is included of blend processing and translucent processing.
In the above-mentioned previously proposed image processing apparatus of Fig. 5, a distinction between transparence and no transparence and an a blend rate of the higher priority image are specified for every pixel.
It is necessary to input flags and a blend rates which vary with every supply clock of pixel data separately from the pixel data. Because of this, there has been a problem in which a large amount of input data are required and the bandwidth of the input data becomes remarkably wide.
Also, when carrying out the translucent processing, since the transmittance of the lower priority image is specified by one a blend rate for every pixel in the previously proposed image processing apparatus, it is impossible to apply respectively different transmittances for the luminance (YB) and the color (CbB and CrB) of each pixel. In order to enable this, when performing the translucent processing, two a blend rates for specifying the transmittances for the luminance and the color, respectively, may be input.
However, this method necessitates a larger amount of input data and the bandwidth of the input data also becomes more wide.
Features of an image processing method and apparatus to be described below, by way of example in illustration of the invention are that transparent processing and translucent processing, in additio'n to normal a blend processing, can be performed without requiring data for specifying transparence or no transparence, other than image data as input data, that the amount of input data necessary for superposition processing of a plurality of images can be reduced to narrow a bandwidth of the input data, and that translucent processing for controlling the transmittance independently for parameters specifying each pixel such as the luminance and the color can be 5 performed.
In one image processing method for superposing two images to be described below by way of example in illustration of the invention each pixel of the two images is indicated by a plurality of parameters, information is provided for specifying that each pixel is rendered transparent or translucent in one of the parameters indicating each pixel of one image, and information is provided for specifying the transmittance of a corresponding pixel in the other image in at least one of other parameters and indicating each pixel specified to be translucent by the one of the parameters, and the method includes providing the pixel in the other image as a pixel after'superposition when the parameter specifies transparence, providing the pixel in the other image which is rendered translucent using the transmittance specified by the other parameter or parameters as pixels after superposition when the parameter specifies translucence, and providing the pixel which is obtained by superposing the pixel and the pixel in the other image as a pixel after superposition when the parameter specifies neither transparence nor translucence for each pixel.
It is preferable that, when there are two or more of the other parameters, the transmittances specified respectively by their parameters be provided as transmittances of different parameters indicating the pixel in the other image, and when the pixel in the other image is rendered translucent, the different parameters, respectively, are rendered translucent using the respective transmittances specified by the other parameters.
It is also possible to superpose the pixel in the one image and the pixel in the other image at a specified rate of image mixing when the 6 parameter specifies neither transparence nor translucence.
In this case, it is preferable that the -specified rate of image mixing is a predetermined constant rate.
Another image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, includes providing information for specifying that each pixel is rendered transparent in one of the parameters indicating each pixel of one image, and providing the pixel in the other image as a pixel after superposition when the parameter specifies transparence for each pixel.
It is preferable that the one of the parameters indicating each pixel of one image includes information for specifying that each pixel is rendered transparent or translucent, the other parameter indicating a pixel specified to be translucent by the parameter includes information for specifying the transmittance of a pixel in the other image, and that each pixel in the other image which is rendered translucent using the transmittance specified by the other parameter is provided as a pixel after superposition when the parameter specifies translucence, for each pixel.
It is also preferable that the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image is provided as a pixel after superposition when the parameter does not specify transparence for each pixel.
In yet another image processing method for superposing to images to be described below by way of example in illustration of the present invention, in which each pixel of the two images is indicated by a plurality of parameters, information is provided for specifying that each pixel is rendered translucent in one of the parameters indicating each pixel of one image, and information is provided for specifying the transmittance of a corresponding pixel in the other image in at least one of other parameters indicating said each pixel specified to be translucent by the one of the parameters, including providing 7 the pixel in the other image which is rendered translucent using the transmittance specified by the other parameters as a pixel after superposition when the parameter specifies translucence, for each pixel.
In this case, it is preferable that the one of the parameters indicating each pixel of one image includes information for specifying that each pixel is rendered transparent or translucent; and that the pixel in the other image is provided as a pixel after superposition when the parameter specifies transparence for each pixel.
It is also preferable that the pixel, which is obtained by superposing the pixel in the one image and the pixel in the other image, is provided as a pixel after superposition when the parameter does not specify translucence for each pixel.
An image processing apparatus for superposing two images to be described below, by way of example in illustration of the present invention, in which each pixel of the two images is indicated by a plurality of parameters, includes data input means for inputting image data of one image in which information for specifying that each pixel is rendered transparent or translucent is included in one of the parameters indicating each pixel and information for specifying the transmittance of a corresponding pixel in the other image is included in at least one of other parameters indicating each pixel specified to be translucent by the one of the parameters, and image data of the other image; and processing means for outputting the pixel in the other image when the parameter specifies transparence, and for outputting the pixel in the other image which is rendered translucent using the transmittance specified by the other parameters when the parameter specifies translucence, and outputting the pixel which is obtained by superposing the pixel and the pixel in the other image when the parameter specifies neither transparence nor translucence, for each pixel.
In this case, it is preferable that the data input means provides the 8 transmittances specified respectively by the other parameters as transmittances of different parameters indicating the pixel in the other image when there are two or more of the other parameters, and the processing means renders the different parameters, respectively, translucent using the respective transmittances specified by the other parameters when the pixel in the other image is rendered translucent.
It is also preferable that the processing means includes decision means for deciding whether each of the pixels is superposed in any one of the modes of transparence, translucence or overlay according to the parameter; selection means for selecting the other parameters when the superposition mode decided by the decision means is translucence and selecting a predetermined rate of image mixing when in other modes; and calculation means for outputting the pixel in the other image when the superposition mode is transparence, and outputting the pixel in the other image which is rendered translucent using the transmittance specified by the parameters selected by the selection means when the superposition mode is translucence, and outputting the pixel which is obtained by superposing the pixels and the pixels in the other image using the rate of image mixing selected by the selection means when the superposition mode is overlay.
It is also preferable that the rate of image mixing is a predetermined constant.
In still another arrangement to be described, by way of example, in illustration of the present invention, an image processing apparatus includes a plurality of stages of image processing apparatuses each having the above mentioned structure, wherein, in the second or subsequent stage image processing apparatus, the other image data input by the respective data input means is provided as output of the fore-stage of the image processing apparatus and data of images to be further superposed on the image of the output is provided as the image data.
9 In a further arrangement for superposing two images to be described below, by way of example in illustration of the present invention, each pixel of the two images is indicated by a plurality of parameters, and the arrangement includes data input means for inputting image data of one image including information for specifying that each pixel is rendered transparent in one of the parameters indicating each pixel and image data of the other image; and processing means for outputting the pixel in the other image when the parameter specifies transparence, for each pixel.
A further image processing apparatus for superposing two images to be described below, by way of example in illustration of the p resent invention, each pixel of the two images is indicated by a plurality of parameters, and the apparatus includes data input means for inputting image data in which information for specifying that each pixel is rendered translucent is included in one of the parameters indicating each pixel and information for specifying the transmittance of a pixel in the other image is included in other parameters indicating each pixel specified to be translucent by the param eter, and imag'e data of the other image; and processing means for outputting the pixel in the other image which is rendered translucent using the transmittance specified by the other parameters when the parameter specifies translucence, for each pixel.
Arrangements illustrative of the invention will now be described, by way of example, with reference to the accompanying drawings, in which like reference numerals designate identical or corresponding parts throughout the figures, and in which:
Fig. 1 is a block schematic circuit diagram of an image processing apparatus, Fig.2 is a schematic diagram showing a configuration- of blenders 131-1, 131-2 and 131-3 usable in the image processing apparatus of Fig. 1, Fig. 3 is a schematic diagram showing another configuration of blenders BL1, BL2 and BL3 usable in the image processing apparatus of Fig. 1, and Fig.4 is a flowchart for use in describing a superposition processing procedure for each pixel in the image processing apparatus of Fig. 1 - In Fig. 1, YA, CbA and CrA indicate image data of a higher priority image which is superposed on another image. YB, CbB and CrB indicate image data of a lower priority image as the grounding image on which the higher priority image is superposed. In these image data of two systems, YB, CbB, and CrB correspond to the usual luminance data, chroma or color saturation data, and hue data, respectively, of the lower priority image. The image data of each pixel, i.e., pixel data, of the lower priority image is supplied in sequence.
On the other hand, depending on the value thereof, YA of the higher priority image becomes data indicating the usual luminance or data indicating distinction between transparence and translucence of the hig her priority image. Also, CbA is data indicating the usual color saturation, or data indicating the degree of translucence of luminance data YB (attention factor of YB x the number of gradations of luminance). CrA is data indicating the usual hue, or data indicating the degree of translucence of color saturation data CbB and hue data CrB (attention factor of CbB and CrB x the number of gradations of color). That is, with respect to respective pixel data, data CbA and CrA are usual or normal color data when data YA has the value specifying normal luminance. Data CbA and CrA become the value indicating colorlessness which means that particularly no data specifying color is included, when data YA has the value specifying that the higher priority image is rendered transparent (hereinafter called "value specifying transparence").
Data CbA becomes the value indicating the degree of translucence of luminance data YB and data CrA becomes the value indicating the degree of translucence of color data CbB and CrB, when data YA has the value 11 specifying translucence (hereinafter called "value specifying translucence").
For example, when luminance and color respectively have 256 gradations, the values of data YA, CbA and CrA can be allocated as follows to implement the above-mentioned correspondence relationship of each data.
Normal Transparence Translucence YA 2 to 255 0 1 CbA (normal --Attenuation factor saturation of YB x 256 gradation value) CrA (normal hue --- Attenuation factor gradation value) of CbB and CrB x 256 Here, data CbA and CrA in the case of translucence are the value specifying the degree of translucence of luminance data YB, color data CbB and CrB. This is because data CbA and CrA are the data having values from 0 to 255 which also indicate normal color saturation and hue. The data CbA and CrA substantially correspond to the value indicating the attenuation factor of the lower priority image in translucent processing, and the transmittance is specified by "l - attenuation factor". Incidentally, the above-mentioned allocation of data is merely one example. It is possible to allocate values other than 0 or 1 for the value specifying transparence and the value specifying translucence of data YA. In short, two values from among any of 0 to 255 may be secured. Also, the degree of translucence indicated by data CbA and CrA may be reversely allocated. Further, this likewise applies to the case in which the number of gradations is other than 256 gradations.
As mentioned above, in the presently described image processing apparatus, two values from data YA which inherently indicate the luminance of a predetermined number of gradations are allocated to the values 12 specifying transparence and translucence, so that the luminance of the higher priority image becomes rough by two gradations. However, even if the number of gradations in luminance is decreased by two gradations in normal display, there is little influence on a viewer. For the above example of allocation, the luminance at about two gradations from the lowest gradation among 256 gradations merely indicates very dark luminance which is unrecognizable. Therefore, no problem arises even if these gradations are compressed to indicate them by a common value of "YA=2". Also, data CbA and CrA in the case in which translucent processing is performed are the values which do not specify any information on displayed images. Thus, there arises no problem even if the degree of translucence in luminance and color is allocated to these data as described above.
On the other hand, a constant a blend rate in Fig. 1 is the constant value indicating the mixing ratio (weight) of respective pixel values of both images when superposing or overlaying the higher and lower priority images.
Concretely, according to the image to be superposed, the proportion of the higher priority image in overlay processing is previously specified in the range from 0 to 1 (0 through 100 percent), as this constant a blend rate. The constant a blend rate and image data of the above two systems are respectively generated by predetermined data generating means (not shown) and are supplied as input data.
DE is a decoder which receives data YA and, according to its value, outputs a transparent enable signal TPE and a translucent enable signal TLE.
An output terminal of the translucent enable signal TLE is connected to selectors SE1 and SE2, blenders 131-1, 131-2 and 131-3, and an output terminal of the transparent enable signal TPE is connected to the blenders 131-1, 131-2 and 131-3. This decoder DE renders the translucent enable signal TLE disabled while rendering the transparent enable signal TPE active when data YA has the value specifying transparence. The decoder DE renders the 13 translucent enable signal TILE active while rendering the transparent enable signal TPE disabled when data YA has the value specifying translucence. The decoder DE renders both the translucent enable signal TILE and the transparent enable signal TPE disabled when data YA does not have the value specifying the transparence and does not have the value specifying the translucence.
The selector SE1 receives data CbA and the constant a blend rate, and selects either of them according to the translucent enable signal TILE from the decoder DE and outputs selected data to the blender BLII. Here. the selector SE1 selects data CbA to output it when the translucent enable signal TLE is active, and selects the constant a blend rate to output it when the translucent enable signal TILE is disabled. But, when data CbA is selected, its data CbA has the value indicating the degree of translucence of luminance data YB (described later), so that the data CbA is converted into the luminance attenuation factor aY obtained by dividing the data CbA by the number of gradations of luminance, and output.
The selector SE2 receives data CrA and the constant a blend rate, and selects either of them according to the translucent enable signal TILE from the decoder DE and outputs selected data to the blenders BL2 and BL3.
Here, the selector SE2 selects data CrA to output it when the translucent enable signal TILE is active, and selects the constant a blend rate to output it when the translucent enable signal TILE is disabled. But, when data CrA is selected, its data CrA has the value indicating the degree of translucence of color data CbB and CrB (described later), so that the data CrA is converted into the color attenuation factor aC obtained by dividing the data CrA by the number of gradations of color, and output.
The blenders BLII, BL2 and BU, respectively, receive data YA and luminance data YB, data CbA and color saturation data CbB, data CrA and hue data CrB, and mix them according to the signals from the decoder DE 14 and the selector SE1 or SE2, and outputs mixed data as blended output luminance data Y, output colour saturation data Cb, and output hue data Cr. These blenders BL1, BL2 and BL3 have the same internal structure although the inputs and the outputs thereof are mutually different. The internal structure of each of the blenders BL1, BL2 and BL3 is shown in each of Fig.2 and Fig.3 and will be described in detail below.
Fig.2 shows one example of a structure of each of th e blenders BL1, BL2 and BL3. In Fig.2, input A is image data of the higher priority image, and corresponds to data YA in the blender BL1, CbA in the blender BL2, or CrA in the blender BL3. Input B is image data of the lower priority image, and corresponds to data YB in the blender BL1, CbB in the blender BL2, or CrB in the blender BL3. a'is an output from each of the selectors, and corresponds to the constant a blend rate or luminance attenuation factor a Y output from the selector SE1 in the blender BL1. Also, a'corresponds to the constant a blend rate or colour attenuation factor a C outputted from the selector SE2 in the blenders BL2 or BL3.
A reference symbol b1 1 shows a selector in which an input A is received in one input terminal and an input to the other input terminal is held at "0". The selector bl 1 receives the translucent enable signal TLE from the decoder DE as a selection signal, and selects the input of "0" to output it when the selection signal is active and selects the input A to output it when the selection signal is disabled. A reference symbol bl 2 shows a selector in which the above-mentioned selector output a' is received in one input terminal and an input to the other input terminal is held at "0". The selector b12 receives the transparent enable signal TIRE as a selection signal, and selects the input of "0" to output it when the selection signal is active and selects the above-mentioned selector output a' to output it when the selection signal is disabled.
A reference symbol bl 3 shows a calculator which receives outputs of the selectors bl 1 and bl 2 and an input B and performs predetermined operation processing. More specifically, where an input from the selector bl 1 is A' and an input from the selector bl 2 is a", this calculator bl 3 performs the following operation and outputs its result therefrom. 5 A' x a" + B x (1 -a") Fig. 3 shows another example of a structure of each of the blenders BL1, BL2 and BL3. In the blender of Fig.3, the selector bl 2 of Fig.2 is omitted and a' output from each of the selectors SE1 and SE2is directly supplied to the calculator b13. Also, a selector b14 is provided. The selector b14 is a selector which receives the input B at one irsput terminal and an output of the calculator bl 3 at the other input terminal, and selects either of them to provide a blender output. The selector bl 4 receives the transparent enable signal TPE as a selection signal, and selects the input B to output it when the selection signal is active and selects the selector the output of the calculator bl 3 to output it when the selection signal is disabled. Other portions of Fig.3 are the same as those of Fig.2 and a detailed explanation thereof is omitted.
The blender output by each of the above-mentioned structures of Fig.2 and Fig.3 corresponds to output luminance data Y in the case of the blender BL1, output color saturation data Cb in the case of the blender BL2, and output hue data Cr in case of the blender BL3.
The blenders BL1, BL2 and BL3 of Fig.1 each have the abovementioned configuration shown in Fig.2 or Fig.3, and the present image processing apparatus is configured such that the above-mentioned input data are generated by predetermined data generating means and are supplied to an image processing circuit which includes these blenders, the abovementioned decoder DE, and the selectors SE1 and SE2.
Next, an operation by the above configuration will be explained with 16 reference to Fig.4. In this explanation, it is assumed that each of the blenders BL1, BL2 and BL3 has the structure shown in Fig.2. In the present image processing apparatus, two systems of the image data including data YA, GbA and CrA of the higher priority image, and luminance data YB, color saturation data CbB and hue data CrB of the lower priority image, as mentioned above, are sequentially supplied for each respective pixel, and superposition processing is sequentially performed for each pixel using their pixel data as input data. Such a procedure for a superposition-processing cycle for each pixel is shown in Fig.4 and will be described below with reference to Fig. 4. In the following description, the above-mentioned example of data allocation in which luminance and the like have 256 gradations is assumed.
When new image data of the higher priority image and the lower priority image is supplied and pixel data is updated (step SO), it is decided whether data YA of the updated pixel data is the value specifying transparence, i.e., "0" or not, by the decoder DE (step Sl). Now, if this pixel data is the pixel data within an area in which the lower priority image is to be displayed and the higher priority image is to be transparent and the data YA is "0" this decision result is "YES", and the operation proceeds to the processing of step S2.
In step S2, the decoder DE renders the transparent enable signal TIRE active while rendering the translucent enable signal TLE disabled. Also, by this, the selectors SEI and SE2 select the constant a blend rate and output it to the blender BL1 and to blenders BL2 and BL3, respectively.
Then, each blender receives signals from the above- mentioned decoder DE and the selector SE1 or SE2, and performs operation processing or calculation processing based on the signals, and the updated data YA and luminance data YB, data CbA and color saturation data CbB, or data CrA and hue data CrB (step S3).
At this time, in each blender, while the input A is output from the 17 selector bl 1 and a signal of "0" is output from the selector bl 2, thereby an operation is performed in the calculator bl 3. In this case, the operation result in the calculator b13 is "A x 0 + B x (1 - 0) 13" resulting in B. By this, luminance data YB, color saturation data CbB and hue data CrB of the lower priority image are output as they are, as output luminance data Y, output color saturation data Cb and output hue data Cr, and pixel data after transparent processing can be obtained in which the higher priority image is rendered transparent and is superposed.
On the other hand, if the data YA is not "0", the decision result in the above step S1 is "NO" and the operation proceeds to step S4. The decoder DE then decides whether the YA is the value specifying translucence, i.e., "l or not. Here, if the updated pixel data is the pixel data within the translucent higher priority image area and the data YA is "I ", the decision result is "'YES", and the operation proceeds to the processing of step S5.
In step S5, the decoder DE renders the transparent enable signal TPE disabled while rendering the translucent enable signal TLE active. Also, by this, the selectors SE1 and SE2 select data CbA and CrA, respectively. In this case, the data YA is ul", so that the data CbA and CrA, respectively, are the values indicating the degree of translucence of luminance data YB and color data CbB and CrB. Because of this, the selector SE1 divides the selected data CbA by the number of luminance gradations "256" and outputs the result of the division to the blender BLI as the luminance attenuation factor aY. The selector SE2 divides the selected data CrA by the number of color gradations "256" and outputs the result of the division to the blenders BL2 and BL3 as the color attenuation factor aC.
Then, the operation proceeds to the step S3, and each blender performs calculation processing based on the signals from the abovementioned decoder DE and the selector SE1 or SE2 and the updated pixel data (YA and YB, CbA and CbB, or CrA and CrB).
18 At this time, in each blender, a signal of "0" is output from the selector bl 1 and the selector output a' is output from the selector bl 2. Thus, in the calculator bl 3, the following operation is performed and this operation result is output. 5 A'x a" + B x (1 - a") =0 x a'+ B x (1 a') =Bx(1 -a') That is, "luminance data YB x (1 - luminance attenuation factor is aY)" is output from the blender BL1 as the output luminance data Y. Also, "color saturation data CbB x (1 - color attenuation factor a C)" is output from the blender BI-2 as the output color saturation data Cb, and "hue data CrB x (1 - color attenuation factor a C)" is output from the blender BU as the output hue data Cr. Thereby, pixel data after translucent processing in which the translucent higher priority image is superposed can be obtained. Here, "(1 luminance attenuation factor aY)" corresponds to the transmittance in luminance and "(1 - color attenuation factor a C)" corresponds to the transmittance in color, and these are controlled by the values.of data CbA and CrA, respectively, in the present image processing apparatus. By this, the translucent processing capable of controlling the transmittances independently for luminance and color, respectively, is realized.
Also, in the present image processing apparatus, the transmittances are controlled by the data CbB and CrA in such a manner as mentioned above, so that different transmittances can be applied for every pixel in spite of rendering the a blend rate constant. This means that transparent processing, or translucent processing in which different transmittances are applied for every pixel can be performed without providing data specifying transparence/no transparence or the a blend rate for each respective pixel.
19 That is, according to the present image processing apparatus, transparent processing or translucent processing can be performed for every pixel using a small amount of input data.
On the other hand, if the data YA is neither "0" nor "l ", namely, if the updated pixel data is the pixel data within an area in which the higher priority image and the lower priority image are to be superposed to display them and the data YA is any of "2" to "256", both of the decision results in t.he above steps S1 and S4 by the decoder DE are "NO" and the operation proceeds to the processing of step S6.
In step S6, the decoder DE renders both of the transparent enable signal TPE and the translucent enable signal TLE disabled. Also, by this, the selectors SE1 and SE2 select the constant a blend rate and output it to the blender BL1 and blenders 131-2 and 131-3, respectively. Then, the operation proceeds to step S3 and each blender performs calculation processing based on the signals from the decoder DE and the selector SE1 or SE2 and the updated pixel data.
At this time, in each blender, input A is output from the selector bl 1, and selector output a' is output from the selector bl 2. Thus, in the calculator bl 3, the following operation is performed, and this operation result is output.
A'x a" + B x (1 - a") =A x a'+ B x (1 - a') =A x a + B x (1 - a) This corresponds to a processing step for superposing the higher priority image and the lower priority image according to the constant a blend rate, namely, a normal a blend processing. In such a way, the normal superposing processing is performed when the data YA is not a specific value, such as the value specifying transparence or translucence. Thereby, output luminance data Y, output color saturation data Cb and output hue data Cr for indicating pixels after the superposing processing, in which pixels of the higher priority image and pixels of the lower priority image are mixed, can be obtained.
As described above, when pixel data is updated, depending on the value of its data YA, any transparent processing by steps S1, S2 and S3, translucent processing by steps S1, S4, S5 and S3, overlay or superposing processing by steps S1, S4, S6 and S3 is executed. This processing operation is repeated every time that pixel data is updated, and pixels of the higher priority image and pixels of the lower priority image are superposed in sequence. By this, superposition processing of the higher priority image and the lower priority image including transparent processing and.translucent processing in addition to normal a blend processing is performed, and one system of image data (Y, Cb, Cr) indicating the image in which these two images are superposed can be obtained. Incidentally, image data obtained by performing this superposition processing is utilized for subsequent processing in which a superposed image is displayed by supplying it to a predetermined display means, in which the image data is used for further other image processing, and the like.
Although particular arrangements illustrative of the present invention have been described above, an image processing method and an image processing apparatus for which protection is sought by the appended claims, are not limited to the arrangements described above. For example, although the arrangements above-described relate to the case in which image data are indicated by luminance data, color saturation data and hue data, the protection sought is not limited to such a case. The arrangements described may likewise be applied to the case in which image data is represented by the RGB method (pixel value data of red, green and blue). In such a case, it is possible to allocate the above values specifying transparence and 21 translucence to color data having less influence on images when the number of gradations is decreased, and to perform transmittance control and the like in translucent processing by the data of another color. Further, it is possible to replace the set of parameters including the luminance data, the color saturation data and the hue data used in the above-mentioned arrangement, by a set of parameters including pixel value data of, for exam pie, luminance, blue color and red color.
Also, the arrangement may be applied to image data for specifying pixels by two parameters, as well as image data including three parameters for specifying pixels by luminance, color saturation and hue or by RGB. In this case, superposition and a distinction between transparence and translucence may be specified by allocating the values specifying transparence and translucence to one of the two parameters, and transmittance control and the like in translucent processing may be performed by the other parameter. That is, as long as images are indicated by two or more parameters, the present arrangement may be applied to image data of any form.
Further, although two images have been superposed in the above arrangement, three or more images may be superposed. For example, the image processing apparatuses each having the configuration of Fig.1 can be connected in a multi-stage configuration, and respective image processing apparatuses can use output luminance data Y, output chroma or color saturation data Cb and output hue data Cr from the fore-stage image processing apparatuses as input luminance data Y13, input chroma or color saturation data CbB and input hue data Cr of the lower priority image. Also, data YA, CbA and CrA of the higher priority image to be superposed on the lower priority image are supplied to respective image processing apparatuses, and superposition processing is executed in the same manner as above. By this, the three or more images may be superposed in sequence.
22 As described above, information for specifying transparence or translucence may be included in one of the parameters indicating each pixel of one image, and other parameters of the pixel specified to be translucent by this parameter are utilized as information for specifying the transmittance of a corresponding pixel in the other image. Also, according to the particular one parameter of each pixel, any of the pixels in the other image, the pixel which is obtained by rendering the pixel in the other image translucent using the transmittance specified by the other parameters, or the pixel which is obtained by superposing the pixels of both images, is output. Therefore, a normal a blend processing, transparent processing and translucent processing can be performed upon every pixel without inputting data for specifying transparence or no transparence in addition to image data. Thereby, an effect is obtained in which the amount of input data necessary for the superposition processing of a plurality of images can be reduced and the bandwidth of the input data can also be limited.
Also, it will be understood that, when there are two or more of the other parameters specifying transmittances, transmittances of different parameters of the pixel in the other image are respectively specified by their parameters, and translucent processing to the respective parameters is performed by the transmittances. Therefore, the translucent processing for independently controlling the transmittances of parameters specifying each pixel such as luminance, color and the like can be performed.
Further, as will be understood from the above description, processing for superposing each pixel in any one of modes of transparence, translucence or overlay can be realized by using a decision means for carrying-out a decision based on the parameter, a selection means for selecting data to be used based on the decision, and a calculation means for carrying-out operations of transparent, translucent or overlay processing based on these decision and the selection results and so on.
23 It will also be appreciated that, the rate of image mixing in an overlay is a predetermined constant, so that the amount of input data necessary for the superposition processing of a plurality of images can be further reduced to narrow the bandwidth of the input data more.
On the other hand, it will be seen that a plurality of stages of image processing apparatuses, each having the above-mentioned structure may be provided, and the inputs in each of the second or subsequent stage image processing apparatuses may be provided by an output of the fore-stage of the image processing apparatus and of data of images to be further superposed. Therefore, where the image is one in which three images are superposed, the image is one in which four images are superposed, and the like, the output may be from the second stage of the image processing apparatus, and the third stage of the image processing apparatus, respectively, and subsequently the image in which images of the number of stages plus one are superposed is output from the last stage'of the image processing apparatus. Thus, an effect is obtained in which three or more images can be superposed by providing the necessary number of the above mentioned image processing apparatuses which have been described above.
Moreover, it will be appreciated that the transparent processing of the above superposition processing can be impiemented with a small amount of input data. Also, the translucent processing of the above superposition processing can be implemented with a small amount of input data.
Although particular arrangements have been described, by way of example, in illustration of the present invention, it will be understood that variations and modifications, as well as other arrangements, may be conceived within the scope of the appended claims.
24

Claims (19)

1. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered transparent or translucent is included in one of the parameters indicating e ach pixel of one image, and information for specifying the transmittance of a corresponding pixel in the other image is included in at least one of the other parameters indicating each specified pixel to be tran slucent by the one of the parameters, and including the steps of providing the pixel in the other image as a pixel after superposition when the parameter specifies transparence, and providing the pixel in the other image which is rendered translucent using the transmiftance specified by one of the other parameters as a pixel after superposition when the parameter specifies translucence, and providing the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image as a pixel after superposition when the parameter specifies neither transparence nor translucence for one pixel.
2. An image processing method as claimed in claim 1, wherein when there are two or more of the other parameters, the transmiffances specified respectively by the parameters are provided as transmittances of different parameters indicating the pixel in the other image, and when the pixel in the other image is rendered translucent, the different parameters, respectively, are rendered translucent using the respective transmittances
3. An image processing method as claimed in claim 1, wherein the pixel in the one image and the pixel in the other image are superposed at a specified rate of image mixing when the parameter specifies neither transparence nor translucence.
4. An image processing method as claimed in claim 3, wherein the specified rate of image mixing is a predetermined constant rate.
5. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered transparent is included in one of the parameters indicating each pixel of one image, and the pixel in the other image is provided as a pixel after superposition when the parameter specifies transparence for a pixel.
6. An image processing method as set forth in claim 5, wherein the one of the parameters indicating a pixel of one image includes information for specifying that the pixel is rendered transparent or translucent, and the other parameter indicating a pixel specified to be translucent by the parameter includes information for specifying the transmittance of a pixel in the other image, andwherein each pixel in the other image which is rendered translucent using the transmittance specified by the other parameter is provided as a pixel after superposition when the parameter specifies translucence, for a pixel.
7. An image processing method as claimed in claim 5, wherein the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image are provided as a pixel after superposition when the parameter does not specify transparence.
8. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered translucent is included in one of the 26 parameters indicating the pixel of one image, and information for specifying the transmittance of a corresponding pixel is included in the other image in at least one of other parameters indicating a specified pixel to be translucent by one of the parameters, and the pixel in the other image which is rendered translucent using the transmittance specified by one of the other parameters is provided as a pixel after superposition when the parameter specifies translucence.
9. An image processing method as claimed in claim 8, wherein the one of the parameters indicating a pixel of one image includes information for specifying that each pixel is rendered transparent or translucent, and wherein the pixel in the other image is provided as a pixel after superposition when the parameter specifies transparence.
10. An image processing method as claimed in claim 8, wherein the pixels which are obtained by superposing the pixels in the one image and the pixels in the other image are provided as pixels after superposition when the parameter does not specify translucence.
11. An image processing apparatus for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which the apparatus includes data input means for inputting image data of one image in which information for specifying that each pixel is rendered transparent or translucent is included in one of the parameters indicating the pixel and information for specifying the transmittance of a corresponding pixel in the other image is included in at least one of the other parameters indicating the specified pixel specified to be translucent by the one of the parameters, and image data of the other image, and processing means for outputting the pixel 27 in the other image when the parameter specifies transparence, and outputting the pixel in the other image which is rendered translucent using the transmittance specified by the other parameters when the parameter specifies translucence, and outputting the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image when the parameter specifies neither transparence nor translucence.
12. An image processing apparatus as claimed in claim 11, wherein the data input means provides the transmittances specified respectively by the other parameters as transmittances of different parameters indicating the pixel in the other image when there are two or more of the other parameters, and the processing means renders the different parameters, respectively, translucent using the respective transmittances specified by the other parameters when the pixel in the other image is rendered translucent.
13. An image processing apparatus as claimed in claim 11 or claim 12, wherein the processing means includes decision means for deciding whether each pixel is superposed in any one of the modes of transparence, translucence or overlay according to the parameter, selection means for selecting the other parameters when the superposition mode decided by the decision means is translucence and selecting a predetermined rate of image mixing when in other modes, and calculation means for outputting the pixel in the other image when the superposition mode is transparence, outputting the pixel in the other image which is rendered translucent using the transmittance specified by the parameters selected by the selection means when the superposition mode is translucence, and outputting the pixel which is obtained by superposing the pixel and the pixel in the other image using the rate of image mixing selected by the selection means when the superposition 28 mode is overlay.
14. An image processing apparatus as claimed in claim 13, wherein the rate of image mixing is a predetermined constant. 5
15. An image processing apparatus including a plurality stages of image processing apparatuses each claimed in any one of claims 11 -14, wherein, in the second or subsequent stage image processing apparatus, the other image data input by the respective data input means is provided as an output of the fore-stage of the image processing apparatus and data of images to be further superposed on the image of the output is provided as the image data.
16. An image processing apparatus for superposing two images, each pixel of two images being indicated by a plurality of parameters, including data input means for inputting image data of one image including information for specifying that a pixel is rendered transparent in one of the parameters indicating a pixel and image data of the other image, and processing means for outputting a pixel in the other image when the parameter specifies transparence.
17. An image processing apparatus for superposing two images, each pixel of the two images being indicated by a plurality of parameters, including data input means for inputting image data in which information for specifying that a pixel is rendered translucent is included in one of the parameters indicating a pixel, and wherein information for specifying the transmittance of a pixel in the other image is included in other parameters indicating a pixel specified to be translucent by parameter, and image data of the other image, and processing means for outputting the pixel in the other image which is 29 rendered translucent using the transmittance specified by the other parameters when the parameter specifies translucence.
17. An image processing method, as claimed in any one of claims 1, 5 or 8, substantially as described herein with reference to any one of Figs. 1 to 4 of the accompanying drawings.
18. An image processing apparatus, as claimed in any -one of claims 11, 16, or 17, substantially as described herein with reference to any one of Figs. 1 to 10 4 of the accompanying drawings.
Amendments to the claims have been filed as follows 1. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered transparent or translucent is included in one of the parameters indicating each pixel of one image, and information for specifying the transmittance of a corresponding pixel in the other image is included in at least one of the other parameters indicating each specified pixel to be translucent by the one of the parameters, and including the steps of providing the pixel in the other image as a pixel after superposition when the parameter specifies transparence, and providing the pixel in the other image which is rendered translucent using the transmittance specified by one of the other parameters as a pixel after superposition when the parameter specifies translucence, and providing the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image as a pixel after superposition when the parameter specifies neither transparence nor translucence for one pixel.
2. An image processing method as claimed in claim 1, wherein when there are two or more of the other parameters, the transmittances specified respectively by the parameters are provided as transmittances of different parameters indicating the pixel in the other image, and when the pixel in the other image is rendered translucent, the different parameters, respectively, are rendered translucent using the respective transmittances 3. An image processing method as claimed in claim 1, wherein the pixel in the one image and the pixel in the other image are superposed at a specified rate of image mixing when the parameter specifies neither transparence nor translucence.
4. An image processing method as claimed in claim 3, wherein the specified rate of image mixing is a predetermined constant rate.
5. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered transparent is included in one of the parameters indicating each pixel of one image, and the pixel in the other image is provided as a pixel after superposition when the parameter specifies transparence for a pixel.
6. An image processing method as set forth in claim 5, wherein the one of the parameters indicating a pixel of one image includes information for specifying that the pixel is rendered transparent or translucent, and the other parameter indicating a pixel specified to be translucent by the parameter includes information for specifying the transmittance of a pixel in the other image, andwherein each pixel in the other image which is rendered translucent using the transmittance specified by the other parameter is provided as a pixel after superposition when the parameter specifies translucence, for a pixel.
7. An image processing method as claimed in claim 5, wherein the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image are provided as a pixel after superposition when the parameter does not specify transparence.
8. An image processing method for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which information for specifying that each pixel is rendered translucent is included in one of the parameters indicating the pixel of one image, and information for specifying the transmittance of a corresponding pixel is included in the other image in at least one of other parameters indicating a specified pixel to be translucent by one of the parameters, and the pixel in the other image which is rendered translucent using the transmittance specified by one of the other parameters is provided as a pixel after superposition when the parameter specifies translucence.
9. An image processing method as claimed in claim 8, wherein the one of the parameters indicating a pixel of one image includes information for specifying that each pixel is rendered transparent or translucent, and wherein the pixel in the other image is provided as a pixel after superposition when the parameter specifies transparence.
10. An image processing method as claimed in claim 8, wherein the pixels which are obtained by superposing the pixels in the one image and the pixels in the other image are provided as pixels after superposition when the parameter does not specify translucence.
11. An image processing apparatus for superposing two images, each pixel of the two images being indicated by a plurality of parameters, in which the apparatus includes data input means for inputting image data of one image in which information for specifying that each pixel is rendered transparent or translucent is included in one of the parameters indicating the pixel and information for specifying the transmittance of a corresponding pixel in the other image is included in at least one of the other parameters indicating the specified pixel specified to be translucent by the one of the parameters, and image data of the other image, and processing means for outputting the pixel in the other image when the parameter specifies transparence, and outputting the pixel in the other image which is rendered translucent using the trans.mittance specified by the other parameters when the parameter specifies translucence, and outputting the pixel which is obtained by superposing the pixel in the one image and the pixel in the other image when the parameter specifies neither transparence nor translucence.
12. An image processing apparatus as claimed in claim 11, wherein the data input means provides the transmittances specified respectively by the other parameters as transmittances of different parameters indicating the pixel in the other image when there are two or more of the other parameters, and the processing means renders the different parameters, respectively, translucent using the respective transmittances specified by the other parameters when the pixel in the other image is rendered translucent.
13. An image processing apparatus as claimed in claim 11 or claim 12, wherein the processing means includes decision means for deciding whether each pixel is superposed in any one of the modes of transparence, translucence or overlay according to the parameter, selection means for selecting the other parameters when the superposition mode decided by the decision means is translucence and selecting a predetermined rate of image mixing when in other modes, and calculation means for outputting the pixel in the other image when the superposition mode is transparence, outputting the pixel in the other image which is rendered translucent using the transmittance specified by the parameters selected by the selection means when the superposition mode is translucence, and outputting the pixel which is obtained by superposing the pixel and the pixel in the other image using the rate of image mixing selected by the selection means when the superposition mode is overlay.
14. An image processing apparatus as claimed in claim 13, wherein the rate of image mixing is a predetermined constant. 5 15. An image processing apparatus including a plurality stages of image processing apparatuses each claimed in any one of claims 11 -14, wherein, in the second or subsequent stage image processing apparatus, the other image data input by the respective data input means is provided as an output of the fore-stage of the image processing apparatus and data of images to be further superposed on the image of the output is provided as the image data.
16. An image processing apparatus for superposing two images, each pixel of two images being indicated by a plurality of parameters, including data input means for inputting image data of one image including information for specifying that a pixel is rendered transparent in one of the parameters indicating a pixel and image data of the other image, and processing means for outputting a pixel in the other image when the parameter specifies transparence. 20, 17. An image processing apparatus for superposing two images, each pixel of the two images being indicated by a plurality of parameters, including data input means for inputting image data in which information for specifying that a pixel is rendered translucent is included in one of the parameters indicating a 25 pixel, and wherein information for specifying the transmittance of a pixel in the other image is included in other parameters indicating a pixel specified to be translucent by parameter, and image data of the other image, and processing means for outputting the pixel in the other image which is -35 rendered translucent using the transmittance specified by the other parameters when the parameter specifies translucence.
18. An image processing method, as claimed in any one of claims 1, 5 or 8, substantially as described herein with reference to any one of Figs. 1 to 4 of the accompanying drawings.
19. An image processing apparatus, as claimed in any one of claims 11, 16, or 17, substantially as described herein with reference to any one of Figs. 1 to 10 4 of the accompanying drawings.
GB9930710A 1998-12-28 1999-12-24 Image processing method and apparatus Expired - Fee Related GB2352950B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP10374509A JP3080087B2 (en) 1998-12-28 1998-12-28 Image processing method and apparatus

Publications (3)

Publication Number Publication Date
GB9930710D0 GB9930710D0 (en) 2000-02-16
GB2352950A true GB2352950A (en) 2001-02-07
GB2352950B GB2352950B (en) 2001-07-25

Family

ID=18503974

Family Applications (1)

Application Number Title Priority Date Filing Date
GB9930710A Expired - Fee Related GB2352950B (en) 1998-12-28 1999-12-24 Image processing method and apparatus

Country Status (2)

Country Link
JP (1) JP3080087B2 (en)
GB (1) GB2352950B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003225427A (en) * 2002-02-05 2003-08-12 Shinnichi Electronics Kk Picture display device for pachinko machine, and picture displaying method and picture displaying program for the picture display device
JP2008262102A (en) * 2007-04-13 2008-10-30 Mitsubishi Electric Corp Overlay image synthesizer
CN107820089B (en) * 2012-07-10 2020-05-19 索尼公司 Image decoding device, image encoding device, and image encoding method
CN111147770B (en) * 2019-12-18 2023-07-07 广东保伦电子股份有限公司 Multi-channel video window superposition display method, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0615223A1 (en) * 1993-03-10 1994-09-14 AT&T Corp. Method and apparatus for the coding and display of overlapping windows with transparency
US5500684A (en) * 1993-12-10 1996-03-19 Matsushita Electric Industrial Co., Ltd. Chroma-key live-video compositing circuit

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3904243B2 (en) * 1992-10-06 2007-04-11 セイコーエプソン株式会社 Image processing apparatus and method
JPH0728986A (en) * 1993-07-09 1995-01-31 Fuji Xerox Co Ltd Picture synthesis processor
JPH08272999A (en) * 1995-03-30 1996-10-18 Casio Comput Co Ltd Image controller

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0615223A1 (en) * 1993-03-10 1994-09-14 AT&T Corp. Method and apparatus for the coding and display of overlapping windows with transparency
US5500684A (en) * 1993-12-10 1996-03-19 Matsushita Electric Industrial Co., Ltd. Chroma-key live-video compositing circuit

Also Published As

Publication number Publication date
JP2000194837A (en) 2000-07-14
JP3080087B2 (en) 2000-08-21
GB2352950B (en) 2001-07-25
GB9930710D0 (en) 2000-02-16

Similar Documents

Publication Publication Date Title
CA1250379A (en) Method and apparatus for providing anti-aliased edges in pixel-mapped computer graphics
US4794382A (en) Image retouching
US5097257A (en) Apparatus for providing output filtering from a frame buffer storing both video and graphics signals
CN109979401B (en) Driving method, driving apparatus, display device, and computer readable medium
US9093033B2 (en) Image display device and image display method
KR19990087566A (en) Mixing Video Images at Home Communication Terminals
US20040233217A1 (en) Adaptive pixel-based blending method and system
EP0089174A1 (en) A video retouching system
KR20150008712A (en) Signal processing method, signal processor, and display device comprsing the signal processor
CN102210145A (en) Picture quality control method and image display using same
CN103761955B (en) The color matching method of perception between two kinds of different multicolor displaying
US20010014175A1 (en) Method for rapid color keying of color video images using individual color component look-up-tables
US11322101B2 (en) Liquid crystal display device to improve color shift due to large viewing angle
KR100823789B1 (en) Display apparatus, display method, and control apparatus for display apparatus
JPH06223168A (en) Mixture and generation system of color as well as graphics system
KR100760608B1 (en) False contour reduction device, display device, false contour reduction method, and false contour reduction program
US5852444A (en) Application of video to graphics weighting factor to video image YUV to RGB color code conversion
US11094093B2 (en) Color processing program, color processing method, color sense inspection system, output system, color vision correction image processing system, and color vision simulation image processing system
JPH01321578A (en) Picture display system
GB2352950A (en) Image superposition including transparency and translucency parameters
US20050083355A1 (en) Apparatus and method for image-processing, and display apparatus
JP2017072754A (en) Display device and control method thereof
US20070109314A1 (en) Adaptive pixel-based blending method and system
JP2007324665A (en) Image correction apparatus and video display apparatus
JP3972471B2 (en) Image display device and image display method

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20091224