GB2369023A - Image data processing devices and methods for generating animations - Google Patents

Image data processing devices and methods for generating animations Download PDF

Info

Publication number
GB2369023A
GB2369023A GB0129737A GB0129737A GB2369023A GB 2369023 A GB2369023 A GB 2369023A GB 0129737 A GB0129737 A GB 0129737A GB 0129737 A GB0129737 A GB 0129737A GB 2369023 A GB2369023 A GB 2369023A
Authority
GB
United Kingdom
Prior art keywords
picture
color
image data
trace line
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB0129737A
Other versions
GB2369023B (en
GB0129737D0 (en
Inventor
Yushi Ihara
Yuji Matsumoto
Eiichi Ishii
Chizuru Makita
Kohei Nojiri
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority claimed from GB9907364A external-priority patent/GB2333678B/en
Publication of GB0129737D0 publication Critical patent/GB0129737D0/en
Publication of GB2369023A publication Critical patent/GB2369023A/en
Application granted granted Critical
Publication of GB2369023B publication Critical patent/GB2369023B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/802D [Two Dimensional] animation, e.g. using sprites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/20Drawing from basic elements, e.g. lines or circles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A trace line picture which is manually drawn with a pen is captured by a camera or a scanner, and is supplied to a personal computer as an original line drawing. The personal computer carries out thinning processing on the supplied line drawing, and paints all pixels in a closed area surrounded by the thinned line with a color designated by an operator. Then, the computer thickens the original trace line and colors pixels on the thickened trace line with a predetermined color, in accordance with a command from the operator. The computer synthesizes the picture of the closed area surrounded by the thinned line and the thickened picture in accordance with a predetermined synthesis formula. Finally, the computer checks whether or not pictures synthesized over several frames are displayed as an appropriate dynamic image and colours them in the same way as the original frame.

Description

IMAGE DATA PROCESSING DEVICES AND METHODS This invention relates to image data processing. A preferred embodiment of the invention described hereinbelow relates to an image data processing device and method which enable quick generation of image data of animation by using a computer.
Fig. 36 shows the flow of processing in producing a conventional animation.
First, at step 81, planning of production of an animation is carried out. On the basis of this planning, a scenario is prepared at step S2. Next, at step S3, original pictures are prepared. The original pictures are drawn on paper with a pencil or a pen.
At step S4, a dynamic image is prepared on the basis of the original pictures prepared at step S3. The dynamic image is prepared by interpolating one original picture and the next original picture prepared at step S3 so as to appear as a dynamic image.
At step S5, the dynamic image thus prepared is copied on a transparent Dim, thus preparing cell pictures. At this point, an image of one frame is constituted by images of a plurality of layers.
For example, a dynamic image made of three frames, showing a person and a dog walking together to a house, is constituted by three layers for each frame, as shown in Fig. 37. Specifically, in this example, a cell picture C in which the house is drawn is prepared as a background layer. The person and the dog are drawn in a layer A and a layer B, respectively. In the layer A, the person in a cell picture A2 of a frame F2 is drawn at a position slightly shifted from the position in a cell picture Al of a frame Fl. Similarly, the person in a cell picture A3 of a frame F3 is drawn at a position slightly shifted from the position in the cell picture A2 of the frame F2.
Also, in the layer B, the position of the dog in a cell picture B2 of the frame F2 is slightly shifted from the position in a cell picture B I of the frame Fl. The position of the dog in a cell picture B3 of the frame Frame 3 is slightly shifted from the position in the cell picture B2 of the frame F2.
Thus, as the cell pictures of the plural layers are prepared, painting processing (coloring processing) on these cell pictures is carried out at step S6. That is, coloring processing on each cell picture is carried out by using paints. Next, at step S7, a plurality of colored cell pictures are superposed on one another and filmed, thus forming one frame.
For example, as shown in Fig. 38, one frame of image is formed by superposing the layer A and the layer B on the layer C. Such processing is sequentially carried out on a plurality of frames so as to produce dynamic image data of a plurality of frames.
At step S8, the dynamic image thus prepared is reproduced and checked. After that, at step S9, sound signals corresponding to the dynamic image are recorded. The ultimately produced turn is telecast at step S10.
In this manner, in the case where cell pictures are colored by manual operation, it is troublesome and time-consuming to produce animation images. Thus, it has been considered to computerize the foregoing operation. In this case, an already colored picture and an uncolored picture can be confirmed by preparing ex key data and displaying a picture corresponding to the a key data.
In such conventional animation producing technique, preparation of a dynamic image, preparation of cell pictures, painting processing, filing processing and the like are carried out manually. Therefore, it is troublesome and time-consuming to produce one animation film.
Also, in the conventional animation producing technique, when the manual operation for superposing a plurality of cell pictures for each layer and filming the superposed pictures is repeated to produce a dynamic image made of a plurality of frames, it is difficult to change frame pictures.
Moreover, in the conventional animation producing technique, when cell pictures are manually colored and filmed, it is troublesome and time-consuming to produce one animation film. As a result, it is difficult to change an already prepared picture to a picture full of variety.
On the other hand, it has been proposed to computerize coloring processing, unlike the foregoing manual animation producing technique. However, when coloring processing is carried out by using a computer, it is difficult for the writer to express fine touches as in manual drawing with a pencil or a pen. Particularly, in the case of animation pictures, fine touches expressed by manual drawing are desired but it is difficult to express such fine touches by a computer.
A preferred embodiment of the present invention described hereinbelow seeks to provide a device which enables quick and easy production of animation pictures. The preferred embodiment of the present invention seeks also to provide a device which enables generation of animation image data as intended by the writer. The preferred embodiment of the present invention seeks further to improve operability in preparing animation pictures.
According to one aspect of the present invention there is provided an image data processing device for generating a frame picture by synthesizing an arbitrary number of captured cell pictures for an arbitrary number of layers, the device comprising: colored picture generation means for coloring, with a predetermined color, uncolored pixels of an area surrounded by a trace line of the captured cell picture, thus generating a colored picture; key data generation means for generating key data for identifying a colored area and an uncolored area of the colored picture in accordance with coloring processing by the colored picture generation means; parameter setting means for setting a parameter prescribing the priority in synthesizing the colored pictures of a plurality of layers; and synthesis means for synthesizing the colored pictures of a plurality of layers in accordance with the parameter set by the parameter setting means and the key data generated by the key data generation means.
According to another aspect of the present invention there is provided an image data processing method for generating a frame picture by synthesizing an arbitrary number of captured cell pictures for an arbitrary number of layers, the method comprising: a colored picture generation step of coloring, with a predetermined color, uncolored pixels of an area surrounded by a trace line of the captured cell picture, thus generating a colored picture; a key data generation step of generating key data for identifying a colored area and an uncolored area of the colored picture in accordance with coloring processing at the colored picture generation step ; a parameter setting step of setting a parameter prescribing the priority in synthesizing the colored pictures of a plurality of layers; and a synthesis step of synthesizing the colored pictures of a plurality of layers in accordance with the parameter set at the parameter setting step and the key data generated at the key data generation step.
In the image data processing device and method of the invention, colored pictures of a plurality of layers are synthesized in accordance with a parameter and key data.
In an image data processing device and image data processing method according to the preferred embodiment of the present invention, a line drawing is generated from a picture taken therein, and a colored picture and a line drawing picture are synthesized.
The image data processing device may include: cell number designation means for designating a cell number of a cell picture to be taken therein; layer number designation means for designating a layer number of the cell picture to be taken therein; and display control means for displaying the cell number and the layer number of a cell picture which is already taken therein.
It is preferred that such an image data processing device include cell picture display control means for displaying the cell picture taken therein.
In such image data processing device, and in an image data processing method which realizes the same features, the cell number and the layer number of a cell picture to be taken therein are designated, and the cell number and the layer number of a cell picture which is already taken therein are displayed.
The image data processing device may include: time sheet display control means for displaying a time sheet prescribed by frame numbers corresponding to time series of frames of a dynamic image and layer numbers ; and cell number input means for inputting the cell number of a cell picture taken therein, at a predetermined position on the time sheet.
The image data processing device may include registration means for registering, for each layer, the cell picture taken therein onto an entry list, and the cell number input means may input the cell number registered on the entry list.
Also, it is preferred that the image data processing device include registered picture display control means for displaying the cell picture registered on the entry list, for each layer, in the order of cell number.
In addition, it is preferred that the image data processing device include special effect setting means for setting a special effect for each layer, cell or frame.
It is also preferred that the image data processing device include synthesis means for synthesizing, for each frame, the cell picture of each layer number inputted on the time sheet.
It is also preferred that the image data processing device include dynamic image display control means for tentatively displaying the picture synthesized by the synthesis means, as a dynamic image.
In such an image data processing device, and in an image data processing method which realizes the same features, a time sheet prescribed by the frame numbers and the layer numbers is displayed, and the cell number of the cell picture taken therein is inputted at a predetermined position on the time sheet. Thus, it is possible to constitute and change each frame of picture easily and securely.
The image data processing device may include: detection means for detecting the density of a trace line; and determination means for determining the color in the vicinity of the trace line in accordance with the density of the trace line.
It is preferred, in this case, that the image data processing device include line coloring means for coloring the trace line, and that the determination means gradates the color of a boundary portion between the trace line and the area surrounded by the trace line by using the color of the trace line and the color of the area surrounded by the trace line in accordance with the density of the trace line.
It is also preferred that such an image data processing device include identification means for identifying the original color of the trace line, and that the determination means determine the color in accordance with the result of identification of the identification means.
The image data processing device may include thickening means for thickening the trace line by predetermined pixels.
The image data processing device may include: identification means for identifying the original color of a trace line; and determination means for determining the color of a boundary portion between the trace line and the area surrounded by the trace line or the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, in accordance with the result of identification by the identification means.
In an image data processing device having such features, and in an image data processing method having the same features, the color of a boundary portion between the trace line and the area surrounded by the trace line or the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, is determined in accordance with the result of identification of the original color of the trace line.
The image data processing device may include: binary expression means for expressing respective pixels of a trace line in a binary form of colored pixels and colorless pixels; and conversion means for converting the trace line to a line consisting of colored pixels having a width of one pixel.
It is preferred that such an image data processing device include coloring means for coloring a colorless pixel of an area surrounded by the line of the width of one pixel with a predetermined color.
In an image data processing device having such features, and in an image data processing method having the same features, a trace line is expressed in a binary form and then converted to a line consisting of colored pixels having a width of one pixel.
The image data processing may include : binary expression means for expressing respective pixels of a trace line in a binary form of colored pixels and colorless pixels ; and confirmation means for confirming whether or not a trace line consisting of colored pixels expressed in the binary form by the binary expression means is closed.
It is preferred that such an image data processing device include conversion means for converting the trace line consisting of colored pixels to a trace line having a width of one pixel, and that the confirmation means confirm that the trace line having the width of one pixel is closed.
It is also preferred that such an image data processing device include correction means for correcting an open part so as to form a closed area, when the confirmation means has confirmed that a part of the trace line is open.
In an image data processing device having such features, and in ain image data processing method having the same features, pixels of a trace line are expressed in a binary form of colored pixels and colorless pixels, and it is confirmed whether a trace line consisting of colored pixels expressed
in the binary form is closed or not. may include : colored picture The image data processing device may include : colored picture generation means for generating a colored picture obtained by coloring a predetermined area of a line drawing ; identification picture generation means for generating an identification picture for identifying a colored area and an uncolored area of the colored picture ; and display control means for displaying the colored picture on the identification picture.
It is preferred that such an image data processing device include extraction means for extracting the color of a predetermined area of the colored picture displayed on the identification picture, and coloring means for coloring the uncolored area of the colored picture with the color extracted by the extraction means.
In an image data processing device having such features, and in a image data processing method having the same features, a colored picture is displayed on an identification picture for identifying a colored area and an uncolored area of the colored picture.
The image data processing may include : discrimination means for discriminating the corresponding relation between a first picture of a first frame and a second picture of a second frame ; detection means for detecting the color of the first picture ; and coloring means for coloring the second picture with the color detected by the detection means in accordance with the result of discrimination by the discrimination means.
It is preferred that such an image data processing device include selection means for selecting coloring of only a designated area or coloring of all of plural corresponding areas.
In an image data processing device having such features, and in an image data processing method having the same features, the corresponding relation between a first picture of a first frame and a second picture of a second frame is discriminated, and the color of the first picture is detected. Then, in accordance with the result of discrimination, the second picture is colored with the detected color.
The invention will now be further described, by way of illustrative example, with reference to the accompanying drawings, in which: Fig. 1 is a block diagram showing an exemplary structure of a personal computer to which the present invention is applied.
Fig. 2 shows the flow of animation production according to an embodiment of the present invention.
Fig. 3 a flowchart for explaining detailed processing of steps s-tho S 13 of Fig. 2.
Fig. 4 shows the structure of a file.
Fig. 5 is a flowchart for explaining generation of data.
Fig. 6 shows an exemplary GUI (graphical user interface) for scan processing.
Fig. 7 is a flowchart for explaining scan processing.
Fig. 8 a flowchart for explaining filtering processing.
Fig. 9 shows an exemplary GUI for original data processing.
Fig. 10 shows a trace line which is taken in the device.
Fig. 11 is a flowchart showing thinning processing. Fig. 12 is a view for explaining processing of Fig. 11. Fig. 13 shows an exemplary fixed palette for thinned data.
Fig. 14 shows an exemplary GUI for line processing.
Fig. 15 is a view for explaining an area surrounded by a line.
Fig. 16 is a view for explaining an area surrounded by a line.
Fig. 17 is a flowchart for explaining line drawing check processing.
Fig. 18 shows an exemplary GUI for painting processing.
Fig. 19 is a flowchart for explaining automatic coloring processing.
Fig. 20 is a view for explaining operation for detecting a corresponding closed area.
Fig. 21 is a view for explaining operation for detecting a corresponding closed area.
Fig. 22 is a view for explaining operation for detecting a corresponding closed area.
Fig. 23 shows an exemplary lookup table for paint data.
Fig. 24 shows an exemplary fixed palette of alpha key data.
Fig. 25 shows an exemplary GUI for alpha key processing.
Fig. 26 is a flowchart for explaining alpha key data processing.
Fig. 27 is a flowchart for explaining thickening processing.
Fig. 28 shows an exemplary GUI for trace line processing.
Fig. 29 shows an exemplary GUI for final processing.
Fig. 30 is a flowchart for explaining synthesis processing.
Figs. 31A to 31F are views for explaining synthesis processing.
Fig. 32 shows an exemplary GUI for time sheet processing.
Fig. 33 is a flowchart for explaining time sheet processing.
Fig. 34 is a flowchart for explaining time sheet processing.
Fig. 35 is a flowchart for explaining synthesis processing of a plurality of layers.
Fig. 36 shows the flow of conventional animation production.
Fig. 37 is a view for explaining the structure of a frame.
Fig. 38 is a view for explaining synthesis of layers.
Preferred embodiments of the present invention will now be described with reference to the drawings.
Fig. 1 shows an exemplary structure of a personal computer to which the present invention is applied. In the example, a personal computer 17 has a CPU 1 for executing various processing. In a ROM 2, a program necessary for the CPU 1 to execute various processing is stored. In a RAM 3, data and a program necessary for the CPU 1 to execute various processing are suitably stored.
A CRT controller 4 is controlled by the CPU 1 and causes a CRT 5 to display a predetermined picture. When the user operates a joy stick 7, a joy stick controller 6 outputs its operation signal to the CPU 1. When the user operates a mouse 9, a mouse controller 8 outputs its operation signal to the CPU 1.
A floppy disk (FD) controller 10 carries out processing for recording or reproducing data to or from a floppy disk 11. Similarly, a hard disk (HD) controller 12 carries out processing for recording or reproducing data to or from a hard disk 13.
A keyboard controller 15 outputs an input from a keyboard 16 to the CPU 1.
To this personal computer 17, a video camera 21, a scanner 22, a video tape recorder 23 and a disk recorder 24 are connected through an input/output interface 14.
In the personal computer 17 shown in Fig. l, the CPU 1, ROM 2, RAM 3 and various controllers 4,6, 8,10, 12 are connected to a system bus 30.
Fig. 2 shows the flow of-animation production according to an embodiment of the present invention, and processing corresponding to processing of Fig. 36 is denoted
by the same numeral. Specifically, in Fig. 2, too, processing from planning ! at step S I to preparation of a dynamic image at step S4 and processing of sound recording at step S9 and telecast at step S10 are carried out similarly to the case of Fig. 36.
In this embodiment of the present invention, processing from steps S 11 to S 13, which is carried out after preparation of a dynamic image is carried out at step S4 and before sound recording is carried out at step S9, differs from the processing of the conventional case. This entire processing is carried out on the computer.
At step Sll, the dynamic image prepared at step S4 is taken in by the video camera 21 or the scanner 22. Then, at step S 12, painting processing is carried out on the image taken in at step Sill. This painting processing is carried out on the computer, not by manual operation using paints. The image on which painting has been completed is reproduced along a time sheet and checked at step S 13. After check of the dynamic image is carried out at step S 13, sound recording is carried out at step S9 and the ultimately produced film is telecast at step S10.
Fig. 3 shows detailed processing of steps S 11 to S 13 of Fig. 2. First, at step S21, the prepared dynamic image is taken in by using the scanner 22 or the video camera 21. This intake processing is carried out for each layer or cut of the dynamic image, and each layer or cut is registered on the file.
At step S22, line drawing processing of the image taken in at step S21 is carried out. By thin line drawing processing, the image taken in is expressed in a binary form for painting. At this point, as will be later described in detail, the line drawing is thinned so that its trace line is expressed with a width of one pixel. Then, it is checked whether or not a closed area is formed by a line having the width of one pixel.
Painting processing at step S24, as will be later described, is carried out with refto this closed area as a unit.
Next, the processing goes to step S23, where a necessary color for painting is prepared on a palette. At step S24, the line drawing is colored by using the palette.
When one area closed by the line having the width of one pixel is designated and then a color for coloring the area is designated, pixels within the area are automatically colored with the designated color.
At step S25, it is checked whether or not an uncolored area exists from among the respective areas. If an uncolored area exists (i. e. , if a non-painted area exists), processing for coloring that area is carried out.
At step S26, a predetermined color is put on a trace line of an original color. The color of the original trace line forms a kind of command, as will be later described, and is not necessarily coincident with the color to be displayed as an animation picture. In this example, the trace line is colored with the color to be displayed as an animation picture.
At step S27, the picture obtained as a result of painting processing at step S24 and the picture of the trace line obtained by coloring processing at step S26 are synthesized. The picture thus synthesized is saved as a file based on each cell picture as a unit, at step S28.
In this manner, when a plurality of cell pictures are saved, time sheet processing is carried out at steps S29 to S31. In this time sheet processing, first, the saved cell pictures are registered on an entry list. After registration on the entry list, the cell pictures are arranged on the time sheet by using the cell numbers on the entry list. Thus, each frame of picture can be prescribed by pictures of a desired number of layers. Also, in this time sheet processing, special effects can be set if necessary.
The frames thus prescribed on the time sheet can be previewed as a dynamic image (i. e. , tentatively displayed as a dynamic image) and confirmed.
The structure of the file generated by carrying out the foregoing processing will now be described with reference Fig. 4. Animation data is constituted by data of a plurality of cuts. A cut mentioned here means a group of continuous pictures.
Each cut has a management file expressed by appending an extension of". TTT" to the cut name and cut number, a bit map file expressed by appending an extension
of". BMP" to the layer and cell numbers, and a work file suitably prepared in the process and expressed by appending an extension of". WIG" to the layer and cell
(TH) numbers. In addition, each cut has a TARGA fUe expressed by appending an numbers. n a
extension of". TGA" to the layer and cell numbers, a time shift file expressed by appending an extension of". TSH" to the cut name and cut number, and an animation file expressed by appending an extension of". AVI" to the cut name and cut number.
The management file having the extension of". TTT"includes number of list data, file name list data, and color palette data. The file name list data has an extension of". BMP", and if the number thereof is n, the number of list data is n.
The color palette file is expressed by appending an extension of". PAL" to the layer and cell numbers. In this color palette, color components and names of colors up to 256 colors are saved.
The work file having the extension of". WIG" has picture width and height information representing the contents of a special effect, 24-bit original information
as a picture obtained by correcting a scanner image, an 8-bit line drawing picture as a picture identified as a line drawing, an 8-bit painted picture as a colored picture, an 8-bit alpha picture for discriminating a colored portion, an 8-bit trace picture as a picture obtained by coloring a trace line, and a 24-bit final picture as a synthesized picture.
The TARGA (RTM) file expressed by the extension of". TGA" includes 32-bit data constituted by 24-bit final image data and 8-bit final key data The time sheet file expressed by the extension of". TSH"includes the number of registered pictures, the number of frames, the maximum number of layers, resolution of rendering, name of registered picture list, and time sheet data. The time sheet data is data indicating which layer should be superposed at which time. The background picture is assumed as a picture of one layer.
Fig. 5 shows the flow of data which is generated when various processing is carried out. Each processing will now be carried out with reference to this flowchart.
At step 41, a dynamic image drawn on paper by the writer using a pencil or a pen is taken in by the scanner 22. For example, when a command of scan processing is inputted by operating the mouse 9, the CPU 1 causes the CRT 5 to display an image as shown in Fig. 6 as a GUI (graphical user interface) for scan processing. Then, in accordance with this GUI, the user carries out scan processing as shown in the
flowchart of Fig. 7.
First, at step S71, the user inputs the name of cut to a display section 701 of the GUI of Fig. 6. This input of the name is carried out by operating the keyboard 16. Next, the processing goes to step S72, where it is discriminated whether or not the user designates the layer number and the cell number by a cursor. If the user does not designate the layer number and the cell number by a cursor, the processing goes to step S73, where the layer number is inputted and displayed on a display section 702 of the GUI of Fig. 6 by operating the keyboard 16. At step S74, the cell number is inputted and displayed on a display section 703 by operating the keyboard 16.
If it is discriminated at step S72 that the user designates the layer number and the cell number by a cursor, the processing goes to step S75, where the user operates a cursor key of the keyboard 16 or the mouse 9 to shift a cursor 709 displayed in a read confirmation window 708 of the GUI of Fig. 6 to a predetermined position. Specifically, in the read confirmation window 708, cell numbers expressed by numerals are shown on the lateral axis and layer numbers expressed by alphabetic characters are shown on the longitudinal axis. Thus, the layer number and the cell number can be designated by shifting the cursor 709 to a predetermined position prescribed by the layer number and the cell number. In this example, though alphabetic characters are used as the layer numbers, such characters are described as numbers as a matter of convenience. Of course, the layer number may be expressed by numerals instead of alphabetic characters.
After the name of cut, the layer number and the cell number are designated in the foregoing manner, when the user presses a button 704 at step S76, the CPU 1 outputs a control signal to the scanner 22 through the input/output interface 14, thus executing scan processing. Then, when the scanner 22 scans one cell picture to receive the image taken in, the CPU 1 outputs the cell picture to a picture display window 707 and causes the picture display window 707 to display the cell picture. At step S77, the CPU 1 causes a mark 710 to be displayed at the position prescribed by the layer number and the cell number in the read confirmation window 708. This enables the user to quickly recognize the layer number and the cell number of the image already taken in.
Scanning may be carried out before the name of cut, the layer number and the cell number are inputted.
Next, the processing goes to step S78, where it is discriminated whether or not all the necessary dynamic images are taken in. If there is any dynamic image left to be taken in, the processing returns to step S71 and the subsequent processing is repeated. On completion of intake of all the dynamic images, the processing ends.
As shown in the GUI of Fig. 6, the image taken in can be rotated by +90 degrees or-90 degrees in accordance with the setting in a display section 705. Also, contrast correction or gamma correction can be carried out by suitably setting correction items associated with a filter displayed in a display section 706.
By thus carrying out scan processing, 24-bit original image data of full colors is obtained at step S42 of Fig. 5. As described above, this image data is filed with an extension of". BMP" appended thereto. This file is recorded on the hard disk 13. As shown in Fig. 4, in the case where the file is scanned as a first cell picture of a layer A,
it is recorded on the hard disk 13 with a file name of AOOO 1. BMP.
At step S43, lightness data, chromaticity data and chroma data are calculated from 24-bit full-color bit map data of RGB. At step S44, filtering processing is carried out. Detailed filtering processing is shown in Fig. 8.
First, at step S91, it is discriminated whether a noted pixel is white or not. If the value of lightness data of the noted pixel is greater than a predetermined threshold value and the values of chromaticity data and chroma data are smaller than predetermined threshold values, it is discriminated that the noted pixel is white. In this case, the processing goes to step S95, where the color of white is expressed by setting predetermined bits of the 8-bit data.
If it is discriminated at step S91 that the noted pixel is not white, the processing goes to step S92, where it is discriminated whether the noted pixel is black or not. If the values of lightness data, chromaticity data and chroma data are smaller than predetermined threshold values, respectively, it is discriminated that the noted pixel is black. In this case, the processing goes to step S96, where the color of black is set by setting predetermined bits of the 8-bit data.
If it is discriminated at step S92 that the noted pixel is not black, the processing goes to step S93, where it is discriminated whether the noted pixel is red or not.
Whether the noted pixel is red or not is discriminated in accordance with whether the values of lightness data and chromaticity data and chroma data of the noted pixel are within the range of threshold values of red. If the noted pixel is red, the color of red is set by setting predetermined bits of the 8-bit data at step S97.
If it is discriminated at step S93 that the noted pixel is not red, the processing goes to step S94, where it is discriminated whether the noted pixel is green or not. In this discrimination, too, if the values of lightness data and chromaticity data of the noted pixel are within the range of predetermined threshold values of green, it is discriminated that the noted pixel is green. In this case, the processing goes to step S98, where the color of green is set by setting predetermined bits of the 8-bit data.
On completion of filter processing in this manner, the processing goes to step S45 of Fig. 5 and thinning processing is carried out. In this case, the CPU 1 causes the CRT 5 to display a GUI as shown in Fig. 9. In this GUI, pictures taken in of designated layers are sequentially displayed in a layer picture display window 902. By operating a button 903, the displayed cell pictures can be scrolled to the right or left. In a picture display window 901, a designated picture (e. g. , the leftmost cell picture in Fig. 9) is displayed from among the cell pictures displayed in the layer picture display window 902.
In a window 904, a cell picture immediately before to the cell picture displayed in the picture display window 901 is displayed. In a window 905 below the window 904, the same cell picture as displayed in the picture display window 901 is displayed in a small size. By designating a predetermined position on the picture displayed in the window 905, the designated portion can be displayed in an enlarged manner in the picture display window 901.
Buttons 906 are operated for setting any one of original (Org), line (Line), paint (Paint), alpha (Alpha), trace (Trace), and final (Final) modes. In the example of Fig. 9, the original button is operated.
A button 907 is operated for reading the original image data which is taken in by the scanner 22 and recorded on the hard disk 13. A button 908 is operated for reading with an emphasis on the line drawing. In a window 909, the brush shape is displayed, and size of the brush can be changed by operating a button 910. Using the brush of the designated size, noise components of the line drawing displayed in the picture display window 901 can be removed.
A button 911 is operated for preparing a filter for carrying out arithmetic operation to convert 24-bit full-color image data to 8-bit color image data. A button 912 is operated for finding thinned data from the original image data displayed in the picture display window 901.
In a color selection window 914, a color palette is displayed which is used for coloring the picture displayed in the picture display window 901. When a predetermined color is designated from among colors on the color palette displayed in the color selection window 914, that designated color is displayed in a color setting window 913. By suitably setting and adjusting buttons of the color setting window 913, the color on one palette of the color selection window 914 can be set to a desired value (color).
When the user operates the button 912 of Fig. 9, thinning processing is carried out at step S45 of Fig. 5. This thinning processing is the processing for converting a trace line having a predetermined thickness to a line having a width of one pixel.
Specifically, since a trace line constituting a line drawing as an original picture is manually drawn with a pen or a pencil, it has an unstable thickness (width). When this line drawing is taken in by the scanner 22 to form bit map image data, the trace line is expressed by a plurality of dots as shown in Fig. 10. In the example shown in Fig. 10, thinning processing is conversion to a line consisting of a total of 13 pixels at the center, that is, pixels of (H8, Vil), (H8, V2), (H8, V3), (H7, V4), (H7, V5), (H6, V6), (H6, V7), (H5, V8), (H5, V9), (H4, V10), (H4, Vis), (H3, V12) and (H2, V13).
This thinning processing is the processing for changing a black pixel in contact with a white pixel, to a white pixel by each thinning processing. In other words, the trace line is thinned by deleting a black pixel in contact with a white pixel. The trace line is thinned every time thinning processing is carried out. By repeating thinning processing for plural times, a line having a width of one pixel can be ultimately obtained.
In the case where one picture is constituted by 640x480 (= 307200) pixels, these pixels are expressed in a binary form, that is, white pixels and pixels of the other colors (black, red or green pixels), by filtering processing at the above-described step S44. In this case, the white pixels are referred to as colorless pixels, and pixels of the other colors are referred to as colored pixels.
Referring to Fig. l l, thinning processing will now be described in detail. First, at step S 111, it is discriminated whether a noted pixel is a colored pixel or not. If the noted pixel is a colorless pixel, the processing goes to step S 116, where the noted pixel is changed to the next pixel. Then, the processing returns to step Sill and it is discriminated whether that noted pixel is a colored pixel or not.
If it is discriminated at step S 111 that the noted pixel is a colored pixel, the processing goes to step S 112 and it is discriminated whether or not the noted pixel is a pixel continuous to other colored pixels. Specifically, as shown in Fig. 12, it is checked whether eight pixels in contact with the noted pixel at the center are colorless pixels or colored pixels. If two or more of the eight pixels are colored pixels, it is discriminated that the noted pixel is a pixel continuous to colored pixels. On the other hand, if the noted pixel is expressed as a colored pixel in the binary form because of noise or the like, none of the eight pixels in contact with the noted pixel is a colored pixel (i. e. , there is an isolated pixel), and therefore it is discriminated that this noted pixel is a pixel which is not continuous to other colored pixels.
If it is discriminated at step S 112 that the noted pixel is not continuous to other colored pixels, the processing goes to step S 115 and the noted pixel is changed to a
colorless pixel. That is, the noise is removed. Then, the noted pixel is changed to th next pixel at step S 116 and the processing returns to step Sill.
If it is discriminated at step S 112 that the noted pixel is continuous to other colored pixels, the processing goes to step S 113 and it is discriminated whether or not the noted pixel is in contact with a colorless pixel. Specifically, if all of two pixels in the vertical direction and two pixels in the horizontal direction which are adjacent to the noted pixel at the center, that is, four pixels in total, are colored pixels, it can be discriminated that the noted pixel is a pixel within a trace line which is not in contact with a colorless pixel. Thus, in this case, the processing goes to step S 116, where the noted pixels is changed to the next pixel. The processing then returns to step Sill.
If it is discriminated at step S 113 that any one of the four pixels in contact with the horizonal and vertical directions of the noted pixel is a colorless pixel, it is discriminated that the noted pixel is a pixel in contact with a colorless pixel, and the processing goes to step S 114.
In the example shown in Fig. 12, of pixels P2 and P6 in contact with the vertical direction of a noted pixel PX and pixels PO and P4 in contact with the horizontal direction, the pixels P4 and P2 are colorless pixels. Therefore, the processing goes to step S 114.
At step S114, it is discriminated whether or not the noted pixel is a thinned pixel. Specifically, first, it is checked whether or not all the pixels in a first pixel set consisting of pixels PO, PI and P2 located on the upper right side of the noted pixel PX, a second pixel set consisting of pixels P2, P3 and P4 located on the upper left side of the noted pixel, a third pixel set consisting of pixels P4, P5 and P6 located on the lower left side of the noted pixel, and a fourth pixel set consisting of pixels P6, P7 and PO located on the lower right side of the noted pixel PX are colorless pixels. That is, it is checked whether or not all the pixels in any one of the first to fourth pixel sets are colorless pixels.
Then, if all the pixels in another pixel set are colorless pixels, it is checked whether or not a colorless pixel is included in the pixel set located on the opposite side.
If a colorless pixel is not included in the pixel set on the opposite side, it can be discriminated that this noted pixel is a pixel located at the edge of a trace line. If a colorless pixel is included in the pixel set on the opposite side, it can be discriminated that the noted pixel is a thinned pixel (i. e. , a pixel of a line having a width of one pixel).
In the example of Fig. 12, all the pixels of the second set (pixels P2, P3 and P4) are colorless pixels. Also, no colorless pixel is included in the fourth set of pixels (pixels P6, P7 and PO) on the side opposite to the second set of pixels. Thus, in such case, it is discriminated at step S 114 that the noted pixel is a pixel located at the edge of a trace line. That is, it is discriminated that this noted pixel has not yet been thinned.
If it is discriminated at step S 114 that the noted pixel is not a thinned pixel (i. e., if it is discriminated that the noted pixel is a pixel located at the edge of a trace line), the processing goes to step S 115, where the noted pixel is changed to a colorless pixel.
Thus, one colored pixel located at the edge of the trace line is changed to a colorless pixel. As a result, the trace line is thinned by one pixel. Next, the noted pixel is changed to the next pixel at step S 116 and the processing returns to step Sill.
If it is discriminated at step S 114 that the noted pixel is a thinned pixel (i. e. , if it is discriminated that the noted pixel is a pixel of a line having a width of one pixel), the processing goes to step S 117, where it is discriminated whether or not all the pixels are thinned pixels. If all the pixels are not thinned pixels, the processing returns to step S 111 and the subsequent processing is repeated. As this thinning processing is repeated, a line having a width of one pixel can be obtained.
By thus thinning the trace line before painting processing, generation of a non painted portion can be prevented.
In the embodiment of the present invention, thinned data is expressed by eight bits. That is, the 24-bit pixel data obtained at step S42 is processed by the filter prepared by operating the button 911 of Fig. 9, and 8-bit thinned data is obtained at step S46.
In this embodiment, a lookup table (LUT) as shown in Fig. 13 is used to express the color of each pixel of the thinned data. This lookup table is a color palette for expressing the original color of each pixel of the thinned data, and internal data for the system is fixed so that the user cannot freely change the original color. That is, the lookup table is a fixed palette.
In this embodiment of the invention, black, green or red is used as the color of the original trace line. (The meaning thereof will be later described with reference to Fig. 30.) Thus, if the trace line is black, the line is expressed by 8-bit data having a palette number 0. The red trace line is expressed by 8-bit data having a palette number 2. The green trace line is expressed by 8-bit data having a palette number 4. A white pixel having no trace line is expressed by 8-bit data having a palette number 7. As is clear from the fixed palette shown in Fig. 13, blue, yellow or the like may be used as the color of the trace line, other than black, green, and red.
On completion of processing of the original data as described above, as the line button of the buttons 906 is operated, a GUI for line processing is displayed on the CRT 5 as shown in Fig. 14. In this example, a picture constituted by a line having a width of one pixel thinned by thinning processing of step S45 is displayed in the picture display window 901. A button 141 is a button operated for checking whether or not an area is closed by the line having the width of one pixel. A button 142 is a button operated for correcting a gap in the case where it is found as a result of the checking that the line having the width of one pixel is not closed and forms a gap.
Specifically, in this embodiment of the invention, in painting processing, as an area closed by the line having the width of one pixel is designated, all the pixels within the area can be automatically colored with a designated color, as will be later described. For example, in the state where patterns of a triangle and a rectangle closed by the line having the width of one line are displayed, if the triangle is designated and red is designated as the painting color, the inside of the triangle is colored with red, as shown in Fig. 15. Also if green is designated as the painting color for the rectangle, the inside of the rectangle is colored with green.
However, in the case where a gap H is generated at a part of the line of the triangle and where the triangle is not closed, as shown in Fig. 16, if the triangle is designated as the area to be colored and red is designated as the painting color, not only the inside of the triangle but also the outside of the triangle is colored with red.
As a result, it is impossible to color only a desired area with a desired color.
Thus, by operating the check button 141 shown in Fig. 14, whether the line having the width of one pixel is closed or not can be checked. That is, in this case, processing shown in the flowchart of Fig. 17 is carried out.
First, at step S 131, the user operates the mouse 9 to designate a predetermined area of the line drawing drawn by the line having the width of one pixel displayed in the picture display window 901. At this point, the CPU 1 at step S 132 colors all the pixels in the area designated at step S 131 with one color, and causes the colored area to be displayed. This processing is carried out by changing the display color to a predetermined color, pixel by pixel, in the horizontal direction from the upper left part in the designated area. Similar processing is sequentially carried out with respect to more adjacent pixels. As a result, if there is an unclosed portion (gap H) on the line as shown in Fig. 16, pixels continuous therefrom to the outside are also colored.
The user can discriminate whether or not only the designated area is displayed in the predetermined color, from the displayed picture. If the outside of the designated area is displayed in the same color as the inside of the area, it is understood that there is a gap somewhere in the area. Thus, at step S 133, it is discriminated whether or not there is a gap in the designated area. If it is discriminated that there is a gap, the processing goes to step S134 and processing for filling the gap is carried out.
Specifically, a button 242 of Fig. 14 is operated to cause the discontinuous line to be continuous, thus closing the area with the line having the width of one pixel. If there is no gap, the processing of step S134 is skipped.
Next, the processing goes to step S47 of Fig. 5, where painting processing is carried out. At this point, the user presses the paint button of the buttons 906, as shown in Fig. 18. On the GUI, as shown in Fig. 18, a button 191 is displayed which is operated for painting, with a predetermined color, the area surrounded by the line having the width of one pixel. A button 192 is a button operated for painting a predetermined area with a predetermined pattern. A button 194 is a button operated for setting/selecting a pattern for painting. A button 193 is a button operated for painting a corresponding area with the same color as the color of the corresponding area of the previous frame (already colored frame).
Referring to the flowchart of Fig. 19, processing for painting the corresponding area of the processing target frame with the same color as the color of the corresponding area of the previous frame by operating the button 191 of Fig. 18 will now be described. First, at step S151, the user discriminates whether or not all the areas are to be colored with the same color as the previous frame. If all the areas are not employed as the processing target, the processing goes to step S 152, and the user operates the mouse 9 to designate the area to be pained with the same color as the corresponding area of the previous frame. When the predetermined area is designated, the CPU 1 at step S153 retrieves the corresponding closed area of the previous frame.
For example, it is assumed that a cell picture 201 shown in Fig. 20 is the picture of the previous frame and that a cell picture 202 shown in Fig. 21 is the picture of the next frame. Ares al to a4 in the cell picture 201 correspond to areas al 1 to al4 in the cell picture 202, respectively. Normally, since it is a dynamic image, the position of the frame picture shown in Fig. 21 and the position of the frame picture shown in Fig. 22 are slightly shifted from each other. However, this shift is not so large because these pictures are of the adjacent frames. On the assumption that 30 frames exist in one second, the temporal difference between these frame pictures is 1/30 seconds.
Thus, on the assumption that the area al 1 is designated, the CPU 1 virtually draws the area al 1 of the cell picture 202 of Fig. 21 on the screen of the cell picture
201, as shown in Fig. 22. In this case, the range of the area all overlaps any one of the areas a 1 to a4 of the cell picture 201. The CPU 1 determines an area having the largest overlapping range with the area all, from among the areas al to a4, as the area corresponding to the area all. In the example of Fig. 22, the area al, from among the areas al to a4, is the area having the largest overlapping range with the area all.
Thus, the area al is determined as the area corresponding to the area all. At this point, the color of the area all is detected.
Next, the processing goes to step S154 of Fig. 19, and the CPU 1 colors the area (area all of Fig. 21) in the processing target frame with the same color as the corresponding area (area al of Fig. 20) in the previous frame.
If it is discriminated at step S 151 that not only the designated area but all the areas are to be automatically colored, the processing goes to step S155 and all the corresponding areas in the previous frame are retrieved. In the example ofFig. 21, not only the area corresponding to the area all but also the areas corresponding to the areas al2 to al4 are retrieved on the cell picture of Fig. 20 of the previous frame.
Then, the processing goes to step S 156 and processing for coloring all the retrieved areas with the same color is carried out. Specifically, in the example ofFig. 21, not only the area all is colored with the same color as the area al, but also the area al2 is colored with the same color as the area a2. The area al3 is colored with the same color as the area a3, and the area al4 is colored with the same color as the area a4.
On the other hand, each area can be colored with a color designated on the color palette, by operating the button 191. In this case, the user designated a predetermined color from the color palette of the color selection window 914, and designates a predetermined area of the line drawing displayed in the picture display window 901.
At this point, the CPU 1 colors the designated area with the designated color.
When the user paints each area by painting processing, color data is appended to white dots of the thinned data. This color data is hereinafter referred to as paint data. As described above, the thinned data is constituted by the black, green, or red trace line data and the other white data. Therefore, the paint data generated by painting processing is constituted by the black, green, or red thinned track line data and the color data applied to the white area.
Each pixel of this paint data is constituted by 8-bit data having a palette number for expressing the painted color, similarly to the 8-bit data of the thinned data. Fig. 23 shows an exemplary lookup table used in this paint data. As described above (or as shown in Fig. 13), since the palette numbers 0 to 7 are used by the system, the user can use colors of the palette numbers 8 to 207. Various patterns are allocated to the palette numbers 208 to 255.
By carrying out the above-described painting processing, first paint data consisting of eight bits is obtained at step S48 of Fig. 5.
The 8-bit first paint data obtained at step S48 is processed by line removal processing at step S49, and becomes 8-bit second paint data at step S50.
The second paint data is generated from the thinned data generated by thinning processing and the paint data (first paint data) generated by painting processing. This second paint data is data obtained by removing the thinned trace line data from the first paint data. Specifically, as the pixels expressing the trace line of the thinned data are caused to be colorless pixels, the second paint data becomes data expressing only the pixels colored by painting processing. That is, in the second paint data, 8-bit data indicating colored pixels is allocated to the colored pixels, and 8-bit data indicating colorless pixels is allocated to the picture expressing the trace line thinned by thinning processing and the pixels which have not yet been colored by painting processing. The color palette referring to the second paint data is the same color palette as the color palette referring to the first paint data.
Meanwhile, when the first paint data is obtained at step S48, the CPU 1 at step S51 prepares 8-bit thinned alpha key data. The alpha key is data indicating whether each pixel is colored or not. Each pixel of this alpha key data is expressed by eight bits. For example, in the embodiment of the present invention,"255"indicating colored pixels is allocated to pixels colored by painting processing, and"0"indicating colorless pixels is allocated to uncolored pixels. Since the thinned pixels are colored, "255"indicating colored pixels is allocated thereto.
Although this alpha key data is generated from the paint data, it may be generated from the thinned data and the paint data.
With respect to this thinned alpha key data, a fixed palette as shown in Fig. 24 is allocated. Thus, when this thinned alpha key data is displayed on the CRT 5, the colored pixels colored by painting processing and the thinned pixels are displayed in white, and the uncolored pixels are displayed in blue.
Fig. 25 shows the state where the alpha button of the buttons 906 is operated to display the alpha key data in the picture display window 901. A button 251 displayed at this point is a button operated for cutting a transparent area which should not be colored so that this transparent area will not be displayed. Also, in this example, a view window 252 is displayed at a part of the area displaying the alpha key picture.
In this view window 252, the picture processed by painting processing (i. e. , the pict,-- shown in Fig. 18) is displayed at a corresponding position. That is, by shifting the v.
window 252 to a predetermined position, the user can confirm the alpha key data picture and the paint data picture at the corresponding position without switching the display mode. If a non-painted area which is left to be painted is found, the user can immediately paint the area with a predetermined color without switching the display mode.
The processing in this case will now be described in detail with reference to the flowchart of Fig. 26. Specifically, at step S 161, processing for displaying the alpha key data in the picture display window 901 as shown in Fig. 25 is carried out. At this point, the user at step S 162 discriminates whether or not the view window 252 needs to be displayed for viewing the painted picture. If the view window 252 need not be displayed, the processing goes to step S163 and it is discriminated whether alpha key processing is to be ended or not. If not, the processing returns to step S161.
If it is discriminated at step S162 that the view window 252 needs to be displayed, the processing goes to step S164 and the user clicks the right button of the mouse 9 to display the view window 252. The view window 252 is displayed at a position where the cursor is located at that time. Next, the processing goes to step S165, and the user discriminates whether or not a color should be extracted so as to carry out processing for coloring a predetermined area, in the state where the alpha key data picture is displayed. If it is discriminated that a color need not be extracted, the processing goes to step S166 and it is discriminated whether or not the display of the view window 252 is to be continued. If the display of the view window 252 is to be continued, the processing returns to step S164.
The view window 252 is displayed throughout the time period during which the right button of the mouse is clicked. By shifting the mouse 9 with its right button clicked, the view window 252 is shifted and the painted picture at a predetermined position can be confirmed. When it is determined that the view window 252 no longer needs to be displayed, the user cancels the click of the right button of the mouse 9. At this point, the processing goes from step S 166 to step S 167 and the display of the view window 252 is erased. After that, the processing returns to step S161.
If it is determined at step S 165 that a color needs to be extracted, the processing goes to step S168 and the user carries out processing for extracting a color. Although not shown, a pattern of syringe is displayed substantially at the center of the view window 252. By clicking the left button of the mouse 9, the user extracts (stores) the color of the painted picture displayed near the inlet of the syringe pattern. In this state, the user proceeds to step S169, operates the mouse 9 to shift the view window 252 (syringe) onto an uncolored area, and then cancels the click of the left button of the mouse. At this point, the CPU 1 colors that area (the area of the painted picture) with the stored (extracted) color.
Thus, coloring with a desired color can be quickly carried out without switching the display mode.
Referring to Fig. 5 again, at step S52, the CPU 1 carries out thickening processing on the pixel data obtained as a result of filtering processing at step S44.
This thickening processing is processing for thickening the trace line, which is the reverse processing of thinning processing of step S45. This processing will be
described with reference to the flowchart of Fig. 27.
First, at step S 181, it is discriminated whether a noted pixel is a colored pixel or not. As described above, each pixel is identified as a colorless, black, red, or green pixel by filtering processing. From the result of identification, it can be discriminated whether the noted pixel is a colored pixel or not. If the noted pixel is a colored pixel, the processing goes to step S 182, where processing for changing three pixels in the horizontal direction and three pixels in the vertical direction of the noted pixel to the same color as the noted pixel is carried out. This processing is carried out, whether the noted pixel is a pixel located inside of the trace line or a pixel located at the edge of the trace line. The processing of step S 182 is carried out preferentially on a red or green pixel rather than a black pixel. That is, if there are a black pixel and a green pixel in contact with the noted pixel, or if there are a black pixel and a red pixel in contact with the noted pixel, the peripheral three pixels are changed not to black but to green or red.
If the noted pixel is a colorless pixel, the processing of step S 182 is skipped.
Fig. 28 shows an exemplary GUI in the case where the trace button of the buttons 906 is operated to set a trace line processing mode. This GUI is used for carrying out trace line painting processing at step S53 after thickening processing is carried out at step S52 ofFig. 5. In this GUI, the line drawing of the original picture taken in, after being processed by thickening processing at step S52, is displayed in the picture display window 901. Although thickening processing of step S52 is automatically carried out, the width of the line can be further thickened by operating a button 281.
Trace line painting processing of step S53 is processing for changing the color of the thickened trace line. In this processing, only the color of the pixels of the thickened data generated by thickening processing is changed, and the color of the other pixels is not changed. This designation of the color is carried out basically in a manner similar to painting processing of step S47.
Each pixel of the trace line, on which the color is designated, is constituted by 8-bit data similarly to the paint data. By this 8-bit data, the palette number is designated. The color palette designated by the palette number is caused to be the same color palette as the color palette of the corresponding paint data.
In the state where the picture of this trace line painting processing is displayed, a view window 282 can be displayed similarly to the case described with reference to Fig. 25. In the view window 282, the paint data picture at the corresponding position is displayed and the trace line is displayed in the color of the original trace line (i. e., the color identified by filter processing of step S44). That is, though each trace line processed by thickening processing can be colored with an arbitrary color by this trace line painting processing, the color of the original trace line constitutes a kind of command, as will be later described with reference to Fig. 30. Thus, the command can be confirmed by displaying the original color instead of the color applied in the view window 282.
As trace line painting processing is carried out in the foregoing manner, 8-bit trace line color data can be obtained at step S54.
At step S55 of Fig. 5, density data is extracted from the lightness data, chromaticity data and chroma data generated at step S43. Specifically, the line drawing on the dynamic image has a density, and the density is taken in at this point.
This density data is expressed by eight bits. Therefore, the density of 256 gradations is expressed. This density data is found regardless of the color such as black, red, or green.
Fig. 29 shows an exemplary GUI in the case where the final button of the buttons 906 is operated to set a final processing mode. A button 291 displayed at this point is operated for removing a boundary line. Specifically, the first paint data is obtained at step S48 as a result of painting processing at step S47 of Fig. 5. Then, by operating this button 291, line removal processing is carried out at step S49 and the second paint data is obtained at step S50. A button 292 is operated for synthesizing the painted trace line generated by trace line painting processing of step S53 with the painted picture generated by painting processing of step S47. A button 293 is operated for saving the synthesized picture. A button 294 is operated for synthesizing the picture with a background picture.
At step S57 of Fig. 5, synthesis processing is carried out by using the GUI of Fig. 29. In this synthesis processing, the second paint data obtained at step S50, the trace line color data obtained at step S54 and the density data obtained at step S56 are used. This synthesis processing will now be described in detail with reference to the flowchart of Fig. 30.
First, at step S201, it is discriminated whether or not a pixel of the trace line exists as a pixel for synthesis. If a pixel of the trace line exists, it is discriminated at steps S202 to S204 whether the color of the original trace line is black, red, or green.
If it is discriminated at step S202 that the color of the original trace line is black, the processing goes to step S205, where both ends (or both edges) of the trace line are gradated with the color of the area in contact with both ends.
For example, it is assumed that an area having the color of yellow and an area having the color of blue as the second paint data are in contact with one side and the other side of the trace line, respectively, as shown in Fig. 31A. The data of the thinned trace line of Fig. 31A is removed. Also, it is assumed that the thickened trace line is painted in brown, as shown in Fig. 3 lob. In accordance with thickening of Fig. 31B, the trace line is thickened by three pixels. In addition, it is assumed that the density data of the trace line has the maximum value of 255 at the center of the trace line and has such distribution that its value is reduced to 0 as it goes away from the center, as shown in Fig. 31C.
In such state, if the color of the original trace line is black, synthesis processing as shown in Fig. 31D is carried out. Specifically, the color of brown of the trace line and the color of yellow of the one area are gradated in a section P 1 in accordance with the density distribution. Similarly, the color of brown of the trace line and the color of blue of the other area are gradated in a section P2 in accordance with the density of the trace line.
If the data obtained as a result of synthesis (gradation) is represented by M, the second paint data is represented by Cl, the trace line color data is represented by C2, and the density data is represented by L, the synthesized data M is expressed by the following equation.
M = { (Cl x L) + (C2 x 255-L)}/255 Each of these data is 8-bit data.
On the other hand, if it is discriminated at step S203 that the color of the original trace line is red, the processing goes to step S206, where the colors of two areas in contact with each other via the trace line are gradated in the section on one end of the trace line.
That is, the yellow area and the blue area in contact with each other via the trace line are gradated in the section P2 on the one end of the trace line (in this case, the right end), as shown in Fig. 3 IE. At this point, the trace line is omitted.
Moreover, if it is discriminated at step S204 that the color of the original trace line is green, the processing goes to step S207, where the colors of two areas in contact with each other via the trace line are gradated in a section P3 of the thickened trace line.
That is, the yellow area and the blue area are gradated in the section P3, as shown in Fig. 3 IF. In this case, too, the trace line is omitted.
If it is discriminated at steps S202 to S204 that the color of the original trace line is none of black, red, and green, other synthesis processing is carried out at step S208.
Meanwhile, if it is discriminated at step S201 that a pixel of the trace line does not exist as a noted pixel for synthesis, the processing goes to step S209. Since there is no particular picture to be synthesized, only the painting-processed picture is used.
Thus, desired synthesis can be carried out by drawing the original trace line in black, red, or green. Therefore, in drawing a dynamic image, a desired synthesized image can be obtained by suitably using black, red, or green.
By the foregoing processing, 8-bit synthesized image data is obtained at step S58. This 8-bit synthesized image data is converted to 24-bit final image data of full colors at step S59. That is, each pixel is converted to the 24-bit color registered on the color palette.
At step S60, the thinned alpha key data generated at step S51 and the density data generated at step S56 are synthesized to generate final alpha key data Fa.
If the density data is represented by L and the thinned alpha key data is represented by a, the final alpha key data Fa is expressed by the following equation.
F < x = Max (L, a) In this equation, Max (A, B) means that the greater one of A and B is selected.
That is, the greater data of the density data L and the thinned alpha key data a is selected as the final alpha key data.
This final alpha key data is used as a key signal in synthesizing images of other layers. Processing in this case will be later described in detail with reference to Fig. 35.
As described above, as synthesis processing of step S60 is carried out, the final alpha key data is obtained at step S61.
Moreover, at step S62, Targa (RTM) data is generated from the 24-bit final image data obtained at step S59 and the 8-bit final alpha key data obtained at step S61. This data has an extension of". TGA" appended thereto and is saved as a Targa (RTM) file.
Fig. 32 shows an exemplary GUI in the case where time sheet processing described with reference to Fig. 3 is carried out. In this example, a layer is inputted and displayed in a display section 321. A button 322 is operated for registering a cell picture to the entry list. In a special effect setting window 323, in setting special effects for shifting or expanding a predetermined picture in the X-direction or in the Y-direction, or rotating the predetermined picture, the quantity of shift, expansion, or rotation can be set. In addition, transmission, focusing, photoeffect, and lens effect can also be set as special effects. In a display section 324, a layer on which special effects are to be set is designated and displayed.
In a time sheet display window 325, a time sheet is displayed. In this time sheet, the frame numbers are displayed in the longitudinal direction and the layer numbers are displayed in the lateral direction. Also, in this time sheet, a special effect setting display section is provided so that when a special effect is set, a mark 326 is displayed in the corresponding frame.
A button 327 is operated for carrying out preview operation. A button 328 is operated for preparing an AVI file (animation file). A button 329 is operated for carrying out review operation.
Time sheet processing will now be described with reference th'owcharts of Figs. 33 and 34. First, at step S220, the time sheet is displayed in the time sheet display window 325. Next, at step S221, the user designates and displays a layer to be registered on the entry list, in the display section 321. At step S222, the user operates the button 322 to display a necessary menu for registration of cell pictures on the entry list. At step S223, cell pictures of designated numbers from among the cell pictures taken in by the scanner 22 are sequentially registered on the entry list. At step S224, the registered cell pictures are sequentially displayed with their respective cell numbers appended thereto in the layer picture display window 902. At step S225, it is discriminated whether registration of all the necessary layers has been completed or not. If it is discriminated that there is a layer which has not been registered, the processing returns to step S221. Then, the layer is changed and similar process ! : repeated.
If it is discriminated at step S225 that registration of all the layers has been completed, the processing goes to step S226, and the user inputs the cell numbers registered on the entry list into predetermined frames of the time sheet. In the example of Fig. 32, a cell picture of number 1 as the background is inputted in the frame 1. A cell picture of cell number 1 is inputted in the layer A, and the cell picture of number 1 is inputted in the layer B. When the cell number is inputted to a necessary number of layers in one frame, the frame picture is displayed in the picture display window 901 at step S227. From this picture, the user can confirm the frame picture obtained by synthesizing pictures of plural layers.
Next, the processing goes to step S228, where it is discriminated whether the frame picture is a desired picture or not. If it is discriminated that the frame picture needs to be changed, the processing returns to step S226 and the cell number on the time sheet is changed.
If it is discriminated at step S228 that a desired frame picture is obtained, the processing goes to step S229 and it is discriminated whether special effects need to be added or not. If special effects need to be added, the processing goes to step S230. Then, a desired cell number is inputted and displayed in the display section 324 of the special effect setting window 223, and values for special effects such as shift, expansion, and transmission are set. As this setting is carried out, the mark 326 indicating that special effects have been set is displayed on the time sheet at step S231.
If it is discriminated at step S229 that special effects need not be added, processing of steps S230 and S231 is skipped. Then, at step S232, it is discriminated whether input of all the frames to the time sheet has been completed or not. If it is discriminated that input of all the frames to the time sheet has not been completed, the processing returns to step S226 and the subsequent processing is repeated.
If it is discriminated at step S232 that input of all the frames to the time sheet has been completed, the processing goes to step S233 and the preview button 327 is pressed. At this point, at step S234, a dynamic image in conformity to the setting on the time sheet is tentatively displayed in the picture display window 901. Watching this preview image, the user at step S235 discriminates whether any change is needed or not. If it is discriminated that a change is needed, the processing returns to step S226 and the subsequent processing is repeated.
It is discriminated at step S235 that no change is needed, the processing goes to step S236 and the user presses the render button 328. At this point, at step S237, the pictures of plural frames prepared on the time sheet are prepared as an animation file and registered on the hard disk 13. When the review button 329 is operated, the data of the animation file thus produced is outputted to and displayed on an external monitor through the input/output interface 14.
Synthesis processing of painted pictures of plural layers will now be described.
In the following example, pictures of two layers are synthesized for simplification. As shown in the flowchart of Fig. 35, at step S251, Targa data of a first layer (layer A) is obtained. This Targa data is constituted by final alpha key data FaA and final image data FIA, as described above.
Next, at step S252, Targa data of a second layer (layer B) is obtained. This Targa data, too, is synthesized from final alpha key data FaB and final image data Fig.
At step S253, synthesis processing is carried out. Specifically, on the assumption that a video signal outputted from the final alpha key data FaA is represented by V AOUT, V Aom is expressed as follows, using an equation for synthesizing the final image data FIB to the final image data FIA.
VAOUT = Fa-FI. A + (l-FaJ-F. FIB... (1)
Similarly, on the assumption that a video signal outputted from the final alpha key data FaB is represented by VBOUT, VBOUT is expressed as follows, using an equation for synthesizing the final image data FIA to the final image data FIB,
VBOUT = FaB'FIB + (l-FaB)'Fa-FIA... (2)
Next, it is assumed that priority data indicating the degree of priority put on the final image data FIB with respect to the final image data FIA is represented by P (OP 1). When the layer B is located before the layer A, P = 1 holds.
On the assumption that a video signal outputted as the image of the layer A in consideration of the priority data P is represented byV'AOUT, the following equation can be obtained from equation (1).
*V'AOUT =YAOUT- (1-P) = {Fa-FIA + (1-FaJ-FaB-FIB} (1-P)... (3)
Similarly, on the assumption that a video signal outputted as the image of the layer B in consideration of the priority data P is represented by V'BouT, the following equation can be obtained from equation (2).
V'BOUT = VBOUT'P = {FOB-Fig + (1-F)-FaA-FIA. P... (4)
On the assumption that a synthesized video signal obtained by synthesizing the layer A and the layer B is represented by Vous and that its synthesis key signal is represented by aOUT, the following equation holds.
VOUT'aOUT AOUT + V'BOUT
Thus, the following equation is obtained.
Vour = (V'AOur + V'BOUr)/aOUr... (5)
When agoutis found, since the alpha key data is defined as the product of (1 FaA) and (1-FaB) in an area other than the areas where the final image data FIA and the final image data Fig are displayed, that is, in an area where neither the video signal VA nor the video signal VB is displayed, the alpha key data of the area where the final image data FIA or the final image data Fig is displayed is defined by the following formula.
1- (l-FaA. (l-Fag)
Thus, the following equation is obtained.
aouT=l- (l-Fa- (l-FaB)... (6)
From the above-described equations (3) to (6), the following equation can be obtained.
Your = [ {F-FI + (1-Fa). F-FIe} (1-P) + {FaB. FlB+ (l-FaB)'F-FIJ. P] / {l- (l-FaJ. (l-FaB)}
The priority data P is normally P = 1. That is, the layer B is synthesized onto the layer A. By setting the value of P at a predetermined value from 0 to 1, the layer A can be seen perspectively through the layer B. The value of P can be set as a parameter in the"transmission"section of the special effect setting window 323 of Fig. 32.
As transmission media for transmitting a program for carrying out the foregoing processing to the user, communication media such as a network and a satellite as well as recording media such as a magnetic disk, a CD-ROM and a solid state memory can be used.
As described above, according to the image data processing device and data processing method of the embodiment of the present invention, a picture constituted by a line is generated from a picture taken in, and a picture obtained by coloring pixels of an area surrounded by the line and the picture of the line are synthesized. Thus, a color picture can be quickly generated.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, the cell number and the layer number of a cell picture to be taken in are designated, and the cell number and the layer number of a cell picture which is already taken in are displayed. Thus, cell pictures drawn on the draft by the writer with a pencil or a pen can be taken in by the scanner or the like and converted to image data quickly and securely.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, a time sheet prescribed by the frame numbers and the layer numbers is displayed, and the cell number of the cell picture taken in is inputted at a predetermined position on the time sheet. Thus, each frame can be constituted or changed with an arbitrary cell number easily and securely.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, the density of a trace line is detected, and the color in the vicinity of the trace line of an area surrounded by the trace line is determined in accordance with the density. Thus, the intention of the writer can be accurately expressed.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, the color of a boundary portion between the trace line and the area surrounded by the trace line or the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, is determined in accordance with the result of identification of the original color of the trace line. Thus, by setting the original color of the trace line to a predetermined color, the color of the boundary portion of the picture taken in can be determined to a desired color.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention trace line taken in is expressed in a binary form and then converted to a line having a width of one pixel. Thus, generation of a non-painted portion left to be painted near the trace line can be prevented, and a color animation picture can be produced quickly and securely.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, it is confirmed whether a trace line is closed or not. Thus, a plurality of areas can be prevented from being colored with the same color, and each picture area can be colored with a desired color quickly and securely.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, a colored picture is displayed on an identification picture for identifying a colored area and an uncolored area of the colored picture. Thus, the uncolored picture can be quickly and securely confirmed so as to improve operability, and generation of a nonpainted portion left to be painted can be prevented.
Also, according to the image data processing device and the image data processing method of the embodiment of the present invention, the corresponding relation between a first picture of a first frame and a second picture of a second frame is discriminated, and the second picture is colored with the color of the first picture in accordance with the result of discrimination. Thus, coloring processing can be carried out quickly and securely.
In addition, according to the image data processing device and the image data processing method of the embodiment of the present invention, colored pictures of a plurality of layers are synthesized in accordance with a parameter and key data. Thus, pictures full of variety can be generated easily and securely.

Claims (46)

  1. CLAIMS 1. An image data processing device for generating a frame picture by synthesizing an arbitarary number of captured cell pictures for an arbitrary number of layers, the device comprising: colored picture generation means for coloring, with a predetermined color, uncolored pixels of an area surrounded by a trace line of the captured cell picture, thus generating a colored picture; key data generation means for generating key data for identifying a colored area and an uncolored area of the colored picture in accordance with coloring processing by the colored picture generation means; parameter setting means for setting a parameter prescribing the priority in synthesizing the colored pictures of a plurality of layers; and synthesis means for synthesizing the colored pictures of plurality of layers in accordance with the parameter set by the parameter setting means and the key data generated by the key data generation means.
  2. 2. An image data processing device as claimed in claim 1, wherein the parameter is set as a special effect.
  3. 3. An image data processing device as claimed in claim 1, the device comprising : cell number designation means for designating a cell number of a cell picture to be captured; layer number designation means for designating a layer number of the cell picture to be captured; and display control means for displaying the cell number and the layer number of a cell picture which is already captured.
  4. 4. An image data processing device as claimed in claim 3, wherein the display control means displays the cell number and the layer number of the cell picture which is already captured at a position on a matrix.
  5. 5. An image data processing device as claimed in claim 3, comprising cell picture display control means for displaying the captured cell picture.
  6. 6. An image data processing device as claimed in claim 1, for generating a dynamic image consisting of a plurality of frame pictures, the device comprising : time sheet display control means for displaying a time sheet prescribed by frame numbers corresponding to time series of frames of a dynamic image and layer numbers; and cell number input means for inputting the cell number of a captured cell picture, at a predetermined position on the time sheet.
  7. 7. An image data processing device as claimed in claim 6, comprising registration means for registering, for each layer, the captured cell picture onto an entry list, the cell number input means inputting the cell number registered on the entry list.
  8. 8. An image data processing device as claimed in claim 7, comprising registered picture display control means for displaying the cell picture registered on the entry list, for each layer, in the order of cell number.
  9. 9. An image data processing device as claimed in claim 6, comprising special effect setting means for setting a special effect for each layer, cell or frame.
  10. 10. An image data processing device as claimed in claim 9, wherein the time sheet display control means displays on the time sheet that a special effect is set, when a special effect is set by the special effect setting means.
  11. 11. An image data processing device as claimed in claim 9, wherein the special effect includes at least rotation, expansion, shift and transmission.
  12. 12. An image data processing device as claimed in claim 6, comprising synthesis means for synthesizing, for each frame, the cell picture of each layer number inputted on the time sheet.
  13. 13. An image data processing device as claimed in claim 12, comprising dynamic image display control means for tentatively displaying the picture synthesized by the synthesis means, as a dynamic image.
  14. 14. An image data processing device as claimed in claim 1, comprising: detection means for detecting the density of a trace line; and determination means for determining the color in the vicinity of the trace line in accordance with the density of the trace line.
  15. 15. An image data processing device as claimed in claim 14, comprising line coloring means for coloring the trace line, the determination means gradating the color of a boundary portion between the trace line and the area surrounded by the trace line by using the color of the trace line and the color of the area surrounded by the trace line in accordance with the density of the trace line.
  16. 16. An image data processing device as claimed in claim 14, wherein the determination means gradates the color of a boundary portion between two areas surrounded by the trace line on one side and the other side of the trace line by using the colors of the two areas surrounded by the trace line in accordance with the density of the trace line.
  17. 17. An image data processing device as claimed in claim 14, comprising identification means for identifying the original color of the trace line, the determination means determining the color in accordance with the result of identification of the identification means.
  18. 18. An image data processing device as claimed in claim 17, wherein, if the color identified by the identification means is a first color, the determination means gradates the color of a boundary portion between the trace line and the area surrounded by the trace line by using the color of the trace line and the color of the area surrounded by the trace line in accordance with the density of the trace line, and wherein, if the color identified by the identification means is a second color, the determination means gradates the color of a boundary portion between two areas surrounded by the trace line on one side and the other side of the trace line by using the colors of the two areas surrounded by the trace line in accordance with the density of the trace line.
  19. 19. An image data processing device as claimed in claim 14, comprising thickening means for thickening the trace line by predetermined pixels.
  20. 20. An image data processing device as claimed in claim 1, comprising: identification means for identifying the original color of a trace line; and determination means for determining the color of a boundary portion between the trace line and the area surrounded by the trace line or the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, in accordance with the result of identification by the identification means.
  21. 21. An image data processing device as claimed in claim 20, wherein, if the color identified by the identification means is a first color, the determination means determines the color of a boundary portion between the trace line and the area surrounded by the trace line, and wherein, if the color identified by the identification means is a second color, the determination means determines the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted.
  22. 22. An image data processing device as claimed in claim 20, wherein, if the color identified by the identification means is a first color, the determination means determines the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, over a first width, and wherein, if the color identified by the identification means is a second color, the determination means determines the color of a boundary portion between an area on one side and an area on the other side of the trace line in the case where the trace line is omitted, over a second width.
  23. 23. An image data processing device as claimed in claim 1, comprising: binary expression means for expressing respective pixels of a trace line in a binary form of colored pixels and colorless pixels; and conversion means for converting the trace line to a line consisting of colored pixels having a width of one pixel.
  24. 24. An image data processing device as claimed in claim 23, comprising coloring means for coloring a colorless pixel of an area surrounded by the line of the width of one pixel with a predetermined color.
  25. 25. An image data processing device as claimed in claim 23, wherein the binary expression means identifies the colors of the pixels in the order of white, black, red and green, and expresses the respective pixels in the binary form, such as the white pixels as colorless pixels and the pixels of the colors other than white as colored pixels.
  26. 26. An image data processing device as claimed in claim 23, wherein the conversion means repeats processing for converting the colored pixel in contact with the colorless pixel, until the width of the trace line becomes the width of one pixel.
  27. 27. An image data processing device as claimed in claim 23, wherein data of each pixel includes a flag indicating its color.
  28. 28. An image data processing device as claimed in claim 1, comprising: binary expression means for expressing respective pixels of a trace line in a binary form of colored pixels and colorless pixels ; and confirmation means for confirming whether or not a trace line consisting of colored pixels expressed in the binary form by the binary expression means is closed.
  29. 29. An image data processing device as claimed in claim 28, wherein the confirmation means causes a predetermined colorless pixel in the area surrounded by the trace line consisting of colored pixels, and a colorless pixel in contact with the predetermined colorless pixel, to be sequentially colored and displayed.
  30. 30. An image data processing device as claimed in claim 28, comprising conversion means for converting the trace line consisting of colored pixels to a trace line having a width of one pixel, the confirmation means confirming that the trace line having the width of one pixel is closed.
  31. 31. An image data processing device as claimed in claim 28, comprising correction means for correcting an open part so as to form a closed area, when the confirmation means has confirmed that a part of the trace line is open.
  32. 32. An image data processing device as claimed in claim 1, comprising: colored picture generation means for generating a colored picture obtained by coloring a predetermined area of a line drawing; identification picture generation means for generating an identification picture for identifying a colored area and an uncolored area of the colored picture; and display control means for displaying the colored picture on the identification picture.
  33. 33. An image data processing device as claimed in claim 32, wherein the display control means displays a window at an arbitrary position on the identification picture, and displays the colored picture at a corresponding position in the window.
  34. 34. An image data processing device as claimed in claim 32, comprising : extraction means for extracting the color of a predetermined area of the colored picture displayed on the identification picture; and coloring means for coloring the uncolored area of the colored picture with the color extracted by the extraction means.
  35. 35. An image data processing device as claimed in claim 32, wherein the identification picture is an alpha key data picture.
  36. 36. An image data processing device as claimed in claim 32, wherein the colored picture is a picture obtained by coloring an area closed by the line drawing, or a picture obtained by coloring the line of the line drawing.
  37. 37. An image data processing device as claimed in claim 32 wherein, if the colored picture is a picture obtained by coloring the line of the line drawing, the display control means displays the colored picture in the color before coloring the line.
  38. 38. An image data processing device as claimed in claim 37, wherein the color before coloring the line forms a command.
  39. 39. An image data processing device as claimed in claim 1, comprising: discrimination means for discriminating the corresponding relation between a first picture of a first frame and a second picture of a second frame ; detection means for detecting the color of the first picture; and coloring means for coloring the second picture with the color detected by the detection means in accordance with the result of discrimination by the discrimination means.
  40. 40. An image data processing device as claimed in claim 39, wherein the discrimination means assumes temporally adjacent frames as the first frame and the second frame.
  41. 41. An image processing device as claimed in claim 39, wherein the discrimination means discriminates areas closed by lines constituting the line drawing as the first picture and the second picture.
  42. 42. An image data processing device as claimed in claim 41, wherein the discrimination means discriminates an area including the largest range of a target area of the second frame, from among a plurality of areas of the first frame, as an area corresponding to the target area.
  43. 43. An image data processing device as claimed in claim 41, wherein the coloring means colors only a designated area.
  44. 44. An image data processing device as claimed in claim 4, wherein the coloring means colors all of plural corresponding areas.
  45. 45. An image data processing device as claimed in claim 41, comprising selection means for selecting coloring of only a designated area or coloring of all plural corresponding areas.
  46. 46. An image data processing method for generating a frame picture by synthesizing an arbitrary number of captured cell pictures from an arbitrary number of layers, the method comprising: a colored picture generation step of coloring, with a predetermined color, uncolored pixels of an area surrounded by a trace line of the captured cell picture, thus generating a colored picture; a key data generation step of generating key data for identifying a colored area and an uncolored area of the colored picture in accordance with coloring processing at the colored picture generation step; a parameter stetting step of setting a parameter prescribing the priority in sythesizing the colored pictures of a plurality of layers ; and a synthesis step of synthesizing the colored pictures of a plurailty of layers in accordance with the parameter set at the parameter setting step and the key data generated at the key data generation step.
GB0129737A 1997-08-04 1998-08-04 Image data processing devices and methods Expired - Fee Related GB2369023B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP22316397 1997-08-04
GB9907364A GB2333678B (en) 1997-08-04 1998-08-04 Image data processing devices and methods

Publications (3)

Publication Number Publication Date
GB0129737D0 GB0129737D0 (en) 2002-01-30
GB2369023A true GB2369023A (en) 2002-05-15
GB2369023B GB2369023B (en) 2002-06-19

Family

ID=26315364

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0129737A Expired - Fee Related GB2369023B (en) 1997-08-04 1998-08-04 Image data processing devices and methods

Country Status (1)

Country Link
GB (1) GB2369023B (en)

Also Published As

Publication number Publication date
GB2369023B (en) 2002-06-19
GB0129737D0 (en) 2002-01-30

Similar Documents

Publication Publication Date Title
US6522329B1 (en) Image processing device and method for producing animated image data
US6404901B1 (en) Image information processing apparatus and its method
AU2002305387B2 (en) Image sequence enhancement system and method
JP4120724B2 (en) Image data management apparatus, image data management method, and medium on which image data management program is recorded
US8396328B2 (en) Minimal artifact image sequence depth enhancement system and method
US8073247B1 (en) Minimal artifact image sequence depth enhancement system and method
CN101213576A (en) Album creating apparatus, album creating method and program therefor
US7747074B2 (en) Selection of decorative picture suitable for input picture
JPH06503695A (en) A compositing interface for arranging the components of special effects jobs for film production.
US4956704A (en) Image processing apparatus and methods for making a creative image from an original image
WO2006006666A1 (en) Image processing method and image processor by tracking digital image contour
JPH11110577A (en) Device and method for processing image data, and transmission medium thereof
GB2369023A (en) Image data processing devices and methods for generating animations
CN107992256A (en) Window control method, apparatus and system
JP2000030039A (en) Picture processor and picture processing method
JPH08235344A (en) Scenario editor
JP2005159971A (en) Apparatus and method for processing image
JP2844573B2 (en) Image processing method
JPH0816758A (en) Line drawing processing method for animation image
JPH0830780A (en) Animation line drawing processing method
JPH0690496B2 (en) Solid mesh film making device
JPS63131793A (en) Image synthesizing device
JPH1145325A (en) Method and device for processing image and computer readable recording medium recording image processing program
JPH0690497B2 (en) Solid mesh film making device
JP2001519117A (en) Computer system and computer implemented process for editing video fields

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20040804