US7308155B2 - Image processing apparatus, image processing method, image processing program, and storage medium - Google Patents

Image processing apparatus, image processing method, image processing program, and storage medium Download PDF

Info

Publication number
US7308155B2
US7308155B2 US10/272,946 US27294602A US7308155B2 US 7308155 B2 US7308155 B2 US 7308155B2 US 27294602 A US27294602 A US 27294602A US 7308155 B2 US7308155 B2 US 7308155B2
Authority
US
United States
Prior art keywords
images
image
information
unit
attribute information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US10/272,946
Other languages
English (en)
Other versions
US20030098983A1 (en
Inventor
Yoshihiro Terada
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Assigned to FUJI XEROX CO., LTD. reassignment FUJI XEROX CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TERADA, YOSHIHIRO
Publication of US20030098983A1 publication Critical patent/US20030098983A1/en
Application granted granted Critical
Publication of US7308155B2 publication Critical patent/US7308155B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T9/00Image coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • H04N1/3872Repositioning or masking
    • H04N1/3873Repositioning or masking defined only by a limited number of coordinate points or parameters, e.g. corners, centre; for trimming
    • H04N1/3875Repositioning or masking defined only by a limited number of coordinate points or parameters, e.g. corners, centre; for trimming combined with enlarging or reducing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/40Picture signal circuits

Definitions

  • the present invention is related to an image processing operation for placing a plurality of images side by side with each other to form a single synthesized image. More specifically, the present invention is directed to both an image processing apparatus and an image processing method, which are suitably applied to such a system that an apparatus into which an image is inputted is physically different from another apparatus which outputs a synthesized image. Also, the present invention is directed to an image processing program capable of causing a computer to execute such an image processing method, and also directed to a storage medium for storing thereinto this image processing program.
  • a copy of an original realized by connecting a scanner via a digital signal transmission path such as a network to a printer will be referred to as a “network copy”, whereas such a conventional original copy will be referred to as a “direct copy.”
  • a scanner is directly connected to a printer, or is directly connected to a controller such as a computer.
  • FIG. 27 is an explanatory diagram for explaining an example of such a function that a plurality of original images are placed side by side with each other on a single sheet of paper, and then are recorded.
  • FIG. 27 ( 1 ) two sheets of original images may be placed side by side with each other as represented in FIG. 27 ( 2 ), and then may be recorded.
  • Such an editing process operation will be referred to as a “2-up” process operation in the below-mentioned description.
  • FIG. 27 is an explanatory diagram for explaining an example of such a function that a plurality of original images are placed side by side with each other on a single sheet of paper, and then are recorded.
  • FIG. 27 ( 3 ) four sheets of original images may be placed side by side with each other as represented in FIG. 27 ( 4 ) and then may be recorded.
  • Such an editing process operation will be referred to as a “4-up” process operation in the below-mentioned description.
  • editing process operations for placing a plurality of original images side by side on a single sheet of paper so as to record the placed plural original images will be commonly referred to as an “N-up” process operation.
  • Such an “N-up” function may achieve effects that consumption of paper is saved, and a comprehensive surveying characteristic of information is improved. As a result, this “N-up” function may constitute one of very important editing functions in the case that documents are formed and/or distributed.
  • FIG. 28 is an explanatory diagram for explaining one structural example capable of realizing the “N-up” function when the direct copy operation is carried out.
  • reference numeral 41 shows a scanner unit
  • reference numeral 42 represents a page memory
  • reference numeral 43 indicates an image processing unit
  • reference numeral 44 denotes a printer unit
  • reference numeral 45 shows a control unit.
  • FIG. 28 indicates the structural example capable of realizing the “2-up” function.
  • the scanner unit 41 reads an image formed on an original.
  • the image data read by the scanner unit 41 is stored in the page memory 42 .
  • the image processing unit 43 reads out the image data stored in the page memory 42 , and then, performs various sorts of image processing operations so as to reproduce/duplicate images.
  • the printer unit 44 records the images which have been processed by the image processing unit 43 to output the recorded images.
  • the control unit 45 controls operations of these respective units.
  • both an image of a “character type” and an image of a “photograph type” are read by the scanner unit 41 , and then these two images are placed side by side to be synthesized with each other, both a “character type” portion and a “photograph type” portion are present within a single synthesized image.
  • the control unit 45 instructs the image processing unit 43 to execute such an image processing operation which is optimized to character images with respect to the “character type” portion, and also instructs the image processing unit 43 to execute such an image processing operation which is optimized to photograph images with respect to the “photograph type” portion.
  • the synthesized image data to which the image processing operations have been executed by the image processing unit 43 is transferred to the printer unit 44 , so that a synthesized image is recorded on paper.
  • this synthesized image may be recorded in accordance with the image forming methods selected for the respective type portions.
  • the conventional network copy in the case that the conventional network copy is produced, for example, when a network scanner is utilized, a scanner unit is controlled, whereas when a network printer is used, a printer unit is controlled.
  • this image processing unit may be arranged in an integral manner together with a network scanner and a network printer. Alternatively, while image processing units are separately arranged with these network scanner and network printer respectively, these image processing units are independently controlled.
  • the present invention has been made to solve the above-described problems, and therefore, has an object to provide both an image processing apparatus and an image processing method, capable of outputting images in high image qualities even in such a case that a so-called “N-up” function is utilized in a network copy, and also to provide both a program used to execute such an image processing operation and a storage medium for storing thereinto such a program.
  • a plurality of entered images are reduced respectively, a plurality of reduced images are placed side by side to be synthesized with each other so as to produce a single synthesized image, and attribute information corresponding to the synthesized image is produced, and also, both the synthesized image and the attribute information are transmitted to an image processing apparatus provided on the reception side.
  • the image processing apparatus provided on the reception side receives information which contains a synthesized image in which plurality of images are placed side by side to form a single synthesized image, and attribute information corresponding to the synthesized image; extracts both the synthesized image and the attribute information from the received information; produces attribute information every such an area different from an area corresponding to the attribute information based upon the extracted attribute information; and corrects the synthesized image based upon the produced attribute information produced.
  • the image processing apparatus provided on the reception side can execute the correcting process operation in response to the respective areas, and thus, can obtain the “N-up” image output having the high image quality.
  • the attribute information transmitted in combination with the synthesized image may be transmitted every such an area that the reduced image before being synthesized is used as a minimum unit, or may be transmitted as single attribute information as to the synthesized image, otherwise may be transmitted in the unit of the pixel of the synthesized image.
  • an image processing apparatus provided on the transmission side reduces a plurality of entered images respectively; places the plurality of reduced images side by side so as to synthesize the plural images every an attribute of the image, and produces a plurality of synthesized images; and transmits the plurality of synthesized images. Since an image processing apparatus provided on the reception side can execute the process operation in response to the attribute every synthesized image, the image which has been synthesized by the “N-up” function can be obtained in the high image quality. It should be noted that when a plurality of synthesized images are transmitted, these plural synthesized images have been compressed based upon the compression systems corresponding to the respective synthesized images, and thereafter, these compressed synthesized image is transmitted. As a result, while lowering of the image qualities in the respective synthesized images is suppressed, a total data amount of data to be transferred can be reduced.
  • an image processing apparatus provided on the transmission side converts a plurality of entered images, attribute information of the respective images, and such a synthesize-instructing information for placing these plural images side by side to synthesize these images as a single image into such an information having a predetermined format, and then transmits the converted information.
  • An image processing apparatus provided on the reception side receives the information having the predetermined format, which contains the plural images, attribute information of the respective images, and such a synthesize-instructing information for placing these plural images side by side to synthesize these images as a single image.
  • This image processing apparatus extracts a plurality of images and the attribute information of the respective images from the received information having the predetermined format; reduces the plural images in accordance with the synthesize-instructing information; places a plurality of reduced images to be synthesized with each other so as to produce a single synthesized image in accordance with the synthesize-instruction information; and then, outputs the single synthesized image.
  • the image processing apparatus may correct with respect to the respective images before being reduced, or the respective images after being reduced, otherwise the synthesized image based upon the attribute information.
  • an image processing apparatus provided on the transmission side reduces a plurality of entered images respectively; and converts the plurality of reduced images and also such an information that these plural images are placed side by side so as to be synthesized as a single image into information having a predetermined format; and then transmit the converted information.
  • this image processing apparatus may also convert attribute information with respect to the respective images into information having a predetermined format, and then may transmit the converted information.
  • An image processing apparatus provided on the reception side receives the information having the predetermined format, which contains the plural images, attribute information of the respective images, and such a synthesize-instructing information for placing these plural images side by side to synthesize these images as a single image into such an information having a predetermined format.
  • This image processing apparatus places a plurality of images side by side to be synthesized with each other so as to produce a single synthesized image in accordance with the above-described synthesize-instructing information; and then outputs the synthesized image.
  • the image processing apparatus provided on the reception side may correct the respective images before being synthesized with each other, or the synthesized image by employing this transmitted attribute information.
  • the image processing apparatus provided on the reception side may produce the attribute information, and may correct the respective images before being synthesized with each other, or the synthesized image based upon this produced attribute information.
  • this image processing apparatus receives these reduced images as independent images. Accordingly, the optimum process operations may be carried out with respect to the respective images in the process operation when the image synthesizing operation is carried out, and after the image synthesizing operation has been executed. As a consequence, the “N-up” function can be realized in a high image quality similar to that obtained when the “direct copy” is produced.
  • FIG. 1 is a block diagram for indicating the configuration of a transmission side according to a first embodiment of the present invention.
  • FIG. 2 is a flow chart for explaining operations of the transmission side according to the first embodiment of the invention.
  • FIG. 3 is an explanatory diagram for explaining one example of a gradation correcting process operation.
  • FIG. 4 is an explanatory diagram for explaining process pixels in one example of a precision correcting process operation.
  • FIG. 5 is an explanatory diagram for explaining multiplication coefficients in one example of the precision correcting process operation.
  • FIG. 6 is an explanatory diagram for explaining one example of a format converting process operation.
  • FIG. 7 is an explanatory diagram for explaining one example of a storage processing operation of an image in the case that the normal image transmission not by the “N-up” function is carried out.
  • FIG. 8 is an explanatory diagram for explaining an example of storage processing operation of image data in the case of “2-up” processing operation.
  • FIG. 9 is an explanatory diagram for explaining an example of storage processing operation of image data in the case of “4-up” processing operation.
  • FIG. 10 is an explanatory diagram for explaining an example of reading process operation with respect to a synthesized image when the “2-up” synthesizing operation is performed in which a color original is mixed with a black/white original.
  • FIG. 11 is an explanatory diagram for explaining an example of an arranging relationship among respective images when the “N-up” process operation is carried out.
  • FIG. 12 is a block diagram for indicating an arrangement of an image processing apparatus provided on the reception side according to the first embodiment of the present invention.
  • FIG. 13 is a flow chart for describing operations of the image processing apparatus provided on the reception side according to the first embodiment of the present invention.
  • FIG. 14 is an explanatory diagram for explaining an example of Tag information.
  • FIG. 15 is an explanatory diagram for explaining a relationship between image data and the Tag information.
  • FIG. 16 is an explanatory diagram for explaining an example of a color correcting process operation.
  • FIG. 17 is a flow chart for explaining an example of operations of a transmission side according to a second embodiment of the invention.
  • FIG. 18 is a flow chart for describing an example of operations of a reception side according to the second embodiment of the invention.
  • FIG. 19 is an explanatory diagram for schematically explaining the MRC compression system.
  • FIG. 20 is an explanatory diagram for explaining an example in which the MRC compression system is applied to the “N-up” function in a third embodiment of the present invention.
  • FIG. 21 is a flow chart for explaining an example of operations of a transmission side according to a fourth embodiment of the invention.
  • FIG. 22 is a block diagram for indicating a configuration of a reception side according to the fourth embodiment of the invention.
  • FIG. 23 is a flow chart for describing an example of operations of the reception side according to the fourth embodiment of the present invention.
  • FIG. 24 is a flow chart for explaining an example of operations of the transmission side according to a fifth embodiment of the invention.
  • FIG. 25 is a flow chart for describing an example of operations of the reception side according to the fifth embodiment of the invention.
  • FIG. 26 is an explanatory diagram for explaining an example of both a computer program and a storage medium which stores thereinto this computer program in the case that either a function of an image processing apparatus or an image processing method, according to the invention, are realized by the computer program.
  • FIG. 27 is an explanatory diagram for explaining one example of such a function that a plurality of original images are placed side by side on single paper to be recorded.
  • FIG. 28 is an explanatory diagram for explaining one structural example capable of realizing the “N-up” function when the “direct copy” is produced.
  • FIG. 1 is a schematic block diagram for indicating an arrangement of an image processing apparatus provided on the transmission side, according to a first embodiment of the invention.
  • reference numeral 11 shows an image input unit
  • reference numeral 12 indicates an image processing unit
  • reference numeral 13 represents a correcting unit
  • reference numeral 14 denotes an editing unit.
  • reference numeral 15 shows a storage unit
  • reference numeral 16 indicates a control unit
  • reference numeral 17 represents a U/I
  • reference numeral 18 shows an I/O
  • reference numeral 19 indicates a system bus.
  • the image input unit 11 is constituted by, for example, a scanner and the like, and reads out a color image formed on an original.
  • the image processing unit 12 performs a necessary process operation with respect to a digital image signal read by the image input unit 11 , and thereafter, transmits the processed image signal in a predetermined format.
  • the image processing unit 12 is constituted by the correcting unit 13 , the editing unit 14 , the storage unit 15 , the control unit 16 , the U/I 17 , the I/O 18 , the system bus 19 , and the like.
  • the correcting unit 13 is to perform such a process operation that a color characteristic, a gradation characteristic, and a spatial characteristic of the image input unit 11 are corrected so as to improve an image quality.
  • the editing unit 14 executes various sorts of image processing operations, for instance, an enlargement of an image, and a reduction of an image.
  • the storage unit 15 corresponds to a storage section capable of storing thereinto an amount of at least one page data of images.
  • the control unit 16 is constructed of a CPU, a ROM, a RAM, and the like.
  • the control unit 16 executes a portion of the image processing operations, and also controls the image processing unit 12 .
  • the U/I 17 is a user interface which is used by a user who instructs operations executed in both the image input unit 11 and the image processing unit 12 .
  • the I/O 18 is disposed to connect to a network, via which a communication is carried out with respect to an external apparatus.
  • the system bus 19 corresponds to a local bus used to connect the above-described processing units to each other.
  • FIG. 2 is a flow chart for describing an example of operations of the image processing apparatus provided on the transmission side according to the first embodiment of the present invention.
  • a normal operation is firstly explained, and thereafter, an operation executed in case of the “N-up” function is explained.
  • various process setting operations are carried out in the U/I 17 , which are related to a sort of original, a process mode, and information of external devices transmitted.
  • the set original is read in the image input unit 11 .
  • the reading operation of this original is realized by that the original is optically read by a main scanning operation by using a one-dimensionally arrayed CCD sensor along a sensor array direction, and also, both the original and this CCD sensor are relatively moved by a sub-scanning operation along a direction perpendicular to the sensor array direction. Since this original reading operation is carried out by the image input unit 11 , a color image signal is produced from the original, which owns resolution of, for example, 400 dpi (400 dots per 25.4 mm) by 8-bit gradation as to each of R(red) color, G(green) color, B(blue) color per one pixel.
  • 400 dpi 400 dots per 25.4 mm
  • either an enlarging process operation or a reducing process operation is carried out
  • either the enlarging process operation or the reducing process operation along the sub-scanning direction may be realized by controlling a move speed of the sub-scanning direction in this image processing unit 11 .
  • it is so assumed that either the enlarging process operation or the reducing process operation is carried out by employing this method.
  • FIG. 3 is an explanatory diagram for explaining an example of the gradation correcting process operation. As indicated in FIG. 3 , the gradation correcting process operation corresponds to such a process operation for correcting both a gray balance and a gradation characteristic of each of R, G, B colors.
  • symbols “Rin”, “Gin”, and “Bin” show pixel values before the gradation correcting process operations are executed, and symbols “Rout”, “Gout”, “Bout” represent pixel values after the gradation correcting process operations are carried out.
  • a correspondence relationship between input (before gradation correction) pixel values and output (after gradation correction) pixel values is previously obtained by employing such converting curves as shown in FIG. 3 to form a one-dimensional lookup table (will be referred to as an “LUT” hereinafter)
  • An output (after gradation correction) pixel value may be obtained based upon an input (before gradation correction) pixel value with reference to the LUT.
  • the color correcting process operation is to correct the color characteristic of the image input unit 11 , and may be realized in such a manner that the below-mentioned matrix calculation is carried out by employing a preset coefficient:
  • [ R out G out B out ] [ ⁇ 11 ⁇ 21 ⁇ 31 ⁇ 41 ⁇ 51 ⁇ 61 ⁇ 71 ⁇ 81 ⁇ 91 ⁇ 12 ⁇ 22 ⁇ 32 ⁇ 42 ⁇ 52 ⁇ 62 ⁇ 72 ⁇ 82 ⁇ 92 ⁇ 13 ⁇ 23 ⁇ 33 ⁇ 43 ⁇ 53 ⁇ 63 ⁇ 73 ⁇ 83 ⁇ 93 ] ⁇ [ R in G in B in G in ⁇ B in R in ⁇ B in R in ⁇ G in R in 2 G in 2 B in 2 ] + [ ⁇ 1 ⁇ 2 ⁇ 3 ]
  • symbols “Rin”, “Gin”, “Bin” show pixel values before the color correcting process operation is executed
  • symbols “Rout”, “Gout”, “Bout” denote pixel values after the color correcting process operation is carried out
  • symbols “ ⁇ 11” to “ ⁇ 39” represent multiplication coefficients
  • symbols “ ⁇ 1” to “ ⁇ 3” indicate addition coefficients.
  • FIG. 4 and FIG. 5 are explanatory diagrams for explaining an example of the precision correcting process operation.
  • This precision correcting process operation is to correct the spatial characteristic of the image input unit 11 , and may be realized by executing a convolution calculation of 7 ⁇ 7 pixels while a pixel of interest is located as a center, as indicated in FIG. 4 and FIG. 5 .
  • FIG. 4 shows 7 ⁇ 7 pixels in a color image
  • FIG. 5 indicates multiplication coefficients in the respective pixel positions.
  • the convolution calculation may be calculated as follows:
  • the RGB-color digital image signals to which the above-described correcting process operations have been executed are outputted to the editing unit 14 .
  • various sorts of correcting process operations namely, the gradation correcting process operation defined in the step S 103 , the color correcting process operation defined in the step S 104 , and the precision correcting process operation defined in the step S 105 have been carried out with respect to the RGB-color digital image signals, and the characteristic of the image input unit 11 has been corrected.
  • a magnification changing process operation of an image defined in a step S 106 is executed, and an adjusting process operation of an image quality defined in a step S 107 is carried out.
  • both the enlarging process operation and the reducing process operation along the sub-scanning direction are realized by controlling the original scanning speed by the image input unit 11 .
  • a one-dimensional enlarging process operation and a one-dimensional reducing process operation are carried out by way of the known interpolation calculating process operation with respect to the main scanning direction of the image, and both a two-dimensional enlarging process operation and a two-dimensional reducing process operation may be realized in combination with the magnification changing process operation along the sub-scanning direction by the image input unit 11 .
  • an image quality adjusting process operation such as contrast, brightness, color balance, sharpness, color hue, and saturation is also carried out.
  • the color quality adjusting process operation as to the contrast, the brightness, and the color balance may be realized by executing, for example, a gamma adjusting process operation with reference to the known one-dimensional LUT with employment of a preset correction curve.
  • the sharpness adjustment may be realized by executing a convolution calculation with employment of preset multiplication coefficients with respect to, for example, 5 ⁇ 5 pixels while an interest pixel is located at a center.
  • both the color hue adjustment and the saturation adjustment may be realized by executing, for instance, the following matrix calculation:
  • a step S 108 the storage unit 15 stores thereinto image data entered from the editing unit 14 , to which both the correcting process operation and the editing process operation have been completed.
  • a step S 109 the control unit 16 judges as to whether or not the reading operations as to all pages of images are accomplished. In the case that there is such an original whose image has not yet been read, the process operation is returned to the previous step S 102 , so that process operations subsequent to reading operation of images by the image input unit 11 will be carried out in a similar manner. Thus, all pages of the image data are stored into the storage unit 15 .
  • a format of image data stored in the storage unit 15 is converted into a predetermined format used to be transmitted by the control unit 16 in a step S 110 .
  • the image data having this predetermined format, which is produced by the control unit 16 is transmitted via the I/O 18 to the external device set by the U/I 17 , so that a series of image processing operations is completed.
  • FIG. 6 is an explanatory diagram for explaining an example of a format converting process operation.
  • both resolution and a compression system of an image to be transmitted are controlled in response to a mode which indicates as to whether a chromatic color is present, and also indicates a sort of original, so that both a data amount and an image quality of an image to be transmitted are optimized.
  • a converting process operation of a color original into a black/white original is carried out as a color correcting process operation by the above-described correcting unit 13
  • both a resolution converting operation and a compressing process operation are carried out as a format converting process operation by the control unit 16 .
  • the transmission condition set from the U/I 17 in the step S 101 may be utilized.
  • step S 101 when various transmission conditions are set by the U/I 17 , the “N-up” process operation is designated. Then, in the step S 102 , reading operation of an image formed on an original is carried out by a changed magnification in response to the transmission condition set in the step S 101 . Subsequently, correcting process operations such as the gradation correcting operation (step S 103 ), the color correcting operation (step S 104 ), and the precision correcting operation (step S 105 ), similar to above-explained correcting process operations are carried out.
  • both a magnification changing process operation (step S 106 ) of the image along the main scanning direction, and an adjusting process operation (step S 107 ) of an image quality are carried out in response to the designated “N-up” function.
  • the storage unit 15 stores the image data to which the above-described process operations have been accomplished into a page memory.
  • FIG. 7 is an explanatory diagram for explaining one example of a storage process operation of an image in the case that the normal image transmission (not “N-up” process operation) is carried out.
  • FIG. 8 and FIG. 9 are explanatory diagrams for explaining one example of storage process operation of image data in such a case that the “N-up” process operation is carried out.
  • FIG. 7 ( 1 ) indicates such an image which has been read by the image input unit 11 , and then, has been processed by both the correcting unit 13 and the editing unit 14 .
  • FIG. 7 ( 2 ) represents that an image to which both the correcting process operation and the editing process operation have been executed is stored into the page memory employed in the storage unit 15 .
  • an arrow schematically shows a sequential order at which pixel values of the above-described described image are stored into the page memory.
  • FIG. 8 shows such a case that image data to which the “two-up” process operation has been executed is stored into the storage unit 15 .
  • FIG. 8 ( 1 ) indicates an original to which the “2-up” editing operation is carried out.
  • symbols “A” and “B” represent a first original image and a second original image, respectively.
  • FIG. 8 ( 2 ) conceptionally indicates a digital color image read by the image input unit 11 .
  • an arrow-line of a horizontal direction shows a main scanning direction
  • an arrow-line of a vertical direction indicates a sub-scanning direction
  • a point “O” located at an upper left position denotes an origin (0, 0) of this digital color image similar to those of FIG. 7 .
  • FIG. 8 ( 3 ) shows such an image that after the correcting process operation has been performed based upon an attribute of an original by the correcting unit 13 employed in the image processing unit 12 , the magnification changing process operation with respect to the main scanning direction is carried out by the editing unit 14 .
  • FIG. 8 ( 4 ) and FIG. 8 ( 5 ) conceptionally indicate storage conditions that the images shown in FIG. 8 ( 3 ) are stored into the page memory employed in the storage unit 15 .
  • both the original image “A” and the original image “B” are placed side by side with each other in such a manner that a writing start coordinate value of the image and a writing direction thereof are made different from those of such a case that the normal process operation is executed as shown in FIG. 7 . It should be understood that when the original image “B” is written, a writing start coordinate value thereof may be made different from that of the original image “A.”
  • FIG. 9 shows such a case that image data to which the “four-up” process operation has been executed is stored into the storage unit 15 .
  • FIG. 9 ( 1 ) indicates an original to which the “4-up” editing operation is carried out.
  • symbols “A”, “B”, “C”, and “D” represent a first original image, a second original image, a third original image, and a fourth original image, respectively.
  • FIG. 9 ( 2 ) conceptionally indicates a digital color image read by the image input unit 11 .
  • an arrow-line of a horizontal direction shows a main scanning direction
  • an arrow-line of a vertical direction indicates a sub-scanning direction
  • a point “O” located at an upper left position denotes an origin (0, 0) of this digital color image similar to those of FIG. 7 .
  • an image outputted from the image input unit 11 constitutes such an image to which the reducing process operation is carried out only the sub-scanning direction.
  • FIG. 9 ( 4 ) conceptionally indicates storage conditions that the images shown in FIG. 9 ( 3 ) are stored into the page memory employed in the storage unit 15 .
  • both original images “A”, “B”, “C”, and “D” are placed side by side with each other in a designated layout in such a manner that writing start coordinate values of these images are made different.
  • the format of the synthesized image stored in the storage unit 15 in the above-described manner is converted into a predetermined format required for a preselected transmission by the control unit 16 in a similar manner to that of the normal processing operation.
  • an attribute of the synthesized image is determined based upon attributes of original images to be synthesized, and both predetermined resolution and a predetermined compressing system are set.
  • an attribute expressed in this specification indicates both a mode and a sort of such an original shown in FIG. 6 .
  • An attribute of a synthesized image may be determined by, for example, selecting such an attribute to which the highest image quality is required from attributes of the respective original images.
  • a color original requires a higher image quality as compared with a black/white original, a mode of a full color is selected.
  • a character/photograph-mixed original requires a higher image quality, as compared with a photograph original and a character original, in which any one of gradation and resolution is required in a top priority.
  • a map original which furthermore requires both higher resolution and superior gradation, a reproducing process operation with a high image quality is required.
  • an attribute of a synthesized image which is formed from both a black/white original and a photograph original by the “2-up” processing operation may be determined based upon the black/white character/photograph image, and then, an MMR reducing process operation in the resolution of 400 dpi is selected from FIG. 6 .
  • an attribute of a synthesized image which is formed from both a color character original and a black/white map original by the “2-up” processing operation may be determined based upon the color map original, and the JPEG reducing process operation in the resolution of 400 dpi is selected from FIG. 6 .
  • the synthesized image held in the storage unit 15 is transferred to the control unit 16 , and both the resolution converting operation and the reducing process operation are carried out in accordance with the determined attribute.
  • FIG. 10 is an explanatory diagram for explaining an example of a reading process operation with respect to a synthesized image produced based upon the “2-up” function in which a color original and a black/white original are mixed.
  • a synthesized image may be desirably processed as the color image. As a result, as indicated in FIG.
  • the “N-up” image data which becomes the reduced image having the resolution selected in the above-explained manner is temporarily stored in a RAM employed in the control unit 16 . Subsequently, the “N-up” image data is converted into transmission data having a predetermined format.
  • image data format for transmission purposes, various known formats may be utilized.
  • the TIFF Tagged Image File Format
  • the TIFF is used which is widely known as the image format having the superior extendibility.
  • such an information required for realizing the “N-up” function is defined as a private data field, and then, this private data field is additionally provided in transmission data so as to be transferred to an external device. For instance, five pieces of the below-mentioned information may be preferably added to a TIFF file as private data capable of realizing the “N-up” function:
  • FIG. 11 is an explanatory diagram for explaining one example of an arranging relationship of respective images when the “N-up” process operation is carried out.
  • the above-described additional information (d) and (e) will be explained.
  • FIG. 11 indicates such an example that a so-called “4-up” image is produced by using four sheets of original images shown in FIG. 11 ( 1 ).
  • the “4-up” image two sheets of these original images are arranged along the main scanning direction, and two sheets of these original images are arranged along the sub-scanning direction.
  • such an information indicating that two original images are arranged along the main scanning direction and two original images are arranged along the sub-scanning direction among the four original images corresponds to the above-described additional information (d).
  • this information indicates that four sheets of originals are arranged based upon any of layouts, namely a “forward lateral arrangement” shown in FIG. 11 ( 2 ); a “forward longitudinal arrangement” indicated in FIG. 11 ( 3 ); a “backward lateral arrangement” represented in FIG. 11 ( 4 ); and a “backward longitudinal arrangement” denoted in FIG. 11 ( 5 ).
  • the data for transmission purposes which has been produced by the control unit 16 in the above-described manner, is transmitted via the system bus 19 and the I/O 18 to the network.
  • FIG. 12 is a schematic block diagram for indicating an arrangement of an image processing apparatus provided on a reception side, according to the first embodiment of the present invention.
  • reference numeral 21 shows an image output unit
  • reference numeral 22 indicates an image processing unit
  • reference numeral 23 represents a correcting unit
  • reference numeral 24 denotes an editing unit.
  • reference numeral 25 shows a storage unit
  • reference numeral 26 indicates a control unit
  • reference numeral 27 represents a U/I
  • reference numeral 28 shows an I/O
  • reference numeral 29 indicates a system bus.
  • the image output unit 21 owns, for example, an electronic photographic type image forming section, and records/outputs a digital color image.
  • the image output unit 21 forms an image by using four colors constructed of a Y (yellow) color, an M (magenta) color, a C (cyan) color, and a K (black) color.
  • a Y (yellow) color an M (magenta) color
  • a C (cyan) color an M (magenta) color
  • a K (black) color an M (magenta) color
  • the present invention is not limited to this example.
  • the image processing unit 22 produces/processes a digital image signal which is outputted to the image output unit 21 .
  • the image processing unit 22 is constituted by the correcting unit 23 , the editing unit 24 , the storage unit 25 , the control unit 26 , the U/I 27 , the I/O 28 , the system bus 29 , and the like.
  • the correcting unit 23 is to perform such a process operation that a color characteristic, a gradation characteristic, and a spatial characteristic of the image output unit 21 are corrected.
  • the image processing unit 22 produces, for example, a 4-color image made of Y, M, C, K colors which are used when an image is formed in the image output unit 21 .
  • the editing unit 24 executes an enlarging process operation and a reducing process operation of an image.
  • the editing unit 24 owns a function capable of making resolution of received image data identical to resolution of the image output unit 21 .
  • the storage unit 25 corresponds to a storage section capable of storing thereinto an amount of at least one page of images.
  • the control unit 26 is constructed of, for example, a CPU, a ROM, a RAM, and the like. The control unit 26 executes a portion of the image processing operations, and also controls the image processing unit 22 .
  • the U/I 27 is a user interface which is used by a user who instructs operations executed in both the image output unit 21 and the image processing unit 22 , and which displays a state of the system, and an error.
  • the I/O 28 is an input/output section used to connect to a network.
  • the system bus 29 corresponds to a local bus used to connect the above-described respective processing units.
  • FIG. 13 is a flow chart for describing an example of operations of the image processing apparatus provided on the reception side according to the first embodiment of the present invention.
  • this image processing apparatus provided on the reception side receives such an image data which is transmitted via the I/O 28 from the image processing apparatus provided on the transmission side shown in FIG. 1 , and temporarily stores the received image data into the RAM employed in the control unit 26 .
  • the control unit 26 firstly extracts from the received image data having a predetermined format document information indicative of total page number, total print quantity, a designation of paper, a double-face print, designation/non-designation of an “N-up” original, and so on.
  • control unit 26 interprets additional information in the unit of each page so as to extract resolution and a color mode of an output image, and also extract a sort of an original, and then, this control unit 26 sets process conditions in the editing unit 24 and the correcting unit 23 (will be explained later). Subsequently, in a step S 123 , the control unit 26 executes an expanding process operation as to the reduced image data, and transfers the expanded image data via the system bus 29 to the storage unit 25 . In a step S 125 , the control unit 26 stores the transferred image data into the storage unit 25 .
  • the control unit 26 produces such an information (will be referred to as “Tag information” hereinafter) which contains the same pixel number as that of the output image, and also indicates both a color mode every one pixel and a sort of an original in a step S 124 . Then, the control unit 26 stores this Tag information into the storage unit 25 similar to the image data.
  • FIG. 14 is an explanatory diagram for explaining an example of the Tag information.
  • FIG. 15 is an explanatory diagram for explaining a relationship between image data and the Tag information.
  • the Tag information contains such information indicative of a mode and a sort.
  • the Tag information may be constituted by 4 bits, namely, 2 bits indicative of a mode, and 2 bits representative of a sort.
  • 2 bits indicative of the mode the mode expressed by the bit string is indicated in FIG. 14 ( 1 )
  • the sort of original is shown in FIG. 14 ( 3 ).
  • implications expressed by the respective bit strings are not limited to this example, but also, number of the bit may be arbitrarily selected, and the information contained in the Tag information may be arbitrarily selected.
  • both the expanded image information and the Tag information having the same size as that of this expanded image information are stored in the storage unit 25 .
  • the Tag information becomes redundant.
  • the Tag information is unified to 4 bits/pixel for the sake of simple control characteristics of the entire system.
  • the editing unit 24 executes the linear interpolation calculations along both the main scanning direction and the sub-scanning direction, so that a two-dimensional resolution converting process operation is carried out.
  • This resolution converting process operation corresponds to such a process operation that resolution of the received image is made coincident with the resolution of the image output unit 21 . In such a case that the resolution of the received image is equal to the resolution of the image output unit 21 , this resolution converting process operation is not executed.
  • process operations executed after the editing unit 24 correspond to such process operations with respect to a multi-value image. In the case that image data stored in the storage unit 25 corresponds to a binary image, while black pixels thereof are allocated to 255 and white pixels thereof are allocated to 0, image processing operations of the respective pixels are performed.
  • the correcting unit 23 produces a YMCK image formed in the image output unit 21 , and also, executes an image correcting process operation in correspondence with the characteristic of the image output unit 21 .
  • a color correcting process operation for converting the entered RGB image into a YMCK image is carried out.
  • the color correcting process operation for example, approximate positions within a three-dimensional RGB-color space are determined from 4-bit information of upper grades of the respective entered R, G, B signals so as to select a plurality of Y, M, C, K representative values, and also representative values are interpolated by employing 4-bit data of lower grades of the respective entered R, G, B signals.
  • final Y, M, C, and K values may be obtained.
  • FIG. 16 is an explanatory diagram for explaining one example of the color correcting process operation.
  • reference numeral 31 indicates a three-dimensional table
  • reference numeral 32 shows an interpolation calculator.
  • FIG. 16 ( 2 ) shows an RGB-color space having 4,096 pieces of cross points, which is formed by upper 4-bit signals of R, G, B color signals. In other word, combinations of 17 points which are produced by subdividing each of the R, G, B axes constitute the respective cross points shown in FIG. 16 ( 2 ). Since YMCK values are defined in correspondence with this cross point (RGB values), the RGB values may be converted into the YMCK values.
  • 16 ( 1 ) is such a table which stores thereinto 4,096 pieces of YMCK values corresponding to, for instance, the cross points shown in FIG. 16 ( 2 ).
  • YMCK values corresponding to 12-bit (in total) data namely 4 bit data of each of the entered R, G, B data
  • Y, M, C, K data of eight cross points corresponding to 12-bit (in total) data namely 4 bits of each of the entered R, G, B data are outputted to the interpolation calculator 32 .
  • FIG. 16 ( 3 ) is an explanatory diagram for explaining process operations executed in the interpolation calculator.
  • symbols ( ⁇ r, ⁇ g, ⁇ b) indicate lower 4-bit values of inputted RGB data.
  • the interpolation calculator 32 executes the interpolation calculation by employing these lower 4 bit values ( ⁇ r, ⁇ g, ⁇ b) and the YMCK values corresponding to eight cross points of (r 1 , g 1 , b 1 ), (r 1 , g 1 , b 2 ), - - - , (r 2 , g 2 , b 2 ) which are determined by the upper 4 bits, so that output Y, M, C, K values (y out, m out, c out, k out) can be determined.
  • the precision correcting operation realized by executing, for example, the known convolution calculation shown in FIG. 4 and FIG. 5 with respect to the YMCK image produced by the color correcting process operation.
  • the gradation correcting process operation is performed by executing such a known manner that, for instance, an one-dimensional LUT as shown in FIG. 3 is referred. While optimum processing parameters are previously prepared in response to modes and sorts as to the above-explained color correcting process operation, precision correcting process operation, and gradation correcting process operation, these optimum processing parameters may be properly switched based upon a mode and a sort of an image to be outputted.
  • the YMCK image signal which has been processed in this manner is transmitted to the image output unit 21 , and then, an image is formed in the image output unit 21 in a step S 130 .
  • the control unit 26 judges as to whether or not image forming of all pages is accomplished. If there is an image which has not yet processed, then the process operation is returned to the previous step S 122 .
  • the above-explained process operations are repeatedly carried out plural times equal to total number of pages of received images. As previously explained, the image forming process operations of the received images may be realized.
  • the image processing apparatus provided on the reception side can produce the Tag information to execute the correcting process operation and the image forming operation by employing the information as to the modes and the sorts of the respective originals which are transmitted from the image processing apparatus provided on the transmission side and are used to be synthesized, this image processing apparatus can produce the “network copy” having the similar image quality to that of the “direct copy” even when the “N-up” function is used.
  • This alternative system may have such a merit that the system can be made very simple as to a point that the additional information used to output only the “N-up” synthesized image is not especially required, and also as to another point that the editing/correcting process operations for referring to the Tag information, and the forming operation of the Tag information (step S 124 of FIG. 13 ) are no longer required in the image processing apparatus provided on the output side.
  • both one optimum mode and one optimum sort may be determined based upon modes of plural originals to be added and plural sorts thereof, and then, the correcting process operation may be carried out by the correcting unit 23 based upon these determined optimum mode and sort.
  • this system may own such a merit that the image processing apparatus provided on the output side may be made simple.
  • FIG. 17 is a flow chart for describing an example of operations of the image processing apparatus provided on the transmission side according to the second embodiment of the present invention.
  • a step in which a processing operation similar to that shown in FIG. 2 is performed is allotted the same reference numerals as FIG. 2 .
  • the process operations defined from the step S 101 in which the transmission condition is set by the U/I 17 up to the step S 108 in which the images are synthesized and the synthesized image is stored in the storage unit 15 , and also the step S 109 for judging whether or not the process operations as to all pages of the image are accomplished are similar to these of the above-described first embodiment.
  • the control unit 16 After the image synthesizing operation has been completed in the storage unit 15 , the control unit 16 produces Tag information by the unit of a pixel in a step S 141 . Concretely specking, while the control unit 16 refers to both modes and sorts of respective originals employed in the “N-up” synthesizing operation, a 4-bit Tag information signal per 1 pixel as indicated in FIG. 14 and FIG. 15 may be produced in the RAM employed in the control unit 16 .
  • a step S 142 the control unit 16 converts “N-up” image data stored in the storage unit 15 into a preselected format in correspondence with the Tag information produced in the step S 141 , and then transmits the format-converted “N-up” image data via the I/O 18 to an external device provided on a network.
  • the control unit 16 converts “N-up” image data stored in the storage unit 15 into a preselected format in correspondence with the Tag information produced in the step S 141 , and then transmits the format-converted “N-up” image data via the I/O 18 to an external device provided on a network.
  • FIG. 18 is a flow chart for describing an example of operations of the image processing apparatus provided on the reception side according to the second embodiment of the present invention.
  • this image processing apparatus provided on the reception side receives such an image data which is transmitted via the I/O 28 and temporarily stores the received image data into the RAM employed in the control unit 26 .
  • the control unit 26 firstly extracts from the received image data having a predetermined format, document information indicative of a total page number, a total print quantity, a designation of paper, a double-face prints, designation/non-designation of an “N-up” original, and so on.
  • the control unit 26 interprets additional information in the unit of each page so as to extract resolution and a color mode of an output image, and also extract a sort of an original, and then, this control unit 26 sets process conditions in the editing unit 24 and the correcting unit 23 (will be explained later).
  • a step S 152 the control unit 26 expands the reduced image data, and also, extracts Tag information which has been added to the transmission data in correspondence with the image data. Both the extracted image data and the extracted Tag information are transmitted via the system bus 29 into the storage unit 25 , and is stored into the storage unit 25 in a step S 153 . Since process operations after this process operation of the step S 153 are similar to those of the above-described first embodiment, descriptions thereof are omitted.
  • the attribute information in the unit of the pixel corresponding to the “N-up” image data
  • the “network copy” having the similar image quality to that of the “direct copy” can be produced.
  • the attribute information in the unit of the pixel is transmitted in the above-described image processing apparatus of the second embodiment, the structural scale of the image processing units provided on the reception side can be decreased. As a consequence, for example, producibility of broadcast communications for transmitting image data to a plurality of output apparatus can be improved.
  • the Tag information is produced after the image data has been read out from the storage unit 15 .
  • the present invention is not limited to this example.
  • the Tag information may be produced before the image data is stored into the storage unit 15 , and the Tag information may be stored into the storage unit 15 in combination with the image data. For instance, in such a case that a mode and a sort of an original image are investigated when such a process operation as an image quality adjustment is carried out in the editing unit 14 and the like, the Tag information may be produced at this time.
  • the Tag information may be produced during this judging operation, and the produced Tag information may be transmitted to the reception side in combination with the image data. It should also be understood that in a case that the Tag information is produced before the image data is stored in the storage unit 15 , when the “N-up” process operation is carried out, a plurality of Tag information may be placed side by side in a similar manner to that of the “N-up” process operation for the image data, and then, these plural sets of Tag information may be synthesized.
  • a compressing process operation of image data is carried out by employing the MRC (Mixed Raster Content) system corresponding to one of the international standard compression systems designed for color facsimile machines.
  • MRC Mated Raster Content
  • both an arrangement of the image processing apparatus provided on the transmission side and an arrangement thereof provided on the reception side are similar to those of the first embodiment shown in FIG. 1 and FIG. 12 .
  • process flow operations other than format converting operations are similar to the above-described operations of the image processing apparatus provided on both the transmission side and the reception side in the first embodiment shown in FIG. 2 and FIG. 13 , explanations thereof are omitted.
  • the MRC system corresponds to such a system that while an image is separated into a plurality of layers, compressing methods different from each other are applied to the respective layers, and is a superior compression system capable of compressing in a higher grade such an image which is constructed of a plurality of objects having different characteristics such as characters and pictorial patterns.
  • FIG. 19 is an explanatory diagram for illustratively showing the MRC compression system.
  • FIG. 19 ( 1 ) illustratively indicates an original image to be compression-processed.
  • the original to be compression-processed is constructed of objects having different characteristics such as a character, a pictorial pattern, and a graph.
  • the MRC compression system such an original image is separated into three layers, namely, a foreground image indicated in FIG. 19 ( 2 ), a mask shown in FIG. 19 ( 3 ), and a background image represented in FIG. 19 ( 4 ).
  • both the foreground image and the background image correspond to color layers having 24 bits per 1 pixel
  • the mask corresponds to a binary layer having 1 bit per 1 pixel.
  • the background image is such a color layer which is constituted by an object whose gradation characteristic is important, such as a pictorial pattern and a photograph.
  • the mask is such a binary layer which is constituted by an object whose resolution information is important, such as a text and a line segment.
  • the foreground image is such a layer which indicates color information of a binary image represented by the mask.
  • the spatial information of the mask as indicated in FIG. 19 ( 5 )
  • one image may be reproduced from the three layers in such a manner that as to pixels in which bits of the mask are “OFF”, the pixel value of the background image is left, and also, pixels in which bits of the mask are ON are replaced by the pixel values of the forward image.
  • the original having a plurality of objects is separated into the background image whose gradation has a top priory, the binary mask whose resolution has a top priority, and also the foreground image containing only the color information.
  • the JPEG compression operation is carried out with respect to the background image
  • the MMR compression operation is carried out with respect to the mask
  • the JBIG compression operation is carried out with respect to the forward image, namely, since the compressing process operations suitable for the respective characteristics are carried out, the image data having a small amount of data can be obtained in a high grade.
  • FIG. 20 is an explanatory diagram for explaining an example in which the MRC system is applied to the “N-up” function in this third embodiment of the present invention.
  • FIG. 20 ( 1 ) shows an “N-up” original which it to be processed, namely which is stored in the storage unit 15 .
  • a “4-up” image is produced from 4 sheets of original images, namely, a “black/white character”, a “color map”, a “color character/photograph”, and a “black/white photograph”.
  • FIG. 20 ( 2 ) represents both resolution and a compression system, which are optimum to these four original images.
  • both the “color map” original and the “color character/photograph” original are allocated to the background image; both the “black/white character” original and the “black/white photograph” original are converted into binary data which is allocated to the mask; and also, both the color of “black/white character” original and the color of “black/white photograph” original, namely the “black” color are allocated to the foreground image, the “N-up” synthesized image shown in FIG. 20 ( 1 ) is subdivided into three layers.
  • the JBIG compression method is carried out with respect to the foreground image
  • the MMR compression method is carried out with respect to the mask
  • the JPEG compression method is carried out with respect to the background image
  • such a Tag information as indicated in FIG. 14 and FIG. 15 is produced based upon the features of the respective layers as represented in FIG. 20 ( 5 ), FIG. 20 ( 6 ), and FIG. 20 ( 7 ). Then, either the editing process operation or the correcting process operation may be executed. Both the editing process operation and the correcting process operation may be carried out while these original images are separated into the respective layers. Alternatively, after either the foreground image or the background image is selected in accordance with the mask and then these background/foreground images are synthesized with each other, either the editing process operation or the correcting process operation may be carried out every region in accordance with the Tag information.
  • the image synthesized by way of the “N-up” function using the MRC system is separated into the plural layers in response to the attributes of the images to be synthesized with each other, and then, the separated layers are transferred.
  • the process operations can be carried out in response to the attributes of the original images even in the image processing apparatus provided on the reception side.
  • the “N-up” function equivalent to that of the “direct copy” can be realized.
  • This fourth embodiment represents such a case that an image synthesizing process operation of “N-up” function is carried out by an image processing apparatus provided on the reception side. It should be noted that since an arrangement of an image processing apparatus provided on the transmission side is similar to that of the first embodiment shown in FIG. 1 , this arrangement is not shown and explanations thereof are omitted.
  • FIG. 21 is a flow chart for describing an example of operations of the image processing apparatus provided on the transmission side according to the fourth embodiment of the present invention.
  • a step S 161 various setting operations of process operations related to a sort of an original and a mode of this original, and also various setting operations of information of an external device to which image data is transmitted are carried out by the U/I 17 .
  • a content of an “N-up” process operation is set. Concretely speaking, such a condition that the “N-up” synthesizing operation is carried out by employing how many sheets of originals is set by the U/I 17 . Also, such a condition that as indicated in FIG.
  • the image input unit 17 executes a reading operation of images formed on the set originals in a step S 162 .
  • various sorts of correcting process operations by the known manners such as the gradation correcting operation, the color correcting operation, and the precision correcting operation are carried out from a step S 163 to a step S 165 by the correcting unit 13 employed in the image processing unit 12 .
  • the processed color image data is stored into the storage unit 15 in a step S 167 .
  • the image data stored in the storage unit 15 is reduced every one page thereof by the control unit 16 in a step S 168 , and then, the reduced image data is temporarily stored into a RAM provided in the control unit 16 in combination with the mode and the sort of the original set from the U/I 17 in a step S 169 .
  • the “N-up” synthesizing process operation is executed in the image processing apparatus provided on the reception side.
  • both the reducing process operations of the original images and the arranging process operation based on a desirable layout are carried out in this image processing apparatus provided on the reception side. In general, it is preferable to execute a magnification changing operation with respect to a multi-value image.
  • the reducing process operation is preferably performed based upon a parameter by which a high reducing ratio is obtained in response to a value of “N” in view of producibility.
  • the storage medium is initialized and thus, the storage unit 15 is prepared for storing thereinto image data of a next original.
  • the image data reduced in the control unit 16 may be again stored into the storage unit 15 .
  • a step S 170 the control unit 16 judges as to whether or not the processing operation for all of the pages is accomplished. When such an original which has not yet been read is still left, the process operation is returned to the previous step S 162 in which the above-described process operations are repeatedly carried out.
  • the all image data is converted into image data having a preselected format for transmission purposes in a step S 171 by the control unit 16 .
  • the image data format for transmission purposes maybe arbitrarily determined.
  • the TIFF format is employed in a similar manner to that of the first embodiment.
  • five pieces of the below-mentioned information is added to a TIFF file as private data capable of realizing the “N-up” function:
  • the data for the transmission purposes which is produced in the control unit 16 is transmitted via the system bus 19 and the I/O 18 to the network in a step S 172 .
  • FIG. 22 is a block diagram for schematically showing an arrangement of an image processing apparatus provided on the reception side according to the fourth embodiment of the present invention. It should be noted that the same reference numerals shown in FIG. 12 will be employed as those for denoting the same, or similar structural units of FIG. 22 . In this fourth embodiment, since the “N-up” synthesizing operation is carried out on the reception side, a storage unit 25 is arranged at a post stage of an editing unit 24 . Other structural units of this fourth embodiment are similar to those of the above-described first embodiment.
  • FIG. 23 is a flow chart for describing an example of operations of the image processing apparatus provided on the reception side according to the fourth embodiment of the present invention.
  • this image processing apparatus provided on the reception side receives such an image data which is transmitted via the I/O 28 from the image processing apparatus provided on the transmission side shown in FIG. 1 , and temporarily stores the received image data into the RAM employed in the control unit 26 .
  • the control unit 26 firstly extracts from the received image data having a predetermined format, document information indicative of a total page number, a total print quantity, a designation of paper, a double-face print, designation/non-designation of an “N-up” original, and so on.
  • control unit 26 interprets additional information such as an attribute in the unit of each page. Subsequently, in a step S 183 , the control unit 26 executes an expanding process operation as to the reduced image data, and transfers the expanded image data via the system bus 29 to the editing unit 24 .
  • a resolution converting process operation is carried out in a step S 185 .
  • the editing unit 24 executes a magnification changing process operation and a reducing process operation.
  • the magnification changing process operation makes resolution of the received image coincident with the resolution of the image output unit 21 .
  • the reducing process operation is to execute the “N-up” synthesizing operation.
  • the image to which the magnification changing operation has been executed is entered into the storage unit 25 , and then, is stored into this storage unit 25 at a preselected coordinate value/direction based upon a layout designated by the additional information in a step S 186 .
  • the control unit 26 produces Tag information having such a size corresponding to the image data whose resolution has been converted from the information related to the mode and the sort in the unit of the page, which are extracted in the step S 182 .
  • the control unit 26 stores the produced Tag information at a coordinate value corresponding to the storage position of the image data stored in the storage unit 25 .
  • the Tag information is similar to that of the above-described respective embodiments.
  • the control unit 26 produces Tag information having a size equal to that of the image data in the step S 184 .
  • the editing unit 24 may execute the resolution converting process operation with respect to the Tag information in combination with the image data, and the control unit 26 may store the resolution-converted Tag information at a coordinate value corresponding to the storage position of the image data of the storage unit 25 .
  • a step S 187 the control unit 26 judges as to whether or not the “N-up” synthesizing operation is accomplished. In other words, the control unit 26 judges as to whether or not the above-described process operations have been carried out with respect to “N” sheets of images. If an unprocessed image is left in “N” sheets of these images, then the process operation is returned to the previous step S 182 in which similar process operations are carried out with respect to this unprocessed image.
  • both the “N-up” synthesizing image and the Tag information corresponding to this “N-up” synthesizing image are produced in the storage unit 25 . It should be noted that while there is no “N” sheets of images, when the received image is ended, it is so regarded that the “N-up” synthesizing operation is accomplished at this time instant.
  • the Tag information similarly produced in correspondence with this “N-up” synthesized image is present. While referring to these image and information, the correcting unit 23 executes a color correcting process operation in a step S 188 , a precision correcting process operation in a step S 189 , and a gradation correcting process operation in a step S 190 .
  • the image output unit 21 produces an image-formable YMCK image signal. It should be noted that contents of the respective process operations are similar to those of the above-described first embodiment.
  • the YMCK images signal which has been processed in this manner is supplied to the image output unit 21 . In a step S 191 , an image is formed in the image output unit 21 .
  • a step S 192 the control unit 26 judges as to whether or not images of all of the designated page numbers have been processed. In the case that the total processed page number is not yet reached to the designated page number, the process operation is returned to the step S 182 in which the above-described process operations are repeatedly carried out. Since the above-explained process operations are repeatedly carried out plural times equal to the designated page number of the received image, the forming process operation of the received image may be realized.
  • the process operations can be carried out with respect to the “N-up”-synthesized image in response to the attribute thereof.
  • the “network copy” having a similar image quality to that of the “direct copy” may be produced.
  • the magnification changing process operation resolution converting process operation
  • the image having the higher image quality can be formed, as compared with the image quality obtained in the case that the magnification changing process operations are carried out on the transmission side and the reception side as explained in the first embodiment.
  • the editing unit 24 executes the resolution converting operation and the synthesizing operation based upon the “N-up” function, and thereafter, the correcting unit 23 executes various sorts of correcting process operations with employment of the Tag information.
  • the present invention is not limited to the above-explained operations, but may be applied to another process operation. That is, for instance, after the editing unit 24 has performed the resolution converting operation, the correcting unit 23 may execute various sorts of correcting process operations with employment of the Tag information, and thereafter, the editing unit 24 may execute the synthesizing operation based upon the “N-up” function.
  • the editing unit 24 may execute the resolution converting operation and the synthesizing operation based upon the “N-up” function.
  • the highspeed process operations may be carried out if the correcting unit 23 executes the correcting process operation in advance.
  • various sorts of these correcting process operations by the correcting unit 23 may be separately carried out before/after there solution converting operation executed by the editing unit 24 , and before/after the image synthesizing operation based upon the “N-up” function.
  • This fifth embodiment represents such a case that an image reducing process operation for “N-up” function is carried out by an image processing apparatus provided on the transmission, and also an image synthesizing process operation is performed by an image processing apparatus provided on the reception. It should be noted that since an arrangement provided on the transmission side is similar to that of the first embodiment shown in FIG. 1 , and an arrangement provided on the reception side is similar to that of the fourth embodiment indicated in FIG. 22 , these arrangements are not indicated and operations thereof are omitted.
  • FIG. 24 is a flow chart for describing an example of operations of the image processing apparatus provided on the transmission side according to the fifth embodiment of the present invention.
  • a step S 201 various setting operations of process operations related to a sort of an original and a mode of this original, and also information of an external device to which image data is transmitted are carried out by the U/I 17 .
  • a content of an “N-up” process operation is set. Concretely speaking, such a condition that the “N-up” synthesizing operation is carried out by employing how many sheets of originals is set by the U/I 17 . Also, such a condition that how to place the originals side by side with each other is set by the U/I 17 .
  • the image input unit 17 executes a reading operation of images formed on the set originals in a step S 202 .
  • the magnification processing operation can be carried out with respect to the sub-scanning direction.
  • various sorts of correcting process operations by the known manners such as the gradation correcting operation, the color correcting operation, and the precision correcting operation are carried out from a step S 203 to a step S 205 by the correcting unit 13 employed in the image processing unit 12 .
  • the resultant RGB color digital image signal is stored into the storage unit 15 in a step S 208 .
  • the magnification changing process operation for the sub-scanning direction is not carried out when the image is read in the step S 202
  • the magnification changing process operation along the sub-scanning direction may also be carried out in the step S 206 .
  • the image data stored in the storage unit 15 is reduced every one page thereof by the control unit 16 in a step S 209 , and then, the reduced image data is temporarily stored into a RAM provided in the control unit 16 in combination with the mode and the sort of the original set from the U/ 117 in a step S 210 .
  • the resolution converting process operation is not executed in the image processing unit 22 provided on the reception side unless the resolution of the image input unit 11 is different from the resolution of the image output unit 21 .
  • the image data can be reduced based upon the optimum compression systems with respect to the respective originals, depending upon an output electronic appliance of a transmission destination, so that producibility can be improved.
  • the storage medium is initialized and thus, the storage unit 15 is prepared for storing thereinto image data of a next original.
  • the image data reduced in the control unit 16 may be again stored into the storage unit 15 .
  • a step S 211 the control unit 16 judges as to whether or not the processing operation for all of the pages is accomplished.
  • the process operation is returned to the previous step S 202 in which the above-described process operations are repeatedly carried out.
  • the above-explained process operations are repeatedly carried out until all of the originals which are used to execute the “N-up” synthesizing operation are accomplished.
  • the “N-up” image data is converted into image data having a preselected format for transmission purposes in a step S 212 by the control unit 16 .
  • the image data format for transmission purposes may be arbitrarily determined.
  • the TIFF format is employed in a similar manner to that of other embodiments.
  • five pieces of the below-mentioned information is added to a TIFF files as private data capable of realizing the “N-up” function:
  • the data for the transmission purposes which is converted in the control unit 16 is transmitted via the system bus 19 and the I/O 18 to the network in a step S 213 .
  • FIG. 26 is a flow chart for describing an example of operations of the image processing apparatus provided on the reception side according to the fifth embodiment of the present invention.
  • this image processing apparatus provided on the reception side receives such an image data which is transmitted via the I/O 28 from the image processing apparatus provided on the transmission side shown in FIG. 1 , and temporarily stores the received image data into the RAM employed in the control unit 26 .
  • the control unit 26 firstly extracts from the received image data having a predetermined format, document information indicative of a total page number, a total print quantity, a designation of paper, a double-face print, designation/non-designation of an “N-up” original, and so on.
  • a step S 223 the control unit 26 interprets additional information in the unit of each page. Subsequently, in a step S 223 , the control unit 26 executes an expanding process operation as to the reduced image data, and transfers the expanded image data via the system bus 29 to the storage unit 25 .
  • the expanded image data is stored into this storage unit 25 at a preselected coordinate value/direction based upon a layout designated by the additional information.
  • the control unit 26 produces Tag information having such a size corresponding to the image data from the information related to the mode and the sort in the unit of he page, which are extracted in the step S 222 .
  • the control unit 26 stores the produced Tag information at such a coordinate value corresponding to the storage position of the image data stored in the storage unit 25 .
  • the Tag information is similar to that of the above-described respective embodiments.
  • a step S 226 the control unit 26 judges as to whether or not the “N-up” synthesizing operation is accomplished. In other words, the control unit 26 judges as to whether or not the above-described process operations have been carried out with respect to “N” sheets of images. If an image to be processed is lefted in “N” sheets of these images, then the process operation is returned to the previous step S 222 in which similar process operations are carried out with respect to this unprocessed image.
  • both the “N-up” synthesizing image and the Tag information corresponding to this “N-up” synthesizing image are produced in the storage unit 25 . It should be noted that while there is no “N” sheets of images, when the received image is ended, it is so regarded that the “N-up” synthesizing operation is accomplished at this time instant.
  • the produced Tag information is defined in correspondence with the “N-up” synthesized image.
  • Both the “N-up” synthesized image and the Tag information are entered into the editing unit 24 , and then, the resolution converting process operation is carried out in a step S 227 in order that resolution of the transmitted image data is made coincident with the resolution of the image output unit 21 .
  • the resolution of the transmitted image data is made coincident with the resolution of the image output unit 21 , the above-explained resolution converting process operation of the step S 227 is no longer required.
  • the correcting unit 23 furthermore executes a color correcting process operation in a step S 228 , a precision correcting process operation in a step S 229 , and a gradation correcting process operation in a step S 230 .
  • the image output unit 21 produces an image-formable YMCK image signal. These process operations are carried out in such a manner that optimum process operations may be executed to the respective regions with reference to the Tag information signal corresponding to the synthesized image.
  • the YMCK image signal which has been processed in this manner is supplied to the image output unit 21 .
  • a step S 231 an image is formed in the image output unit 21 .
  • a step S 232 the control unit 26 judges as to whether or not images of all of the designated page numbers have been processed. In the case that the total processed page number is not yet reached to the designated page number, the process operation is returned to the step S 222 in which the above-described process operations are repeatedly carried out. Since the above-explained process operations are repeatedly carried out plural times equal to the designated page number of the received image, the forming process operation of the received image may be realized.
  • the process operations may be carried out with respect to the “N-up” synthesized image in response to the attribute information thereof.
  • the magnification changing process operation is carried out on the transmission side, a transfer efficiency when the image is transmitted from the transmission side to the reception side may be improved, as compared with the above-explained fourth embodiment.
  • the image processing apparatus provided on the reception side owns a mechanism capable of judging an attribute of an image
  • the attribute information of the respective images need not be transmitted from the image processing apparatus provided on the transmission side.
  • the image processing apparatus provided on the reception side can judge the attribute every image and can utilize the judged attribute in the synthesized image, the “N-up” function can be realized in a higher image quality.
  • the present invention is not limited to the above-explained arrangement/operations, but may be applied to such a case that, for instance, after various sorts of correcting process operations have been carried out in advance with employment of the Tag information by the correcting unit 23 , the image synthesizing operation may be carried out based upon the “N-up” function.
  • the resolution converting process operation may be carried out by the editing unit 24 before the image synthesizing operation is executed based upon the “N-up” function.
  • the black/white binary value/gray scale/RGB have been employed as the color space when the image is transmitted/received;
  • the MH/MMR/JPEG systems and the like have been used as the compression system;
  • the TIFF has been employed as the image format in the first, second, fourth, and fifth embodiments.
  • the present invention is not especially limited to these items, but may employ an arbitrary color space, an arbitrary compression system, and an arbitrary image format.
  • the description has been made in which both the color mode and the sort of the original have been employed as the attribute of the original.
  • attributes may be used, namely such an information as to whether an original corresponds to a printed photograph, or a printed matter may be used as the attribute of the original; and such an information related to a background of an original and also to a backing copy may be used as the attribute of the original.
  • these attributes are entered from either the U/I 17 or the U/I 27 .
  • this judging result may be combined with attribute information to be added.
  • magnification changing process operations when the magnification changing process operation is carried out in the image processing apparatus provided on the transmission side, the magnification changing process operations may be separately carried out as to the sub-scanning direction and the main scanning direction by the image input unit 11 and the editing unit 14 .
  • magnification changing process operations maybe carried out by the editing unit 14 as to both the main scanning direction and the sub-scanning direction.
  • the image processing apparatus provided on the transmission side and the image processing apparatus provided on the reception side are represented, the communication is established between these image processing apparatus.
  • the present invention is no limited to this case, but may be applied to another case that, for example, the image input unit 11 maybe realized by a network scanner connected to a network, and this network scanner may be separately provided with the image processing unit 11 .
  • the image output unit 21 may be realized by a network printer connected to a network, and this network printer may be separately provided with the image processing unit 22 .
  • such a combination between the network scanner and the network printer may be employed.
  • FIG. 26 is an explanatory diagram for explaining one example of a computer program and a storage medium for storing thereinto this computer program in such a case that either a function of an image processing apparatus or an image processing method of the present invention is realized by such a computer program.
  • reference numerals 51 and 53 are programs, and reference numerals 52 and 54 show recording media. It should be understood that both the function of the image processing unit 11 of the image processing apparatus provided on the transmission side, and the function of the image processing unit 12 of the image processing apparatus provided on the reception side may also be realized by way of a computer program. For example, in the image processing apparatus provided on the transmission side shown in FIG.
  • both the functions of the correcting unit 13 and the editing unit 14 , and also the function executed in the control unit 16 are realized as the program 51 , and then, this program 51 may be executed by the CPU of the control unit 16 .
  • both the functions of the correcting unit 23 and the editing unit 24 , and also the function executed in the control unit 26 are realized as the program 53 , and then, this program 53 may be executed by the CPU of the control unit 26 .
  • this program 51 , the program 53 , and data used by these programs 51 / 53 may be stored in computer-readable storage media 52 and 54 .
  • the storage media 52 and 54 correspond to such media that changed states of energy such as magnetic, optical, electric energy are conducted in response to description contents of a program, and then, the description contents of this program may be transferred in a signal format corresponding to the changed states to a reading apparatus provided in a hardware resource of a computer.
  • the storage media there are optical disks (CD-ROM etc.), magnetic optical disks (MO etc.), magnetic disks, magnetic cards, memories (including IC cards and memory card).
  • these storage media are not limited to portable type storage media.
  • While the programs 51 and 53 are stored in these storage media 52 and 54 , since either the storage medium 52 or the storage medium 54 is mounted on a computer capable of activating, for instance, the functions of the image processing units 12 and 22 , this computer can execute the various functions which are described in the respective embodiments of both the image processing apparatus and the image processing method according to the present invention.
  • either the program 51 or the program 53 is transferred via, e.g., a network to the computer, and then, either the program 51 or the program 53 may be stored into either the storage medium 52 or the storage medium 54 so as to be executed.
US10/272,946 2001-11-26 2002-10-18 Image processing apparatus, image processing method, image processing program, and storage medium Expired - Fee Related US7308155B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPP2001-359948 2001-11-26
JP2001359948A JP2003163801A (ja) 2001-11-26 2001-11-26 画像処理装置および画像処理方法、画像処理プログラム、記憶媒体

Publications (2)

Publication Number Publication Date
US20030098983A1 US20030098983A1 (en) 2003-05-29
US7308155B2 true US7308155B2 (en) 2007-12-11

Family

ID=19170853

Family Applications (1)

Application Number Title Priority Date Filing Date
US10/272,946 Expired - Fee Related US7308155B2 (en) 2001-11-26 2002-10-18 Image processing apparatus, image processing method, image processing program, and storage medium

Country Status (2)

Country Link
US (1) US7308155B2 (ja)
JP (1) JP2003163801A (ja)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030160991A1 (en) * 2002-02-27 2003-08-28 Masatoshi Kadota Spool file modifying device
US20040150854A1 (en) * 2003-01-31 2004-08-05 Sprague Mary Ann Systems and methods for creating a single electronic scanned job from multiple scanned documents
US20050012963A1 (en) * 2003-07-16 2005-01-20 Maiko Yamads Image processing apparatus, image processing method, and computer product
US20050024666A1 (en) * 2003-05-08 2005-02-03 Maki Ohyama Copying apparatus, a program, and a storage medium
US20050174590A1 (en) * 2004-02-10 2005-08-11 Fuji Photo Film Co., Ltd. Image correction method, image correction apparatus, and image correction program
US20060056660A1 (en) * 2004-09-14 2006-03-16 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20060209348A1 (en) * 2005-03-16 2006-09-21 Kabushiki Kaisha Toshiba Image processing apparatus
US20060221411A1 (en) * 2005-03-31 2006-10-05 Canon Kabushiki Kaisha Image reading apparatus and control method of image reading apparatus
US20070097434A1 (en) * 2005-11-01 2007-05-03 Kyocera Mita Corporation Image-forming system and image-forming program
US20070250532A1 (en) * 2006-04-21 2007-10-25 Eastman Kodak Company Method for automatically generating a dynamic digital metadata record from digitized hardcopy media
US20090059257A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device capable of preventing needless printing
US20090059251A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device performing image correction by using a plurality of sample images
US20090059256A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device outputting image for selecting sample image for image correction
US20090060364A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processor for converting image by using image retrieved based on keyword
US20090059263A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processor
US20090129805A1 (en) * 2007-11-16 2009-05-21 Konica Minolta Business Technologies, Inc. Image forming apparatus
US20090244564A1 (en) * 2007-08-31 2009-10-01 Brother Kogyo Kabushiki Kaisha Image processing device extracting desired region to be used as model for image correction
US20090322792A1 (en) * 2008-06-27 2009-12-31 Canon Kabushiki Kaisha Image output apparatus, control method, and computer-readable storage medium
US20100202711A1 (en) * 2007-07-19 2010-08-12 Sony Corporation Image processing apparatus, image processing method, and program
US10217257B1 (en) * 2015-03-17 2019-02-26 Amazon Technologies, Inc. Process for contextualizing continuous images

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4125208B2 (ja) * 2003-09-29 2008-07-30 キヤノン株式会社 画像処理装置及び画像処理方法
US7430318B2 (en) * 2004-07-13 2008-09-30 Toshiba Corporation System and method for color correction for electronic printing
JP4613712B2 (ja) * 2005-06-27 2011-01-19 富士ゼロックス株式会社 画像データ合成装置及び画像データ合成プログラム
JP4604888B2 (ja) * 2005-07-12 2011-01-05 富士ゼロックス株式会社 位置情報管理装置、画像形成装置、位置情報管理方法、及びプログラム
JP2007088767A (ja) * 2005-09-21 2007-04-05 Fuji Xerox Co Ltd 画像読み取り装置、画像処理装置、画像読み取り方法および画像処理方法
JP4845700B2 (ja) * 2006-12-13 2011-12-28 キヤノン株式会社 画像形成装置及びその制御方法
JP4906627B2 (ja) * 2007-07-31 2012-03-28 キヤノン株式会社 画像処理装置、画像処理方法、コンピュータプログラム及び記憶媒体
KR102037716B1 (ko) * 2012-11-23 2019-10-30 삼성디스플레이 주식회사 디스플레이 장치의 감마 데이터 저장 방법, 디스플레이 장치 및 디스플레이 장치의 구동 방법
JP6369071B2 (ja) * 2014-03-18 2018-08-08 富士ゼロックス株式会社 画像処理装置及びプログラム
JP6350069B2 (ja) * 2014-07-22 2018-07-04 富士ゼロックス株式会社 情報処理システム、情報処理装置およびプログラム
US10810775B2 (en) * 2019-02-20 2020-10-20 Adobe Inc. Automatically selecting and superimposing images for aesthetically pleasing photo creations

Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5262867A (en) * 1990-06-20 1993-11-16 Sony Corporation Electronic camera and device for panoramic imaging and object searching
JPH05344328A (ja) 1992-06-12 1993-12-24 Canon Inc 印刷装置
US5444552A (en) * 1992-09-28 1995-08-22 Xerox Corporation Method for compressing, processing, and storing grayscale bitmaps
JPH0918683A (ja) 1995-06-28 1997-01-17 Fuji Xerox Co Ltd 出力装置
US5638464A (en) * 1987-09-02 1997-06-10 Canon Kabushiki Kaisha Image processing apparatus
US5644411A (en) * 1992-11-19 1997-07-01 Sharp Kabushiki Kaisha Joint-portion processing device for image data for use in an image processing apparatus
US5673209A (en) * 1995-03-29 1997-09-30 International Business Machines Corporation Apparatus and associated method for compressing and decompressing digital data
US5721624A (en) * 1989-10-15 1998-02-24 Minolta Co., Ltd. Image reading apparatus improving the joining state of a plurality of image data obtained by dividing and reading out an original image
US5734915A (en) * 1992-11-25 1998-03-31 Eastman Kodak Company Method and apparatus for composing digital medical imagery
US5880778A (en) * 1995-05-17 1999-03-09 Sharp Kabushiki Kaisha Still-image taking camera
JPH11275299A (ja) 1998-03-19 1999-10-08 Canon Inc 画像読取装置及び画像読取装置における動作指示通知方法
US5995145A (en) * 1991-04-30 1999-11-30 Sony Broadcast & Communications Limited Image capture apparatus for selectively causing a stepped reduction in video image signal values
US6023556A (en) * 1997-01-29 2000-02-08 Gammagrapnx, Inc. Processing print job image data
JP2000101823A (ja) 1998-07-24 2000-04-07 Fuji Xerox Co Ltd 画像処理装置
US6049390A (en) * 1997-11-05 2000-04-11 Barco Graphics Nv Compressed merging of raster images for high speed digital printing
US6201571B1 (en) * 1996-06-13 2001-03-13 Nec Corporation Digital camera recording a reduced image synthesized with a character image of the image picking-up information
US6219454B1 (en) * 1997-12-08 2001-04-17 Fuji Xerox Co., Ltd. Image processing apparatus
JP2001127989A (ja) 1999-11-01 2001-05-11 Fuji Xerox Co Ltd 画像形成装置
US6233066B1 (en) * 1997-08-06 2001-05-15 Matsushita Electric Industrial Co., Ltd. Image processing apparatus, method for image processing, and image reader apparatus
JP2001136322A (ja) 1999-11-10 2001-05-18 Konica Corp 画像形成装置、情報処理装置、画像形成システムおよび画像ファイルを記憶する記憶媒体
JP2001211316A (ja) 2000-01-27 2001-08-03 Canon Inc 画像処理装置及び画像処理方法、記憶媒体及び画像処理システム
US20020051230A1 (en) 2000-01-27 2002-05-02 Kenichi Ohta Image processing apparatus, image processing method, and storage medium
US6424752B1 (en) * 1997-10-06 2002-07-23 Canon Kabushiki Kaisha Image synthesis apparatus and image synthesis method
US6484149B1 (en) * 1997-10-10 2002-11-19 Microsoft Corporation Systems and methods for viewing product information, and methods for generating web pages
US6522789B2 (en) * 1991-04-08 2003-02-18 Canon Kabushiki Kaisha Apparatus and method for processing images
US6549681B1 (en) * 1995-09-26 2003-04-15 Canon Kabushiki Kaisha Image synthesization method
US6628419B1 (en) * 1999-03-12 2003-09-30 Fuji Xerox Co., Ltd. Image forming device
US6744471B1 (en) * 1997-12-05 2004-06-01 Olympus Optical Co., Ltd Electronic camera that synthesizes two images taken under different exposures
US6968115B2 (en) * 1998-01-26 2005-11-22 Canon Kabushiki Kaisha Editing-function-integrated reproducing apparatus
US7013289B2 (en) * 2001-02-21 2006-03-14 Michel Horn Global electronic commerce system
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US464275A (en) * 1891-12-01 Street letter-box
US4827419A (en) * 1986-09-22 1989-05-02 Lasertrak Corporation Portable navigational planning device
US4831538A (en) * 1986-12-08 1989-05-16 Aviation Supplies And Academics Hand-held navigation and flight performance computer
US6760778B1 (en) * 1998-09-09 2004-07-06 At&T Wireless Services, Inc. System and method for communication between airborne and ground-based entities
US20020045974A1 (en) * 2000-05-12 2002-04-18 Stephen Heppe Dual-band radio communications system for aeronautical data communications

Patent Citations (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5638464A (en) * 1987-09-02 1997-06-10 Canon Kabushiki Kaisha Image processing apparatus
US5721624A (en) * 1989-10-15 1998-02-24 Minolta Co., Ltd. Image reading apparatus improving the joining state of a plurality of image data obtained by dividing and reading out an original image
US5262867A (en) * 1990-06-20 1993-11-16 Sony Corporation Electronic camera and device for panoramic imaging and object searching
US6522789B2 (en) * 1991-04-08 2003-02-18 Canon Kabushiki Kaisha Apparatus and method for processing images
US5995145A (en) * 1991-04-30 1999-11-30 Sony Broadcast & Communications Limited Image capture apparatus for selectively causing a stepped reduction in video image signal values
JPH05344328A (ja) 1992-06-12 1993-12-24 Canon Inc 印刷装置
US5444552A (en) * 1992-09-28 1995-08-22 Xerox Corporation Method for compressing, processing, and storing grayscale bitmaps
US5644411A (en) * 1992-11-19 1997-07-01 Sharp Kabushiki Kaisha Joint-portion processing device for image data for use in an image processing apparatus
US5734915A (en) * 1992-11-25 1998-03-31 Eastman Kodak Company Method and apparatus for composing digital medical imagery
US5673209A (en) * 1995-03-29 1997-09-30 International Business Machines Corporation Apparatus and associated method for compressing and decompressing digital data
US5880778A (en) * 1995-05-17 1999-03-09 Sharp Kabushiki Kaisha Still-image taking camera
JPH0918683A (ja) 1995-06-28 1997-01-17 Fuji Xerox Co Ltd 出力装置
US6549681B1 (en) * 1995-09-26 2003-04-15 Canon Kabushiki Kaisha Image synthesization method
US6201571B1 (en) * 1996-06-13 2001-03-13 Nec Corporation Digital camera recording a reduced image synthesized with a character image of the image picking-up information
US6023556A (en) * 1997-01-29 2000-02-08 Gammagrapnx, Inc. Processing print job image data
US6233066B1 (en) * 1997-08-06 2001-05-15 Matsushita Electric Industrial Co., Ltd. Image processing apparatus, method for image processing, and image reader apparatus
US6424752B1 (en) * 1997-10-06 2002-07-23 Canon Kabushiki Kaisha Image synthesis apparatus and image synthesis method
US6484149B1 (en) * 1997-10-10 2002-11-19 Microsoft Corporation Systems and methods for viewing product information, and methods for generating web pages
US6049390A (en) * 1997-11-05 2000-04-11 Barco Graphics Nv Compressed merging of raster images for high speed digital printing
US6744471B1 (en) * 1997-12-05 2004-06-01 Olympus Optical Co., Ltd Electronic camera that synthesizes two images taken under different exposures
US6219454B1 (en) * 1997-12-08 2001-04-17 Fuji Xerox Co., Ltd. Image processing apparatus
US6968115B2 (en) * 1998-01-26 2005-11-22 Canon Kabushiki Kaisha Editing-function-integrated reproducing apparatus
JPH11275299A (ja) 1998-03-19 1999-10-08 Canon Inc 画像読取装置及び画像読取装置における動作指示通知方法
JP2000101823A (ja) 1998-07-24 2000-04-07 Fuji Xerox Co Ltd 画像処理装置
US6628419B1 (en) * 1999-03-12 2003-09-30 Fuji Xerox Co., Ltd. Image forming device
JP2001127989A (ja) 1999-11-01 2001-05-11 Fuji Xerox Co Ltd 画像形成装置
JP2001136322A (ja) 1999-11-10 2001-05-18 Konica Corp 画像形成装置、情報処理装置、画像形成システムおよび画像ファイルを記憶する記憶媒体
JP2001211316A (ja) 2000-01-27 2001-08-03 Canon Inc 画像処理装置及び画像処理方法、記憶媒体及び画像処理システム
US20020051230A1 (en) 2000-01-27 2002-05-02 Kenichi Ohta Image processing apparatus, image processing method, and storage medium
US7013289B2 (en) * 2001-02-21 2006-03-14 Michel Horn Global electronic commerce system
US7027054B1 (en) * 2002-08-14 2006-04-11 Avaworks, Incorporated Do-it-yourself photo realistic talking head creation system and method

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030160991A1 (en) * 2002-02-27 2003-08-28 Masatoshi Kadota Spool file modifying device
US8098385B2 (en) * 2002-02-27 2012-01-17 Brother Kogyo Kabushiki Kaisha Spool file modifying device
US20040150854A1 (en) * 2003-01-31 2004-08-05 Sprague Mary Ann Systems and methods for creating a single electronic scanned job from multiple scanned documents
US7443548B2 (en) * 2003-01-31 2008-10-28 Xerox Corporation Systems and methods for creating a single electronic scanned job from multiple scanned documents
US20050024666A1 (en) * 2003-05-08 2005-02-03 Maki Ohyama Copying apparatus, a program, and a storage medium
US20050012963A1 (en) * 2003-07-16 2005-01-20 Maiko Yamads Image processing apparatus, image processing method, and computer product
US20050174590A1 (en) * 2004-02-10 2005-08-11 Fuji Photo Film Co., Ltd. Image correction method, image correction apparatus, and image correction program
US7623259B2 (en) 2004-09-14 2009-11-24 Canon Kabushiki Kaisha Image processing apparatus and image processing method to store image data for subsequent retrieval
US20060056660A1 (en) * 2004-09-14 2006-03-16 Canon Kabushiki Kaisha Image processing apparatus and image processing method
US20060209348A1 (en) * 2005-03-16 2006-09-21 Kabushiki Kaisha Toshiba Image processing apparatus
US20060221411A1 (en) * 2005-03-31 2006-10-05 Canon Kabushiki Kaisha Image reading apparatus and control method of image reading apparatus
US8482785B2 (en) 2005-03-31 2013-07-09 Canon Kabushiki Kaisha Image reading apparatus and control method of image reading apparatus of automatic sheet discriminate cropping
US20070097434A1 (en) * 2005-11-01 2007-05-03 Kyocera Mita Corporation Image-forming system and image-forming program
US8023143B2 (en) * 2005-11-01 2011-09-20 Kyocera Mita Corporation Image-forming system and image-forming program
US20070250532A1 (en) * 2006-04-21 2007-10-25 Eastman Kodak Company Method for automatically generating a dynamic digital metadata record from digitized hardcopy media
US20100202711A1 (en) * 2007-07-19 2010-08-12 Sony Corporation Image processing apparatus, image processing method, and program
US8094343B2 (en) 2007-08-31 2012-01-10 Brother Kogyo Kabushiki Kaisha Image processor
US8159716B2 (en) * 2007-08-31 2012-04-17 Brother Kogyo Kabushiki Kaisha Image processing device performing image correction by using a plurality of sample images
US20090059251A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device performing image correction by using a plurality of sample images
US20090060364A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processor for converting image by using image retrieved based on keyword
US8390905B2 (en) 2007-08-31 2013-03-05 Brother Kogyo Kabushiki Kaisha Image processing device extracting desired region to be used as model for image correction
US20090059263A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processor
US8311323B2 (en) 2007-08-31 2012-11-13 Brother Kogyo Kabushiki Kaisha Image processor for converting image by using image retrieved based on keyword
US20090059256A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device outputting image for selecting sample image for image correction
US20090059257A1 (en) * 2007-08-31 2009-03-05 Brother Kogyo Kabushiki Kaisha Image processing device capable of preventing needless printing
US20090244564A1 (en) * 2007-08-31 2009-10-01 Brother Kogyo Kabushiki Kaisha Image processing device extracting desired region to be used as model for image correction
US8174731B2 (en) 2007-08-31 2012-05-08 Brother Kogyo Kabushiki Kaisha Image processing device outputting image for selecting sample image for image correction
US8284417B2 (en) 2007-08-31 2012-10-09 Brother Kogyo Kabushiki Kaisha Image processing device capable of preventing needless printing
US8050587B2 (en) * 2007-11-16 2011-11-01 Konica Minolta Business Technologies, Inc. Image forming apparatus with N-up print mode grouping and alignment
US20090129805A1 (en) * 2007-11-16 2009-05-21 Konica Minolta Business Technologies, Inc. Image forming apparatus
US8269795B2 (en) 2008-06-27 2012-09-18 Canon Kabushiki Kaisha Image output apparatus, control method, and computer-readable storage medium
US20090322792A1 (en) * 2008-06-27 2009-12-31 Canon Kabushiki Kaisha Image output apparatus, control method, and computer-readable storage medium
US10217257B1 (en) * 2015-03-17 2019-02-26 Amazon Technologies, Inc. Process for contextualizing continuous images

Also Published As

Publication number Publication date
JP2003163801A (ja) 2003-06-06
US20030098983A1 (en) 2003-05-29

Similar Documents

Publication Publication Date Title
US7308155B2 (en) Image processing apparatus, image processing method, image processing program, and storage medium
JP4688515B2 (ja) 画像処理方法及び画像処理装置
JP4261739B2 (ja) 画像処理装置、画像処理方法、記憶媒体及び画像処理システム
US7710599B2 (en) Image processing apparatus, image processing method, and image processing program
US7940434B2 (en) Image processing apparatus, image forming apparatus, method of image processing, and a computer-readable storage medium storing an image processing program
US7551796B2 (en) Image processing apparatus, image data generation and transmission method and image data generation and transmission program
EP1404104B1 (en) Method of and apparatus for processing image data, and computer product
US20070025627A1 (en) Image processing device, image compression method, image compression program, and recording medium
US6219454B1 (en) Image processing apparatus
CN101662565B (zh) 图像处理装置、图像形成装置以及图像处理方法
CN101242469B (zh) 图像处理装置
CN1984219A (zh) 图像处理设备和图像处理方法
US20050036694A1 (en) Compression of mixed raster content (MRC) image data
US20060215205A1 (en) Image processing apparatus, image processing method and image processing program
US7809199B2 (en) Image processing apparatus
US7139442B2 (en) Template matching applied to selector planes for multiple raster content (MRC) representation of documents
JP2004350240A (ja) 画像処理装置及び画像処理方法
JP3346051B2 (ja) 画像処理装置
JP3815214B2 (ja) 画像処理装置および画面処理プログラムを記録した記憶媒体
JP3738807B2 (ja) 画像処理装置及び画像処理方法
JP2001309183A (ja) 画像処理装置および方法
JP2007189275A (ja) 画像処理装置
JP3092194B2 (ja) 画像合成装置
JP2001078049A (ja) 画像処理装置および画像処理方法
JP4474001B2 (ja) 画像処理装置および方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJI XEROX CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TERADA, YOSHIHIRO;REEL/FRAME:013407/0020

Effective date: 20021007

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20151211