US5475509A - Method and apparatus for setting image processing conditions - Google Patents
Method and apparatus for setting image processing conditions Download PDFInfo
- Publication number
- US5475509A US5475509A US08/189,746 US18974694A US5475509A US 5475509 A US5475509 A US 5475509A US 18974694 A US18974694 A US 18974694A US 5475509 A US5475509 A US 5475509A
- Authority
- US
- United States
- Prior art keywords
- data
- image
- image data
- image processing
- processing conditions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
- 238000012545 processing Methods 0.000 title claims abstract description 73
- 238000000034 method Methods 0.000 title claims description 17
- 238000012937 correction Methods 0.000 claims abstract description 18
- 238000011156 evaluation Methods 0.000 claims description 11
- 239000000975 dye Substances 0.000 description 9
- 238000000926 separation method Methods 0.000 description 9
- 238000001444 catalytic combustion detection Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 6
- 238000007781 pre-processing Methods 0.000 description 6
- 239000007787 solid Substances 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000001186 cumulative effect Effects 0.000 description 3
- 238000003705 background correction Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004042 decolorization Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000002546 full scan Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000009966 trimming Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/62—Retouching, i.e. modification of isolated colours only or in isolated picture areas only
- H04N1/622—Retouching, i.e. modification of isolated colours only or in isolated picture areas only with simulation on a subsidiary picture reproducer
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/52—Circuits or arrangements for halftone screening
Definitions
- This invention relates to a method and an apparatus for setting processing conditions of an image so that the optimum image that satisfies desired finishing requirements is obtained.
- An image processing system such as an image reading, recording and reproducing system has been widely used in the printing and platemaking industries, for example, for processing image information recorded on an original or subject to be read to produce original film plates, with an intention of simplifying the entire process and improving the quality of printed images.
- image processing conditions are set to effect desired processing on image data obtained by scanning an original.
- scanning conditions such as magnification, trimming ranges, number of output lines, halftone angles, are set according to pre-scanned image data obtained by roughly scanning the original. Further, finishing requirements with respect to such as adjustment of gray, adjustment of a human skin color and adjustment of brightness, are set.
- the original is scanned thereafter in detail to obtain full-scanned image data (see Japanese Laid-Open Patent Publication No. 4-111575).
- an object of the present invention is to provide a method and an apparatus for setting image processing conditions under which an image is adjusted in accordance with specified optimum finishing requirements.
- a method of setting image processing conditions comprising a first step of specifying at least one particular point to be corrected in image data and specifying finishing requirements to be met at the respective specified points, a second step of processing the image data at the specified points under predetermined image processing conditions to obtain processed image data, a third step of comparing the processed image data with target processed data that meet the finishing requirements to revise the image processing conditions based on the result of comparison, and a fourth step of repeating the second and third steps based on the corrected image processing conditions, if it is needed.
- an apparatus for setting image processing conditions comprising image displaying means for displaying an image comprised of image data, specifying means for specifying at least one particular point to be corrected in the image displayed on said image displaying means and specifying finishing requirements to be met at the respective specified points, storing means for storing target processed data respectively satisfying the finishing requirements, comparing means for comparing processed image data with the target processed data, the processed image data being obtained by processing the image data at the specified point under predetermined image processing conditions, and correcting means for correcting the image processing conditions based on the result of comparison made by said comparing means.
- finishing requirements to be met at each of specified points of image data are set.
- the image data at each of the specified points is first processed under predetermined image processing conditions. It is then determined whether the resultant processed image data are near respective target processed data that satisfy the finishing requirements. If the answer is negative, then the image processing conditions are corrected and the image processing is repeated to obtain processed image data again. Finally, image processing conditions that give a desired image can be set.
- FIG. 1 is a block diagram showing the structure of an image reading, recording and reproducing system to which a method and an apparatus for setting image processing conditions, according to the present invention, is applied;
- FIG. 2 is a view schematically showing the structure of a reflective scanner shown in FIG. 1;
- FIG. 3 is a flowchart for describing an entire operation of image processing executed by the image reading, recording and reproducing system shown in FIG. 1;
- FIG. 4 is a general flowchart for describing the image processing executed by the image reading, recording and reproducing system shown in FIG. 1;
- FIG. 5 is a view for explaining a picture displayed on a console screen employed in the image reading, recording and reproducing system shown in FIG. 1;
- FIG. 6 is a detailed flowchart for describing the setting of image processing conditions according to the general flowchart shown in FIG. 4;
- FIG. 7 is a view for explaining the relationship of ideal halftone % data to a finishing requirement corresponding to "fine skin";
- FIG. 8 is a view for describing the relationship of ideal halftone % data to a finishing requirement corresponding to "uniform gray";
- FIG. 9 is a view for describing the relationship of ideal halftone % data to a finishing requirement corresponding to "beautiful sky";
- FIG. 10 is a view for describing the relationship of ideal halftone % data to a finishing requirement corresponding to "beautiful green".
- FIG. 1 is a block diagram showing the structure of an image reading, recording and reproducing system to which the method and apparatus for setting image processing conditions according to the present invention is applied.
- the image reading, recording and reproducing system basically comprises a reflective scanner 10 for reading image information recorded on a reflective original, a transmissive scanner 12 for reading image information recorded on a transmissive original and at the same time, controlling the reflective scanner 10 and effecting desired image processing on image data supplied therefrom, a console 14 for operating the transmissive scanner 12 and displaying an image supplied therefrom, and an output device 16 for outputting image data subjected to the desired image processing as a film original plate.
- the reflective scanner 10 is constructed as shown in FIG. 2, and provided with an original placement table 20 attached to an upper portion of a casing 18 and made up of light-transmissive glass.
- a reflective original S pressed by an original pressing plate 22 is placed on the original placement table 20.
- the original placement table 20 is provided side by side with a shading reference plate 25 for effecting a shading correction on a reading optical system.
- a light source 24 for irradiating light to the reflective original S, and at the same time, receiving light reflected by the reflective original S, a moving mirror 26 for changing an optical path of the reflected light, an ND filter 28 for adjusting the densities of highlights and shadows of the reflected light, a focusing lens 30, and a light-receiving device 32.
- the light source 24 has two lamps 34a and 34b elongated in a main scanning direction perpendicular to the sheet shown in FIG. 2, and a reflecting mirror 36.
- the moving mirror 26 has two reflecting mirrors 38a and 38b.
- the light-receiving device 32 is constituted by three prisms 40a through 40c and CCDs 42a through 42c fixed to their corresponding prisms 40a through 40c.
- the prisms 40a through 40c separate the light reflected from the reflective original S into the three primary colors of R, G and B and introduce them into their corresponding CCDs 42a through 42c.
- the light source 24 and the moving mirror 26 are movable in an auxiliary or sub scanning direction, indicated by an arrow, by a conveying motor 44.
- the light source 24 is movable at a speed twice the speed of the moving mirror 26.
- the reflective scanner 10 also has a control circuit 46, which controls the entire operation and effects the control of transfer of image information obtained from the respective CCDs 42a through 42c to the transmissive scanner 12.
- the transmissive scanner 12 comprises a CPU 48 (which serves as a comparing means) for carrying out the entire control, an image reading or scanning device 49 for reading image information from an unillustrated transmissive original, an input/output control circuit 50 for effecting the transfer of a signal between the reflective scanner 10 and the transmissive scanner 12, a pre-processing circuit 52 for effecting pre-processing prior to the execution of the image processing on image data, an image processing conditions correcting circuit 56 (correcting means) for carrying out the correction of image processing conditions, and an image processing circuit 58 for performing desired image processing based on the image processing conditions.
- a CPU 48 which serves as a comparing means for carrying out the entire control
- an image reading or scanning device 49 for reading image information from an unillustrated transmissive original
- an input/output control circuit 50 for effecting the transfer of a signal between the reflective scanner 10 and the transmissive scanner 12
- a pre-processing circuit 52 for effecting pre-processing prior to the execution of the image processing on image data
- An image buffer 54 for temporarily storing image data therein and an END (Equivalent Neutral Density) LUT (look-up table) storage device 62 for storing therein an ENDLUT for carrying out the conversion of an END into another are connected to the pre-processing circuit 52.
- a parameter storage device 60 for storing therein respective set-up parameters corresponding to the image processing conditions and a gradation LUT storage device 64 (storing means) for storing therein a reference gradation curve used as a look-up table, are connected to the image processing conditions correcting circuit 56.
- the parameter storage device 60 and the gradation LUT storage device 64 are connected to each other via a bus 66.
- the console 14 is also connected to the bus 66.
- the console 14 has a video buffer 68 and a controller 70.
- the controller 70 controls the supply of an output to a CRT display unit 72 connected to the console 14 and processes data inputted via a keyboard 74 and a mouse 76.
- the output device 16 is connected to the image processing circuit 58 and has a function of recording an image on a film based on image data supplied from the image processing circuit 58.
- the image reading, recording and reproducing system employed in the present embodiment is basically constructed as described above. The operation of the image scanning recording and reproducing system and its effects will next be described below.
- the reflective scanner 10 When the reflective scanner 10 is used, the reflective original S is set on the original placement table 20, where the image reading surface is rendered flat by the original pressing plate 22.
- the reflective scanner 10 reads an image using the transmissive scanner 12 in accordance with instructions given via the keyboard 74 of the console 14. Accordingly, the conveying motor 44 is driven to cause the light source 24 and the moving mirror 26 to start moving.
- the light source 24 first irradiates the shading reference plate 25 with light rays emitted from each of the lamps 34a and 34b and introduces the light reflected from the shading reference plate 25 into each of the CCDs 42a through 42c via the reflecting mirrors 36, 38a and 38b, the ND filter 28, the focusing lens 30 and the prisms 40a through 40c.
- the CCDs 42a through 42c photoelectrically convert the reflected light into R, G and B signals and supply them to the control circuit 46 where a shading correction is made. That is, the control circuit 46 compensates for shadings caused by characteristics of the focusing lens 30 and the CCDs 42a through 42c based on the R, G and B signals obtained by scanning the shading reference plate 25.
- the CCDs 42a through 42c photoelectrically convert the image information into the R, G and B signals and supply the same to the control circuit 46 (Step S1).
- the control circuit 46 transmits the R, G and B signals to the transmissive scanner 12.
- the R, G and B signals received via the input/output control circuit 50 are subjected to such as logarithmic conversion in the pre-processing circuit 52 under the action of the CPU 48, and R, G and B density data are produced (Step S2).
- the produced R, G and B density data are temporarily stored in the image buffer 54.
- the pre-processing circuit 52 effects END conversion on the R, G and B density data based on the ENDLUT prepared corresponding to the amount of dyes in the reflective original S and stored in the ENDLUT storage device 62, and dye amount data of Y, M and C are produced (Step S3).
- the image processing circuit 58 effects picture adjustment for adjusting the densities of highlights and shadows of the dye amount data of Y, M and C to respective preset reference values (Step S4). Thereafter, image processes such as gradation (Step S5), color correction (Step S6), UCR (Under Color Removal), and sharpness enhancement (Step S7), are performed by the image processing circuit 58 based on the image processing conditions which have been corrected by the image processing conditions correcting circuit 56, as described later, and halftone % data is produced, which is outputted to the output device 16.
- image processes such as gradation (Step S5), color correction (Step S6), UCR (Under Color Removal), and sharpness enhancement (Step S7), are performed by the image processing circuit 58 based on the image processing conditions which have been corrected by the image processing conditions correcting circuit 56, as described later, and halftone % data is produced, which is outputted to the output device 16.
- R, G and B signals read by the image scanning device 49 are transmitted to the pre-processing circuit 52 via the input/output control circuit 50, where these signals are processed in a manner similar to the reflective original S.
- the output device 16 converts the halftone % data outputted from the image processing circuit 58 into halftone data for on/off-controlling a laser beam (Step S8), and a desired image is recorded on a film according to the halftone data (Step S9).
- the image reading, recording and reproducing system is operated as described above to produce the original film plate. A description will next be made of how the image processing conditions are set in the image reading, recording and reproducing system.
- the reflective original S (or the transmissive original) is set to the reflective scanner 10 (or the transmissive scanner 12) as described in a flowchart of FIG. 4 (Step S20). Thereafter, pre-scanning for roughly reading image information recorded on the reflective original S (or the transmissive original) is first performed (Step S21).
- the transmissive scanner 12 automatically sets a part of image processing conditions based on the pre-scanned image data obtained by the pre-scanning.
- the image processing conditions are used to control respective processes such as picture adjustment, gradation, color correction, UCR, or sharpness enhancement.
- the transmissive scanner 12 can automatically set the part of the image processing conditions according to fuzzy inference rules. That is, the pre-scanned image data is stored in the image buffer 54 and displayed on the CRT display unit 72 connected to the console 14 and characteristic values of the pre-scanned image data are calculated.
- a cumulative histogram of density (cumulative percentage vs density) in each region of the screen divided into a predetermined number of sectors (e.g., 1/4 or 1/8), an average density, a maximum peak density, or cumulative density histograms of Y, M and C in each of the regions, are examples of characteristic values.
- Originals classifying information as the part of the image processing conditions is determined in accordance with the fuzzy inference rules describing the relationship between these characteristic values and the original characteristics to be sought, for example:
- an entire range of the average density is classified into seven categories (a) through (g), under any one of which every pre-scanned image data will fall.
- information such as highkey or lowkey, existence or not of a highlight point, existence or not of a particular pattern of the picture, a skin color or a color fog, are other examples of the originals classifying information.
- Fuzzy inference rules similar to the above can be applied to derive other kinds of originals classifying information.
- fuzzy rules on the basis of the originals classifying information referred to above, establish set-up parameters as other part of image processing conditions:
- the set-up parameters can also be set manually in relation to finishing requirements (Step S22).
- the reflective scanner 10 or the transmissive scanner 12
- the reflective scanner 10 then full-scans the reflective original S (or the transmissive original) to read the image information recorded thereon in detail (Step S23).
- the image processing circuit 58 effects desired image processing on full-scanned image data under the image processing conditions set in Step S22. Thereafter, the image processing circuit 58 transmits the so-processed image data to the output device 16 so as to reproduce and output an image (Step S24).
- a picture shown in FIG. 5 is displayed on the screen of the CRT display unit 72 connected to the console 14. That is, the image obtained by the pre-scanning in Step S21 is displayed on the screen referred to above. Further, finishing requirements selected by an operator are also displayed on the screen.
- the finishing requirements are classified into brightness instructions and finishing instructions, for example. Any one of the brightness instructions such as “brighten”, “brighten slightly”, “darken”, “darken slightly” and “as carried by original” can be selected by the mouse 76. Any one of the finishing instructions such as “fine skin” “make uniform gray” "beautiful sky” and “beautiful green” can be selected.
- An initial set-up parameter corresponding to the correction amount from the standard gradation curve stored in the gradation LUT storage device 64 is first set by using the aforementioned fuzzy inference rules and stored in the parameter storage device 60 (Step S30). Then, the operator selects the term "fine skin” on the screen shown in FIG. 5 using the mouse 76 (Step S31), and specifies, on the pre-scanned image, one or more points to be finished as "fine skin” (Step S32).
- ideal halftone % data corresponding to the term "fine skin” specified by the operator is read from the gradation LUT storage device 64 and target halftone % data closest to the halftone % data at the specified points are calculated (Step S34).
- the ideal halftone % data corresponding to the term "fine skin” is defined as data on a surface ⁇ surrounded by points a 1 , a 2 and a 3 in FIG. 7.
- the target halftone % data are defined as the ideal halftone % data y j , m j and c j in which an index r i defined by the following equation (1) is minimized.
- the halftone % data and the target halftone % data obtained by carrying out the above calculation at respective specified points are averaged for each of the Y, M and C dyes.
- Step S36 the halftone % data at respective specified points are re-calculated based on the gradation curve obtained using the so-corrected set-up parameters in a manner similar to Step S33 (Step S36).
- Step S37 the halftone % data obtained by correcting the set-up parameters is evaluated. That is, let an evaluation function f y represent how far the halftone % data for Y at one of the specified points is away from the target halftone % data as a whole, wherein the halftone % data for Y is determined before the set-up parameter about the middle separation (MS) is changed. Then, define another evaluation function f y ' which is obtained after the set-up parameter about the middle separation (MS) is corrected.
- Respective evaluation functions are represented by the following equations (2) and (3): ##EQU1## where n: number of specified points y i : halftone % data about hue of Y determined before MS correction at i-th specified point
- Y i0 target halftone % data about hue of Y at i-th specified point
- y i ' target halftone % data about hue of Y at i-th specified point determined after MS correction
- r i ' index for hue of Y determined after MS correction at i-th specified point.
- the weighted factor w i is used to correct differences between the hues, which are automatically set so as to eliminate the differences.
- the halftone % data about Y at one of the specified points is judged as being in the near of the target halftone % data as a whole, and a corrected value of a set-up parameter about the middle separation (MS) is therefore judged as being proper. Then, the corrected value is set to the parameter storage device 60.
- the steps S35 through S37 are repeatedly executed to repeatedly correct the set-up parameters about the middle separation (MS).
- the set-up parameters about the middle separation (MS) are set to the parameter storage device 60 in due form.
- Step S38 When the term "fine skin” is next selected as the finishing instruction (Step S38), a fine adjustment is effected on each of the hues of Y, M and C. Since the skin color is comprised principally of the hues of Y and R, only pre-scanned image data decided as having the hue of R within a predetermined extent are first selected. Then, color correction is performed on the dye amount data of Y, M and C of the extracted pre-scanned image data (Step S39). Further, processes similar to those in Steps S36 and S37 are executed (Steps S40 and S41) to thereby correct set-up parameters with respect to the amount of dyes referred to above. Similarly, the dye amount data are corrected and set-up parameters are set with respect to pre-scanned image data having the hue of Y.
- the ideal halftone % data corresponding to the finishing requirement of "make uniform gray” is defined as data on a curved line segment ⁇ between points b 1 and b 2 in FIG. 8.
- set-up parameters can be set in a manner similar to the case when the term "fine skin” is selected.
- the ideal halftone % data corresponding to the finishing requirement of "beautiful sky” is defined by two data: data on a line segment ⁇ 1 formed between points c 1 and c 2 in FIG. 9 and data on a surface y 2 surrounded by points c 2 , c 3 and c 4 in FIG. 9.
- the target halftone % data is calculated in Step S34 of FIG. 6 in the following manner. That is, the ideal halftone % data corresponding to the term “beautiful green” is defined as the data existing within a solid ⁇ surrounded by points d 1 , d 2 , d 3 , d 4 , d 5 and d 6 in FIG. 10. Accordingly, the ideal halftone % data on the surface of the solid ⁇ , which is closest to the halftone % data at a specified point instructed by the operator, is determined in accordance with the equation (1). The determined data is set as temporary target halftone % data.
- n number of specified points
- Y ij0 target halftone % data about hue of Y at ith specified point.
- the evaluation function represented by the equation (3) is also replaced with a new evaluation function f' in the same manner as described above. All the finishing requirements can be set and satisfied by judging the corrected set-up parameters based on the new evaluation functions f and f'.
- the reflective original S is full-scanned (Step S23).
- the gradation and the color correction are effected on the resultant full-scanned image data based on the image processing conditions, thereby making it possible to obtain high-accuracy image data in which the finishing requirements are met.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Facsimile Image Signal Circuits (AREA)
- Color Image Communication Systems (AREA)
- Preparing Plates And Mask In Photomechanical Process (AREA)
- Control Or Security For Electrophotography (AREA)
- Processing Or Creating Images (AREA)
- Image Processing (AREA)
Abstract
Description
r.sub.i.sup.2 =(y.sub.i -y.sub.j).sup.2 +(m.sub.i -m.sub.j).sup.2 +(c.sub.i -c.sub.j) .sup.2 . . . (1)
Claims (3)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP1554693 | 1993-02-02 | ||
JP5-015546 | 1993-02-02 | ||
JP32258093A JP3335240B2 (en) | 1993-02-02 | 1993-12-21 | Image processing condition setting method and apparatus |
JP5-322580 | 1993-12-21 |
Publications (1)
Publication Number | Publication Date |
---|---|
US5475509A true US5475509A (en) | 1995-12-12 |
Family
ID=26351716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/189,746 Expired - Lifetime US5475509A (en) | 1993-02-02 | 1994-02-01 | Method and apparatus for setting image processing conditions |
Country Status (2)
Country | Link |
---|---|
US (1) | US5475509A (en) |
JP (1) | JP3335240B2 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5694484A (en) * | 1995-05-15 | 1997-12-02 | Polaroid Corporation | System and method for automatically processing image data to provide images of optimal perceptual quality |
US5734801A (en) * | 1994-12-28 | 1998-03-31 | Fuji Photo Film Co., Ltd. | Method of and apparatus for producing color proof |
US5745262A (en) * | 1995-10-16 | 1998-04-28 | Fuji Photo Film Co., Ltd. | Image read-out and processing apparatus having a parameter correction circuit |
US6072588A (en) * | 1996-04-02 | 2000-06-06 | Fuji Photo Film Co., Ltd. | Method of generating proof data and method of generating proof |
US20030053160A1 (en) * | 2001-09-20 | 2003-03-20 | Umax Data Systems Inc. | Scanning method and scanning system free of identifying original's attribute |
US20040133531A1 (en) * | 2003-01-06 | 2004-07-08 | Dingding Chen | Neural network training data selection using memory reduced cluster analysis for field model development |
US6954213B1 (en) | 1995-10-02 | 2005-10-11 | Canon Kabushiki Kaisha | Objective automated color matching between input and output devices |
EP1619875A1 (en) * | 1997-06-17 | 2006-01-25 | Seiko Epson Corporation | Image processing apparatus, image processing method, color adjustment method, and color adjusment system |
US7038713B1 (en) * | 1997-09-09 | 2006-05-02 | Fuji Photo Film Co., Ltd. | Image processing apparatus |
US7119923B1 (en) * | 1999-07-23 | 2006-10-10 | Fuji Photo Film Co., Ltd. | Apparatus and method for image processing |
US20070011114A1 (en) * | 2005-06-24 | 2007-01-11 | Halliburton Energy Services, Inc. | Ensembles of neural networks with different input sets |
US20070011115A1 (en) * | 2005-06-24 | 2007-01-11 | Halliburton Energy Services, Inc. | Well logging with reduced usage of radioisotopic sources |
US20070165286A1 (en) * | 2006-01-18 | 2007-07-19 | Mitsubishi Denki Kabushiki Kaisha | Image reading apparatus |
US20070258115A1 (en) * | 2001-08-08 | 2007-11-08 | Transpacific Ip, Ltd. | Method of shortening multiple-image scanning duration |
US20080228680A1 (en) * | 2007-03-14 | 2008-09-18 | Halliburton Energy Services Inc. | Neural-Network Based Surrogate Model Construction Methods and Applications Thereof |
US20100013940A1 (en) * | 2005-06-21 | 2010-01-21 | Nittoh Kogaku K.K. | Image processing apparatus |
US20100040281A1 (en) * | 2008-08-12 | 2010-02-18 | Halliburton Energy Services, Inc. | Systems and Methods Employing Cooperative Optimization-Based Dimensionality Reduction |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6618170B1 (en) * | 1999-05-14 | 2003-09-09 | Xerox Corporation | User interface comprising hue shift control for color printing |
JP2001251505A (en) * | 1999-12-28 | 2001-09-14 | Fuji Photo Film Co Ltd | Image processing condition setting device and storage medium for image processing condition setting program |
US7212311B2 (en) | 1999-12-28 | 2007-05-01 | Fujifilm Corporation | Image processing condition setting apparatus and image processing condition setting program storage medium |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04111575A (en) * | 1990-08-30 | 1992-04-13 | Fuji Photo Film Co Ltd | Picture processor |
US5155588A (en) * | 1990-02-06 | 1992-10-13 | Levien Raphael L | Color correction and apparatus for photographic reproduction |
-
1993
- 1993-12-21 JP JP32258093A patent/JP3335240B2/en not_active Expired - Fee Related
-
1994
- 1994-02-01 US US08/189,746 patent/US5475509A/en not_active Expired - Lifetime
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5155588A (en) * | 1990-02-06 | 1992-10-13 | Levien Raphael L | Color correction and apparatus for photographic reproduction |
JPH04111575A (en) * | 1990-08-30 | 1992-04-13 | Fuji Photo Film Co Ltd | Picture processor |
Cited By (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5734801A (en) * | 1994-12-28 | 1998-03-31 | Fuji Photo Film Co., Ltd. | Method of and apparatus for producing color proof |
US5694484A (en) * | 1995-05-15 | 1997-12-02 | Polaroid Corporation | System and method for automatically processing image data to provide images of optimal perceptual quality |
US6954213B1 (en) | 1995-10-02 | 2005-10-11 | Canon Kabushiki Kaisha | Objective automated color matching between input and output devices |
US5745262A (en) * | 1995-10-16 | 1998-04-28 | Fuji Photo Film Co., Ltd. | Image read-out and processing apparatus having a parameter correction circuit |
US6072588A (en) * | 1996-04-02 | 2000-06-06 | Fuji Photo Film Co., Ltd. | Method of generating proof data and method of generating proof |
US7286265B2 (en) | 1997-06-17 | 2007-10-23 | Seiko Epson Corporation | Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium |
US7292371B2 (en) | 1997-06-17 | 2007-11-06 | Seiko Epson Corporation | Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium |
EP1619875A1 (en) * | 1997-06-17 | 2006-01-25 | Seiko Epson Corporation | Image processing apparatus, image processing method, color adjustment method, and color adjusment system |
US7072074B2 (en) | 1997-06-17 | 2006-07-04 | Seiko Epson Corporation | Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium |
US20060203298A1 (en) * | 1997-06-17 | 2006-09-14 | Seiko Epson Corporation | Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium |
US20060203297A1 (en) * | 1997-06-17 | 2006-09-14 | Seiko Epson Corporation | Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium |
US7038713B1 (en) * | 1997-09-09 | 2006-05-02 | Fuji Photo Film Co., Ltd. | Image processing apparatus |
US7119923B1 (en) * | 1999-07-23 | 2006-10-10 | Fuji Photo Film Co., Ltd. | Apparatus and method for image processing |
US7733541B2 (en) * | 2001-08-08 | 2010-06-08 | Rong-Ji Liu | Method of shortening multiple-image scanning duration |
US20070258115A1 (en) * | 2001-08-08 | 2007-11-08 | Transpacific Ip, Ltd. | Method of shortening multiple-image scanning duration |
US7440146B2 (en) * | 2001-09-20 | 2008-10-21 | Transpacific Ip, Llp | Scanning method and scanning system free of identifying original's attribute |
US20030053160A1 (en) * | 2001-09-20 | 2003-03-20 | Umax Data Systems Inc. | Scanning method and scanning system free of identifying original's attribute |
US8374974B2 (en) * | 2003-01-06 | 2013-02-12 | Halliburton Energy Services, Inc. | Neural network training data selection using memory reduced cluster analysis for field model development |
US20040133531A1 (en) * | 2003-01-06 | 2004-07-08 | Dingding Chen | Neural network training data selection using memory reduced cluster analysis for field model development |
US20100013940A1 (en) * | 2005-06-21 | 2010-01-21 | Nittoh Kogaku K.K. | Image processing apparatus |
US20070011115A1 (en) * | 2005-06-24 | 2007-01-11 | Halliburton Energy Services, Inc. | Well logging with reduced usage of radioisotopic sources |
US7587373B2 (en) | 2005-06-24 | 2009-09-08 | Halliburton Energy Services, Inc. | Neural network based well log synthesis with reduced usage of radioisotopic sources |
US7613665B2 (en) | 2005-06-24 | 2009-11-03 | Halliburton Energy Services, Inc. | Ensembles of neural networks with different input sets |
US20070011114A1 (en) * | 2005-06-24 | 2007-01-11 | Halliburton Energy Services, Inc. | Ensembles of neural networks with different input sets |
US20070165286A1 (en) * | 2006-01-18 | 2007-07-19 | Mitsubishi Denki Kabushiki Kaisha | Image reading apparatus |
US7859726B2 (en) * | 2006-01-18 | 2010-12-28 | Mitsubishi Denki Kabushiki Kaisha | Image reading apparatus |
US20080228680A1 (en) * | 2007-03-14 | 2008-09-18 | Halliburton Energy Services Inc. | Neural-Network Based Surrogate Model Construction Methods and Applications Thereof |
US8065244B2 (en) | 2007-03-14 | 2011-11-22 | Halliburton Energy Services, Inc. | Neural-network based surrogate model construction methods and applications thereof |
US20100040281A1 (en) * | 2008-08-12 | 2010-02-18 | Halliburton Energy Services, Inc. | Systems and Methods Employing Cooperative Optimization-Based Dimensionality Reduction |
US9514388B2 (en) | 2008-08-12 | 2016-12-06 | Halliburton Energy Services, Inc. | Systems and methods employing cooperative optimization-based dimensionality reduction |
Also Published As
Publication number | Publication date |
---|---|
JPH06291998A (en) | 1994-10-18 |
JP3335240B2 (en) | 2002-10-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5475509A (en) | Method and apparatus for setting image processing conditions | |
US6674544B2 (en) | Image processing method and apparatus | |
US5053888A (en) | Method of and apparatus for establishing highlight and shadow densities | |
US7127108B2 (en) | Image processing method | |
JP3256982B2 (en) | Image processing device | |
US5652663A (en) | Preview buffer for electronic scanner | |
US5515172A (en) | Apparatus and method for enhanced color to color conversion | |
US6975437B2 (en) | Method, apparatus and recording medium for color correction | |
US5359437A (en) | Method for under-color removal in color image forming apparatus | |
US6122076A (en) | Image reproducing method and apparatus including compressing a dynamic range of a read-out image signal | |
US5973802A (en) | Image reproducing apparatus for forming either of a color image or a monochromatic image | |
JPH0686045A (en) | Texture picture processing system for picture processor | |
JPH09182093A (en) | Image reproducing method and device | |
JPH07101917B2 (en) | Area controller | |
JPH11266358A (en) | Image processing method | |
US5710840A (en) | Image processing method and apparatus for adjusting the tone density of pixels based upon differences between the tone density of a center pixel and tone densities of peripheral pixels | |
EP0940773B1 (en) | Image processing method and image processing apparatus | |
US5850297A (en) | Image reading apparatus for reading first and second images | |
JP4083823B2 (en) | Image processing device | |
US20020012126A1 (en) | Image processing method and apparatus | |
JP4132586B2 (en) | Image processing condition setting method | |
JPH1013680A (en) | Image processing method and image processor | |
JPH0738757A (en) | Image processor for correcting density by areas | |
JPH06233076A (en) | Method and device for extracting picture outline | |
JPH1013683A (en) | Image input method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: FUJI PHOTO FILM CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:OKAMOTO, TAKAHIRO;REEL/FRAME:006864/0031 Effective date: 19940119 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
AS | Assignment |
Owner name: FUJIFILM CORPORATION, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO., LTD.);REEL/FRAME:018904/0001 Effective date: 20070130 Owner name: FUJIFILM CORPORATION,JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO., LTD.);REEL/FRAME:018904/0001 Effective date: 20070130 |
|
FPAY | Fee payment |
Year of fee payment: 12 |