US20050041103A1 - Image processing method, image processing apparatus and image processing program - Google Patents

Image processing method, image processing apparatus and image processing program Download PDF

Info

Publication number
US20050041103A1
US20050041103A1 US10919314 US91931404A US2005041103A1 US 20050041103 A1 US20050041103 A1 US 20050041103A1 US 10919314 US10919314 US 10919314 US 91931404 A US91931404 A US 91931404A US 2005041103 A1 US2005041103 A1 US 2005041103A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
image
objects
image processing
images
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10919314
Inventor
Naoto Kinjo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Corp
Original Assignee
Fujifilm Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • H04N1/00244Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00342Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with a radio frequency tag transmitter or receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device
    • H04N1/32101Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N1/32128Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image
    • H04N1/32133Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image on the same paper sheet, e.g. a facsimile page header
    • H04N1/32138Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title attached to the image data, e.g. file header, transmitted message header, information on the same page or in the same computer file as the image on the same paper sheet, e.g. a facsimile page header in an electronic device attached to the sheet, e.g. in an RFID tag
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0008Connection or combination of a still picture apparatus with another apparatus
    • H04N2201/001Sharing resources, e.g. processing power or memory, with a connected apparatus or enhancing the capability of the still picture apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3225Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title of data relating to an image, a page or a document
    • H04N2201/3253Position information, e.g. geographical position at time of capture, GPS data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/32Circuits or arrangements for control or supervision between transmitter and receiver or between image input and image output device
    • H04N2201/3201Display, printing, storage or transmission of additional information, e.g. ID code, date and time or title
    • H04N2201/3274Storage or retrieval of prestored additional information

Abstract

Using an image processing program, a personal computer identifies objects photographed in individual images, on the basis of additional data appended to image data of those images. The image data and the additional data are obtained and written on a memory by a digital camera, and transferred to the personal computer. When the user selects a part to correct in an image, plural levels of categories are displayed on a monitor in accordance with the object corresponding to the selected part. The user selects a category from among the displayed categories, and designates contents and parameters of image correction. Then, any parts of the image which correspond to those objects included in the selected category are automatically corrected in the designated manner. If necessary, also those parts of other relating images which correspond to the objects included in the selected category are automatically corrected in the designated manner.

Description

    FIELD OF THE INVENTION
  • The present invention relates to an image processing method for correcting color and other factors to improve quality of images photographed by digital cameras or the like, and an image processing apparatus and an image processing program for that method.
  • BACKGROUND ARTS
  • As the digital cameras are being widely used, an increasing number of users begin to process the photographed images on their personal computers for the sake of correcting the color or the quality of the images.
  • In order for those users to process the images with ease, Japanese Laid-open Patent Application Hei 11-275351 suggests an image processing method for processing a series of image frames which are related to each other, wherein a particular part of a first one of the series of image frames is designated to be corrected, and image characteristic values of the particular part are memorized before the correction. After this part is corrected, the contents of the correction are memorized. Then, those parts having similar image characteristic values to the memorized image characteristic values are extracted from other image frames of the series, as similar part to the particular part. Then the same image correction is automatically done on these similar parts, as on the particular part of the first image frame. According to this prior art, image data of similar parts of a number of relating image frames are processed with ease at a high efficiency.
  • Japanese Laid-open Patent Application No. 2001-238177 discloses an image processing method for processing each image frame in accordance with what kind of photographic scene the image frame may be classified into. In this prior art, camera data such as the position of photography are obtained or entered at the photography of each subject and, if necessary, data relating to the kind of photographic scene is entered. Then, the kind of photographic scene is estimated with reference to at least one of the camera data and the relating data alone or in combination with the image data. How to process the image is predetermined in accordance with the photographic scenes. Because the image processing is optimized for the photographic scene of the image to be corrected, high quality images can be obtained.
  • According to the former prior art, however, because the similar parts are extracted with reference to the image characteristic values, some of the extracted parts can be unrelated to the designated particular part of the first image frame. Moreover, if there are not any similar parts in the following image frames, the user must carry out different image processing from one image frame to another. In that case, the processing efficiency would be lower than conventional.
  • Since the latter prior art automatically defines the parameters for the image processing, the processed images do not always agree with the user's memory or impression at the photography, or the user's taste or intention.
  • SUMMARY OF THE INVENTION
  • In view of the foregoing, a primary object of the present invention is to provide an image processing method that ensures designating image portions to be corrected and enables correcting images with high efficiency, and an image processing apparatus and an image processing program for that method.
  • Another object of the present invention is to provide an image processing method that can reproduce images as reflecting the user's intention, and an image processing apparatus and an image processing program for that method.
  • To achieve the above and other objects, an image processing method of the present invention comprises the steps of identifying objects corresponding to parts of a photographed image; determining plural levels of categories in accordance with one of the objects which corresponds to a selected part to correct in the image; selecting a category from among the plural levels of categories; and carrying out same image correction on the selected part and on those parts of the image and relating images to the image, which correspond to objects included in the selected category.
  • It is preferable to make it possible to choose whether to carry out the image correction on the relating images or not.
  • According to a preferred embodiment, IC tags storing identification data of the objects are appended to the objects, and the objects are identified by reading out the identification data from the IC tags.
  • According to another preferred embodiment, the objects are identified by retrieving identification data of the objects from a data base on the basis of photographic data appended to the image.
  • An image processing apparatus of the present invention comprises an identifying device for identifying objects corresponding to parts of photographed images; a determining device for determining plural levels of categories in accordance with one of the objects which corresponds to a selected part to correct in one of the images; a selecting device for selecting a category from among the plural levels of categories; and a correction device for carrying out same image correction on the selected part and on those parts of the one image and relating images to the one image, which correspond to objects included in the selected category.
  • According to a preferred embodiment, the image processing apparatus further comprises a device of allowing to choose whether to carry out the image correction on the relating images or not.
  • In a preferred embodiment, the identifying device identifies photographed objects by reading out identification data of the objects from IC tags appended to the objects.
  • In another preferred embodiment, the identifying device identifies the photographed objects by retrieving identification data of the objects from a data base on the basis of photographic data appended to the image.
  • An image processing program of the present invention comprises steps of identifying objects photographed in an image; determining plural levels of categories in accordance with an object that corresponds to a part selected to correct in the image; extracting those parts from the image, which correspond to objects included in a category selected from among the plurality of levels of categories; correcting the extracted parts in accordance with designated parameters; identifying those objects which are included in the selected category with respect to relating images to the image; and correcting corresponding parts of the relating images to the identified objects in the same way as the extracted parts of the image.
  • According to the image processing method, apparatus and program of the present invention, image parts to correct are automatically extracted with high accuracy, so that the efficiency of image correction is improved. Because the category of objects to correct and the parameters of image correction are selected or designated by the user, the image correction will reflect the user's intention and liking.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects and advantages will be more apparent from the following detailed description of the preferred embodiments when read in connection with the accompanied drawings, wherein like reference numerals designate like or corresponding parts throughout the several views, and wherein:
  • FIG. 1 is an explanatory diagram illustrating an image processing apparatus embodying the method of the present invention;
  • FIG. 2 is a block diagram illustrating the electric structure of a digital camera as a component of the image processing apparatus of the invention;
  • FIG. 3 is a block diagram illustrating the electric structure of a personal computer as a component of the image processing apparatus of the invention;
  • FIG. 4 is an explanatory diagram illustrating an image window displayed initially at the activation of an image processing program of the present invention;
  • FIG. 5 is an explanatory diagram illustrating a window for designating a category of objects to correct;
  • FIG. 6 is an explanatory diagram illustrating a window for designating contents of image correction; and
  • FIG. 7 is a flowchart illustrating the overall sequence of the image processing program.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • In FIG. 1, a personal computer 2 has an image processing program 47 (see FIG. 3) of the present invention installed therein, and a digital camera 10 is connected to the personal computer 2. The personal computer 2 is also connected to a server 12 through the Internet 11.
  • In FIG. 2, showing the electric structure of the digital camera 10, a CPU 20 supervises and controls respective parts of the digital camera 10. An imaging section 21 is constituted of a not-shown photographic lens and a not-shown CCD, wherein an optical image of a subject formed through the photographic lens is picked up as an image signal through the CCD. A signal processing circuit 22 amplifies the image signal up to a predetermined level, and then converts the image signal into digital image data. The digital image data is subjected to various kinds of image processing, such as white balance adjustment and gamma-correction.
  • The digital image data obtained through the signal processing circuit 22 is used for driving an LCD driver to display a slew of images on a liquid crystal display (LCD) 24. A random access memory (RAM) 25 stores the image data after being processed through the signal processing circuit 22.
  • The CPU 20 outputs control signals to the respective parts in response to some operations on a release button 26 and a zooming section 27. When the release button 26 is operated, the image data presently stored in the RAM 25 is compressed through the signal processing circuit 22, and the compressed data is written on a memory card 28. An external interface 29 controls signal communication between the digital camera 10 and external apparatuses such as the personal computer 2.
  • A clock circuit 30 counts clock data of the digital camera 10, and outputs the clock data to the CPU 20 during the photography. A location-and-direction detector circuit 31 detects the present location and orientation of the digital camera 10 on the basis of signals from a GPS antenna 32 that receives electromagnetic waves from satellite. The location is detected as coordinates. An IC tag sensor 33 reads out ID data of an object that is photographed in a part of the image, for example, a person, furniture, cloths or accessories, from an IC tag that is appended to the object.
  • When the image data is written on the memory card 28, additional data are written in association with the image data. The additional data include zoom ratio data or an amount of operation on the zooming section 27, date-of-photograph detected by the clock circuit 30, data of photo location and orientation that are detected by the location and orientation detector circuit 31, and the information detected by the IC tag reading sensor 33.
  • As shown in FIG. 3, the personal computer 2 is constituted of a CPU 40, a monitor 41, a keyboard 42, a mouse 43, an external interface 44, a ROM 45 and a RAM 46. The RAM 46 is installed with the image processing program 47.
  • The server 12 stores a 3D (three-dimensional) map data base used for identifying the object on the basis of the zoom ratio data and the photo location and orientation data, a data base showing a relationship between ID data read out from the IC tags and a variety of objects of photography, a data base recording plural levels of categories classifying the variety of objects, an image data base storing a number of images, and other data bases.
  • The personal computer 2 accesses the server 12 through the Internet 11, to carry out image processing with reference to the data bases stored in the server 12 in the following manner. Alternatively, the data bases may be contained in the image processing program 47. In that case, it is unnecessary for the personal computer 2 to access the server 12.
  • On the basis of the additional data written in association with the image data on the memory card 28, the image processing program 47 identifies objects that correspond to respective parts of the individual image, i.e. the objects photographed in the individual image, for example, in a method as disclosed in the above mentioned Japanese Laid-open Patent Application No. 2001-238177.
  • For example, if the photographed subject is a mountain, the mountain is identified by use of the zoom ratio data and the photo location and orientation data as detected by the location and orientation detector circuit 31 with reference to the 3D map data base. Specifically, the photographed subject is compared with those objects which exit inside a given angle of view on a map that is defined depending upon the zoom ratio data and the photo location and orientation data. That is, on the basis of the zoom ratio data and the photo location and orientation data, a 3D computer graphic (CG) image is produced from the 3D map data base in a conventional CG making method, and the CG image is compared with the actually photographed image by use of pattern matching.
  • In this example, pattern matching is carried out between a ridge line of the mountain of the photographed image, which is determined by edge-extraction based on color differences between pixels of the photographed image, and mountain ridges of the CG image produced from the 3D map data base. While shifting pixels of the CG image two-dimensionally, a point where the ridge line of the photographed mountain coincides with that of the CG image the most is determined. Based on the zoom ratio, photo location and orientation at that point, the name of the photographed mountain and its location are retrieved from the 3D map data base.
  • If the photograph is made in a town, it is possible to identify individual constructions in the same way as above. If an object is not determined to be other than a person, conventional face extraction is carried out on the image part corresponding to that object. If an image of a face is extracted, the image part is determined to be a person.
  • The mouse 43 is operated for selecting an object to correct from among plural objects photographed in an image, wherein these objects are identified by the image processing program 47. The mouse 43 is also operated for choosing a category as a subject for the image correction from several levels of categories defined by the selected object, and for designating contents of the image correction.
  • The image processing program 47 processes image data of those image parts which correspond not only to the selected object but also to those objects included in the selected category, in order to carry out the designated image correction. The designated image correction is effected not only on the image of which the object to correct is selected, but also on other images relating to this image, such as images stored in the same folder, those photographed on the same day, a series of movie images including the selected image and the like.
  • As examples of the contents of the image correction, density, color tinge, color saturation, gradation, sharpness, smoothness and size may be referred to, as shown in FIG. 6. If the selected object is a person, the contents of the image correction may include complexion of the face, color of the hair, soft-focusing and red-eye correction.
  • When the image processing program 47 is activated, the monitor 41 displays an image window 50 as shown for example in FIG. 4. In the image window 50, an image 51 read from the digital camera 10 and a cursor 52 moving in cooperation with the mouse 43 are displayed. The image 50 displayed initially may be a head one of a number of images stored in a folder, a head frame of movie images, a representative image or an image designated by the user.
  • The image 51 shown in FIG. 4 is composed of a person 53, grass 54, a first mountain 55 and a second mountain 56. These objects are identified in the way as described above. One of these objects is selected by clicking the mouse 43 while putting the cursor 52 on the object to select. It is alternatively possible to select an object by inputting the proper name or noun of that object as character data through the keyboard 42 or as voice data through a not-shown microphone of the CPU 20.
  • When the first mountain 55 is selected as the object to correct, a category designation window 60 and a content of image correction designation window 61 are displayed on the monitor 41, as shown for example in FIGS. 5 and 6. The category designation window 60 displays four categories directed to the first mountain 55: 1. proper name of the first mountain, 2. mountain, 3. plants, 4. hue. To designate the category and the content of image correction, the mouse 43 is clicked while putting the cursor 52 sequentially on respective checkboxes 62 for the items to select.
  • In the present embodiment, the numbers of the categories as listed in the same category designation window 60 increase in the order toward upper level, wherein the upper-level category covers the wider range of objects. For example, when the person 53 is selected as the object to correct, person, sexuality, age and people are displayed as categories in this order toward the upper number in the category designation window 60. When a piece of furniture is selected as the object to correct, individual articles and materials are displayed as categories. As categories for the cloths and accessories, individual articles, type, density, hue and so on are displayed. In addition, the sun or the shade, or the season may be included in available categories.
  • If the category “1. FIRST MOUNTAIN” is selected in the category designation window 60, image correction is carried out on a part of the image 51 which correspond to the first mountain 55. If there are relating images to the image 51, image parts of these images which correspond to the first mountain are subjected to the image correction. If the category “2. MOUNTAIN” is selected in the category designation window 60, image correction is carried out on those parts of the image 51 which correspond to the first and second mountains 55 and 56, and those image parts which correspond to mountains are corrected in the relating images. If the category “3. PLANTS” is selected in the category designation window 60, image correction is carried out on those parts of the image 51 which correspond to the first and second mountains 55 and 56 and the grass 54, and image parts corresponding to mountains and grass are corrected in the relating images. If the category “4. SAME HUE” is selected in the category designation window 60, image correction is carried out on those parts of the image 51 and the relating images which correspond to such objects as having the same hue as the first mountain. For example, if the person 53 wears a green shirt, the image of the green shirt is also corrected in the same way as the mountains 55 and 56 and the grass 54.
  • The contents of the image correction are determined by parameters for the correction that are selected by the user through the keyboard 42 and the mouse 43.
  • Now the operation of the present embodiment will be described with reference to the flowchart shown in FIG. 7.
  • Images are photographed by the digital camera 10, so that image data of the photographed images are written on the memory card 28 along with additional data including the zoom ratio data, the date-of-photograph data, the location and orientation of the photograph, and identification data of photographed objects.
  • After the photography, the digital camera 10 is connected to the personal computer 2, to output the image data and the additional data to the personal computer 2. When the image processing program 47 is activated, the image window 50 is displayed on the monitor 41, and the image processing program 47 identifies the objects of the displayed image 51 on the basis of the additional data appended to the image data of the image 51.
  • When one of the objects of the displayed image 51 is selected by the user through the mouse 43, the monitor 41 displays the category designation window 60 and the content of image correction designation window 61. The user selects the category for the image correction from among several options displayed in the category designation window 60, and also designates the contents of the image correction in the content of image correction designation window 61. Then, the image data of those parts of the image 51 which correspond to the objects included in the selected category are processed for the image correction determined by the image processing program 47. At that time, the user selects parameters for the image correction through the keyboard 42 and the mouse 43, so that the image is corrected in accordance with the image correction parameters selected to reflect the user's intention and liking.
  • If there are any images relating to the image 51 on which the image correction is carried out, the image correction is carried out on corresponding parts of the relating images to those objects which are included in the designated category while using the same parameters as used for the image 51. In this way, the image processing program 47 continues the image correction till all of the relating images are processed, or till all of the image parts which correspond to the objects included in the designated category are processed.
  • According to the above described configuration, the user has only to designate the category of objects to correct, then corresponding image parts to the objects included in the designated category are automatically processed for image correction. In addition, it is possible for the users to preset the contents of the image correction in accordance with their own memories, impressions and taste.
  • It is to be noted that the category designation may be done after the selection of contents of image correction. It is also possible to designate the category without choosing any object. It is preferable to make those image parts distinctive on the image window 50, which correspond to the objects determined to be the object of the correction, for example, by winking those parts.
  • It is possible to provide the server 12 with map data that indicate locations of public constructions, such as electric wires, utility poles and pylons, so that public constructions in the photographed images may be identified with reference to the map data. Then, the user may erase or blur the electric wires or utility poles in all relating images simultaneously when the user selects merely an electric wire or a utility pole and erases or blurs it. In the same way, it is possible to erase or blur a particular object such as road signs or buildings. This erasing treatment may be applicable to those objects which damage the beauty of photographed images, such as trash cans or wastepaper on the street. Because the position of the trash can or wastepaper within the image is not determined, a three-dimensional position of that object in the image is estimated, and the estimated position is compared to the data base stored in the server 12, so as to identify that object.
  • It is desirable to merge an appropriate background image in each part from where the original object, e.g. a trash can, is erased. The appropriate background image may be retrieved from the image data base stored in the server 12. Alternatively, the erased part may be treated with the pixel interpolation by use of pixels in the periphery of the erased part. If there is not any appropriate background in the data base, it is preferable to produce a background image using a conventional CG technique, and merge the CG image in the erased part.
  • In order to reflect the user's taste, it is possible to collect data relating to the user's taste by displaying plural image samples on the monitor 41, which are substantially identical but corrected with gradually varied parameters. From among these samples, the user is required to select the most preferable one, and the results of the user's choice are accumulated for a certain period. The data relating to the user's taste may be derived from the accumulated results, and stored as a data base in the server 12, so that the data may be read out at the activation of the image processing program 47. This configuration enables automatic retrieval of such image correction parameters as congenial to the user's taste. So it saves the user time and labor for adjusting image correction parameters.
  • On processing a flash-photographed image containing any person, it is possible to enlarge the face of a person on the monitor 41 in order to check if the person suffers the red-eye phenomenon. If yes, the red-eye phenomenon is checked as to other persons of the same image and those persons of other images who exit in corresponding positions of individual images to the position of the person determined to suffer the red-eye. Then the red-eye compensation is carried out on those persons who are determined to suffer the red-eye. Because the object of the red-eye compensation is limited to the persons whose positions are determined, the image processing is still more speeded.
  • Although the above embodiment automatically correct corresponding parts of the relating images to the corrected objects of the initially displayed image, it is possible to allow the user to select between correcting the corresponding objects of the relating images or not. Thereby, if the user knows that the relating images do not contain the same object as the object to correct, process of extracting the same object from the relating images, which is unnecessary in this case, will be skipped to improve the efficiently.
  • The present invention is effective not only for correcting those images which are photographed by the digital camera 10 but also for correcting images downloaded through the Internet 11, so far as their image data are accompanied with additional data for identifying individual objects.
  • In the above embodiment, the personal computer 2 is referred to as the image processing apparatus, the present invention is applicable to image scanners or printer-processor that are installed in photo-shops.
  • Thus, the present invention is not to be limited to the above embodiment but, on the contrary, various modifications will be possible within the scope and sprit of the present invention as specified in the appended claims.

Claims (17)

  1. 1. An image processing method comprising steps of:
    identifying objects corresponding to parts of a photographed image;
    determining plural levels of categories in accordance with one of said objects which corresponds to a selected part to correct in said image;
    selecting a category from among said plural levels of categories; and
    carrying out same image correction on said selected part and those parts of said image and relating images to said image, which correspond to objects included in said selected category.
  2. 2. An image processing method as claimed in claim 1, wherein it is possible to choose whether to carry out said image correction on said relating images or not.
  3. 3. An image processing method as claimed in claim 1, wherein IC tags storing identification data of said objects are appended to said objects, and said objects are identified by reading out said identification data from said IC tags.
  4. 4. An image processing method as claimed in claim 1, wherein said objects are identified by retrieving identification data of said objects from a data base on the basis of photographic data appended to said image.
  5. 5. An image processing method comprising steps of:
    A. identifying objects photographed in digital photographic images;
    B. selecting a part to correct in an image;
    C. selecting a category from among plural levels of categories determined in accordance with an object corresponding to said selected part to correct;
    D. extracting those parts from said image, which correspond to objects included in said selected category;
    E. correcting said extracted parts in accordance with designated parameters;
    F. extracting from relating images to said image, corresponding parts to said objects included in said selected category; and
    G. correcting said corresponding parts of said relating images in the same way as said extracted parts of said image.
  6. 6. An image processing method as claimed in claim 5, further comprising, before the step F, a step of choosing between correcting said relating images or not, wherein the steps F and G are omitted if it is not chosen to correct said relating images.
  7. 7. An image processing method as claimed in claim 5, wherein the step A comprises steps of:
    appending IC tags to some objects, said IC tags being written with identification data of respective objects;
    reading out said identification data from at least one of said IC tags which exits in a photographic field at each photography, to store said identification data in association with image data obtained at each photography; and
    identifying photographed objects with reference to said stored identification data.
  8. 8. An image processing method as claimed in claim 5, wherein the step A comprises steps of retrieving identification data of photographed objects from a data base on the basis of photographic data appended to image data of individual images, and identifying said photographed objects with reference to said retrieved identification data.
  9. 9. An image processing method as claimed in claim 8, wherein said photographic data include at least one of location and orientation of photography, date and time of photography, and a zoom ratio.
  10. 10. An image processing apparatus comprising:
    an identifying device for identifying objects corresponding to parts of photographed images;
    a determining device for determining plural levels of categories in accordance with one of said objects which corresponds to a selected part to correct in one of said images;
    a selecting device for selecting a category from among said plural levels of categories; and
    a correction device for carrying out same image correction on said selected part and those parts of said one image and images relating to said one image, which correspond to objects included in said selected category.
  11. 11. An image processing apparatus as claimed in claim 10, further comprising a device of allowing to choose whether to carry out said image correction on said relating images or not.
  12. 12. An image processing apparatus as claimed in claim 10, wherein said identifying device identifies photographed objects by reading out identification data of said objects from IC tags appended to said objects.
  13. 13. An image processing apparatus as claimed in claim 10, wherein said identifying device identifies said photographed objects by retrieving identification data of said objects from a data base on the basis of photographic data appended to said image.
  14. 14. An image processing program comprising steps of:
    A. identifying objects photographed in an image;
    B. determining plural levels of categories in accordance with an object that corresponds to a part selected to correct in said image;
    C. extracting those parts from said image, which correspond to objects included in a category selected from among said plurality of levels of categories;
    D. correcting said extracted parts in accordance with designated parameters;
    E. identifying those objects which are included in said selected category with respect to relating images to said image; and
    F. correcting corresponding parts of said relating images to said identified objects in the same way as said extracted parts of said image.
  15. 15. An image processing program as claimed in claim 14, further comprising a step of choosing between correcting said relating images or not, wherein if it is not chosen to correct said relating images, said program does not proceed to the steps E and F.
  16. 16. An image processing program as claimed in claim 14, wherein photographed objects are identified with reference to identification data of said objects that are read out from IC tags appended to said objects.
  17. 17. An image processing program as claimed in claim 14, wherein photographed objects are identified with reference to identification data of said objects, which are retrieved from a data base on the basis of photographic data appended to respective image.
US10919314 2003-08-18 2004-08-17 Image processing method, image processing apparatus and image processing program Abandoned US20050041103A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2003-294597 2003-08-18
JP2003294597A JP4279083B2 (en) 2003-08-18 2003-08-18 Image processing method and apparatus, and image processing program

Publications (1)

Publication Number Publication Date
US20050041103A1 true true US20050041103A1 (en) 2005-02-24

Family

ID=34191050

Family Applications (1)

Application Number Title Priority Date Filing Date
US10919314 Abandoned US20050041103A1 (en) 2003-08-18 2004-08-17 Image processing method, image processing apparatus and image processing program

Country Status (2)

Country Link
US (1) US20050041103A1 (en)
JP (1) JP4279083B2 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060268112A1 (en) * 2005-05-26 2006-11-30 Sony Corporation Imaging device and method, computer program product on computer-readable medium, and imaging system
US20070255456A1 (en) * 2004-09-07 2007-11-01 Chisato Funayama Image Processing System and Method, and Terminal and Server Used for the Same
US20080151059A1 (en) * 2006-12-26 2008-06-26 Canon Kabushiki Kaisha Image pickup apparatus, method for controlling image pickup apparatus, and storage medium
US20090316004A1 (en) * 2008-06-18 2009-12-24 Sanyo Electric Co., Ltd. Electronic Device
US20120086828A1 (en) * 2010-10-09 2012-04-12 Yan Li White balance method and white balance device
US20120236162A1 (en) * 2011-03-18 2012-09-20 Casio Computer Co., Ltd. Image processing apparatus with function for specifying image quality, and method and storage medium
US8340453B1 (en) 2008-08-29 2012-12-25 Adobe Systems Incorporated Metadata-driven method and apparatus for constraining solution space in image processing techniques
US8368773B1 (en) 2008-08-29 2013-02-05 Adobe Systems Incorporated Metadata-driven method and apparatus for automatically aligning distorted images
US8391640B1 (en) 2008-08-29 2013-03-05 Adobe Systems Incorporated Method and apparatus for aligning and unwarping distorted images
US20130124471A1 (en) * 2008-08-29 2013-05-16 Simon Chen Metadata-Driven Method and Apparatus for Multi-Image Processing
EP2613515A1 (en) * 2008-04-22 2013-07-10 Sony Corporation Offloading processing of images from a portable digital camera
US8842190B2 (en) 2008-08-29 2014-09-23 Adobe Systems Incorporated Method and apparatus for determining sensor format factors from image metadata

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020045988A1 (en) * 2000-09-25 2002-04-18 International Business Machines Corporation Spatial information using system, system for obtaining information, and server system
US6389169B1 (en) * 1998-06-08 2002-05-14 Lawrence W. Stark Intelligent systems and methods for processing image data based upon anticipated regions of visual interest
US20020059581A1 (en) * 1994-09-14 2002-05-16 Time Warner Entertainment Company, L.P. Video-on-demand service with an interactive interface for facilitating viewer selection of video programs
US20020085001A1 (en) * 2000-10-06 2002-07-04 Taylor Richard Ian Image processing apparatus
US20020167538A1 (en) * 2001-05-11 2002-11-14 Bhetanabhotla Murthy N. Flexible organization of information using multiple hierarchical categories
US6690828B2 (en) * 2001-04-09 2004-02-10 Gary Elliott Meyers Method for representing and comparing digital images
US6748097B1 (en) * 2000-06-23 2004-06-08 Eastman Kodak Company Method for varying the number, size, and magnification of photographic prints based on image emphasis and appeal
US6751363B1 (en) * 1999-08-10 2004-06-15 Lucent Technologies Inc. Methods of imaging based on wavelet retrieval of scenes
US20040126038A1 (en) * 2002-12-31 2004-07-01 France Telecom Research And Development Llc Method and system for automated annotation and retrieval of remote digital content
US6798921B2 (en) * 1998-03-19 2004-09-28 Fuji Photo Film Co., Ltd. Method for image designating and modifying process
US20050229799A1 (en) * 2004-03-23 2005-10-20 Fuji Photo Film Co., Ltd. Memory element mounting method and image forming apparatus
US7027655B2 (en) * 2001-03-29 2006-04-11 Electronics For Imaging, Inc. Digital image compression with spatially varying quality levels determined by identifying areas of interest
US20060086794A1 (en) * 1999-06-07 2006-04-27 Metrologic Instruments, Inc.. X-radiation scanning system having an automatic object identification and attribute information acquisition and linking mechanism integrated therein
US7050630B2 (en) * 2002-05-29 2006-05-23 Hewlett-Packard Development Company, L.P. System and method of locating a non-textual region of an electronic document or image that matches a user-defined description of the region
US7197493B2 (en) * 2001-12-21 2007-03-27 Lifestory Productions, Inc. Collection management database of arbitrary schema

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020059581A1 (en) * 1994-09-14 2002-05-16 Time Warner Entertainment Company, L.P. Video-on-demand service with an interactive interface for facilitating viewer selection of video programs
US6798921B2 (en) * 1998-03-19 2004-09-28 Fuji Photo Film Co., Ltd. Method for image designating and modifying process
US6389169B1 (en) * 1998-06-08 2002-05-14 Lawrence W. Stark Intelligent systems and methods for processing image data based upon anticipated regions of visual interest
US20060086794A1 (en) * 1999-06-07 2006-04-27 Metrologic Instruments, Inc.. X-radiation scanning system having an automatic object identification and attribute information acquisition and linking mechanism integrated therein
US6751363B1 (en) * 1999-08-10 2004-06-15 Lucent Technologies Inc. Methods of imaging based on wavelet retrieval of scenes
US6748097B1 (en) * 2000-06-23 2004-06-08 Eastman Kodak Company Method for varying the number, size, and magnification of photographic prints based on image emphasis and appeal
US20020045988A1 (en) * 2000-09-25 2002-04-18 International Business Machines Corporation Spatial information using system, system for obtaining information, and server system
US20020085001A1 (en) * 2000-10-06 2002-07-04 Taylor Richard Ian Image processing apparatus
US7027655B2 (en) * 2001-03-29 2006-04-11 Electronics For Imaging, Inc. Digital image compression with spatially varying quality levels determined by identifying areas of interest
US6690828B2 (en) * 2001-04-09 2004-02-10 Gary Elliott Meyers Method for representing and comparing digital images
US20020167538A1 (en) * 2001-05-11 2002-11-14 Bhetanabhotla Murthy N. Flexible organization of information using multiple hierarchical categories
US7197493B2 (en) * 2001-12-21 2007-03-27 Lifestory Productions, Inc. Collection management database of arbitrary schema
US7050630B2 (en) * 2002-05-29 2006-05-23 Hewlett-Packard Development Company, L.P. System and method of locating a non-textual region of an electronic document or image that matches a user-defined description of the region
US20040126038A1 (en) * 2002-12-31 2004-07-01 France Telecom Research And Development Llc Method and system for automated annotation and retrieval of remote digital content
US20050229799A1 (en) * 2004-03-23 2005-10-20 Fuji Photo Film Co., Ltd. Memory element mounting method and image forming apparatus

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070255456A1 (en) * 2004-09-07 2007-11-01 Chisato Funayama Image Processing System and Method, and Terminal and Server Used for the Same
US7929796B2 (en) * 2004-09-07 2011-04-19 Nec Corporation Image processing system and method, and terminal and server used for the same
US20060268112A1 (en) * 2005-05-26 2006-11-30 Sony Corporation Imaging device and method, computer program product on computer-readable medium, and imaging system
US8179442B2 (en) * 2005-05-26 2012-05-15 Sony Corporation Imaging device and method for performing surveillance by infrared radiation measurement
US20080151059A1 (en) * 2006-12-26 2008-06-26 Canon Kabushiki Kaisha Image pickup apparatus, method for controlling image pickup apparatus, and storage medium
US8102439B2 (en) * 2006-12-26 2012-01-24 Canon Kabushiki Kaisha Image pickup apparatus comprising a generating unit configured to generate an identifier, method for controlling image pickup apparatus, and storage medium
EP2613515A1 (en) * 2008-04-22 2013-07-10 Sony Corporation Offloading processing of images from a portable digital camera
US8542285B2 (en) 2008-04-22 2013-09-24 Sony Corporation Offloading processing of images from a portable digital camera
US8643727B2 (en) * 2008-06-18 2014-02-04 Sanyo Electric Co., Ltd. Electronic device related to automatic time setting
US20090316004A1 (en) * 2008-06-18 2009-12-24 Sanyo Electric Co., Ltd. Electronic Device
US8830347B2 (en) 2008-08-29 2014-09-09 Adobe Systems Incorporated Metadata based alignment of distorted images
US8391640B1 (en) 2008-08-29 2013-03-05 Adobe Systems Incorporated Method and apparatus for aligning and unwarping distorted images
US8368773B1 (en) 2008-08-29 2013-02-05 Adobe Systems Incorporated Metadata-driven method and apparatus for automatically aligning distorted images
US20130124471A1 (en) * 2008-08-29 2013-05-16 Simon Chen Metadata-Driven Method and Apparatus for Multi-Image Processing
US8340453B1 (en) 2008-08-29 2012-12-25 Adobe Systems Incorporated Metadata-driven method and apparatus for constraining solution space in image processing techniques
US8842190B2 (en) 2008-08-29 2014-09-23 Adobe Systems Incorporated Method and apparatus for determining sensor format factors from image metadata
US8724007B2 (en) * 2008-08-29 2014-05-13 Adobe Systems Incorporated Metadata-driven method and apparatus for multi-image processing
US10068317B2 (en) 2008-08-29 2018-09-04 Adobe Systems Incorporated Metadata-driven method and apparatus for constraining solution space in image processing techniques
US8675988B2 (en) 2008-08-29 2014-03-18 Adobe Systems Incorporated Metadata-driven method and apparatus for constraining solution space in image processing techniques
US20120086828A1 (en) * 2010-10-09 2012-04-12 Yan Li White balance method and white balance device
US8400523B2 (en) * 2010-10-09 2013-03-19 Ricoh Company, Ltd. White balance method and white balance device
US8547449B2 (en) * 2011-03-18 2013-10-01 Casio Computer Co., Ltd. Image processing apparatus with function for specifying image quality, and method and storage medium
US8760534B2 (en) 2011-03-18 2014-06-24 Casio Computer Co., Ltd. Image processing apparatus with function for specifying image quality, and method and storage medium
US20120236162A1 (en) * 2011-03-18 2012-09-20 Casio Computer Co., Ltd. Image processing apparatus with function for specifying image quality, and method and storage medium

Also Published As

Publication number Publication date Type
JP2005065048A (en) 2005-03-10 application
JP4279083B2 (en) 2009-06-17 grant

Similar Documents

Publication Publication Date Title
US7916897B2 (en) Face tracking for controlling imaging parameters
US8064710B2 (en) Image processing apparatus, method of controlling thereof, and program
US20060078224A1 (en) Image combination device, image combination method, image combination program, and recording medium containing the image combination program
US20060268150A1 (en) Photography apparatus, photography method, and photography program
US7206022B2 (en) Camera system with eye monitoring
US20020093670A1 (en) Doubleprint photofinishing service with the second print having subject content-based modifications
US7403643B2 (en) Real-time face tracking in a digital image acquisition device
US20050025387A1 (en) Method and computer program product for producing an image of a desired aspect ratio
US7035462B2 (en) Apparatus and method for processing digital images having eye color defects
US20090135269A1 (en) Electronic Camera and Image Processing Device
US6711291B1 (en) Method for automatic text placement in digital images
US20070255456A1 (en) Image Processing System and Method, and Terminal and Server Used for the Same
US20040236791A1 (en) Image searching method and image processing method
US7620218B2 (en) Real-time face tracking with reference images
US20080218603A1 (en) Imaging apparatus and control method thereof
US20040120606A1 (en) Imaging method and system for determining an area of importance in an archival image
US20050088542A1 (en) System and method for displaying an image composition template
US20040258304A1 (en) Apparatus and program for selecting photographic images
US7133571B2 (en) Automated cropping of electronic images
US20070077025A1 (en) Apparatus, method and program for image search
US8081227B1 (en) Image quality visual indicator
US20080220750A1 (en) Face Categorization and Annotation of a Mobile Phone Contact List
US20070071316A1 (en) Image correcting method and image correcting system
US20040247175A1 (en) Image processing method, image capturing apparatus, image processing apparatus and image recording apparatus
US20060280380A1 (en) Apparatus, method, and program for image processing

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJI PHOTO FILM CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KINJO, NAOTO;REEL/FRAME:015702/0309

Effective date: 20040611

AS Assignment

Owner name: FUJIFILM CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO., LTD.);REEL/FRAME:018904/0001

Effective date: 20070130

Owner name: FUJIFILM CORPORATION,JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUJIFILM HOLDINGS CORPORATION (FORMERLY FUJI PHOTO FILM CO., LTD.);REEL/FRAME:018904/0001

Effective date: 20070130