JP2012079337A - Image arrangement device, method and program - Google Patents

Image arrangement device, method and program Download PDF

Info

Publication number
JP2012079337A
JP2012079337A JP2011277615A JP2011277615A JP2012079337A JP 2012079337 A JP2012079337 A JP 2012079337A JP 2011277615 A JP2011277615 A JP 2011277615A JP 2011277615 A JP2011277615 A JP 2011277615A JP 2012079337 A JP2012079337 A JP 2012079337A
Authority
JP
Japan
Prior art keywords
organizing
image
condition
images
arrangement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2011277615A
Other languages
Japanese (ja)
Other versions
JP5485254B2 (en
Inventor
Akira Yoda
章 依田
Original Assignee
Fujifilm Corp
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Corp, 富士フイルム株式会社 filed Critical Fujifilm Corp
Priority to JP2011277615A priority Critical patent/JP5485254B2/en
Publication of JP2012079337A publication Critical patent/JP2012079337A/en
Application granted granted Critical
Publication of JP5485254B2 publication Critical patent/JP5485254B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An image organizing apparatus for easily grouping a plurality of input images for each person included in the plurality of images.
Display means for displaying a face image of a person included in a plurality of input images for each person, and selection of a desired face image from face images displayed by the display means by a user. , Organizing condition setting means for accepting setting of organizing conditions for grouping the plurality of inputted images for each person of the selected face image, and accepting the plurality of inputted images by the organizing condition setting means And image organizing means for grouping the face images included in the organizing conditions for each person based on the organizing conditions.
[Selection] Figure 7

Description

  The present invention relates to an image organizing apparatus and method for classifying and organizing some or all of a plurality of images into one or more groups, and a program for causing a computer to execute control for realizing the image organizing method. Is.

  Compared to a silver salt camera, a digital camera has a tendency to shoot more images as a result of the lower cost of filming, as a result of not incurring film charges. In addition, the trend of increasing the capacity of recording media that can be attached to and detached from a digital camera is also helping. As a result, a very large number of images are stocked on the recording media of digital cameras and the storage media such as the hard disks and CD-Rs of personal computers where image data read from the recording media is stored. In many cases, necessary images (good images) and unnecessary images (failure photos, duplicated images, etc.) are mixed, and they remain unorganized.

  Organizing such a large number of images is a very troublesome task. Therefore, a device that supports search / extraction of necessary images from such image stock, extraction / deletion of unnecessary images, and classification / organization based on the viewpoint of events, date / time, location, etc. of images in image stock Has been proposed.

  For example, a plurality of images can be obtained from the viewpoint of time and event (similarity of images based on image analysis) (for example, Patent Document 1), from the viewpoint of shooting location and shooting date (for example, Patent Document 2), and accompanying information (for example: It has been proposed to automatically classify from the viewpoint (for example, Patent Document 3) of secondary information (weather, country name) obtained from shooting date / time) and accompanying information (eg shooting date / time, GPS information).

  On the other hand, for each user, a desired classification pattern is defined as a classification condition and stored in the digital camera. When the user takes a picture with the camera, the classification condition is described in the tag of the photographed image. It has also been proposed that an image photographed by the user is classified according to his / her classification condition (for example, Patent Document 4).

JP 2000-112997 A JP 2005-037992 A JP 2003-271617 A JP 2004-118573 A

  In the automatic classification methods described in Patent Documents 1 to 3, the classification conditions are constant regardless of the user and uniform arrangement is performed, and the classification is not necessarily consistent with the viewpoint of the classification desired by the user. Further, according to the technique described in Patent Document 4, classification from the viewpoint of the classification desired by the user is possible, but it is troublesome to associate an image with a classification condition every time shooting is performed.

  The present invention has been made in view of the above circumstances, and provides an apparatus, a method, and a program for realizing automatic organization of images suitable for the organization desired by the user without performing troublesome operations by the user. For the purpose.

  The image organizing method of the present invention stores, for each user, an organizing condition for organizing a plurality of images based on the contents of each of the input plurality of images and / or the incidental attributes of each of the plurality of images. In addition, when organizing a plurality of new images, the stored user's organizing conditions for organizing the new images are acquired, and the new images are stored based on the acquired organizing conditions. It is characterized by organizing.

  The image organizing apparatus of the present invention realizes the above-described image organizing method, and the first aspect thereof is the contents of each of a plurality of inputted images and / or the incidental attributes of each of a plurality of images. An image organizing apparatus having an image organizing means for organizing a plurality of images on the basis thereof stores the organizing conditions at the time of organizing performed by the image organizing means in association with user identification information for identifying a user of the apparatus. An arrangement condition storage unit; a user identification information reception unit that receives input of user identification information; and an arrangement condition acquisition unit that acquires an arrangement condition associated with the input user identification information from the arrangement condition storage unit, The image organizing means organizes new images based on the acquired organizing conditions.

  According to a second aspect of the image organizing apparatus of the present invention, there is provided an image organizing means for organizing a plurality of images based on contents of each of a plurality of input images and / or incidental attributes of each of the plurality of images. An arrangement condition output means for storing an arrangement condition at the time of arrangement performed by the image arrangement means in a storage medium that is held by the user of the apparatus and that is readable and writable by the apparatus, and the storage medium An arrangement condition acquisition means for acquiring an arrangement condition is further provided, and the image arrangement means arranges a new image based on the acquired arrangement condition.

  The image organizing program of the present invention is for causing a computer to execute the image organizing method, in other words, causing the computer to function as the image organizing apparatus. That is, the first form of the image organizing program of the present invention is an image organizing means for organizing a plurality of images based on the contents of each of the plurality of inputted images and / or the incidental attributes of each of the plurality of images. In the program for causing a computer to function as a computer, an organization condition storage means for storing the computer in the storage device in association with the user identification information for identifying the user, the organization condition when the image organization means performs the organization, and user identification It further functions as a user identification information receiving unit that receives input of information, and a rearrangement condition acquisition unit that acquires a rearrangement condition associated with the input user identification information from the storage device. Based on the above, it is made to function so as to organize new images.

  A second form of the image organizing program of the present invention is a computer as an image organizing means for organizing a plurality of images based on the contents of each of a plurality of inputted images and / or the incidental attributes of each of the plurality of images. In the image organizing program for causing the computer to function, an organizing condition output means for storing the organizing conditions at the time of organizing performed by the image organizing means in a computer-readable storage medium possessed by the user of the computer And further functioning as an organizing condition acquisition unit for acquiring an organizing condition from the storage medium, and causing the image organizing unit to function to organize new images based on the acquired organizing condition.

  Next, details of the image organizing apparatus, method, and program of the present invention will be described.

  Image “arrangement” includes at least one of a process of extracting some or all images from a plurality of images, a process of classifying the plurality of images into one or more groups, and a process of rearranging the plurality of images. One may be included.

  Accordingly, the “organization condition” may include at least one of an extraction condition for image extraction, a classification condition for image classification, and a rearrangement condition for image rearrangement. The extraction condition, the classification condition, and the rearrangement condition are defined using one or more kinds of contents and / or incidental attributes (described later) of the input image.

  Here, the “content” of the image is obtained by image analysis, and is obtained by collation processing with a reference dictionary in addition to information such as the presence or absence of a subject such as a human face in the image, shooting conditions, and image quality. Information such as the name of a person inside and the presence or absence of a specific subject can be included.

  Specific examples of “additional attributes” of an image include the shooting date / time, shooting location, imaging device model, shooting conditions such as shooting mode, shutter speed, aperture, and strobe settings. Recorded information, information such as image titles and keywords, photographer names, etc., which are recorded in association with images mainly by manual input after shooting, and the number of times the image is referenced, reference time, etc. For example, information indicating the access frequency for an image that is updated in association with the image. Such information can be recorded in association with image data as tags based on the Exif standard or the like, for example. The information on the shooting location can be acquired by means for acquiring GPS information (latitude, longitude) and position information (base station information, etc.) in mobile communication provided in the imaging device, for example.

  Specific examples of “user identification information” include a plurality of characters and symbols such as a user ID, an array of numbers, attribute information such as a user's name, date of birth, and telephone number, a user's fingerprint, iris, face, and signature. Biological information such as

  The “input of user identification information” may be, for example, manually input by a user using an input device such as a keyboard, or the user identification information may be stored on a barcode, magnetic stripe, IC, or the like of a recording medium such as a card. If stored, it may be read using a reading device such as a card reader, or may be input / read using a biometric information input / reading device.

  As a specific example of “a storage medium readable / writable by the device (the image organizing that is the second aspect of the present invention) possessed by the user”, an IC card that can be read / written by the image organizing device can be considered, A memory card, a CD-ROM, or the like in which an image input to the image organizing apparatus is recorded as an organizing target can be considered.

  The aspect of the present invention may allow a plurality of organization conditions to be stored for each user. That is, for each user, the arrangement condition in each of a plurality of image arrangement patterns may be stored. For example, in the case of the first form of the image organizing apparatus of the present invention, a plurality of organizing conditions are stored in association with the user identification information of one user, and a plurality of items are associated with the input user identification information. An arrangement condition is acquired, and a plurality of patterns are arranged based on each of the acquired plurality of arrangement conditions. In the case of the second form of the image organizing apparatus of the present invention, the organizing conditions for a plurality of organizing patterns are stored in the storage medium, and when organizing a new image, the plurality of organizing conditions are acquired, A plurality of patterns are arranged based on each of the obtained plurality of arrangement conditions.

  In the above-described aspect of organizing a plurality of patterns, it is preferable to evaluate the preference of the result of organizing the plurality of patterns. Here, “preferability of the result of organization” is, for example, at least one of the number of extracted images, the number of classified groups, the number of images in the group, and the degree of bias between the groups of the number of images. Can be evaluated on the basis of

  Furthermore, it is preferable to preferentially display the result of the arrangement evaluated as more preferable. Here, “preferential display” means, for example, displaying in order from the result of the organization evaluated as being more preferable, or by displaying together an evaluation value indicating the preference of the organization result, for example, It is conceivable to display the arrangement result evaluated as being more preferable in a manner distinguishable from the results of other arrangements, or to display only the arrangement result evaluated as being most preferable.

  According to the present invention, the rearrangement conditions when rearranging images in the past are stored for each user, and the rearrangement conditions of the user who performs the rearrangement stored when rearranging new images are stored. Since the new images are organized based on the obtained organizing conditions, the user-desired classification is compared with the conventional case where the images are organized by the uniform organizing conditions regardless of the user. This makes it possible not only to automatically arrange images that match from the above viewpoints, but also to eliminate the need for associating images with arrangement conditions each time a picture is taken, thereby enabling automatic arrangement without user effort.

  In addition, if a plurality of arrangement conditions can be stored for each user and images of a plurality of patterns are arranged based on each of the acquired plurality of arrangement conditions, a variety of user image arrangement conditions can be set. It becomes possible to respond to requests more flexibly.

  Furthermore, when the preference of the result of organizing a plurality of patterns is evaluated, the user can more easily determine which of the results of organizing the plurality of patterns is preferable. Here, this effect becomes even more prominent if the arrangement results evaluated as being more preferable are preferentially displayed.

1 is an external perspective view of an order receiving device to which an image organizing device according to a first embodiment of the present invention is applied. 1 is a schematic block diagram showing the configuration of an order receiving device according to a first embodiment of the present invention. Figure showing the initial screen The figure showing typically the main data flow and the functional block in the image organization process by the 1st Embodiment of this invention. The flowchart which shows the flow of the image rearrangement process by the 1st Embodiment of this invention. Flowchart showing the flow of image organization processing according to the first embodiment of the present invention (continued) A figure showing an example of an arrangement condition setting screen The figure which shows an example of an arrangement condition table Diagram showing the nature of the total feature A figure showing an example of the organization result display screen The figure which represented typically the flow of main data and the functional block in the image organization process by the 2nd Embodiment of this invention

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

  FIG. 1 is an external perspective view of a photo print order receiving apparatus provided with an image organizing apparatus according to a first embodiment of the present invention. As shown in FIG. 1, an order receiving apparatus 1 according to a first embodiment of the present invention is installed at a photo shop store to receive an image print order by a user. Various types of memory card 2 in which images are recorded, a plurality of card slots 4 for reading out images from the memory card 2 and recording images in the memory card 2, and a magnetic card in which a customer ID is recorded 3 and a display unit 6 for performing various displays for ordering prints. The order receiving apparatus 1 is connected via a network to a printer 8 that performs photo printing based on an order from a customer, and a digital image controller (DIC) 10 that performs image processing for images and management of print orders. ing. The display unit 6 includes a touch panel type input unit 18, and the user can input necessary for print ordering and image organization by touching the display unit 6 in accordance with the display on the display unit 6.

  FIG. 2 is a schematic block diagram showing the configuration of the order receiving apparatus 1 according to the embodiment of the present invention. As shown in FIG. 2, the order receiving apparatus 1 performs various controls such as recording control and display control of image data representing an image, and operates the apparatus 1 with a CPU 12 that controls each part of the apparatus 1. A system memory 14 composed of a ROM in which a basic program and various coefficients are recorded, and a RAM which is a work area when the CPU 12 executes processing, and various instructions to the apparatus 1 Touch panel type input unit 18 and the display unit 6 described above.

  Further, the order receiving device 1 records images read from the memory card 2 in the card slot 4 described above, various programs executed by the CPU 12 for ordering and organizing images, reference data such as organizing conditions, and the like. A hard disk 24 that controls the system memory 14, the card slot 4, the card reader 5, and the hard disk 24, a display control unit 26 that controls display on the display unit 6, and an input by the input unit 18. An input control unit 22 and a network interface 30 for communicating the apparatus 1 with the printer 8 and the DIC 10 via a network are provided.

  Although a plurality of card slots 4 are prepared according to the type of the memory card 2, only one card slot 4 is shown in FIG.

  Processing described later performed in the order receiving device 1 is realized by executing various programs stored in the hard disk 24. The various programs are stored in the hard disk 24 by being installed from a computer-readable recording medium such as a CD-ROM in which the various programs are recorded. The various programs include a main program that controls the entire processing performed in the order receiving apparatus 1 and a subprogram that performs order processing and image organization processing that is called from the main program as necessary.

Such a print order in the order receiving apparatus 1 is performed as follows. FIG. 3 is a diagram showing an initial screen displayed on the display unit 6 of the order receiving apparatus 1. This screen is displayed under the control of the main program. As shown in FIG. 3, the initial screen 40 displays a print order button 40A for ordering prints and an image organization button 40B for organizing images as will be described later. Here, when the user selects the print order button 40A, the CPU 12 calls a print order processing subprogram from the main program, and executes processing by this subprogram. When the user loads a memory card 2 on which a plurality of images are recorded in the card slot 4 according to an instruction displayed on the screen, the apparatus 1 reads a plurality of images from the memory card 2 and temporarily stores them in the hard disk 24. Furthermore, a list of a plurality of images is displayed on the display unit 6. The user selects a print order from the list of images displayed on the display unit 6, sets the order quantity and the order size, and inputs the touch panel type input unit 16. To do. When the user issues a print execution instruction using the input unit 16, the image selected by the user and the order information indicating the order quantity and the order size are transmitted to the DIC 10, which is necessary for improving the image quality. Image processing is performed on the image, and the image ordered for printing according to the order quantity and the order size according to the order information is printed out from the printer 8.

  Next, the image organization process in the embodiment of the present invention will be described. This process is realized when the user selects the image organization button 40B on the initial screen of FIG. 3 to call and execute the image organization subprogram from the main program.

  FIG. 4 is a block diagram schematically showing main data flows and functions in the image organizing process according to the first embodiment of the present invention. As shown in the figure, a customer ID input unit 51 for acquiring a customer ID, an image input unit 52 for receiving image input, an arrangement condition setting mode selection unit 61 for receiving selection of an image arrangement condition setting mode, and an arrangement condition When “automatic” is selected as the setting mode, the arrangement condition acquisition unit 53 that acquires the arrangement condition associated with the customer ID acquired by the customer ID input unit 51 from the arrangement condition table T1, and the arrangement condition setting mode When “manual” is selected, the feature amount of the person is acquired from the person information table T2 as necessary based on the arrangement condition input unit 62 that receives the input of the arrangement condition and the acquired / input arrangement condition. An image organizing unit 54 for organizing the input images, an organizing result evaluating unit 55 for evaluating the preference of the organizing result, and an image organizing result based on the result of the evaluation Is displayed on the display unit 6, an arrangement result correction unit 63 that accepts correction of the displayed arrangement result, an arrangement result selection unit 57 that accepts selection of the arrangement result, and a memory of the selected arrangement result. The image organizing process is realized by the organizing result recording unit 58 that records on the card 2 and the organizing condition registration unit 59 that registers the organizing condition corresponding to the selected organizing result in the organizing condition table T1.

  Here, the customer ID input unit 51 displays a message prompting the insertion of the magnetic card 3 into the card reader 5 such as “Please insert the magnetic card” on the display unit 6, and the magnetic card 3 is displayed on the card reader 5. When inserted, the magnetic ID of the inserted magnetic card 3 is read to obtain the customer ID.

  The image input unit 52 displays a message prompting insertion of the memory card 2 into the card slot 4 such as “Please insert a memory card” on the display unit 6, and when the memory card 2 is inserted into the card slot 4. The image file is read from the inserted memory card 2 and temporarily stored in the hard disk 24 of the apparatus 1.

  The arrangement condition setting mode selection unit 61 causes the display unit 6 to display an image arrangement condition setting mode selection screen. This screen includes a message “Please select a setting mode for image organizing conditions” and buttons for selecting “automatic” and “manual”. "Is accepted.

  In the organization condition table T1, as shown in FIG. 8, an image organization condition including a customer ID, an organization title, an image extraction condition, an image grouping condition, and an image sorting condition is registered in association with each other. One image organization condition (organization pattern) can be identified by a combination of a customer ID and a serial number. That is, a plurality of image arrangement conditions can be associated with one customer ID by changing the serial number value. Note that the customer ID = “ZZZ9999” in the figure is not associated with an actual customer, but is associated with a default arrangement condition of the image arrangement apparatus. A specific example of the image organizing process according to the organizing conditions set in the organizing condition table T1 will be described later.

  The arrangement condition acquisition unit 53 uses the customer ID acquired by the customer ID input unit 51 as a search key in the arrangement condition table T1 when “automatic” is selected as the arrangement condition setting mode in the arrangement condition setting mode selection unit 61. Random access is made and one arrangement condition associated with this customer is acquired. At this time, the arrangement condition acquisition unit 53 stores the acquired position (address of the record of the arrangement condition table T1), and a plurality of arrangement conditions associated with the customer ID are registered in the arrangement condition table T1. In this case, when the next arrangement condition associated with the customer ID is read, it can be read by sequential access from the current reading position of the arrangement condition table T1 instead of random access.

  The organizing condition input unit 61 causes the display unit 6 to display a screen for setting the organizing condition when “manual” is selected as the organizing condition setting mode in the organizing condition setting mode selection unit 61, so that the organizing condition by the user is displayed. Accepts input. FIG. 7 shows an example of an arrangement condition setting screen. As shown in the figure, the arrangement condition setting screen 41 includes an extraction condition setting area 41A, a grouping condition setting area 41E, and a sort condition setting area 41K.

  In the extraction condition setting area 41A, items 41C that can be set as extraction conditions are listed, and the user touches the check box 41B to select whether or not to use the item as an extraction condition. Further, when the set / change button 41D is touched, a screen for setting detailed conditions of the item is displayed. For example, when the setting / change button 41D for shooting date / time is touched, the organizing condition input unit 61 refers to the incidental information of the image file input by the image input unit 52, and specifies the oldest shooting date / time and the latest shooting date / time. Then, a detailed setting screen showing the shooting date and time within the shooting period of the image of the input image file in the calendar format is displayed on the display unit 6, and the user touches the start and end of the shooting period of the image to be extracted. Thus, the setting of the extraction condition related to the shooting date and time is accepted, and the set shooting period is displayed at the position 41C of the shooting date and time item in the extraction condition setting area 41A of the arrangement condition setting screen 41. When the setting / change button 41D for a person is touched, the organizing condition input unit 61 sequentially reads the image file input by the image input unit 52, detects a face area in the image, and further detects the detected face area. Face similarity is determined, similar face images are grouped, representative face images of each group are determined, and a detailed setting screen listing representative face thumbnail images is displayed on the display unit 6 And accepting selection of a face image that should be included in the image to be extracted when the user touches the desired thumbnail image, and the thumbnail image of the selected face is extracted in the extraction condition setting area of the organizing condition setting screen 41 It is displayed at the position 41C of the item of the person 41A. Here, the detection of the face area is a method using a discriminator obtained by machine learning such as AdaBoost (for example, Japanese Patent Laid-Open No. 2005-108195), and the similarity of the face is determined by the eyes extracted from the face area. A feature vector (feature amount) representing the position, shape, size, etc. of feature points such as nose, mouth, etc., and grouping the face images based on the statistical distance of the feature vectors between the face images (for example, Japanese Laid-Open Patent Publication No. 09-251534), representative face determination is a method of selecting an image with the most preferable image quality based on the edge component and skin color component of the face image in the group (for example, Japanese Patent Laid-Open No. 2005-49968). It is possible to use known techniques such as Japanese Laid-Open Patent Publication No. 2005-122721). Note that the characteristic amounts of the representative face images of each group are associated with the customer ID acquired by the customer ID input unit 51 and registered in the person information table T2. At this time, since a plurality of facial feature quantities are associated with one customer ID, individual facial feature quantities can be identified by a person ID that combines a customer ID and a serial number. If the facial feature value associated with the customer ID has already been registered, the similarity between the facial feature value to be registered this time and the already registered facial feature value is determined, When it is determined, new registration is not performed, and when it is determined that they are not similar, new registration is performed.

  The grouping condition setting area 41E displays a list of items 41F that can be used as grouping keys, and accepts selection of grouping keys when the user touches the check box 41G corresponding to each item. Further, when the user touches the “↑” button 41H or the “↓” button 41J corresponding to each item, the setting of the grouping priority is accepted.

  As with the grouping condition setting area 41E, the sorting condition setting area 41K also displays a list of items 41L that can be used as sorting keys, and the sorting keys when the user touches the check box 41M. Selection of items, setting of priority of sorting key items by touching “↑” button 41N or “↓” button 41P, and raising / lowering sorting by touching ascending button 41Q or descending button 41R Accept order selection. Ascending order button 41Q and descending order button 41R are selected so that the selected button is highlighted. The organizing conditions manually input in the organizing condition input unit 61 are stored in a predetermined storage area of the system memory 14 and used in the organizing process of the image organizing unit 54 described later.

  The image organizing unit 54, based on the acquired / inputted organizing conditions, that is, the extraction condition, the grouping condition, and the sorting condition, includes the incidental information of the input image and the feature amount obtained by the analysis processing for the input image, Extraction processing for acquiring person feature values from the person information table T2 as necessary, and extracting images satisfying the extraction condition from the input images, as input / extracted images, as key information for image extraction, grouping, and sorting Is classified into one or more groups so as to satisfy the grouping condition, and a sorting process is performed for rearranging the input / extracted image or each group classified by the grouping process so as to satisfy the sorting condition. Each group classified by the grouping process is associated with a logical storage location of the image file, that is, a folder (directory path).

  Specifically, in the image extraction process, for example, when the extraction condition is set so as to extract an image shot during a predetermined period, the image organizing unit 54 performs the Exif of the input image file. The shooting date / time recorded in the tag is acquired, and only the image files whose acquired shooting date / time is within the set predetermined period are extracted.

  Further, “manual” is selected in the organizing condition setting mode selection unit 61, and the extraction condition is set so as to extract an image including a face of a specific person by an operation on the organizing condition setting screen 41 in FIG. In this case, the image organizing unit 54 performs the above-described face detection, grouping, and representative face performed by touching the set / change button 41D for the person on the organizing condition setting screen 41 in the organizing condition input unit 62. Using the result of each process of image determination, an input image belonging to a group having the face to be extracted as a representative image is extracted as matching the extraction condition.

  On the other hand, when “automatic” is selected in the arrangement condition setting mode selection unit 61 and the extraction condition acquired by the arrangement condition acquisition unit 53 from the arrangement condition table T1 is, for example, person ID = “ABC0001_01”, this person ID The person information table T2 is accessed in order to obtain the feature amount of the person's face. This person information table T2 associates attribute information such as a person's name or the like, and the person's face feature amount obtained by analysis processing of the person's face image and the person ID. This person ID is composed of a combination of a customer ID and a serial number. Therefore, for example, a customer with a customer ID has a serial number “00” and a child with the customer ID has a serial number “01”. It is possible to associate. The image organizing unit 54 refers to the person information table T2 using the person ID = “ABC0001_01” acquired as the extraction condition as a search key, acquires the facial feature amount associated with the person ID = “ABC0001_01”, and The image file input by the image input unit 52 is sequentially read, the face area in the image is detected, and the feature amount representing the position, shape, size, etc. of the feature points such as eyes, nose, mouth, etc. is detected from the detected face area. calculate. Further, a statistical distance between the feature amount of the face corresponding to the person ID = “ABC0001_01” acquired from the person information table T2 and the feature amount corresponding to each of the faces detected from the input image is calculated. An input image having a statistical distance value equal to or smaller than a predetermined threshold is extracted.

  Further, when the extraction condition is set so as to extract an image shot at a predetermined shooting location specified by the GPS information, the image organizing unit 54 records it in the Exif tag of each input image file. GPS information representing the shooting location being acquired is acquired, and based on the GPS information, the distance between the predetermined location as the extraction condition and the shooting location of each input image is calculated, and the calculated distance is a predetermined threshold value The following input image is extracted.

  Further, when extraction conditions for excluding failed photographs are set, for example, a main subject area such as a face in the input image is detected, and a luminance histogram in each of the main subject area and other background areas is generated. , Compare the profile with a profile registered in advance for appropriate brightness, and exclude images whose difference is greater than the predetermined threshold as images with inappropriate brightness, The main subject is detected, and an image in which the integral value of the high frequency component in the detected area is smaller than a predetermined threshold is excluded as a defocused image, or edges in a plurality of directions are detected from the input image, and obtained for each edge direction. Create a histogram of the obtained edge width, find the correlation value of the histogram of the edge width in two orthogonal directions for each direction set, and calculate the average edge in each direction Based on the correlation value of the edge width histogram in each direction group, the blur direction is determined in the direction group in which the correlation value is smaller than the predetermined threshold and the average edge width is larger, and is perpendicular to the determined blur direction. The width of the edge is set as the blur width (see Japanese Patent Application Laid-Open No. 2005-122721), and an image having the obtained blur width larger than a predetermined threshold is excluded as a camera shake image.

  Note that images that are not extracted or excluded by this extraction process are classified as “unorganized folders” that are created separately from the folders that are classified by the organization process described later.

  Next, in the image grouping process, for example, when the grouping condition is set so as to classify by shooting date and time (year / month), the image organizing unit 54 adds the Exif tag of the input image file. The recorded shooting date and time are acquired, and the input image file is classified into groups for each value of the shooting date and time in the acquired shooting date and time. The classification based on the shooting date / time is a method in which the shooting date / time of each image is arranged on the time axis, and the images are classified into a plurality of groups according to the shooting interval between two images having the same shooting date / time (Japanese Patent Application Laid-Open (JP-A)) 2000-112997), and a method (Japanese Patent Laid-Open No. 2001-228582) or the like that uses a group boundary between images when the shooting interval between adjacent images is larger than a predetermined threshold. You may go.

  In addition, when a plurality of items such as “shooting year, shooting month, shooting location” are set as the grouping condition, the image organizing unit 54 is first recorded in the Exif tag of the input image file. The acquired image date and time are acquired, and the input image file is classified into groups for each value of the imaged year in the acquired image date and time. Next, the image files belonging to the group of each shooting year are classified into groups for each value of the shooting month among the acquired shooting dates and times. Finally, GPS information (latitude / longitude) indicating the shooting location recorded in the Exif tag of the image file belonging to each shooting year / shooting month group is acquired, and the latitude / longitude of each acquired image file is obtained. A plot plotted on a coordinate plane with the vertical axis representing latitude and the horizontal axis representing longitude is regarded as an image (hereinafter referred to as a plot image), and a connected region is formed by performing expansion processing on the plotted image. On the other hand, grouping is performed by performing a labeling process (see, for example, JP-A-2005-49968). Each group classified based on GPS information may be given a group name such as “shooting location 1” or “shooting location 2”, or a reference table for associating GPS information with a location name is prepared in advance. In addition, a place name corresponding to the GPS information of each group may be obtained from the reference table, and the obtained place name may be used as the group name. As described above, when a plurality of items for grouping are set, the image organizing unit 54 first classifies the input image file for each shooting year, and images belonging to each group for each shooting year. Hierarchical grouping is performed by classifying files by shooting month and classifying image files belonging to each group by shooting year and month by shooting location.

  When grouping conditions are set so as to classify by person, the image organizing unit 54 is similar to the process when the set / change button 41D for a person is touched in the organizing condition input unit 61. Then, the image file input by the image input unit 52 is sequentially read, the face area in the image is detected, the similarity of the detected face is determined, and similar face images are grouped. Note that the characteristic amount of the representative face image of each group when this grouping is performed is registered in the person information table T2 in association with the customer ID acquired by the customer ID input unit 51. At this time, since a plurality of facial feature quantities are associated with one customer ID, the individual facial feature quantities can be identified by the customer ID + serial number. If the facial feature value associated with the customer ID has already been registered, the similarity between the facial feature value to be registered this time and the already registered facial feature value is determined, When it is determined, new registration is not performed, and when it is determined that they are not similar, new registration is performed. Further, when face grouping similar to the above is performed by touching the set / change button 41D for a person on the organizing condition setting screen 41 in the organizing condition input unit 62, the processing result Can be used.

  When grouping conditions are set so as to classify each event represented in the image, the image organizing unit 54, for example, applies a plurality of feature amounts obtained by a plurality of types of image analysis on the input image. Among the plurality of total feature amounts obtained based on the above, the total feature amount change amount between adjacent images is positively correlated with the shooting date and time interval in the plurality of input images. The event that is represented best is selected as the event feature amount, and classification is performed according to the distribution of the selected event feature amount.

Specifically, first, the image organizing unit 54 performs a plurality of types of image analysis for each of the input image files, and a plurality of types of feature amounts g i (i = 1, 2,..., n,..., N) are calculated. Here, the calculated feature amount includes, for example, an image color, brightness, texture, depth, an edge in the image, and the like. Next, for example, as shown in the following expression (1), the plurality of feature quantities g i and the plurality of coefficients a i [j], b i [j], c i [j] (j = 1, 2 ,..., M,..., M), M types of new overall feature quantities Ev (j) are obtained.
Ev [j] = (a 1 [j] ・ g 1 + a 2 [j] ・ g 2 + ... + a N [j] ・ g N )
+ (b 1 [j] ・ g 1・ g 2 + b 2 [j] ・ g 2・ g 3 + ... + b N-1 [j] ・ g n-1・ g n )
+ (c 1 [j] · g 1 2 + c 2 [j] · g 2 2 + ... + c N [j] · g n 2 ) (1)
As a result, for one input image, M total feature values Ev [j] having different combinations of values of the coefficients a i [j], b i [j], c i [j] are obtained.

  Further, the correlation coefficient R [j] between the shooting time interval Δt between adjacent images when the input images are arranged in order of shooting date and the difference ΔEv [j] in the total feature amount between the adjacent images is expressed as j = 1,2, ..., m, ..., M, and the overall feature value Ev [J] at the value J of j when the correlation coefficient R [j] is maximized is determined as the event feature value To do.

  Then, the input images are rearranged in the order of the value of the event feature value Ev [J] of each image, and when the difference of the event feature value Ev [J] between adjacent images is larger than a predetermined threshold, the images are grouped. Classify to be a boundary.

For example, there is the overall characteristic amount Ev [J 1], comprehensive characterized ordinate amount on the horizontal axis the order of photographing, when the overall characteristic amounts Ev for each of a plurality of input image [J 1] If a series distribution is expressed, as shown in FIG. 9A, if a group (A to E) is formed for each of a plurality of continuous images, this comprehensive feature amount Ev [J 1 ] Is considered preferable for classifying this input image. On the other hand, if a similar time-series distribution is expressed for a certain comprehensive feature quantity Ev [J 2 ], if the distribution is random as shown in FIG. 9B, this comprehensive feature quantity Ev [J J 2 ] is considered not preferable for classifying this input image. In the above classification method, in order to identify these distribution patterns, it is assumed that “if there is an interval between images, there is a high possibility that an event is different between the images”. Classify the total feature quantity that shows the highest positive correlation between the shooting interval and the total feature quantity difference between adjacent images when arranged in time series for each event expressed in multiple input images Is selected as the most suitable event feature amount, and the input image is classified according to the distribution of the selected event feature amount. In the case of FIG. 9A, in the classification result according to the event feature quantity Ev [J 1 ], the groups B and D are the same group. For example, when group B includes an image of last year's athletic meet and group D includes an image of this year's athletic meet, these two groups are integrated and classified into one group called the athletic meet.

  In the image sorting process, for example, when the sort condition is set so that the images are sorted in ascending order of the shooting date and time, the image organizing unit 54 displays the shooting date and time recorded in the Exif tag of the input image file. For each group acquired and classified in the grouping process, a process of rearranging images belonging to the group in ascending order of shooting date and time is performed. In addition, when grouping is performed for each shooting date and time, the sorting order of the classified groups is also rearranged in order of shooting date and time.

  As a result of the organization performed by the image organizing unit 54 as described above, the file name of the extracted image and the group name (folder name) to which the file belongs are associated and rearranged based on the sorting condition. A list format is possible.

  The organization result evaluation unit 55 evaluates the evaluation value Sc [k] (k = 1, 2,...) Indicating the preference of the result of each of a plurality (K types) of organization processing results performed by the image organization unit 54. .., K) is calculated. This evaluation value Sc [k] is obtained for each evaluation item such as, for example, the number of groups classified by the organization process, the number of images in the group, and the deviation (variance) in the number of images between groups. It can be obtained by determining an evaluation score from 0 to 5 based on a predetermined evaluation criterion and summing the evaluation scores in each evaluation item. Here, as the predetermined evaluation criteria, for example, if the number of classified groups is too large or too small, it is not appropriate, so if it is within a predetermined value range, the evaluation score is set to 5; It is conceivable that the evaluation score is lowered as the distance from the range increases. As for the number of images in a group, too much or too little is not appropriate, so as with the number of groups, if the average number of images per group is within a predetermined value range, the evaluation score is 5. The evaluation score can be lowered as the distance from the range increases. Also, regarding the bias in the number of images between groups, it is considered appropriate as there is no bias. Therefore, the variance of the number of images for each group is obtained, and if the value is smaller than a predetermined value, the evaluation score is 5, and the variance of It can be considered that the evaluation score is lowered as the value increases.

  The organization result display unit 56 sets display priorities in descending order of evaluation values based on the evaluation value Sc [k] calculated by the organization result evaluating unit 55, and displays a predetermined number of organization results in descending order of priority. Displayed sequentially on the unit 6. FIG. 10 is an example of an organization result display screen displaying the organization result. As shown in the figure, the organization result display screen 42 includes a title area 42A for displaying the priority of the organization result display and the organization title, and a folder area 42B for displaying folders classified by the image organization unit 54. The image file in the folder selected by the user in the folder area 42B is read, and the thumbnail image area 42C for displaying the thumbnail image and the previous organization result, that is, the organization result with higher priority are displayed. “Organization result” button 42D, “Next organizing result” button 42E for displaying the next organizing result, that is, an organizing result having a lower priority, and an organizing condition corresponding to the currently displayed organizing result are displayed on another screen. The “confirmation condition confirmation” button 42F to be displayed and the title of the currently displayed organizing result are edited. "Edit title" button 42G for displaying the screen, "Exclude folder" button 42H for excluding classification by the folder selected by the user, and "File" for excluding the image selected by the user from the folder to which the image belongs "Exclude" button 42J, "exclude failed photo" button 42K that excludes failed photos such as camera shake and out-of-focus from the images in each folder, and how to organize input images corresponding to the currently displayed organization result The button is composed of a “determine this organizing result” button 42L to be confirmed and an “end” button 42M to end the work in the image organizing menu.

  Here, the organization result “n” of the title area 42A indicates that the priority is nth, that is, the evaluation value Sc [k] is the nth highest organization result. The displayed title ("Fujitaro-chan's growth record (January 2005-December 2005)") is obtained from the title item of the organizing condition table T1, or the title edit button It is input on the title editing screen displayed by touching 42G. The person name (Taro Fuji) used in the title is not registered as a fixed value in the organizing condition table T1, but is registered as a variable “NAME” and corresponds to the person ID specified in the extraction condition. The name of the person to be acquired may be acquired from the person information table T2. Similarly, the year and month are registered with variables “YY / MM-1” and “YY / MM-2”, and the minimum and maximum values of the shooting date and time of the image after the extraction process are respectively set. You may make it acquire from the Exif tag of an image file.

  In the folder area 42B, a display in which the hierarchy of folder classification is visualized is performed based on the information of the folder name to which each file in the organization result list belongs.

  The organization result correction unit 63 displays the organization result according to the touch of the “edit title” button 42G, “exclude folder” button 42H, “exclude file” button 42J, and “exclude failed photo” button 42K by the user. Make corrections.

  Specifically, when a touch on the “edit title” button 42G is detected, a title edit screen is displayed on the display unit 6. The title editing screen includes a user interface for selecting from a standard title and a user interface for freely inputting a title using character buttons such as kana and alphabet. The title selected / input on this screen is displayed in the title area 42 A of the organization result display screen 42. Furthermore, the title edit screen displays the name of the person for the face image of the person set as the extraction condition and the face image of the person corresponding to each group when the grouping condition is set for each person. It has a user interface for inputting the attributes. The attribute of the person input here is registered in the person information table T2 in association with the person ID together with the feature amount of the person's face.

  In the folder area 42B of the organization result display screen 42, when it is detected that the user has touched the folder to be excluded and touched the “exclude folder” button 42H, the organization result list is displayed. In Step 1, all the image files belonging to the selected exclusion target folder are moved to the unorganized folder by deleting the lines of all the files associated with the selected exclusion target folder. Since the exclusion process performed here is the same as the process of extracting only image files belonging to folders that are not excluded, this exclusion process is reflected as an extraction condition. For example, if the folder with the person ID = “DEF0001_05” is excluded from the folders classified by person (person ID = “DEF0001_00” to “DEF0001_05”), the person ID = “DEF0001_00” to “DEF0001_04” Since an image including a person is extracted, an extraction condition of person ID = “DEF0001_00” to “DEF0001_04” is added.

  In the thumbnail image area 42C of the organization result display screen 42, when it is detected that the user has touched an image to be excluded and touched the “exclude file” button 42J, the selected exclusion is performed. By deleting the row of the target file from the organization result list, the selected image file to be excluded is moved to the unorganized folder.

  When a touch on the “exclude failed photo” button 42K is detected, the same processing as the failed photo exclusion process of the image organizing unit 54 is performed, and the image file determined as the failed photo is moved to the unorganized folder. Let

  The organization result selection unit 57 detects a touch on the “decide on this organization result” button 42L on the organization result display screen 42, and the organization result displayed on the organization result display screen 42 when the touch is performed, Identify the corresponding organizing conditions.

  The organization result recording unit 58 records the organization result list corresponding to the organization result specified by the organization result selecting unit 57 in the memory card 2 loaded in the card slot 4.

  The organization result registration unit 59 registers the organization condition corresponding to the organization result specified by the organization result selection unit 57 in the organization condition table T 1 in association with the customer ID acquired by the customer ID input unit 51. At this time, if the arrangement condition corresponding to the customer ID has already been registered, it is checked whether or not the same arrangement condition has been registered, and if it has already been registered, the registration is not performed. If the same ordering condition is not registered, a value obtained by adding 1 to the maximum value of the serial number for the customer ID is registered as the current ordering condition as a new serial number.

  Next, an outline of the flow of image arrangement processing according to the embodiment of the present invention will be described using the flowcharts of FIGS. 5 and 6 and the registration example of the arrangement condition table T1 of FIG.

  First, as a first case, a user having the customer ID “ABC0001” is not registered in the state where the arrangement condition associated with the customer ID “ABC0001” is registered in the arrangement condition table T1 of FIG. A description will be given of a case where images are organized by manually setting organizing conditions in order to organize children's images as growth records.

  When the user selects the image organization button 40B on the initial screen of FIG. 3, the customer ID input unit 51 displays a message “Please insert a magnetic card” on the display unit 6, and the user causes the card reader 5 to display the magnetic card. 3 is inserted, the customer ID input unit 53 reads the inserted magnetic card 3 and obtains the customer ID “ABC0001” (step S1).

  Next, when the image input unit 52 displays a message “Please insert a memory card” on the display unit 6 and the user loads the memory card 2 into the card slot 4, the image input unit 52 is loaded. A plurality of image files are read from the memory card 2 and temporarily stored in the hard disk 24 (step S2).

  When the input of the customer ID and the image is completed, the arrangement condition setting mode selection unit 61 causes the display unit 6 to display an image arrangement condition setting mode selection screen. Here, when the user touches the “manual” button on the image organization condition setting mode selection screen, manual organization condition setting is selected (step S3).

  Upon receiving this selection of “manual” arrangement condition setting (step S4; No), the arrangement condition input unit 61 displays the arrangement condition setting screen 41 on the display unit 6. Here, when the user touches the setting / change button 41D for a person in the extraction condition setting area 41A, the organizing condition input unit 61 performs face detection and a similar face on each image input by the image input unit 52. The grouping of the images and the determination of the representative image for each group are performed, and a detailed setting screen listing the thumbnail images of the faces included in the input image is displayed on the display unit 6. Further, the arrangement condition input unit 61 registers the feature amount of the representative image for each group and the customer ID acquired by the customer ID input unit 51 in the person information table T2 in association with each other. When the user selects a thumbnail image of the face of his / her child to be organized from among the displayed thumbnail images of the face, the organization condition input unit 61 extracts the thumbnail image of the selected face as an extraction condition. The extraction condition for the person in the setting area 41A is displayed in the area for displaying, and a check mark is displayed in the check box for the person. Further, in the grouping condition setting area 41E, the user selects a classification key item by touching the check box 41G of the shooting year and the shooting month, and touches the arrow buttons 41H and 41J as appropriate to select the classification hierarchy. The order of the key items of the classification is changed so that is in the order of shooting year and shooting month. Further, in the sorting condition setting area 41K, the sorting key item is selected by touching the check box 41M for the shooting year, the shooting month, the shooting date, and the shooting time, and the arrow buttons 41H and 41J are appropriately touched. The sorting key items are rearranged by touching the ascending button 41Q so that the sorting key items are arranged in the order of shooting year, shooting month, shooting date, and shooting time. Select the ascending / descending order. The image organizing conditions manually input by the above user operations are stored in a predetermined storage area of the system memory 14 by the organizing condition input unit 62 (step S22).

  The image organizing unit 54 extracts an image including the face of the designated user's child from the input image based on the organizing condition input in the organizing condition input unit 62, and which image of each of the extracted images is taken. By deciding whether it belongs to the year / shooting month folder, it is classified hierarchically by shooting year and shooting month, and the classified folders are arranged in ascending order of shooting year and shooting month. The arrangement order of the image files in the rearrangement result list is determined so that each of the files is arranged in ascending order of shooting date and time (step S23).

  The organization result display unit 56 causes the display unit 6 to display the organization result display screen 42 (see FIG. 10) representing the organization result of the image organization unit 54 (step S14).

  Here, the user confirms the displayed organization result and touches the “title editing” button 42G (step S15; present). In response to the touch of the “title editing” button 42G, the organization result correction unit 63 causes the display unit 6 to display a title editing screen. The user selects “(NAME) -chan's growth record ((YY / MM-1) to (YY / MM-2))” from the standard titles, and his / her child selected as the extraction condition. The person name (Taro Fuji) corresponding to the face image is input (step S21). As a result, as shown in FIG. 10, in the title area 42A of the organization result display screen 42, the organization title “Taro Fuji-chan's growth record (January 2005 to February 2006)” is displayed. . (Step S14). The person name “Taro Fuji” input at this time is registered in the person information table T2 in association with the same person ID as the feature amount of the face image of his / her child selected as the extraction condition.

  When the editing of the title and the correction of the organization result are completed (step S15; none) and the user touches the “determine this organization result” button 42L on the organization result display screen 42 (step S16; present), the organization result selection unit 57 Accordingly, the display result displayed on the organization result display screen 42 and the organization condition corresponding to the display result are specified.

  The organization result recording unit 58 records the identified organization result list on the memory card 2 loaded in the card slot 4 (step S17).

  The organizing condition registration unit 59 acquires the already registered organizing condition corresponding to the customer ID “ABC0001” input by the customer ID input unit 51 from the organizing condition table T1 (step S18). It is determined whether or not the conditions overlap (step S19). Here, since it is assumed that no arrangement condition associated with the customer ID “ABC0001” is registered in the arrangement condition table T1, it is determined that there is no duplication of arrangement conditions (step S19; none), The arrangement condition is registered in the arrangement condition table T1 in association with the customer ID “ABC0001” and the serial number “01” (step S20). The arrangement title, extraction condition, grouping condition, and sort condition associated with the customer ID “ABC0001” and the serial number “01” in the arrangement condition table T1 illustrated in FIG. 8 are registered as described above. It is.

  When the user touches the “END” button 42M on the organization result display screen 42 (step S16; none), the organization screen is returned to the initial screen 40 without recording the organization result and registering the organization conditions.

  Next, as a second case, after the arrangement condition associated with the customer ID “ABC0001” and the serial number “01” is registered in the arrangement condition table T1 by the above processing, the customer ID “ABC0001” is registered. A case will be described in which the user arranges the newly acquired image as a growth record of his / her child as in the previous case.

  In the same manner as described above, after the selection of the image organization button 40B, the insertion of the magnetic card 3, the acquisition of the customer ID (step S1), the loading of the memory card 2 and the input of the image (step S2), the organization condition setting is performed. The mode selection unit 61 displays an image organization condition setting mode selection screen on the display unit 6, and when the user touches an “auto” button on the image organization condition setting mode selection screen, automatic organization condition setting is selected. (Step S3).

  Upon receiving this selection of “automatic” arrangement condition setting (step S4; Yes), the arrangement condition acquisition unit 53 arranges the customer ID and serial number “01” input by the customer ID input unit 51 as search keys. Referring to the condition table T1 (step S5), the first item (serial number “01”) of the arrangement condition corresponding to the customer ID “ABC0001” is acquired from the arrangement condition table T1 (step S6; present).

  Based on the acquired extraction condition “person ID =“ ABC0001 — 01 ””, the image organizing unit 54 detects a face area from the image input by the image input unit 52 and obtains a facial feature amount, and the person information table T 2. The feature amount corresponding to the person ID “ABC0001_01” is acquired from the image, the statistical distance between both feature amounts is calculated, the input image in which the calculated statistical distance is equal to or less than the predetermined threshold is extracted, and the grouping condition “shooting date and time (YY), shooting date and time (MM) ", based on the shooting date and time attached to the extracted image, classify hierarchically by shooting year and shooting month, based on the sorting condition" shooting date and time, ascending order " The images are rearranged in order of shooting date and time (step S7).

  When the image arrangement process based on the first arrangement condition is completed in this way, the arrangement condition acquisition unit 53 adds 1 to the serial number value to “02”, and two cases corresponding to the customer ID “ABC0001”. An attempt is made to acquire eye sorting conditions (step S8). Here, since only one arrangement condition corresponding to the customer ID “ABC0001” is registered, the second case is not applicable (step S9; none), and subsequently, the image is arranged according to the apparatus default arrangement condition. A user interface for selecting whether to perform or not is displayed on the display unit 6 (step S10).

  Here, after the user selects the one that does not organize the images according to the default arrangement conditions and skips the image arrangement processing according to the default arrangement conditions (step S10; not), the arrangement processing performed is only one pattern. Is determined (step S12). Here, since only one pattern of the arrangement condition of customer ID “ABC0001” and serial number “01” is performed (step S12; Yes), the arrangement result display unit 56 displays the arrangement result based on the arrangement condition. The result display screen 42 (see FIG. 10) is displayed on the display unit 6 (step S14). The name “Taro Fuji” in the title area 42A of the organization result display screen 42 corresponds to the person ID “ABC0001_01” in the extraction condition and is registered in the person information table T2. Has acquired.

  Thereafter, as described above, the organization result is corrected as necessary (steps S15 and S21), and when the user touches the “determine this organization result” button 42L on the organization result display screen 42 (step S16; present). The organization result selection unit 57 identifies the display result displayed on the organization result display screen 42 and the organization condition corresponding thereto, and the organization result recording unit 58 records the identified organization result list in the memory card 2. (Step S17).

  The organizing condition registration unit 59 acquires the already registered organizing condition corresponding to the customer ID “ABC0001” input by the customer ID input unit 51 from the organizing condition table T1 (step S18). It is determined whether or not the conditions overlap (step S19). Here, the arrangement conditions of the customer ID “ABC0001” and the serial number “01” already registered in the arrangement condition table T1 are acquired and compared with the current arrangement conditions. If the user has not corrected the organizing result that affects the organizing condition, it is determined that the organizing condition is duplicated (step S19; present), and the present organizing condition is not registered. When it is determined that there is no duplication of the arrangement condition due to the modification of the arrangement result (step S19; none), the current arrangement condition corresponds to the customer ID “ABC0001” and the serial number “02”. Then, it is registered in the organizing condition table T1 (step S20).

  Further, as the third case, in the state where only the arrangement condition associated with the customer ID “ABC0001” and the serial number “01” is registered in the arrangement condition table T1 by the processing of the first case, this customer ID is registered. The user of “ABC0001” has forgotten what and when the image stored in another memory card 2 was photographed. A case where the organization is performed based on the set organization condition will be described.

  In the same manner as described above, after the selection of the image organization button 40B, the insertion of the magnetic card 3, the acquisition of the customer ID (step S1), the loading of the memory card 2 and the input of the image (step S2), the organization condition setting is performed. The mode selection unit 61 displays an image organization condition setting mode selection screen on the display unit 6, and when the user touches an “auto” button on the image organization condition setting mode selection screen, automatic organization condition setting is selected. (Step S3).

Upon receiving this selection of “automatic” arrangement condition setting (step S4; Yes), the arrangement condition acquisition unit 53 arranges the customer ID and serial number “01” input by the customer ID input unit 51 as search keys. Referring to the condition table T1 (step S5), the first item (serial number “01”) of the arrangement condition corresponding to the customer ID “ABC0001” is acquired from the arrangement condition table T1 (step S6; present).

  The image organizing unit 54 performs image organizing processing based on the acquired organizing conditions (step S7). Here, the result of the nth image rearrangement process is defined as the nth rearrangement result. Therefore, the result of arrangement by the arrangement condition corresponding to the customer ID “ABC0001” and the serial number “01” becomes the first arrangement result.

  When the image arrangement process based on the first arrangement condition is completed, the arrangement condition acquisition unit 53 adds 1 to the serial number value to “02”, and the second arrangement condition corresponding to the customer ID “ABC0001”. (Step S8). Here, since only one arrangement condition corresponding to the customer ID “ABC0001” is registered, the second case is not applicable (step S9; none), and subsequently, the image is arranged according to the apparatus default arrangement condition. A user interface for selecting whether to perform or not is displayed on the display unit 6 (step S10).

  Here, the user selects a method of organizing images based on the default organizing conditions (Step S10; Yes), and the organizing condition acquisition unit 53 and the image organizing unit 54 perform an image organizing process based on the default organizing conditions ( Step S11). Specifically, the organizing condition acquisition unit 53 sets the customer ID to the default reserved value “ZZZ9999” for the organizing condition, sets the serial number to “01”, refers to the organizing condition table T1, and arranges the organizing condition table T1. The first item of the default arrangement condition is acquired from the image arrangement unit 54, and the image arrangement unit 54 performs image arrangement processing based on the acquired default arrangement condition. Hereinafter, the organizing condition acquisition unit 53 acquires the next default organizing condition while adding serial numbers one by one, and the image organizing unit 54 performs the image organizing process based on the acquired default organizing condition. repeat.

  As a result of the image arrangement process described above, in the case of the arrangement condition table T1 of FIG. 8 (however, the arrangement condition of the customer ID “ABC0001” and the serial number “02” is not registered at this time), the customer ID “ABC0001” ”And the first arrangement result based on the arrangement condition of the serial number“ 01 ”and the hierarchy for each person ID, shooting year, and shooting month based on the arrangement condition of the customer ID“ ZZZ9999 ”and the serial number“ 01 ”. Classified by shooting year, shooting month, and shooting location based on the second sorting result sorted in ascending order of shooting date and time and the sorting condition of customer ID “ZZZ9999” and serial number “02” Classified according to the distribution of event feature values based on the third organizing result sorted in ascending order of photographing date and time, and the organizing condition of customer ID “ZZZ9999” and serial number “03” A total of 4 patterns of the 4th result sorted in ascending order of date and time An image rearrangement result is obtained. The person ID in the second organization result indicates that the face images detected from the input image are grouped according to the similarity as described above, and the representative of each group when this grouping is performed. The feature quantity of the face image is registered in the person information table T2 in association with the person ID.

  As described above, in this case, since a plurality of arrangement processes are performed (step S12; No), the arrangement result evaluation unit 55 determines the number of groups, the number of images in the group, and the images between groups for each of the arrangement results. Sc [4] is calculated from evaluation values Sc [1] representing the preference of each organizing result by making the number deviation (dispersion) or the like a point (step S13).

  The rearrangement result display unit 56 rearranges the evaluation values Sc [1] to Sc [4] in descending order of the values and sets the priority of display. The rearrangement result having the highest evaluation value (highest priority) is set as the rearrangement result 1. The organization result display screen 42 is displayed on the display unit 6 (step S14). When the user touches the “next organizing result” button 42E on the organizing result display screen 42, the organizing result display unit 56 displays the organizing result display screen 42 with the organizing result having the next highest evaluation value as the organizing result 2. It is displayed on the display unit 6. Similarly, the organization results 3 and 4 can be displayed. When the user touches the “previous organization result” button 42D on the organization result display screen 42, the organization result display unit 56 displays the organization result that represents the organization result that is one priority higher than the currently displayed organization result. The screen 42 is displayed on the display unit 6.

  Thereafter, in the same manner as described above, the organization result is corrected as necessary (steps S15 and S21), and the organization result determined to be most preferable by the user (here, the fourth organization result) is displayed as the organization result display screen. When the user touches the “determine this organizing result” button 42L (step S16; Yes), the organizing result selection unit 57 displays the display result displayed on the organizing result display screen 42 and the display result The corresponding organizing condition is specified, and the organizing result recording unit 58 records the identified organizing result list in the memory card 2 (step S17).

  The organizing condition registration unit 59 acquires the already registered organizing condition corresponding to the customer ID “ABC0001” input by the customer ID input unit 51 from the organizing condition table T1 (step S18). It is determined whether or not the conditions overlap (step S19). Here, the arrangement condition of customer ID “ABC0001” and serial number “01” already registered in the arrangement condition table T1 and the arrangement condition selected by the user, that is, customer ID “ZZZ9999” and serial number “03”. Therefore, it is determined that there is no duplication of the arrangement condition (step S19; none), and the selected arrangement condition is the customer ID “ ABC0001 "and serial number" 02 "are associated with each other and registered in the arrangement condition table T1 (step S20). In the registration example associated with the customer ID “ABC0001” and the serial number “02” in FIG. 8, the “Edit Title” button 42G is touched on the organization result display screen 42 and the title “Athletic Meet” is input. (Steps S15 and S21; the organization result correction unit 63).

  Similarly, as a fourth case, when the user with the customer ID “ABC0001” causes the new image to be automatically arranged according to the arrangement condition set automatically, the above steps S5 to S9 are performed. By the processing in the organizing condition acquisition unit 53 and the image organizing unit 54, two organizing results based on the two organizing conditions associated with the customer ID “ABC0001” already registered in the organizing condition table T1 are obtained. Here, when the user selects not to organize under the default arrangement condition (step S10; not), the arrangement result evaluation unit 55 evaluates the evaluation value Sc [for each of these two arrangement results. 1], Sc [2] are calculated (step S13), and the organization result display unit 56 displays the organization result display screen 42 on the display unit 6 as the organization results 1 and 2 in descending order of evaluation values (step S14). . As described above, even when a plurality of arrangement conditions associated with the customer ID are registered in the arrangement condition table T1, as in the third case, the evaluation of the arrangement results based on these arrangement conditions and the arrangement results Display can be made.

  Here, a supplementary description will be given of the process of reflecting the correction contents of the organization result in the organization condition. For example, the user with the customer ID “DEF0001” automatically organizes the new image by setting the organizing condition in a state where the organizing condition associated with the customer ID is not registered in the organizing condition table T1. Is selected based on each of the three arrangement conditions of the customer ID “ZZZ9999” that the arrangement condition acquisition unit 53 sequentially acquires from the arrangement condition table T1. Then, the image organizing unit 54 performs image organizing processing to obtain three organizing results (step S11), and the organizing result evaluating unit 55 calculates Sc [3] from the evaluation values Sc [1] for each organizing result. (Step S13) The organization result display unit 56 displays the organization result display screen 42 on the display unit 6 as the organization results 1, 2, and 3 in descending order of evaluation values (step S14). Here, when the images are classified into six folders with the person IDs “DEF0001_00” to “DEF0001_05” as a result of the image arrangement processing based on the arrangement conditions of the customer ID “ZZZ9999” and the serial number “01”, the user When the folder with the person ID “DEF0001_05” in the folder area 42B is touched to select the folder and the “Folder Exclusion” button 42H is touched, the folder with the person ID “DEF0001_05” is moved to the unorganized folder, and the result display screen This folder is deleted from 42. Furthermore, when the “exclude failed photo” button 42K is touched, the images determined as failed photos in the above-described processing are moved from the images classified into the respective folders to the unorganized folder, and this image is displayed on the organized result display screen 42. The image is erased. Here, when the user is satisfied with the organization result and touches the “determine this organization result” button 42 </ b> L (step S <b> 16; present), the organization result selection unit 57 displays the display result displayed on the organization result display screen 42. The organizing result corresponding to that is specified, the organizing result recording unit 58 records a list of the identified organizing results on the memory card 2 (step S17), and the organizing condition registering unit 59 overlaps with the already registered organizing condition. After checking (steps S18 and S19), the arrangement conditions are registered in the arrangement condition table T1 (step S20). Here, in the rearrangement condition table T1 shown in FIG. 8, the rearrangement condition associated with the registered customer ID “DEF0001” and the serial number “01” and the rearrangement result based on this rearrangement condition are the basis. Comparing with the default arrangement condition of the customer ID “ZZZ9999” and the serial number “01”, the registered arrangement condition includes an extraction condition of person ID = “DEF0001_00” to “DEF0001_04” and “failure photo exclusion”. It can be seen that an extraction (exclusion) condition is added. These indicate that the correction of the organization result performed by the user operation on the organization result display screen 42 is specified by the organization result selection unit 57 and reflected in the organization condition.

  As described above, in the image organizing process according to the first embodiment of the present invention, image organizing conditions performed in the past are stored in the organizing condition table T1 for each user ID of the user, and the organizing condition setting mode is set. In the selection unit 61, when the user selects automatic setting of the organization condition, the organization condition acquisition unit 53 displays the organization condition associated with the customer ID input by the customer ID input unit 51 from the organization condition table T1. Acquired and the image organizing unit 54 organizes new images input from the image input unit 52 based on the acquired organizing conditions. Compared to organizing images, it is possible not only to automatically organize images that match the user's desired organizing viewpoint, but also to associate images with organizing conditions each time they are taken. Without Ri, it is possible to automatically organize in less than time and effort of the user.

  In the arrangement condition table T1, by identifying individual arrangement conditions by combining customer IDs and serial numbers, a plurality of arrangement conditions can be stored for each user (one customer ID). Since the image organizing unit 54 organizes images of a plurality of patterns on the basis of each of the plurality of organizing conditions sequentially acquired by 53, it can respond more flexibly to various requests for the user's image organizing conditions. It becomes possible.

  Furthermore, the organization result evaluation unit 55 calculates an evaluation value Sc [k] representing the preference of the organization result for each of the results of organizing a plurality of patterns, and the organization result display unit 56 is evaluated as being more preferable. Since the result of organization is preferentially displayed, the user can more easily determine which of the organization results by a plurality of patterns is preferable.

  Next, an image arrangement process according to the second embodiment of the present invention will be described. In the second embodiment, the arrangement of each user and personal information is not stored in a storage device such as a hard disk of the order receiving apparatus 1, but is stored in an IC card possessed by the user. Different from form. The hardware configuration is substantially the same as that of the first embodiment shown in FIGS. 1 and 2, but the magnetic card 3 is replaced with an IC card 3 ′ having a larger storage capacity, and the card reader 5 is mounted. The card reader / writer 5 'is readable / writable with respect to the memory of the IC card.

  FIG. 11 is a schematic block diagram showing the configuration of the order receiving device 1 according to the embodiment of the present invention. As shown in FIG. 11, instead of the customer ID input unit 51, an arrangement condition reading unit 71 for reading arrangement conditions and a person information reading unit 72 for reading person information including attributes and feature amounts for each person ID are provided. Further, the second embodiment is different from the first embodiment.

  Here, the organizing condition reading unit 71 and the person information reading unit 72 display a message prompting the insertion of the IC card 3 ′ into the card reader / writer 5 ′ such as “Please insert the IC card” on the display unit 6. When the IC card 3 ′ is inserted into the card reader / writer 5 ′, the organizing conditions and the personal information are acquired from the memory of the inserted IC card 3 ′.

  This organizing condition is an organizing condition at the time of image organizing performed in the past by the user who possesses the IC card 3 '. The arrangement condition read by the arrangement condition reading unit 71 is temporarily stored in a predetermined storage area (user arrangement condition memory) of the system memory 14 of the order receiving apparatus 1. These user-specific arrangement conditions and the default arrangement conditions of the apparatus are referred to by the arrangement condition acquisition unit 53 as in the first embodiment, but only the arrangement conditions of the user are stored in the user arrangement condition memory. Therefore, the customer ID is not necessary, and the organization condition acquisition unit 53 reads in order from the first organization condition stored in the user organization condition memory and acquires the organization condition. On the other hand, the default arrangement condition of the apparatus is read from the hard disk 24 and stored in a predetermined storage area (default arrangement condition memory) of the system memory 14 when the subprogram for performing the image arrangement process is started. Since this default arrangement condition is also stored separately from the user's own arrangement condition, there is no need to identify it by a special customer ID or the like, and the arrangement condition acquisition unit 53 is stored in the default arrangement condition memory. Read in order from the first organizing condition, and obtain the organizing condition.

  When the organizing condition registration unit 59 registers a new organizing condition, data is written to the memory of the IC card 3 'via the card reader / writer 5'.

  The person information is obtained at the time of image rearrangement performed in the past by the user with the customer ID who possesses the IC card 3 '. The person information read by the person information reading unit 72 is temporarily stored in a predetermined storage area (person information memory) of the system memory 14 of the order receiving device 1. The stored personal information is referred to as necessary by the image organizing unit 54 as in the first embodiment. Further, when newly writing personal information, the personal information is once written in the personal information memory, and when the organizing condition registration unit 58 performs writing to the IC card 3 ′ via the card reader / writer 5 ′, the person is simultaneously recorded. Write the contents of the information memory.

  The processing flow in the second embodiment is such that the customer ID input reception (step S1) is not performed in the flowcharts of the first embodiment of FIGS. 5 and 6, and the arrangement condition is acquired in step S5. In addition, the user's own organizing conditions and person information are read from the memory of the IC card 3 ′ and acquired, and the rest is the same as the first embodiment.

  As described above, in the image organizing process according to the second embodiment of the present invention, the user's own organizing conditions and personal information are not managed on the order receiving device 1 side, but managed by the IC card 3 ′ possessed by the user. As a result, management on the order receiving device 1 side is facilitated, and protection of personal information is enhanced. The effects obtained in the first embodiment can be obtained in this embodiment as well.

DESCRIPTION OF SYMBOLS 1 Order reception apparatus 2 Memory card 3 Magnetic card 3 'IC card 4 Card slot 5 Card reader 5' Card reader / writer 6 Display part 8 Printer 10 DIC
12 CPU
14 System memory 16 Memory control unit 18 Input unit 22 Input control unit 24 Hard disk 26 Display control unit 30 Network interface 40 Initial screen 41 Arrangement condition setting screen 42 Arrangement result display screen 51 Customer ID input unit 52 Image input unit 53 Arrangement condition acquisition unit 54 Image Arrangement Unit 55 Arrangement Result Evaluation Unit 56 Arrangement Result Display Unit 57 Arrangement Result Selection Unit 58 Arrangement Result Recording Unit 59 Arrangement Condition Registration Unit 61 Arrangement Condition Setting Mode Selection Unit 62 Arrangement Condition Input Unit 63 Arrangement Result Correction Unit 71 Reading Arrangement Condition Part 72 Person information reading part

Claims (12)

  1. Image organizing means for organizing the plurality of images based on the faces of persons included in the plurality of input images;
    Display means for displaying, for each person, face images of persons included in the plurality of input images;
    Selection of a desired face image from among the face images displayed by the display means by the user, and setting of an organization condition for grouping the plurality of inputted images for each person of the selected face image An arrangement condition setting means for receiving,
    The image organizing means organizes the plurality of input images into groups for each person of face images included in the organizing conditions based on the organizing conditions received by the organizing condition setting means. An image organizing apparatus characterized by that.
  2.   The display means extracts a representative face image of a person's face included in the plurality of images for each person by performing image analysis on the input plurality of images, and displays the extracted representative face image as the face image The image organizing apparatus according to claim 1, wherein:
  3. An arrangement condition storage means for storing the arrangement condition received by the arrangement condition setting means in association with the identification information of the user;
    User identification information receiving means for receiving user identification information;
    Storage condition rearrangement instruction receiving means for receiving a storage condition rearrangement instruction for organizing based on the rearrangement conditions stored in the rearrangement condition storage means,
    When the storage condition rearrangement instruction receiving means receives the storage condition rearrangement instruction, the image rearranging means acquires the rearrangement condition corresponding to the user identification information received by the user identification information receiving means from the rearrangement condition storage means. 3. The image organizing apparatus according to claim 1, wherein the image organizing apparatus performs organizing by grouping the plurality of input images based on the acquired organizing conditions.
  4. Organizing condition output means for storing the organizing condition received by the organizing condition setting means in a readable / writable storage medium possessed by the user;
    Organizing condition acquisition means for acquiring organizing conditions from the storage medium;
    Storage condition rearrangement instruction accepting means for receiving a storage condition rearrangement instruction for organizing based on the rearrangement conditions stored in the storage medium,
    When the image organizing means accepts the memory condition organizing instruction accepting means, the plurality of input images are grouped based on the organizing conditions acquired from the storage medium possessed by the user. The image organizing apparatus according to claim 1, wherein the image organizing apparatus performs sorting.
  5. In an image organizing method executed by an image organizing apparatus that organizes a plurality of images based on human faces included in a plurality of input images,
    Displaying face images of persons included in the plurality of input images for each person,
    Accepting selection of a desired face image from the displayed face images by a user and setting of an organizing condition for grouping the plurality of inputted images for each person of the selected face image;
    An image organizing method comprising: organizing the plurality of input images into groups based on the received organizing conditions for each person of face images included in the organizing conditions.
  6.   For the display of the face image, a representative face image of a person's face included in the plurality of images is extracted for each person by performing image analysis on the input plurality of images, and the extracted representative face image is extracted from the face image. The image organizing method according to claim 5, wherein the image organizing method is displayed.
  7. The accepted arrangement condition is stored in the arrangement condition storage means in association with the identification information of the user,
    When receiving an input of user identification information and a storage condition rearrangement instruction for organizing based on the stored rearrangement conditions,
    An arrangement condition corresponding to the received user identification information is acquired from the arrangement condition storage unit, and the input images are arranged and grouped based on the acquired arrangement condition. The image arrangement method according to claim 5 or 6.
  8. Storing the accepted organization condition in a readable / writable storage medium possessed by the user;
    When receiving a storage condition rearrangement instruction to organize based on the rearrangement conditions stored in the storage medium,
    7. The image organizing method according to claim 5 or 6, wherein organizing is performed by grouping the plurality of inputted images based on an organizing condition acquired from a storage medium possessed by the user.
  9. Computer
    Image organizing means for organizing the plurality of images based on the faces of persons included in the plurality of input images;
    Display means for displaying, for each person, face images of persons included in the plurality of input images;
    Selection of a desired face image from among the face images displayed by the display means by the user, and setting of an organization condition for grouping the plurality of inputted images for each person of the selected face image It functions as a receiving condition setting means to accept,
    The image organizing means organizes the plurality of input images into groups for each person of face images included in the organizing conditions based on the organizing conditions received by the organizing condition setting means. An image organization program characterized by that.
  10.   The display means extracts a representative face image of a person's face included in the plurality of images for each person by performing image analysis on the input plurality of images, and displays the extracted representative face image as the face image The image organizing program according to claim 9, wherein
  11. Computer
    An arrangement condition storage means for storing the arrangement condition received by the arrangement condition setting means in association with the identification information of the user;
    User identification information receiving means for receiving user identification information;
    Further functioning as a storage condition rearrangement instruction receiving means for receiving a storage condition rearrangement instruction for organizing based on the rearrangement conditions stored in the rearrangement condition storage means;
    When the storage condition rearrangement instruction receiving means receives the storage condition rearrangement instruction, the image rearranging means acquires the rearrangement condition corresponding to the user identification information received by the user identification information receiving means from the rearrangement condition storage means. The image organizing program according to claim 9 or 10, wherein the image organizing program is for organizing the inputted plurality of images into groups based on the acquired organizing conditions.
  12. Computer
    Organizing condition output means for storing the organizing condition received by the organizing condition setting means in a readable / writable storage medium possessed by the user;
    Organizing condition acquisition means for acquiring organizing conditions from the storage medium;
    Further functioning as a storage condition rearrangement instruction receiving means for receiving a storage condition rearrangement instruction for organizing based on the rearrangement conditions stored in the storage medium,
    When the image organizing means accepts the memory condition organizing instruction accepting means, the plurality of input images are grouped based on the organizing conditions acquired from the storage medium possessed by the user. The image organizing program according to claim 9 or 10, wherein the image organizing program is used for organizing the images.
JP2011277615A 2011-12-19 2011-12-19 Image organizing apparatus and method, and program Active JP5485254B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011277615A JP5485254B2 (en) 2011-12-19 2011-12-19 Image organizing apparatus and method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011277615A JP5485254B2 (en) 2011-12-19 2011-12-19 Image organizing apparatus and method, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2006099210 Division 2006-03-31

Publications (2)

Publication Number Publication Date
JP2012079337A true JP2012079337A (en) 2012-04-19
JP5485254B2 JP5485254B2 (en) 2014-05-07

Family

ID=46239412

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011277615A Active JP5485254B2 (en) 2011-12-19 2011-12-19 Image organizing apparatus and method, and program

Country Status (1)

Country Link
JP (1) JP5485254B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016786A (en) * 2012-07-09 2014-01-30 Canon Inc Image processor, image processing method, and program
JP2014016784A (en) * 2012-07-09 2014-01-30 Canon Inc Image processor, image processing method, and program
JP2014142783A (en) * 2013-01-23 2014-08-07 Fuji Xerox Co Ltd Information processing apparatus and program
JP2014229178A (en) * 2013-05-24 2014-12-08 株式会社東芝 Electronic apparatus, display control method, and program
JP2016014995A (en) * 2014-07-01 2016-01-28 富士フイルム株式会社 Image processing apparatus, image processing method, image processing program, and print order receiving apparatus

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6318102B2 (en) 2015-02-04 2018-04-25 富士フイルム株式会社 Image display control device, image display control method, image display control program, and recording medium storing the program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003150932A (en) * 2001-11-12 2003-05-23 Olympus Optical Co Ltd Image processing unit and program
JP2005107885A (en) * 2003-09-30 2005-04-21 Casio Comput Co Ltd Image classifying device and program
JP2005157764A (en) * 2003-11-26 2005-06-16 Canon Inc Image retrieval device, image retrieval method, program, and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003150932A (en) * 2001-11-12 2003-05-23 Olympus Optical Co Ltd Image processing unit and program
JP2005107885A (en) * 2003-09-30 2005-04-21 Casio Comput Co Ltd Image classifying device and program
JP2005157764A (en) * 2003-11-26 2005-06-16 Canon Inc Image retrieval device, image retrieval method, program, and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016786A (en) * 2012-07-09 2014-01-30 Canon Inc Image processor, image processing method, and program
JP2014016784A (en) * 2012-07-09 2014-01-30 Canon Inc Image processor, image processing method, and program
JP2014142783A (en) * 2013-01-23 2014-08-07 Fuji Xerox Co Ltd Information processing apparatus and program
JP2014229178A (en) * 2013-05-24 2014-12-08 株式会社東芝 Electronic apparatus, display control method, and program
JP2016014995A (en) * 2014-07-01 2016-01-28 富士フイルム株式会社 Image processing apparatus, image processing method, image processing program, and print order receiving apparatus

Also Published As

Publication number Publication date
JP5485254B2 (en) 2014-05-07

Similar Documents

Publication Publication Date Title
US20180046855A1 (en) Face detection and recognition
US10346677B2 (en) Classification and organization of consumer digital images using workflow, and face detection and recognition
US9639740B2 (en) Face detection and recognition
US9721148B2 (en) Face detection and recognition
US9881229B2 (en) Apparatus, method and program for image search
US10754836B1 (en) Facial based image organization and retrieval method
US20150213837A1 (en) Image processing device, print production system, photograph album production system, image processing method, and program
US7574054B2 (en) Using photographer identity to classify images
US20150016691A1 (en) Image Tagging User Interface
EP1450307B1 (en) Apparatus and program for selecting photographic images
US9507802B2 (en) Information processing apparatus and method, and program
US6636648B2 (en) Albuming method with automatic page layout
US7978936B1 (en) Indicating a correspondence between an image and an object
JP4412342B2 (en) Content management device, image display device, imaging device, processing method in them, and program for causing computer to execute the method
US8406481B2 (en) Automated indexing for distributing event photography
JP5403838B2 (en) Digital image organization by correlating faces
EP1770638B1 (en) Displaying images according to an image evaluation value
JP3984029B2 (en) Image processing apparatus and program
JP4902270B2 (en) How to assemble a collection of digital images
US20130195375A1 (en) Tagging images with labels
US8594440B2 (en) Automatic creation of a scalable relevance ordered representation of an image collection
KR101618735B1 (en) Method and apparatus to incorporate automatic face recognition in digital image collections
KR101246737B1 (en) Content managing device and content managing method
US7286255B2 (en) Method, system, and program for storing images
US6396963B2 (en) Photocollage generation and modification

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120116

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130827

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131025

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140212

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140219

R150 Certificate of patent or registration of utility model

Ref document number: 5485254

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250