WO2017169963A1 - Système de traitement d'image, procédé de traitement d'image, programme, et support d'entregistrement - Google Patents

Système de traitement d'image, procédé de traitement d'image, programme, et support d'entregistrement Download PDF

Info

Publication number
WO2017169963A1
WO2017169963A1 PCT/JP2017/011146 JP2017011146W WO2017169963A1 WO 2017169963 A1 WO2017169963 A1 WO 2017169963A1 JP 2017011146 W JP2017011146 W JP 2017011146W WO 2017169963 A1 WO2017169963 A1 WO 2017169963A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
unit
composite image
user
images
Prior art date
Application number
PCT/JP2017/011146
Other languages
English (en)
Japanese (ja)
Inventor
小泉 功
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2017048410A external-priority patent/JP6518280B2/ja
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2017169963A1 publication Critical patent/WO2017169963A1/fr
Priority to US16/121,739 priority Critical patent/US10783355B2/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/387Composing, repositioning or otherwise geometrically modifying originals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/266Channel or content management, e.g. generation and management of keys and entitlement messages in a conditional access system, merging a VOD unicast channel into a multicast channel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor

Definitions

  • the present invention relates to an image processing system, an image processing method, a program, and a recording medium that create a composite image such as a photobook using a plurality of images acquired from a plurality of user terminal devices via a network.
  • Photobooks for example, consider the continuity and relevance of images, classify multiple images into multiple groups based on shooting time, etc., and automatically place images contained in each group on their corresponding pages Created by (automatic layout).
  • a photo book having the bride and groom as the main character is created and sent to the bride and groom.
  • the bride and groom who are the recipients of the photo book use an image that shows each person who attended the wedding reception from among multiple images owned by the bride and groom.
  • the bride and groom request a photographer to take a picture, it is considered that the bride and groom have received a large number of images in which each person attending from the photographer is captured with good image quality. In this case, it is not easy from the prior art to create and send a composite image for each person using the received many images.
  • An object of the present invention is to provide an image processing system, an image processing method, a program, and a recording medium that can create another composite image using an image of a person shown in the composite image based on the composite image. It is in.
  • the present invention provides a first composite image acquisition unit that acquires a first composite image owned by a first user; An image analysis unit for analyzing the content of the first composite image; Based on the analysis result of the first composite image, a person specifying unit for specifying a plurality of persons in the first composite image; A designated person accepting unit that accepts designation of one or more persons as designated persons from a plurality of persons reflected in the first composite image; A first image group holding unit for holding a first image group owned by the first user; An image specifying unit for specifying an image in which the designated person is captured from the first image group; An image processing system is provided that includes a composite image creation unit that creates a second composite image using an image showing a designated person.
  • the first composite image has identification information for identifying the first composite image from other images,
  • An identification information acquisition unit that acquires identification information included in the first composite image;
  • a face image holding unit that acquires a plurality of face images including face images of a plurality of users from a terminal device of a plurality of users via a network, and holds the acquired plurality of face images
  • a face image specifying unit that specifies face images of a plurality of users from among the plurality of face images held in the face image holding unit based on the identification information acquired by the identification information acquiring unit;
  • the person specifying unit specifies each of a plurality of users corresponding to each of the plurality of persons shown in the first composite image based on the face images of the plurality of users specified by the face image specifying unit, It is preferable that the designated person receiving unit identifies a user corresponding to one or more designated persons from among a plurality of users identified by the person identifying unit.
  • the present invention also provides an identification information acquisition unit that acquires identification information for identifying the first composite image owned by the first user from other images;
  • a face image holding unit that acquires a plurality of face images including face images of a plurality of users from a terminal device of a plurality of users via a network, and holds the acquired plurality of face images
  • a face image specifying unit that specifies face images of a plurality of users from a plurality of face images held in the face image holding unit based on the identification information acquired by the identification information acquisition unit;
  • a first image group holding unit for holding a first image group owned by the first user;
  • a designated person receiving unit that receives designation of one or more persons as a designated person from among a plurality of users identified by the face image identifying unit;
  • An image specifying unit for specifying an image in which the designated person is captured from the first image group;
  • An image processing system is provided that includes a composite image creation unit that creates a second composite image using an image showing a designated person.
  • An importance level information acquisition unit that acquires importance level information of the designated person set by the first user from the terminal device of the first user via the network; It is preferable that the composite image creation unit creates the second composite image based on the importance level information by preferentially using an image in which a designated person having a higher importance than a designated person having a lower importance is captured.
  • a comment obtaining unit for obtaining a first user's comment on the designated person from the terminal device of the first user;
  • the composite image creating unit creates a second composite image using a comment in addition to a plurality of images in which the designated person is shown.
  • a moving image acquisition unit that acquires each of one or more moving images associated with each of the plurality of images used in the second composite image from the terminal device of the first user;
  • the composite image creating unit creates a second composite image using the first image group including one or more images associated with each of the one or more moving images and including the designated person,
  • the first image associated with the first moving image of the one or more moving images in the first image group used in the second composite image is shot by the image shooting unit of the designated person's terminal device, and When the photographed first image is displayed on the image display unit of the designated person's terminal device, the first moving image is preferably reproduced on the image display unit of the designated person's terminal device.
  • the first moving image is reproduced within the display area of the first image displayed on the image display unit of the designated person's terminal device.
  • the first composite image acquisition unit acquires a first composite image owned by the first user; An image analysis unit analyzing the content of the first composite image; A person specifying unit specifying a plurality of persons in the first composite image based on the analysis result of the first composite image; A designated person accepting unit for accepting designation of one or more persons as designated persons from a plurality of persons reflected in the first composite image; A first image group holding unit holding a first image group owned by the first user; An image specifying unit specifying an image in which the designated person is captured from the first image group; And a step of creating a second composite image using an image in which a designated person is captured.
  • the first composite image has identification information for identifying the first composite image from other images
  • An identification information acquisition unit acquiring identification information included in the first composite image
  • the face image holding unit acquires a plurality of face images including the face images of the plurality of users from the terminal devices of the plurality of users via the network, and the acquired plurality of face images Holding step
  • a face image specifying unit specifying a plurality of user face images from the plurality of face images held in the face image holding unit based on the identification information acquired by the identification information acquiring unit
  • the step of specifying a person specifies each of a plurality of users corresponding to each of a plurality of persons shown in the first composite image based on the face images of the plurality of users specified by the face image specifying unit
  • the step of accepting the designation of the person preferably specifies a user corresponding to one or more designated persons from among a plurality of users specified by the step of specifying the person.
  • the identification information acquisition unit acquires identification information for identifying the first composite image owned by the first user from other images;
  • the face image holding unit creates the first composite image
  • the face image holding unit acquires a plurality of face images including the face images of the plurality of users from the terminal devices of the plurality of users via the network, and the acquired plurality of face images Holding step;
  • a first image group holding unit holding a first image group owned by the first user;
  • a designated person receiving unit receiving a designation of one or more persons as a designated person from among a plurality of users identified by the face image identifying unit;
  • An image specifying unit specifying an image in which the designated person is captured from the first image group;
  • the present invention also provides a program for causing a computer to execute each step of the image processing method described above.
  • the present invention also provides a computer-readable recording medium on which a program for causing a computer to execute each step of the image processing method described above is recorded.
  • a plurality of persons appearing in the first composite image are identified, and one or more persons are designated as designated persons from among the plurality of identified persons, and are owned by the first user.
  • the first user can specify the image in which the designated person is captured from the first image group, and use the specified image to create a second composite image different from the first composite image. However, it is possible to easily create the second composite image.
  • FIG. 8 is a block diagram of an embodiment illustrating a configuration of a group creation unit illustrated in FIG. 7. It is a block diagram of one Embodiment showing the structure of the user's terminal device shown in FIG. 2 is a flowchart of an embodiment illustrating an operation of the image processing system illustrated in FIG. 1.
  • 12 is a flowchart of an embodiment illustrating the operation of the image processing system following FIG. 11.
  • 13 is a flowchart illustrating an operation of the image processing system following FIG. 12 according to an embodiment.
  • 14 is a flowchart illustrating an operation of the image processing system following FIG. 13 according to an embodiment. It is a conceptual diagram of an example showing the screen which sets the budget of a synthesized image.
  • FIG. 10 is a conceptual diagram illustrating an example of a screen informing that a postcard is created by automatically selecting an image from a first image group. It is a conceptual diagram of an example showing the screen for confirming a 2nd synthesized image. It is a conceptual diagram of an example showing the screen for inputting the address of the person who sends a 2nd synthesized image.
  • FIG. 1 is a block diagram of an embodiment showing a configuration of an image processing system 10 according to the present invention.
  • An image processing system 10 shown in the figure uses a plurality of images acquired from a plurality of terminal devices 14 of a plurality of users involved in the creation of a composite image via a network 16 to generate a composite image such as a photo book including a postscript page. create.
  • the image processing system 10 includes a server 12 and a plurality of user terminal devices (clients) 14 connected to the server 12 via a network 16.
  • the server 12 performs various data processing for creating a composite image based on an instruction from the terminal device 14, and is configured by, for example, a desktop PC (Personal Computer) or a workstation. .
  • the terminal device 14 gives various instructions to the server 12 to perform various data processing, and is configured by, for example, a smartphone, a tablet PC, or a notebook PC.
  • the network 16 is, for example, a telephone line or an Internet line, and connects the server 12 and the terminal device 14 to each other by wire or wireless to enable bidirectional communication.
  • FIG. 2 is a block diagram of an embodiment showing the configuration of the server 12 shown in FIG.
  • the server 12 shown in the figure acquires an information setting unit 18 for setting various types of information related to the composite image, an information management unit 20 for managing various types of information related to creation of the composite image, and various types of data used in the composite image.
  • one user who creates a composite image is expressed as an administrative user, and two or more users including the administrative user are participating users. It expresses. For example, this is the case when the secretary is one of a group of friends. However, in another embodiment, the number of participating users does not include the number of secretary users if the secretary itself does not provide a photo or message, such as when the photo shop undertakes the agency's agency. .
  • FIG. 3 is a block diagram of an embodiment showing the configuration of the information setting unit 18 shown in FIG.
  • the information setting unit 18 shown in the figure includes a budget setting unit 28, a product acquisition unit 30, a cover design setting unit 32, a margin design setting unit 34, a schedule setting unit 36, and an importance information acquisition unit 72. It has.
  • the budget setting unit 28 acquires the budget information of the composite image set by the secretary user from the terminal device 14 of the secretary user via the network 16.
  • the commercial material acquisition unit 30 is set by the secretary user from one or more image products (image product information) having a size and the number of pages corresponding to the budget information acquired by the budget setting unit 28.
  • One image product thus obtained is acquired from the terminal device 14 of the secretary user via the network 16.
  • the image product is a material for creating an image product, and includes, for example, a photo album such as a photo book, a shuffle print, a calendar with an image, and the like.
  • the images are mainly photographs.
  • the image merchandise includes a plurality of types of image merchandise each having a different size and / or number of pages.
  • the number of pages of the image merchandise is the number of pages including the main page and the miscellaneous pages.
  • Image merchandise includes image merchandise for paper media and image merchandise for electronic data.
  • the page in the present invention refers to a unit for performing image arrangement and side-by-side arrangement. In the present embodiment, it refers to a spread page. However, in other embodiments, it may be a single-sided page. Further, inside the image processing system 10 in the present embodiment, the number of pages of the image product may be displayed to the user in units of spread pages and in units of single-sided pages.
  • a photo book is, for example, a composite image in which a plurality of images selected by a user are arranged on a plurality of pages in a layout desired by the user.
  • the photo book may also be a composite image in which images selected by automatic selection from images in a desired period (for example, one year) held by the user are arranged on a plurality of pages by automatic layout (for example, Fuji). Film Co., Ltd. ear album).
  • the shuffle print is a composite image in which a plurality of images are shuffled and arranged on one print.
  • the calendar with image is, for example, a composite image in which images corresponding to the calendars of each month are arranged.
  • the photobook may be a paper medium or electronic data.
  • the composite image is a photo book that includes a grouped page at the end.
  • the message page is a composite image in which message messages of two or more participating users acquired from the terminal devices 14 of the participating users are arranged.
  • the message message is a message of the participating user used in the message page.
  • the profile image may be arranged on the message page.
  • the profile image is, for example, a face image of a participating user, but may be other images.
  • the cover design setting unit 32 receives information on the design of one cover page set by the secretary user from one or more cover page designs via the network 16 from the terminal device 14 of the secretary user. get.
  • the cover page design information includes, for example, the cover page pattern, the design information such as the illustration drawn on the cover page, the title information of the composite image described on the cover page, and the color of the cover page. Information etc. are included.
  • the draft design setting unit 34 sends information on the design of one draft page set by the secretary user from the design of one or more draft pages from the terminal device 14 of the secretary user via the network 16. get.
  • the information on the page layout design includes, for example, information on the template in which the position and size of the profile image of the participating user and the layout message on the page are set in advance.
  • the schedule setting unit 36 sends the schedule information including the application deadline date of the image and the contribution message set by the secretary user, the creation period of the composite image, and the delivery date of the image product via the network 16. Obtained from the terminal device 14 of the secretary user.
  • the deadline for recruiting images and messages is that the participating users can upload (post) images and messages, that is, the image processing system 10 acquires the images and messages from the terminal devices 14 of the participating users. Represents the deadline that can be done.
  • the composite image creation period represents a period for the secretary user to create a composite image using a plurality of images acquired from the terminal devices 14 of a plurality of users, in other words, a time limit for ordering an image product.
  • the delivery date of the image product represents the date on which the image product is delivered.
  • the importance level information acquisition unit 72 receives information on the importance level of the designated person set by the recipient from the terminal device 14 of the recipient of the composite image (first user of the present invention) via the network 16. To get.
  • the designated person is one or more persons designated by the recipient from among a plurality of people shown in the first composite image owned by the recipient.
  • the first composite image is a photo book owned by the recipient.
  • a person other than the recipient may designate one or more persons from a plurality of persons shown in the first composite image.
  • a second composite image different from the first composite image is created using an image showing a designated person in the first image group owned by the recipient, and the second composite image created is created.
  • the composite image is sent to a designated person, for example.
  • Recipient can set the importance of at least one designated person among one or more designated persons.
  • the importance level information acquisition unit 72 sets the importance level of all designated persons to the same minimum level of importance level as an initial setting, but the recipient changes the initial setting and sets the importance level. Can do.
  • the importance of the designated person set by the recipient is set to the importance set by the recipient, and the importance of the designated person not set by the recipient is set to the lowest level importance that is the default setting. It will remain as it is.
  • the importance level information may be set in advance before the recipient receives the first composite image, or after the recipient receives the first composite image, for example, analysis of the first composite image is performed. You may do this based on the results.
  • FIG. 4 is a block diagram of an embodiment showing the configuration of the information management unit 20 shown in FIG.
  • the information management unit 20 shown in the figure includes a setting storage unit 38, an account information storage unit 40, a management screen information transmission unit 42, and an upload prompting unit 44.
  • the setting storage unit 38 stores the image product acquired by the product acquisition unit 30 and the schedule information acquired by the schedule setting unit 36.
  • the account information storage unit 40 acquires the account information of the secretary user set by the secretary user from the terminal device 14 of the secretary user via the network 16. Further, the account information storage unit 40 acquires and stores the participating user's account information set by the participating user from the participating user's terminal device 14 via the network 16.
  • the account information of the secretary user is the email address and secretary password of the secretary user.
  • the account information of the participating user is the name and individual password of the participating user (displayed as “secret word” as shown in FIG. 30 for the participating user).
  • the name of the participating user is used for the managing user to manage the participating user, and the individual password is used for the image processing system 10 to identify the participating user. In the case of this embodiment, it is assumed that the secret word cannot be changed or reissued later.
  • the management screen information sending unit 42 sends a message including a URL (Uniform Resource Locator) for accessing the management screen for managing the image merchandise and the schedule by the managing user via the network 16 to the managing user.
  • a URL Uniform Resource Locator
  • the secretary user can change the secretary user, add a secretary user (for example, add a secondary secretary described later), and the like.
  • the management screen information sending unit 42 sends a message including a URL for accessing the management screen by e-mail, for example, to the email address of the secretary user acquired by the account information storage unit 40.
  • the upload prompting unit 44 sends the invitation created by the secretary user to the terminal device 14 of the participating user via the network 16.
  • the invitation is prompting information that invites the participating user to create a composite image and prompts the participating user to upload an image used in the composite image, evaluation information for each image, a profile image, a message, and the like. .
  • the upload prompting unit 44 sends the invitation to the terminal device 14 of the participating user by, for example, an SNS (Social Networking Service) message or an e-mail.
  • SNS Social Networking Service
  • FIG. 5 is a block diagram of an embodiment showing the configuration of the data acquisition unit 22 shown in FIG.
  • the data acquisition unit 22 shown in the figure includes an image acquisition unit 46, an evaluation information acquisition unit 48, a message acquisition unit 50, a comment acquisition unit 74, a moving image acquisition unit 76, and a first composite image acquisition unit 98.
  • the image acquisition unit 46 is a period between the time when the invitation is sent and the date set by the schedule setting unit 36 until the closing date for the recruitment of images and messages.
  • a plurality of images (image group) transmitted from the terminal device 14 via the network 16 are acquired.
  • the image acquisition part 46 acquires the participating user's profile image set by each participating user.
  • the image acquisition unit 46 After acquiring a plurality of images (image group) transmitted from the terminal device 14 of the participating user via the network 16, the image acquisition unit 46 determines which participating user transmitted the plurality of images. Save it in association with the image. In addition, the image acquisition unit 46 stores the profile image transmitted from the terminal device 14 of the participating user via the network 16 in association with the profile image that is transmitted from which participating user.
  • the evaluation information acquisition unit 48 transmits the evaluation information representing the evaluation for each image given by two or more participating users for a certain period of time via the network 16 to the terminal devices 14 of two or more participating users. Get from.
  • the image evaluation information is information representing the evaluation of each participating user for each image, for example, high evaluation or low evaluation.
  • the message acquisition unit 50 acquires the message message uploaded by each participating user for the same period from the terminal device 14 of each participating user via the network 16.
  • the message acquisition unit 50 saves the participation message transmitted from the participating user's terminal device 14 via the network 16 in association with the additional message.
  • the comment acquisition unit 74 acquires the recipient's comment on the designated person from the recipient's terminal device 14 via the network 16.
  • the comment is placed in the image placement area in combination with the image used in the second composite image, for example.
  • the moving image acquisition unit 76 acquires one or more moving images associated with each of the plurality of images used in the second composite image from the recipient terminal device 14 via the network 16.
  • the content of the moving image is not particularly limited.
  • a moving image in which a designated person is shown, a moving image related to the designated person, or the like can be used.
  • the first composite image acquisition unit 98 acquires the first composite image owned by the recipient from the recipient's terminal device 14 via the network 16.
  • the first composite image acquisition unit 98 acquires the electronic data.
  • the first composite image is carried on an object such as paper like a real photo book
  • the first composite image carried on the object is generated by being scanned by a scanner or the like. Get electronic data.
  • the first composite image is, for example, a composite image created by the image processing system 10 and sent to the recipient, and has identification information that identifies the first composite image from other images.
  • the identification information is not particularly limited as long as each composite image can be uniquely identified. For example, a barcode, an identification number, and the like described on the composite image can be exemplified.
  • a user such as a recipient owns an image means that the user can take the image into a recording medium, analyze it, and use it for image composition.
  • the identification information acquisition unit 78 acquires the identification information included in the first composite image acquired by the first composite image acquisition unit 98.
  • the face image holding unit 100 acquires a plurality of face images including the face images of the plurality of participating users from the terminal devices 14 of the plurality of participating users via the network 16. , Hold a plurality of acquired face images.
  • the face image specifying unit 80 is involved in creating the first composite image from the plurality of face images held in the face image holding unit 100 based on the identification information acquired by the identification information acquisition unit 78. Identify face images of multiple participating users.
  • the first image group holding unit 102 holds the first image group owned by the recipient, transmitted from the recipient terminal device 14 via the network 16.
  • FIG. 6 is a block diagram of an embodiment showing a configuration of the data analysis unit 24 shown in FIG.
  • the data analysis unit 24 shown in the figure includes an image number calculation unit 52, an evaluation number calculation unit 54, a message number calculation unit 56, an image analysis unit 58, an evaluation value calculation unit 60, a person specifying unit 82, A designated person receiving unit 84 and an image specifying unit 86 are provided.
  • the number-of-images calculation unit 52 is acquired by the image acquisition unit 46 after a certain period of time has passed since the invitation was sent by the uploading prompting unit 44, that is, after the application deadline for images and message postings has passed. Calculate the number of images.
  • the evaluation number calculation unit 54 also calculates the number of evaluation information representing the high evaluation and the low evaluation acquired by the evaluation information acquisition unit 48 after the application deadline date for the image and the written message has passed.
  • the message number calculation unit 56 calculates the number of message messages acquired by the message acquisition unit 50 after the application deadline for images and message messages has passed.
  • the image analysis unit 58 analyzes the contents of each image acquired by the image acquisition unit 46.
  • the image analysis unit 58 performs image analysis every time an image is acquired by the image acquisition unit 46.
  • the image analysis unit 58 analyzes the content of the first composite image acquired by the first composite image acquisition unit 98, that is, the content of each image used in the first composite image.
  • the image analysis unit 58 analyzes the brightness and color of the image, the degree of blurring and blurring, and when the image includes a human face, the size of the face, the position of the face, the orientation of the face, Analysis is performed on the skin color of the face, facial expressions such as smiles, eyes, the number of persons included in the image, and the positional relationship of the persons.
  • the evaluation value calculation unit 60 calculates the analysis evaluation value of each image based on the analysis result of each image by the image analysis unit 58. Further, the evaluation value calculation unit 60 adds or subtracts or weights a value to the analysis evaluation value of each image based on the evaluation information representing the high evaluation and the low evaluation for each image acquired by the evaluation information acquisition unit 48. Then, the comprehensive evaluation value of each image is calculated.
  • the evaluation value calculation unit 60 adds a value to the analysis evaluation value of the image based on the number of evaluation information representing the high evaluation calculated by the evaluation number calculation unit 54, and sets the evaluation information representing the low evaluation. Based on the number, the total evaluation value of the image can be calculated by subtracting the value from the analysis evaluation value of the image.
  • the image analysis evaluation value is calculated based on the image analysis result, it becomes a reference for determining whether the image is good or bad.
  • the higher the analysis evaluation value the better the image quality.
  • the overall evaluation value of the image is calculated based on the evaluation information representing the high and low evaluations given by the participating users in addition to the analysis result of the image. Therefore, in addition to the quality of the images, the preference of the participating users It can be said that the higher the comprehensive evaluation value, the better the image quality or the preference of the participating user.
  • the person specifying unit 82 specifies a plurality of persons (specific persons) shown in the first composite image based on the analysis result of the first composite image by the image analysis unit 58. In addition, the person specifying unit 82 selects each of a plurality of users corresponding to each of the plurality of persons shown in the first composite image based on the face images of the plurality of participating users specified by the face image specifying unit 80. Identify.
  • the designated person receiving unit 84 is designated as a designated person from among a plurality of persons shown in the first synthesized image identified by the person identifying unit 82, for example, by a recipient of the first synthesized image. Accepts designation of one or more persons.
  • the designated person accepting unit 84 accepts designation of one or more persons as designated persons from among a plurality of participating users involved in the creation of the first composite image in which the face image is identified by the face image identifying unit 80.
  • the image specifying unit 86 specifies an image (specific image) in which the designated person accepted by the designated person accepting unit 84 is captured from the first image group held in the first image group holding unit 102. To do.
  • FIG. 7 is a block diagram of an embodiment showing the configuration of the composite image creation unit 26 shown in FIG.
  • the composite image creation unit 26 shown in the figure includes a cover creation unit 62, a main part creation unit 64, and a margin creation unit 66.
  • the cover creation unit 62 creates a cover page with a design corresponding to the product information stored in the setting storage unit 38 and the cover page design information acquired by the cover design setting unit 32.
  • the main part creation unit 64 uses the plurality of images acquired by the image acquisition unit 46, and the main page (the cover page and the miscellaneous page) with the number of pages corresponding to the product information stored in the setting storage unit 38. Other pages).
  • the main part creation unit 64 creates a composite image corresponding to the product information stored in the setting storage unit 38, in the case of this embodiment, a main page of a photo book.
  • creating a composite image means creating one image by arranging two or more images.
  • the image used in creating the composite image includes a part image such as a background image, a character image, and a stamp in addition to a photograph.
  • the main part creation unit 64 includes an image dividing unit 88, an image extracting unit 90, an image arranging unit 92, and the like.
  • the image dividing unit 88 divides the plurality of images acquired by the image acquiring unit 46 into a number of groups corresponding to the number of main pages.
  • the image extraction unit 90 extracts, for each group of images, a plurality of compositing target images used in the main page from images included in the group based on the comprehensive evaluation value of the images.
  • the image placement unit 92 determines the size of each synthesis target image extracted by the image extraction unit 90 and the placement position in the main page based on the overall evaluation value of the image for each group of images, and selects the synthesis target image.
  • the main page of the page corresponding to the image group is arranged (automatic layout).
  • the draft creation unit 66 creates a draft page using the participating user's profile image and the draft message acquired by the message acquisition unit 50.
  • the draft creation unit 66 creates a draft page for a design corresponding to the product information saved in the setting saving unit 38 and the draft design information acquired by the draft design setting unit 34.
  • the message creation unit 66 includes a message division unit 94, a message arrangement unit 96, and the like.
  • the message dividing unit 94 divides the written message acquired by the message acquisition unit 50 into a number of groups corresponding to the number of pages of the written page.
  • the message arrangement unit 96 arranges, for each group of overwritten messages, the overwritten message included in the group on the overwritten page of the page corresponding to the overwritten message group.
  • the number of pages of the message page is set according to the number of participating users, the number of message messages, and the like. Further, the number of pages of the main page is set according to the number of pages of the composite image, the number of pages of the handwritten page, and the like. In the case of the present embodiment, the number of participating users is 2 to 36, 2 to 12 messages are arranged on one page, and a 16-page photo book is created.
  • Each part constituting the information setting unit 18, the information management unit 20, the data acquisition unit 22, the data analysis unit 24, and the composite image creation unit 26 is, for example, a control device such as a CPU (Central Processing Unit). This is realized by executing a program loaded in the memory.
  • the data stored by each part is stored in a storage device such as an HDD (Hard Disk Drive), SSD (Solid State Drive), or SD (Secure Digital) memory.
  • HDD Hard Disk Drive
  • SSD Solid State Drive
  • SD Secure Digital
  • FIG. 10 is a block diagram of an embodiment showing the configuration of the user terminal device 14 shown in FIG.
  • the user terminal device 14 shown in the figure includes an image display unit 68 and an instruction input unit 70.
  • the image display unit 68 displays various setting screens, selection screens, confirmation screens, input screens, creation screens, and the like, and is configured by a display device such as a liquid crystal display.
  • the instruction input unit 70 acquires various setting instructions, selection instructions, confirmation instructions, input instructions, creation instructions, and the like input by the user, and includes, for example, an input device such as a keyboard and a mouse.
  • the image display unit 68 and the instruction input unit 70 are configured by a device in which a display device and an input device are integrated, such as a touch panel.
  • the terminal device 14 does not need to be one device corresponding to each user involved in the creation of the composite image in a one-to-one manner, as long as it can correspond to each user account in the image processing system 10. There may be a plurality of units corresponding to each user.
  • the server 12 and the terminal device 14 include, for example, a transmission / reception unit that is a communication device that transmits and receives various data between the server 12 and the terminal device 14, a CPU that controls the operation of each part, and the like.
  • the control part etc. which are are provided.
  • the secretary user When creating a composite image, the secretary user first accesses a website for creating a composite image provided by the image processing system 10 via the instruction input unit 70 in the terminal device 14 of the secretary user.
  • the budget setting unit 28 displays a screen for setting the budget of the composite image on the image display unit 68 of the secretary user's terminal device 14. .
  • the secretary user sets the budget for the composite image created by the secretary user via the instruction input unit 70 on the screen for setting the budget for the composite image.
  • a list of composite image budgets is registered in advance by a pull-down menu.
  • the secretary user selects and sets one budget, for example, 3000 to 4000 yen, from the list of composite image budgets registered in the pull-down menu.
  • the budget setting unit 28 acquires information on the budget for the composite image set by the secretary user from the terminal device 14 of the secretary user (step S1).
  • the product acquisition unit 30 presents one or more image products corresponding to the budget information.
  • the merchandise acquisition unit 30 presents five photo books having different sizes and the number of pages as image merchandise.
  • the image display unit 68 of the terminal device 14 of the secretary user displays a screen for setting one image product from one or more image products presented by the product acquisition unit 30. To do.
  • the secretary user selects and sets one image product from the one or more presented image products via the instruction input unit 70 on the screen for setting the image product.
  • a 16-page photo book of A4 size is set.
  • step S2 After the photobook is set as the image product, when the “Next” button is pressed, for example, when the “Next” button is tapped or clicked, the product acquisition unit 30 is set by the managing user. One piece of image merchandise obtained is acquired (step S2).
  • the cover design setting unit 32 displays a screen for setting the cover page design of the photo book on the image display unit 68 of the terminal device 14 of the manager user.
  • the secretary user selects and sets one or more cover page designs via the instruction input unit 70 on the cover page design setting screen.
  • the secretary user selects and sets one cover page design from among the three cover page designs.
  • the secretary user can set, for example, the title of the photobook up to 20 characters written on the cover page and the color of the cover page as information on the design of the cover page of the photo book.
  • the cover design setting unit 32 acquires information on the cover page design set by the secretary user from the terminal device 14 of the secretary user. (Step S3).
  • the draft design setting unit 34 displays a screen for setting the design of the draft page of the photo book on the image display unit 68 of the terminal device 14 of the manager user.
  • the secretary user selects and sets one or more draft designs via the instruction input unit 70 on the screen for setting the draft page design.
  • one of the nine designs is selected and set.
  • the layout design setting unit 34 acquires information on the layout page design set by the secretary user from the terminal device 14 of the secretary user. (Step S4).
  • the schedule setting unit 36 displays a screen for setting an application and deadline date for the message to be posted on the image display unit 68 of the secretary user's terminal device 14.
  • the secretary user sets the application deadline date for the image and the message message via the instruction input unit 70 on the screen for setting the application deadline date for the image and the message message.
  • a list of dates within a certain period from the current date is registered in advance by a pull-down menu.
  • the secretary user selects and sets one date, for example, December 2, as the application deadline date from the list of dates registered in the pull-down menu.
  • the schedule setting unit 36 displays the information of the image and message message application deadline date set by the secretary user. Obtained from the user terminal device 14 (step S5).
  • the schedule setting unit 36 displays a screen for setting a composite image creation period on the image display unit 68 of the secretary user's terminal device 14.
  • the secretary user sets the composite image creation period via the instruction input unit 70 on the screen for setting the composite image creation period.
  • a list of dates within 30 days from the deadline for the period for recruiting images and written messages is registered in advance by a pull-down menu.
  • the organizer user selects one date, for example, December 4 from the list of dates registered in the pull-down menu, and sets December 2 to 4 as the creation period.
  • the schedule setting unit 36 receives information on the composite image creation period set by the secretary user from the terminal device 14 of the secretary user. Obtain (step S5).
  • the schedule setting unit 36 displays a screen for setting the delivery date of the image product on the image display unit 68 of the terminal device 14 of the manager user.
  • the delivery date of the image product is automatically set by the schedule setting unit 36 to a date after a certain period from the deadline date of the composite image creation period.
  • the setting is automatically set on December 20, 16 days after December 4, which is the last day of the composite image creation period.
  • the secretary user can change the delivery date of the image product via the instruction input unit 70 on the screen for setting the delivery date of the image product.
  • the delivery date of the image merchandise can be set to a date before December 20, for example, by paying an additional fee and specifying express finishing.
  • the schedule setting unit 36 acquires the delivery date information of the image product (step S5).
  • the setting storage unit 38 displays a screen for confirming information on the image merchandise and the schedule on the image display unit 68 of the terminal device 14 of the manager user.
  • the secretary user confirms the image merchandise and schedule information on the screen for confirming the image merchandise and schedule information, and if the settings are acceptable, press the “Save / Login” button to proceed to the next screen. To change the setting, press the “ ⁇ ” button to return to the previous screen.
  • the setting storage unit 38 displays a screen for inputting account information for the secretary user to log in to the image processing system 10.
  • the secretary user inputs the email address and secretary password of the secretary user via the instruction input unit 70 as account information for logging in to the image processing system 10. If the secretary user has already registered as a member in the image processing system 10, the “login” button is pressed. If the secretary password is forgotten after member registration, a process for recovering the secretary password is performed by pressing the “forget password” button.
  • the account information storage unit 40 compares the account information already stored with the account information input by the managing user. If the account information storage unit 40 compares the two as a result of the comparison, the secretary user can log in to the image processing system 10.
  • the image processing system 10 performs a process for newly registering a member by pressing a “new member registration” button.
  • the account information storage unit 40 acquires the e-mail address and secretary password input by the secretary user, and stores them as account information of the secretary user.
  • the secretary user's terminal device 14 can log in to the image processing system 10 after completing the membership registration.
  • the setting storage unit 38 stores the image product acquired by the product acquisition unit 30 and the schedule information acquired by the schedule setting unit 36.
  • the setting storage unit 38 displays a screen indicating that the log-in of the secretary user and the storage of the image merchandise information and the schedule information has been completed. This is displayed on the part 68.
  • the management screen information sending unit 42 sends a message including a URL for accessing the management screen.
  • This message also includes a notification indicating that the image merchandise and schedule information have been saved.
  • the management screen information sending unit 42 inputs the email addresses of participating users other than the managing user as well as the managing user via the instruction input unit 70, thereby participating users other than the managing user, For example, a message including a URL for accessing the management screen can be sent to the e-mail address of the deputy secretary user acting on behalf of the secretary user.
  • the upload prompting unit 44 displays a screen for creating an invitation to be sent to the terminal device 14 of the participating user as a terminal of the managing user.
  • the image is displayed on the image display unit 68 of the device 14.
  • the secretary user inputs a message to be included in the invitation via the instruction input unit 70 within a certain number of characters on the screen for creating an invitation within 150 characters in the example of FIG.
  • an initial message “Let's give a photo book with everyone's photos and messages!” Is automatically input.
  • the secretary user may use the initial message as it is, or may input another message.
  • the upload prompting unit 44 acquires message information to be included in the invitation and creates an invitation (step S6).
  • the upload prompting unit 44 displays a screen for confirming the contents of the invitation on the image display unit 68 of the terminal device 14 of the manager user.
  • the secretary user confirms the contents of the invitation on the screen for confirming the contents of the invitation, and if the contents are acceptable, the button for "Invitation Sending Screen” is pushed to the next screen, and the contents If you want to change, press the “ ⁇ ” button to return to the previous screen.
  • the upload prompting unit 44 displays a screen for sending an invitation to the terminal device 14 of the participating user. Is displayed on the image display unit 68.
  • the secretary user selects a means for sending the invitation via the instruction input unit 70 on the screen for sending the invitation.
  • the secretary user selects an SNS message or e-mail as means for sending an invitation.
  • the invitation is sent to the participating user's SNS account as an SNS message when the “SNS” button is pressed, and sent to the participating user's email address by e-mail when the “mail” button is pressed. Is done.
  • the secretary user may send invitations to all participating users involved in the creation of the composite image, or may send invitations only to some participating users.
  • SNS messages or e-mails include invitations to access images used for composite images, image evaluation information, message postings, etc., and a common password.
  • SNS messages or e-mails include invitations to access images used for composite images, image evaluation information, message postings, etc., and a common password.
  • “5865” is included.
  • the upload prompting unit 44 sends an invitation to the terminal device 14 of the participating user (step S7).
  • the participating user receives the invitation at the terminal device 14 of the participating user and accesses the website indicated by the invitation URL via the instruction input unit 70.
  • the upload prompting unit 44 displays a screen representing the invitation received by the participating user on the image display unit of the terminal device 14 of the participating user. 68 (step S8).
  • Participating users browse the screen showing the invitation, and the request from the managing user to the participating users is to upload an image to be a photo book and a message to be posted, and the application deadline is December 2 To understand the.
  • the account information storage unit 40 displays an image used in the composite image, A screen for inputting a common password for accessing the screen on which the participating user uploads the evaluation information of the image, the written message, and the like is displayed on the image display unit 68 of the terminal device 14 of the participating user.
  • the participating user inputs the common password included in the received invitation, “5865” in the example of FIG. 28, via the instruction input unit 70 on the screen for inputting the common password.
  • the account information storage unit 40 displays a screen for registering a new participating user on the terminal device 14 of the participating user.
  • the image is displayed on the image display unit 68.
  • the name of a user who has already been registered as a participating user is displayed on the screen for registering a new participating user. It is not essential to display the names of registered participating users. However, by displaying the names of registered participating users, if a user newly registers as a participating user, if the name that he / she knows is in the registered participating user's name, he / she can register with confidence. It can be performed.
  • the participating user presses the “new registration” button if he / she has not yet registered as a participating user in the image processing system 10, and presses the “content modification” button if he / she has already been registered as a participating user.
  • the account information storage unit 40 registers the name and individual password (displayed as “secret password” in FIG. 30) to be registered as a participating user. A screen for this is displayed on the image display unit 68 of the terminal device 14 of the participating user.
  • the participating user inputs a real name or a nickname as a name via the instruction input unit 70, and inputs an arbitrary character string as a secret word.
  • the name of the participating user and the secret word are the account information of the participating user. As shown in FIG. 29, the participating user has already uploaded by pressing the “modify content” button on the screen for registering a new participating user, and inputting the registered name and secret password. It is possible to correct the image, the image evaluation information, the written message, and the like.
  • the account information storage unit 40 acquires the name of the participating user and the secret password and saves it as the account information of the participating user. To do.
  • the image acquisition unit 46 displays a screen for selecting an image to be uploaded by the participating user on the image display unit 68 of the terminal device 14 of the participating user.
  • Participating users can select an image to upload by pressing the “add image” button on the screen for selecting an image to upload.
  • the number of images selected by the participating users is displayed on the screen for selecting an image to be uploaded.
  • the image acquisition unit 46 acquires an image uploaded from the participating user, that is, an image posted from the terminal device 14 of the participating user. (Step S9). As described above, the image acquisition unit 46 acquires a plurality of images transmitted from the terminal devices 14 of two or more participating users. Each time an image is acquired by the image acquisition unit 46, the image analysis unit 58 analyzes the image, and the evaluation value calculation unit 60 calculates an analysis evaluation value of the image based on the analysis result of the image (step). S10).
  • the evaluation information acquisition unit 48 displays a screen for evaluating each of a plurality of images transmitted from the terminal devices 14 of two or more participating users.
  • the image is displayed on the image display unit 68.
  • Participating users can give evaluation information indicating high evaluation or low evaluation to each image via the instruction input unit 70 on a screen for evaluating each image. For example, a participating user browses each image, gives evaluation information indicating high evaluation by pressing a “like” button to an image he / she likes, and gives “ Evaluation information indicating low evaluation can be given by pressing the “Imaichi” button.
  • the evaluation information acquisition unit 48 gives the high evaluation and low evaluation given to each image. Evaluation information representing the evaluation is acquired from the terminal device 14 of the participating user (step S11).
  • the message acquisition unit 50 displays a screen for notifying that the grouped page enters the end of the photo book as a participating user. Displayed on the image display unit 68 of the terminal device 14.
  • Participating users browse and confirm the screen that informs them that the message page will be at the end of the photo book.
  • the message acquisition unit 50 displays a screen for setting the participating user's profile image to be used in the postscript page as a terminal device of the participating user. 14 image display units 68.
  • the participating user selects and sets an image to be used as the profile image from the images owned by the participating user in the terminal device 14 of the participating user via the instruction input unit 70. be able to.
  • the participating user can delete the already set profile image by pressing the “delete” button and set the profile image again.
  • the image acquisition unit 46 acquires the profile image set by the participating user from the terminal device 14 of the participating user (step S12). .
  • the message acquisition unit 50 displays a screen for inputting a message to the image display unit 68 of the terminal device 14 of the participating user.
  • Participating users input a message to be included in the writing page within 150 characters via the instruction input unit 70 on the screen for inputting the writing message.
  • an initial message “Congratulations on marriage ... Please be happy” is automatically input.
  • the participating user may use the initial message as it is or may input another message.
  • the participating user can view the message that has already been uploaded by the other participating users by pressing the “view other people's message” button on the screen for inputting the message.
  • the message acquisition unit 50 displays a screen for confirming the message message input by the participating user.
  • the image is displayed on the image display unit 68 of the user terminal device 14.
  • Participating users browse and confirm the message in the message confirmation screen, and if that message is acceptable, press the “Post” button to go to the next screen. If you want to change the message , Press the “ ⁇ ” button to return to the previous screen.
  • the message acquisition unit 50 acquires the message posted by the participating user, that is, the message uploaded from the terminal device 14 of the participating user (step S13).
  • each participating user with respect to the recipient of the photo book includes not only the main page of the photo book but also a miscellaneous page. You can convey your feelings as a message.
  • a screen indicating that the posting of the message has been completed is displayed on the image display unit 68 of the terminal device 14 of the participating user.
  • the image number calculation unit 52 calculates the number of images acquired by the image acquisition unit 46 (step S14). Further, the evaluation number calculation unit 54 calculates the number of evaluation information indicating high evaluation and low evaluation acquired by the evaluation information acquisition unit 48 (step S15), and the message number calculation unit 56 is acquired by the message acquisition unit 50. The number of posted messages is calculated (step S16).
  • the evaluation value calculation unit 60 adds or subtracts a value to or from an analysis evaluation value of each image based on evaluation information indicating high evaluation and low evaluation for each image, for example, the number of evaluation information, The overall evaluation value of the image is calculated (step S17).
  • every time an image is acquired it is not essential to analyze the image and calculate the analysis evaluation value.
  • every time an image is acquired by analyzing the image and calculating its analysis evaluation value, after acquiring all the images, the image is analyzed and the analysis evaluation value is calculated.
  • the overall evaluation value of the image can be calculated in a short time, and as a result, the time required for creating the composite image can be shortened.
  • the secretary user instructs the composite image creation unit 26 to create a composite image via the instruction input unit 70 in the terminal device 14 of the secretary user.
  • the cover creation unit 62 has a design corresponding to the design information of the cover page, the title set by the secretary user is described, and the secretary user A cover page of the color set by is created (step S18).
  • the main part creation unit 64 creates a main part page of the photobook using at least one of the plurality of images acquired by the image acquisition unit 46 (step S19).
  • the image dividing unit 88 of the main creation unit 64 divides the plurality of images acquired by the image acquisition unit 46 into a plurality of groups corresponding to the number of pages of the main page.
  • the image extraction unit 90 uses, for each group of images, in the main page from the images included in the group, for example, in order from the image having the highest comprehensive evaluation value, based on the comprehensive evaluation value of the image. A plurality of compositing target images are extracted.
  • the image placement unit 92 determines the size of each compositing target image and the placement position in the main page based on the overall evaluation value of the image for each group of images, and the compositing target image has a corresponding main part. Placed on the page. For example, among the plurality of compositing target images arranged on the main page, the compositing target image having the highest overall evaluation value is arranged at the center position of the page with a size larger than the other compositing target images.
  • the draft creation unit 66 creates a draft page with a design corresponding to the information on the design of the draft page using the profile images of the participating users and the draft message (step S20).
  • the message division unit 94 of the message creation unit 66 divides the message by the message acquisition unit 50 into a number of groups corresponding to the number of pages to be edited.
  • the message placement unit 96 creates a page for each group of overwritten messages by placing it on the overwritten page of the page corresponding to the group of overwritten messages. For example, the message arrangement unit 96 arranges the grouped message included in the group and the corresponding participating user's profile image. On each page, for example, the message placement unit 96 sequentially places the handwritten messages in the order in which the handwritten messages are uploaded.
  • the cover creation unit 62 automatically creates a cover page corresponding to the cover page design information.
  • the main part creation unit 64 automatically creates a main part page using a plurality of images acquired from the terminal devices 14 of two or more participating users.
  • the group creation unit 66 automatically creates a group page with a design corresponding to the information about the design of the group page using the profile images of the participating users and the group message.
  • the composite image creation unit 26 automatically creates a photo book including a cover page corresponding to the cover page design information, a main page, and a page layout corresponding to the page layout design information.
  • the image processing system 10 it is possible to create a composite image such as a photo book including not only the main part page but also a page for writing.
  • a composite image such as a photo book including not only the main part page but also a page for writing.
  • the size of each image and the arrangement position on the main page are determined based on the overall evaluation value of the image, so that not only the quality of the image but also a plurality of participating users It is possible to create a composite image reflecting the preferences of the user.
  • the secretary user browses each page of the photo book including the cover page, the main part page, and the handwritten page automatically created on the terminal unit 14 of the secretary user.
  • the secretary user may adopt the automatically created photobook as it is, or edit the contents of each page, for example, the image used on each page, the size of the image, the position of the image and the overwriting message, etc. Also good.
  • the secretary user can add a comment, add a stamp image, change the background type and color of each page, and the like.
  • the secretary user completes the editing of the photo book by December 4th in the case of the composite image creation period set by the schedule setting unit 36, and the image product with the contents of the created photo book is completed. Is ordered (step S21).
  • the image merchandise ordered by the secretary user includes at least one of a paper photobook and electronic data photobook.
  • a photo book of the ordered image product is created and sent to the delivery destination by the delivery date of the image product set by the schedule setting unit 36, in this embodiment, by December 20th. (Step S22).
  • a paper photobook for example, a paper photobook is sent to a delivery destination.
  • an electronic data photobook for example, an electronic data photobook or for downloading electronic data.
  • a URL etc. is sent to the mail address of the delivery destination.
  • the image processing system 10 can create a composite image such as a photobook including a page by using a plurality of images and a message from the plurality of participating users.
  • a photo book is created using images to be combined extracted from a plurality of images owned by a plurality of participating users, and when the created photo book is sent to a delivery destination, The recipient of the book uses the image processing system 10 to use a composite image that is different from the received photobook, using an image of a person in the photobook from the first image group owned by the recipient. Can be created.
  • the recipient creates a second composite image such as a postcard with an image based on the received first composite image such as a photo book. Further, even after the recipient receives the first composite image, it is assumed that the image processing system 10 holds various types of information regarding the first composite image.
  • the first composite image acquisition unit 98 displays a screen informing that the postcard with the image is created as the second composite image.
  • the image is displayed on the image display unit 68 of the terminal device 14.
  • the recipient presses the “Next” button and designates the photobook received by the recipient as the first composite image via the instruction input unit 70. .
  • the first composite image acquisition unit 98 subsequently acquires the first composite image owned by the recipient from the terminal device 14 of the recipient (step S23).
  • the image analysis unit 58 analyzes the content of the first composite image, that is, the content of each image used in the first composite image (step S24).
  • the person specifying unit 82 specifies a plurality of persons shown in the first composite image based on the analysis result of the first composite image (step S25).
  • the designated person receiving unit 84 displays a screen for designating one or more persons as designated persons from among a plurality of persons shown in the first composite image.
  • the image is displayed on the image display unit 68 of the device 14.
  • the recipient designates a person who wants to use the second composite image from among a plurality of persons shown in the first composite image via the instruction input unit 70 on the screen for designating the person, that is, the second composite image.
  • One or more persons (designated persons) appearing in an image used for creation are designated.
  • the recipient checks the face image of the person he wants to use in the second composite image from the plurality of persons shown in the first composite image via the instruction input unit 70 on the screen for designating the person.
  • the recipients are two persons A and B from a plurality of persons including A, B, C, D,. Designated as a designated person.
  • the recipient may specify the face image of another person who is not displayed by swiping up and down the area where the face image of the person is displayed, for example. Can do. Further, by pressing the “specify all” button, all persons appearing in the first composite image can be specified. Also, by pressing the “release all” button, it is possible to cancel the designation of all persons and start over.
  • the designated person accepting unit 84 accepts designation of one or more persons designated by the recipient (step S26).
  • the first image group holding unit 102 When the designation of the person is accepted, the first image group holding unit 102 subsequently displays a screen for designating the first image group owned by the recipient as shown in FIG. Is displayed on the image display unit 68.
  • a field for designating the folder in which the first image group is stored a field for designating the shooting date of the first image group
  • the first image group A field for specifying the shooting location is displayed.
  • the recipient designates one or more fields to be used for designating the first image group from among the above three fields by setting the check box provided in each field to a check state.
  • the number of fields specifying the first image group may be one or more, and there is no upper limit.
  • an AND (logical product) search is performed for images that match the specified conditions in the two or more check boxes specified by the recipient from the image group owned by the recipient.
  • an OR (logical sum) search may be performed. Whether the image is AND-searched or OR-searched may be set in advance or may be specified by the recipient.
  • the first image group holding unit 102 holds the first image group designated by the recipient.
  • the image display unit 68 of the recipient's terminal device 14 automatically selects an image from the first image group designated by the recipient by the image specifying unit 86 and selects a postcard. Display a screen that informs you to create.
  • the image analysis unit 58 analyzes the contents of each image included in the first composite image group. Subsequently, the person specifying unit 82 specifies a plurality of persons in each image included in the first composite image group based on the analysis result of each image included in the first composite image group. Subsequently, the image specifying unit 86 specifies an image in which the designated person is captured from the first image group held in the first image group holding unit 102 (step S27).
  • the composite image creation unit 26 creates a second composite image using an image in which the designated person is shown (step S28).
  • the composite image creation unit 26 displays a screen for confirming the created second composite image as an image of the recipient terminal device 14. It is displayed on the display unit 68.
  • candidates for the second composite image to be sent to Mr. A are displayed.
  • the person on the left in the candidate for the second composite image to be sent to Mr. A is Mr. A, and the two persons on the right are recipients of the photo book.
  • the recipient can display another second composite image candidate to be sent to Mr. A by pressing the “change image” button, and can select another second composite image to be sent to Mr. A. .
  • a text message can be synthesized.
  • the text message may display a fixed sentence selected by the recipient from a plurality of fixed sentences, but may allow the recipient to freely create a text message.
  • the second composite image to be sent to the next person can be displayed.
  • the composite image creation unit 26 displays a screen for inputting the address of the person to whom the second composite image is sent on the image display unit 68 of the recipient's terminal device 14.
  • the composite image creation unit 26 displays fields for inputting the postal codes and addresses of Mr. A and Mr. B on the screen for inputting addresses.
  • the recipient inputs the postal codes and addresses of Mr. A and Mr. B via the instruction input unit 70.
  • the recipient In the screen for entering an address, for example, the recipient displays a field for entering the address of another person who is not displayed by swiping up and down the area for entering the address. You can enter a zip code and address.
  • the “Use existing address book (specify file)” button specify the address book file owned by the recipient, and also specify the person stored in the specified address book By doing so, it is possible to automatically input the address of the designated person. For example, if the recipient of a photo book is a bride and groom, an address book created to send a wedding invitation can be used.
  • the image processing system 10 sends a postcard to both A and B.
  • a plurality of persons appearing in the first composite image created using a plurality of images owned by a plurality of users are identified, and the plurality of identified persons are As a designated person, the designation of one or more persons is accepted, an image showing the designated person is identified from the first image group owned by the recipient, and the identified image is used as the first composite image.
  • Another second composite image can be created.
  • the plurality of persons attending the wedding reception include a plurality of participating users involved in the creation of the wedding reception photobook.
  • the bride and groom who are the recipients of the photo book use the image processing system 10 and thank them for the images of each participating user who attended the wedding reception from among a plurality of images owned by the bride and groom. It is possible to create a postcard with an image showing each participating user as the second composite image, and send the created postcard or the like to each participating user.
  • a plurality of participating users involved in the creation of the first composite image can be specified as the designated person.
  • the face image holding unit 100 holds a plurality of face images including the face images of a plurality of participating users acquired from the terminal devices 14 of the plurality of participating users when creating the first composite image. Shall.
  • the recipient captures the barcode included in the first composite image by, for example, an image capturing unit of a mobile terminal such as a smartphone owned by the recipient.
  • the recipient terminal device 14 does not have to be a portable terminal.
  • the barcode included in the first composite image from the terminal device 14 such as a desktop PC or a notebook PC by an image capturing unit such as a USB camera with a cable. Can also be taken.
  • the identification information acquisition unit 78 acquires the identification information included in the first composite image from the photographed barcode image.
  • the recipient may input the identification number described in the first composite image instead of photographing the barcode.
  • the identification information acquisition unit 78 acquires the identification number input by the recipient as identification information.
  • the face image specifying unit 80 is involved in creating the first composite image from the plurality of face images held in the face image holding unit 100 based on the identification information acquired by the identification information acquisition unit 78. Identify face images of multiple participating users.
  • the person specifying unit 82 each of the plurality of participating users corresponding to each of the plurality of persons shown in the first composite image. Is identified.
  • the designated person receiving unit 84 specifies a participating user corresponding to one or more designated persons from among a plurality of participating users involved in the creation of the first composite image. Thereafter, the designated person operates as the specified participating user as described above.
  • the person specifying unit 82 is not only based on the analysis result of the first composite image, but also based on the face images of a plurality of participating users related to the creation of the first composite image, Participating users can be specified and a second composite image can be created.
  • the recipient can set the importance of the designated person.
  • the recipient receives the important information of each designated person via the instruction input unit 70. Set the degree.
  • the importance level information acquisition unit 72 acquires the importance level information of the designated person set by the recipient from the recipient's terminal device 14.
  • the composite image creation unit 26 creates a second composite image based on the importance level information by preferentially using an image in which the designated person having a higher importance than the designated person having a lower importance is captured.
  • the image extraction unit 90 sequentially extracts, from the plurality of images in which the designated person is captured, from the image in which the designated person having the highest importance is captured as the synthesis target image. Other operations are as described above.
  • the second composite image can be created by preferentially using the existing image.
  • the recipient can include a comment for the designated person in the second composite image.
  • the comment acquisition unit 74 displays a screen for inputting a comment for the designated person on the image display unit 68 of the recipient's terminal device 14.
  • the recipient inputs a comment for each designated person via the instruction input unit 70 on the screen for inputting a comment.
  • the comment acquisition unit 74 acquires a comment uploaded from the recipient's terminal device 14, that is, a recipient's comment on the designated person.
  • the composite image creation unit 26 creates a second composite image using the comments acquired by the comment acquisition unit 74 in addition to the plurality of images in which the designated person is shown.
  • the image placement unit 92 places a comment for the designated person around a plurality of images in which the designated person is shown in the image placement area of the main page. Other operations are as described above.
  • the recipient can use a combination of images (still images) and moving images.
  • the moving image acquisition unit 76 displays a screen for designating one or more moving images on the image display unit 68 of the recipient's terminal device 14, for example.
  • the recipient designates each of one or more moving images associated with each of the plurality of images used in the second composite image via the instruction input unit 70 on the screen for designating the moving image.
  • the moving image acquisition unit 76 acquires each of one or more moving images associated with each of the moving images uploaded from the recipient's terminal device 14, that is, the plurality of images used in the second composite image. get.
  • the composite image creation unit 26 creates a second composite image using the first image group that includes one or more images associated with each of the one or more moving images and includes the designated person. .
  • the designated person who is the recipient of the second synthesized image selects the first image associated with the first moving image of one or more moving images in the first image group used in the second synthesized image. Then, the image is taken (captured) by the image photographing unit of the portable terminal of the designated person. As a result, the first image taken by the designated person is displayed on the image display unit 68 of the terminal apparatus 14 of the designated person. In this case, by using AR (Augmented Reality) technology, the first moving image is reproduced on the image display unit 68 of the portable terminal of the designated person.
  • AR Augmented Reality
  • the first moving image may be reproduced on the entire screen of the image display unit 68 of the designated person's portable terminal, or the first image displayed on the image display unit 68 of the designated person's portable terminal may be reproduced. You may reproduce
  • the recipient of the second composite image can reproduce and view the moving image associated with the image used in the second composite image.
  • the second composite image is created by combining the viewpoints such as the identification of the participating user based on the face image, the setting of the importance level of the designated person, the use of a comment for the designated person, the use of a combination of an image and a moving image May be.
  • a still image extracted from a moving image may be used as an image associated with the moving image.
  • the image analysis unit 58 analyzes the contents of each image included in the plurality of still images constituting the moving image.
  • the person specifying unit 82 specifies a plurality of persons shown in the plurality of still images constituting the moving image based on the analysis result of the plurality of still images constituting the moving image.
  • the image specifying unit 86 specifies an image in which the designated person is captured from a plurality of still images constituting the moving image, and sets it as an image associated with the moving image.
  • a plurality of users involved in the creation of the composite image may jointly create a composite image, or at least one user among the plurality of users A composite image may be created.
  • the server 12 includes the information setting unit 18, the information management unit 20, the data acquisition unit 22, the data analysis unit 24, and the composite image creation unit 26.
  • the terminal device 14 may be provided.
  • each component included in the image processing system 10 may be configured by dedicated hardware, or each component may be configured by a programmed computer.
  • the method of the present invention can be implemented, for example, by a program for causing a computer to execute each step. It is also possible to provide a computer-readable recording medium in which this program is recorded.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne un système de traitement d'image, un procédé de traitement d'image, un programme, et un support d'enregistrement, lesquels permettent, sur la base d'une image composite créée à l'aide d'une pluralité d'images possédées par une pluralité d'utilisateurs, de créer une autre image composite à l'aide d'images d'une pluralité de personnes dans l'image composite. Dans le système de traitement d'image, le procédé de traitement d'image, le programme, et le support d'enregistrement de l'invention, une unité d'identification de personnes identifie une pluralité de personnes apparaissant dans une première image composite possédée par un premier utilisateur. Une unité d'acceptation de personne désignée accepte une désignation d'au moins une personne en tant que personne désignée, parmi la pluralité de personnes apparaissant dans la première image composite. Une unité d'identification d'image identifie au moins une image dans laquelle l'au moins une personne désignée apparaît, parmi un premier groupe d'images possédé par le premier utilisateur. Une unité de création d'image composite crée une seconde image composite à l'aide de l'au moins une image dans laquelle l'au moins une personne désignée apparaît.
PCT/JP2017/011146 2016-03-29 2017-03-21 Système de traitement d'image, procédé de traitement d'image, programme, et support d'entregistrement WO2017169963A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/121,739 US10783355B2 (en) 2016-03-29 2018-09-05 Image processing system, image processing method, program, and recording medium

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-065863 2016-03-29
JP2016065863 2016-03-29
JP2017-048410 2017-03-14
JP2017048410A JP6518280B2 (ja) 2016-03-29 2017-03-14 画像処理システム、画像処理方法、プログラムおよび記録媒体

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/121,739 Continuation US10783355B2 (en) 2016-03-29 2018-09-05 Image processing system, image processing method, program, and recording medium

Publications (1)

Publication Number Publication Date
WO2017169963A1 true WO2017169963A1 (fr) 2017-10-05

Family

ID=59965253

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/011146 WO2017169963A1 (fr) 2016-03-29 2017-03-21 Système de traitement d'image, procédé de traitement d'image, programme, et support d'entregistrement

Country Status (1)

Country Link
WO (1) WO2017169963A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005184790A (ja) * 2003-11-27 2005-07-07 Fuji Photo Film Co Ltd 画像編集装置および方法並びにプログラム
JP2007316939A (ja) * 2006-05-25 2007-12-06 Fujifilm Corp 電子アルバム提供装置、および画像ネットワークシステム
JP2008233957A (ja) * 2007-03-16 2008-10-02 Fujifilm Corp 画像選択装置、画像選択方法、撮像装置及びプログラム

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005184790A (ja) * 2003-11-27 2005-07-07 Fuji Photo Film Co Ltd 画像編集装置および方法並びにプログラム
JP2007316939A (ja) * 2006-05-25 2007-12-06 Fujifilm Corp 電子アルバム提供装置、および画像ネットワークシステム
JP2008233957A (ja) * 2007-03-16 2008-10-02 Fujifilm Corp 画像選択装置、画像選択方法、撮像装置及びプログラム

Similar Documents

Publication Publication Date Title
US9338242B1 (en) Processes for generating content sharing recommendations
US10853980B2 (en) Image processing apparatus, image processing method, program, and recording medium
US9531823B1 (en) Processes for generating content sharing recommendations based on user feedback data
JP6533481B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
US20120324002A1 (en) Media Sharing
US20110270813A1 (en) Digital imaging system employing user personalization and image utilization profiles
JP2022022239A (ja) デジタル画像を公開するためのシステム
US9405964B1 (en) Processes for generating content sharing recommendations based on image content analysis
JP2014075778A (ja) 合成画像作成システム、画像処理装置および画像処理方法
US10990824B2 (en) Image processing apparatus, image processing method, program, and recording medium
US20110270947A1 (en) Digital imaging method employing user personalization and image utilization profiles
JP2019185747A (ja) メディアオブジェクトをグループ化するコンピュータ実装方法及びプログラム、並びに画像取り込み装置
KR101473438B1 (ko) 콘텐츠의 미디어 월 서비스 장치 및 그 방법
US10783355B2 (en) Image processing system, image processing method, program, and recording medium
JP6674798B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
JP6502280B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
US10846771B2 (en) Image processing apparatus, image processing method, program, and recording medium
JP6502272B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
WO2017169963A1 (fr) Système de traitement d'image, procédé de traitement d'image, programme, et support d'entregistrement
JP6618437B2 (ja) サーバ、端末装置、画像処理システム、画像処理方法、プログラムおよび記録媒体
JP2021128425A (ja) 画像処理装置、画像処理方法、プログラムおよび画像処理システム
US20230025580A1 (en) Event And/Or Location Based Media Capture And Upload Platform Based On A URL Or A Link Associated With A Machine-Readable Optical Label
JP6595958B2 (ja) 画像処理装置、画像処理方法、プログラムおよび記録媒体
JP6936471B2 (ja) コミュニティ維持活性化システム
CA2781929A1 (fr) Systeme et methode de creation d'annuaires

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17774504

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17774504

Country of ref document: EP

Kind code of ref document: A1