JP5447276B2 - Image editing apparatus and method, and program - Google Patents

Image editing apparatus and method, and program Download PDF

Info

Publication number
JP5447276B2
JP5447276B2 JP2010178151A JP2010178151A JP5447276B2 JP 5447276 B2 JP5447276 B2 JP 5447276B2 JP 2010178151 A JP2010178151 A JP 2010178151A JP 2010178151 A JP2010178151 A JP 2010178151A JP 5447276 B2 JP5447276 B2 JP 5447276B2
Authority
JP
Japan
Prior art keywords
image
user
editing
course
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010178151A
Other languages
Japanese (ja)
Other versions
JP2012037735A (en
Inventor
萌子 小西
亜希 今平
正人 塚田
幸代 金澤
真吾 ▲浜▼口
孝 中嶋
Original Assignee
フリュー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by フリュー株式会社 filed Critical フリュー株式会社
Priority to JP2010178151A priority Critical patent/JP5447276B2/en
Publication of JP2012037735A publication Critical patent/JP2012037735A/en
Application granted granted Critical
Publication of JP5447276B2 publication Critical patent/JP5447276B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an image editing apparatus and method, and a program, and more particularly to an image editing apparatus and method capable of improving the work of creating a photo sticker and the satisfaction of a user with respect to the created photo sticker, and Regarding the program.

  Conventionally, a user (subject) is photographed, and a graffiti editing process is performed in which a prepared image is combined with the obtained photographed image or a character or a line drawing is superimposed, printed on sticker paper, and a photo sticker There is a photographic seal machine provided to users.

  This photo sticker is installed, for example, in an entertainment (game) facility such as a game center, and provides services mainly to high school girls and female junior high school students for a fee. Users can not only enjoy creating photo stickers as a game, but also cut out photo stickers with edited images printed in various sizes and paste them on their belongings. doing.

  The game for creating a photo sticker is performed by shooting a subject, editing a graffiti on the shot image, and outputting the edited image to a photo sticker. The user plays this series of processes within the time limit, and receives a photo sticker created as a product. Therefore, in order to improve user satisfaction, it is important to improve the fun of the photo sticker creation work as well as to improve the quality of the photo sticker.

  In a photo sticker that provides such a photo sticker creation game, an image of a subject that is a user is displayed in real time as a moving image (live view image) at the time of shooting. Then, shooting can be performed while checking the live view image.

  Furthermore, there is a photo sticker that displays a live view image in which a selected portion is corrected by a user selecting a portion to be corrected with respect to the live view image (see Patent Document 1).

  As a technique for performing correction suitable for the subject, the face image of the subject is detected, the feature amount of the face image is used to determine the sex and age of the subject, and the face image of the subject is determined based on the determination result. On the other hand, there has been proposed a technique for performing correction processing (see Patent Document 2).

JP 2005-277772 A Japanese Patent No. 4277534

  However, in the photo sticker of Patent Document 1, the user needs to repeatedly perform confirmation of the live view image and selection of a portion to be corrected in the live view image until satisfactory. It was troublesome for me.

  Moreover, in the method of Patent Document 2, there is a possibility that correction processing unintended by the user is performed depending on the determination result. For example, when the gender of a subject who is originally a woman is erroneously determined to be male, correction processing for men is performed on the face image, and any correction processing for men In some cases, correction processing was not performed. Conversely, when the gender of a subject who is originally a man is erroneously determined to be a woman, the face image may be subjected to a correction process for women. In such a case, the user may not be satisfied with the result of the correction process.

  The present invention has been made in view of such a situation, and makes it possible to improve the work of creating a photo sticker and the satisfaction of a user with respect to the created photo sticker.

An image editing apparatus according to one aspect of the present invention is an image editing apparatus that captures an image of a user as a subject and outputs the obtained image. The image is applied to the image captured by the user as a subject. Course selection means for selecting a course for determining the type of processing, photographing means for photographing the user as a subject to obtain a photographed image, and for the photographed image according to the course selected by the course selection means The image processing control means for controlling the image processing to be performed, the editing means for editing the photographed image subjected to the image processing based on the graffiti editing input by the user, and the editing means on the captured image, and a screen display control means for the user to control the display of the editing screen for the graffiti editing input, the course selection means, among the courses The user selects a first course for performing image processing according to the sex of the user with respect to the photographed image photographed as a subject, and the image processing control means includes the course selection means. When the first course is selected by the above, the control is performed so that the photographed image is subjected to image processing according to the sex of the user, and the screen display control means is controlled by the course selection means. When the course is selected, the display of the editing screen according to the gender of the user is controlled .

In this image editing apparatus, the course selection means is configured by, for example, a course selection control unit, the imaging unit is configured by, for example, a captured image acquisition unit, and the image processing control unit is configured by, for example, an image processing control unit. , editing means, for example, it is constituted by the edit processing unit, a screen display control means, for example, Ru is constituted by the editing screen display control unit.

In the image editing apparatus of the present invention, a course for determining the type of image processing to be performed on an image photographed as a subject by the user is selected, and the photographed image is obtained by photographing the user as the subject. the image is controlled so that the image processing of the type corresponding to the course selected is applied, based on the graffiti editing input by the user, the captured image subjected to image processing is edited, the edited wherein on the captured image, the display of the editing screen for the user to perform graffiti editing input that are controlled.

  As a result, the image processing intended by the user can be performed on the photographed image with a simple operation, so that the work of creating the photo sticker and the satisfaction of the user with the created photo sticker are improved. Can do.

  The course selection means is configured to select a second course for performing image processing according to an instruction from the user on the captured image, which is different from the first course. The image processing control means can be controlled to perform image processing according to the user's instruction on the photographed image when the second course is selected by the course selecting means. .

When the first course is selected, the course selection unit further selects a gender for each user, and the image processing control unit causes the image processing control unit to select each of the users in the captured image. It can be controlled to perform image processing according to the sex of each user selected by the course selection means.

  Thereby, the image processing intended by the user can be performed with respect to the photographed image with a simple operation, and the satisfaction of the user with respect to the photographing work in the photo sticker creation work can be improved.

In the screen display control means, when the first course is selected by the course selection means, the editing screen according to the relative position of the user in the captured image and the gender of the user Display can be controlled.

  Thereby, for example, a male user can be provided with an editing screen that allows easy editing work, and the user's satisfaction with the editing work in the photo sticker creation work can be improved.

  The image editing apparatus may further include a synthesis image selection unit that selects a synthesis image to be synthesized with the captured image in accordance with the selection of the course selection unit.

  The composition image selection means of this image editing apparatus is constituted by, for example, a composition image selection control unit.

  Thereby, it is possible to simplify the operation at the time of shooting for a male user who is not so familiar with the photo sticker creation game.

  The image editing apparatus further includes a moving image display control unit that controls the captured image captured by the imaging unit to be displayed as a moving image on a display unit in real time, and the image processing control unit includes: The moving image is controlled to be subjected to image processing according to the selection of the course selecting unit, and the moving image display control unit is configured to perform image processing according to the selection of the course selecting unit by the image processing control unit. It is possible to control to display the moving image on which the display is performed on the display unit.

  The moving image display control means of this image editing apparatus is constituted by, for example, a live view display control unit.

  The image editing apparatus further includes a printing unit that prints an editing result on the sticker sheet, and the printing unit includes the editing result on the sticker sheet having a design or size according to the selection of the course selecting unit. Can be printed.

  The printing means of this image editing apparatus is constituted by, for example, a print processing unit.

  This makes it possible to create a photo sticker on which a photographed image that has been subjected to image processing intended by the user is printed, so that the user's satisfaction with the created photo sticker can be improved.

  The course selection means can select the gender for each user based on the name input by the user.

  The course selecting means can select the course according to a face image recognized based on a state of the user's face in the photographing.

An image editing method according to one aspect of the present invention is an image editing method of an image editing apparatus that captures a user as a subject and outputs the obtained image. A course selection step for selecting a course for determining the type of image processing to be performed, a photographing step for photographing the user as a subject to obtain a photographed image, and selecting the photographed image by the processing of the course selecting step An image processing control step for controlling to perform image processing of a type corresponding to the processed course, and an editing step for editing the photographed image subjected to the image processing based on a graffiti editing input by a user. In addition, the screen display for controlling the display of the editing screen for the user to perform the graffiti editing input on the photographed image edited by the processing of the editing step The course selection step includes a step for performing image processing according to the gender of the user on the photographed image of the course taken by the user as a subject. The first course is selected, and the processing of the image processing control step is an image according to the sex of the user with respect to the photographed image when the first course is selected by the processing of the course selection step. The process of the screen display control step controls the display of the edit screen according to the gender of the user when the first course is selected by the course selecting means .

In the image editing method of the present invention, a course for determining the type of image processing to be performed on an image photographed as a subject by the user is selected, and the photographed image is obtained by photographing the user as the subject. the image is controlled so that the image processing of the type corresponding to the course selected is applied, based on the graffiti editing input by the user, the captured image subjected to image processing is edited, the edited on the captured image, the display of the editing screen for the user to perform graffiti editing input that are controlled.

A program according to an aspect of the present invention is a program that causes a computer to execute processing for capturing a user as a subject and outputting the obtained image, and is applied to the image captured by the user as a subject. A course selecting step for selecting a course for determining the type of image processing to be performed, a photographing control step for controlling photographing as the subject of the user to obtain a photographed image, and processing for the course selecting step for the photographed image An image processing control step for controlling to perform image processing of a type corresponding to the course selected by the editing, and editing for controlling editing of the photographed image subjected to image processing based on a graffiti editing input by a user a control step, on the captured image to be edited by the processing of the editing control step, for the user performs the graffiti editing input A process including a screen display control step for controlling the display of the collection screen, and the process of the course selection step is performed on the captured image of the course taken by the user as a subject. The first course for performing image processing according to the gender of the user is selected, and the processing of the image processing control step is performed when the first course is selected by the processing of the course selection step. Control is performed so that image processing according to the user's gender is performed on the captured image, and the processing of the screen display control step is performed when the first course is selected by the course selection unit. The display of the editing screen is controlled according to the gender of the person .

  This program is recorded on a predetermined recording medium, read by a drive, for example, and installed in the photo sticker creating apparatus. Recording media include magnetic disks made of floppy (registered trademark) disks, optical disks made of CD-ROM (Compact Disc Read Only Memory), DVD-ROM (Digital Versatile Disc), MD (Mini Disc) (registered trademark), etc. And a removable medium such as a semiconductor memory.

  ADVANTAGE OF THE INVENTION According to this invention, it becomes possible to improve the user's satisfaction with the creation work of a photo sticker and the created photo sticker.

It is a perspective view which shows the example of one Embodiment of the photograph sticker production apparatus to which this invention is applied. It is a perspective view from the other angle of the photograph sticker production apparatus of FIG. It is a perspective view which shows the example of installation of the photograph sticker production apparatus of FIG. It is a figure explaining a user's space movement. It is a figure which shows the structural example of the front of a front unit. It is a figure which shows the structural example of the side surface of an edit unit. It is a figure which shows the structural example of the other side of an edit unit. It is a block diagram which shows the example of an internal structure of a photograph sticker production apparatus. It is a functional block diagram which shows the detailed structural example of a control part. It is a functional block diagram which shows the detailed structural example of an imaging | photography process part and an edit process part. It is a flowchart explaining the example of the flow of photograph sticker creation game processing. It is a flowchart explaining the example of the flow of an imaging | photography process. It is a flowchart explaining the example of the flow of a couple course imaging | photography process. It is a flowchart explaining the example of the flow of a couple course imaging | photography process. It is a figure which shows the example of a display of a touchscreen monitor. It is a figure which shows the other example of a display of a touchscreen monitor. It is a flowchart explaining the other example of the flow of a couple course imaging | photography process. It is a flowchart explaining the other example of the flow of a couple course imaging | photography process. It is a figure which shows the example of a display of a touchscreen monitor. It is a figure which shows the other example of a display of a touchscreen monitor. It is a flowchart explaining the example of the flow of a friend course imaging | photography process. It is a flowchart explaining the example of the flow of a friend course imaging | photography process. It is a figure which shows the example of a display of a touchscreen monitor. It is a figure which shows the other example of a display of a touchscreen monitor. It is a flowchart explaining the example of the flow of an edit process. It is a flowchart explaining the example of the flow of an edit screen display process. It is a figure explaining imaging | photography information. It is a figure which shows the example of an edit screen. It is a functional block diagram which shows the detailed structural example of a live view display control part. It is a flowchart explaining the example of the flow of a live view display process. It is a figure which shows the example of a display of a touchscreen monitor. It is a figure explaining the synthesis | combination of a completed frame image. It is a figure which shows the other example of a display of a touchscreen monitor.

  FIG. 1 is a diagram showing an example of an embodiment of a photo sticker creating apparatus to which the present invention is applied.

  The photo sticker creating apparatus 1 in FIG. 1 enhances the amusement of work such as shooting and editing, and allows the user to perform such work as a game (play service), and uses the shot image or edited image as a photo sticker or data. It is a game device (play service providing device) that receives money from the user instead of providing it to the user.

  In the photo sticker creating game executed by the photo sticker creating apparatus 1, the user inputs the price of the game, takes a picture of himself / herself using the camera provided in the photo sticker creating apparatus 1, and displays the photographed image. On the other hand, the background image is synthesized, and editing (doodle editing) such as adding an image for synthesis by an editing function such as pen input or stamp input is performed, and the photographed image is designed to be rich. After the game ends, the user receives a photo sticker or the like on which the edited image is printed as a product.

  Such a photo sticker creating apparatus 1 is normally installed in a game center, a store in a sightseeing spot, or the like. That is, the photo sticker creating apparatus 1 is sold to managers of those stores from, for example, manufacturers and sales companies. The manager who has purchased the photo sticker creating apparatus 1 installs the photo sticker creating apparatus 1 in a store (for example, a game center) to manage and operate the photo sticker creating apparatus 1 for general users represented by female high school students and female junior high school students. Play the created game for a fee, and make a profit by paying for it.

  Therefore, the high profit rate of the photo sticker creating apparatus 1 is important for the store manager. That is, it is desirable that the photo sticker creating apparatus 1 has a high rate of return per unit time. Moreover, in the case of a normal store, the installation place of the photograph sticker production apparatus 1 is limited by restrictions, such as a floor area and a structure. Therefore, it is desirable that the photo sticker creating apparatus 1 has a high profit rate per unit floor area. Furthermore, since the size and shape of each store are usually not the same, and other game machines and the like installed together with the photo sticker creating apparatus 1 are also different, it is desirable that the degree of freedom of installation is higher.

  As a method of improving the profit rate, for example, there is a method of improving the rotation rate of the photo sticker creation game. That is, it is a method of increasing the rate of return per unit time by increasing the number of games played within a predetermined time. However, if there is no user who plays the photo sticker creation game, the profit rate is not improved. Therefore, it is also important to improve the user's expectation and satisfaction with respect to the photo sticker creating game of the photo sticker creating apparatus 1.

  For example, the photo sticker creating apparatus 1 can increase the number of users by providing a function with high expectation that a user who has never played a game wants to play. In addition, by providing a function with a high degree of satisfaction that a user who has played once thinks that “it was fun and would like to play again”, the photo sticker creating apparatus 1 increases the number of users by increasing so-called repeaters. be able to. As a result, the profit of the photo sticker creating apparatus 1 is improved.

  For the user, the photo sticker creating apparatus 1 is roughly divided into playing a photo sticker making game (photo sticker making work) and obtaining a photo sticker image and a photo sticker as a result (finished photo). There are two types of entertainment: the seal itself.

  Examples of the former method for improving the amusement of photo sticker creation work include, for example, a multi-functionality of shooting functions and editing functions such as graffiti editing, improved operability, or adjustment of play time and waiting time. . The latter method of improving the amusement of the completed photo sticker itself is, for example, improvement of hardware such as improvement of the quality of sticker paper, high image quality printing, special color printing, etc. Improve the satisfaction level of the elderly.

  In addition, high school girls and junior high school students who are the main users not only stick the obtained photo sticker to their property, but also exchange the photo sticker or talk about the photo sticker as a topic. To do. Therefore, for example, when the image content of a photo sticker is flashy, interesting, scarce or novel, the user may like the photo sticker, but not only that, It is also possible that the conversation will be excited about the photo sticker.

  In this way, improving the amusement of photo stickers not only improves the satisfaction of its users, but also increases the expectations of others, attracting customers due to the spread of reputation, ie, so-called word-of-mouth You can also expect to increase new users. Therefore, by using such a photo sticker, the photo sticker itself can also serve as an advertisement for the photo sticker creating apparatus 1.

  In other words, in the photo sticker creation game, by providing a function that can create images that users will like, not only the user's satisfaction with the photo sticker creation game but also the satisfaction with the photo sticker is improved. Furthermore, the topicality of the photo sticker and the advertising effect can be improved at the same time.

  Although the photo sticker creating apparatus 1 has such a feature, the casing 10 includes four units, that is, a photographing unit 12, an editing unit 13, a ceiling strobe unit 14, and a background curtain unit 15, as shown in FIG. It is roughly divided into

  The photographing unit 12 has a function of photographing a user or the like as a subject, and the subject is photographed in an internal photographing space described later. The photographing unit 12 is roughly divided into two units, a front unit 12A and a rear unit 12B.

  The front unit 12A is a unit on the front side of the shooting space, and performs processing related to shooting of the user in the shooting space and accepts operations related to the user's photo sticker creation game. The rear unit 12B is a unit on the side that is the rear side of the shooting space, and is a unit that plays the role of the user when the front unit 12A takes a picture. The interior of the rear unit 12B is a cavity (space), and is configured as a part of a shooting space used when a user performs shooting. A stage or the like used for the user to pose may be provided in the rear unit 12B. An external monitor 16, a speaker 17, and a touch button 18 are provided on the side surface of the front unit 12A. Details thereof will be described later.

  The editing unit 13 is provided with a configuration (for example, a monitor, a touch panel, and a touch pen) for the user to perform graffiti editing or the like on the captured image. As will be described later, the editing unit 13 consisting of one frame is provided with a configuration for performing graffiti editing and the like on each of the two surfaces so that two sets of users can perform editing work simultaneously. Yes. As will be described later, the photo sticker creating apparatus 1 can execute a plurality of photo sticker creating games in parallel. In other words, a plurality of sets of users can play the photo sticker creation game in the photo sticker creation device 1 at the same time. The editing unit 13 has a configuration for performing two graffiti editing and the like so that a plurality of sets of users can simultaneously perform graffiti editing in order to improve the rotation rate of the photo sticker creation game. Is provided.

  The ceiling strobe unit 14 provided above the photographing space incorporates a strobe that emits flash light toward the photographing space in accordance with the photographing timing. The ceiling strobe unit 14 is further provided with a fluorescent lamp, and functions as illumination that constantly illuminates the shooting space (the user in the shooting space). The photographing space is simply configured as a closed space as will be described later. Therefore, it is difficult for light to enter from the outside, and the user's operability may be deteriorated because it tends to be dark. Therefore, the ceiling strobe unit 14 can illuminate the shooting space with the illumination to avoid such a fear and improve operability. Such illumination is also provided in the front unit 12A. A configuration example of the front unit 12A will be described later with reference to FIG.

  Moreover, the background curtain unit 15 provided with the movable curtain (background curtain) utilized for a background is provided in the upper part of the back unit 12B, for example. In the background curtain unit 15, for example, a plurality of winding curtains having different colors or patterns are suspended. For example, there are colors and patterns that are popular among young people who are the main users, such as bright single colors, stylish and cute patterns like foreign picture books, and colorful pop patterns (for the public). Multiple items are aligned.

  The background curtain unit 15 operates in conjunction with a photographing function such as a camera or a strobe provided in the front unit 12A. At the time of photographing, for example, a curtain of a color selected by the user is lowered and other curtains are wound. take. By operating in this manner, the background curtain unit 15 can easily provide a background of a design desired by the user, a design according to the shooting mode, and the like. Thereby, the photograph sticker creation apparatus 1 can improve the user's satisfaction with respect to an image (photo sticker).

  The number of curtains (background curtains) provided in the background curtain unit 15 is arbitrary. For example, only one background curtain may be suspended from the background curtain unit 15. In this case, the photo sticker creating apparatus 1 may prepare a plurality of types of background images for composition, and may synthesize the background image selected by the user from the photographed images. By doing in this way, even if there is only one physical background curtain provided in the photo sticker creating apparatus 1, the photo sticker creating apparatus 1 can cope with various user preferences.

  When the user is shooting in the shooting space in the shooting unit 12, the external monitor 16 is located next to the user who is waiting to use the shooting space next time or the photo sticker creating apparatus 1. Content (demonstration) and photos that output (display) a video showing the progress of the user's game during shooting, and guide (explain) the flow and play of the photo sticker creation game An advertisement or the like of a store or a region where the sticker creating apparatus 1 is installed is output. If an error has occurred in the photo sticker creating apparatus 1, the content of the error is output.

  The speaker 17 outputs sound that matches the video, content, advertisement, etc. output from the external monitor 16.

  The touch button 18 is configured not to have a mechanically operating mechanism, and is operated (touched) by a predetermined user, thereby sensing pressure or static electricity caused by the touch and accepting the user's operation. When the touch button 18 receives a user operation, the contents of the video, content, advertisement, and the like output from the external monitor 16 are switched as appropriate. The touch button 18 may be configured to be displayed on a part of the display area of the external monitor 16 when the external monitor 16 is configured with a touch panel superimposed thereon. Alternatively, it may be configured as a push button having a mechanically operated mechanism.

  Further, the external monitor 16, the speaker 17, and the touch button 18 are provided on any one of the side surfaces of the front unit 12A. That is, the side surface of the front unit 12A (not shown in FIG. 1) is connected to the side surface of the front unit 12A in which the external monitor 16, the speaker 17, and the touch button 18 are provided in FIG. When the button 18 is not provided, the external monitor 16, the speaker 17, and the touch button 18 can be installed.

  FIG. 2 is a perspective view of the photo sticker creating apparatus 1 of FIG. 1 as viewed from the editing unit 13 side.

  As shown in FIG. 2, the editing unit 13 is installed adjacent to the surface of the front unit 12A opposite to the surface on the rear unit 12B side (that is, the surface on the imaging space side). A first editing interface 13A and a second editing interface 13B are provided on two side surfaces of the editing unit 13 that are orthogonal to the front unit 12A. That is, the editing unit 13 allows a plurality of sets of users who perform the photo sticker creation game to perform editing operations with the editing unit 13 facing each other in a direction orthogonal to the direction connecting the front unit 12A and the rear unit 12B. Two editing interfaces are provided so that this can be done.

  Also, on the surface of the editing unit 13 that faces the photographing unit 12, a photo sticker discharge unit 13C that is provided to the user by discharging the photo sticker created by playing the photo sticker making game is provided. ing.

  FIG. 3 is an external view showing an installation example of the photo sticker creating apparatus 1.

  When the photo sticker creating apparatus 1 is installed in a game center or the like, for example, as shown in FIG. 3, a side curtain 21 is provided so as to surround the photographing unit 12 and the editing unit 13. The side curtain 21 simply surrounds an imaging space in which an imaging operation is performed, an editing space in which an editing operation is performed, and the like, and improves the closeability of each space. In other words, for example, the side curtain 21 contributes to high-quality shooting and the visibility of the editing screen by stabilizing the environment in each space, or blocks the field of view from outside each space. It is possible to perform operations such as shooting and editing without being conscious of the line of sight from the outside.

  The outer surface of the side curtain 21 is usually used as a place to place an advertisement, and a flashy design image representing the model name and model image of the photo sticker creating apparatus 1 is provided so as to attract attention at the installation location. Printed.

  The photo sticker creating apparatus 1 as described above causes each set of users to play a photo sticker creating game. The photo sticker creating apparatus 1 can simultaneously execute a plurality of photo sticker creating games. That is, the photo sticker creating apparatus 1 can cause a plurality of sets to play a photo sticker creating game at the same time. At this time, each group of users can perform each work of the photo sticker creation game while moving through a plurality of spaces provided in the photo sticker creation device 1 to advance the game.

  Next, the flow of the photo sticker creation game and the accompanying movement of the space will be described with reference to FIG.

  FIG. 4 is a plan view of the photo sticker creating apparatus 1 as viewed from above during the user's game. As shown in FIG. 4, the user passes from the outside (user A or user A ′) between the front unit 12A and the rear unit 12B, and is indicated by the dotted line in the figure, the front unit 12A and the rear unit. The photograph sticker creation game is started by entering the photographing space 32 where the photographing work is performed and the price is inserted (User B). If the previous group of users is using the shooting space 32 (if shooting is in progress), the user waits outside the unit (user A or user A ′). In FIG. 4, the user (user A) outputs a video showing the progress of the user's game being photographed and the flow of the photo sticker creation game, how to play, etc., which are output from the external monitor 16 and the speaker 17. It is possible to wait while watching the content that guides the user.

  When the price is inserted and the photo sticker creation game is started, the user (user B) performs a shooting operation to photograph himself / herself using a camera or a monitor provided in front of the front unit 12A. .

  When the photographing operation is completed, the user selects one of the obtained photographed images to be saved as an image to be edited, and in accordance with the displayed movement guide, from the photographing space 32 where the photographing operation is performed, a dotted line in the figure. It moves to the edit space which performs edit work shown by.

  A first editing space 33A is provided on the left side of the editing unit 13 as viewed from the front unit 12A (in front of the first editing interface 13A), and a second editing space 33B is provided on the right side (in front of the second editing interface 13B). ing. That is, in this photo sticker creation apparatus 1, a total of two sets of users, the user (user C) of the first editing space 33A and the user (user C ') of the second editing space 33B, simultaneously edit. Work can be done. A user (user B) of the shooting space 32 is guided to a free one of the two editing spaces. When the user moves to the first editing space 33A or the second editing space 33B, the user performs a graffiti editing work. Hereinafter, when it is not necessary to distinguish the first editing space 33A and the second editing space 33B, they are referred to as an editing space 33.

  Thereafter, when the graffiti editing is completed, the user (user C or user C ′) next performs print setting work and sets the number of divisions and the like. When printing is started, the user performs image transmission, questionnaire input, mini-games, and the like during the printing. Then, when printing is completed, it moves to the printed matter receiving area 34 in front of the photo sticker discharge unit 13C (user D), receives the discharged photo sticker paper, and ends the photo sticker creation game.

  In the photo sticker creation game as described above, when the user B of the shooting space 32 moves to the editing space 33, the shooting space 32 becomes unused, so the photo sticker creation device 1 uses a new set. A person can use the photographing space 32. Further, when the user C of the first editing space 33A or the user C ′ of the second editing space 33B moves to the printed material receiving area 34, the first editing space 33A or the second editing space 33B is not used. Therefore, the photo sticker creating apparatus 1 can cause a new set of users (user B) to use the first editing space 33A or the second editing space 33B. Note that when both the first editing space 33A and the second editing space 33B are unused, the photo sticker creating apparatus 1 gives the user (user B) the first editing space 33A and the second editing space 33B. In addition to guiding the user to use one of the editing spaces 33B, when the user moves to a direction other than the guided editing space, the user can be guided to the guided editing space.

  In other words, the photo sticker creating apparatus 1 uses the shooting space 32, the first editing space 33A, the second editing space 33B, and the printed matter in order to improve the rotation rate per unit time and unit floor area of the photo sticker creating game. Using the receiving area 34, a plurality of sets of users can play a photo sticker creation game at the same time (multiple customer service can be performed).

  In addition, the structure of such a housing | casing 10 is arbitrary and may be a structure other than what was mentioned above. Similarly, the method of multiple customer service is also arbitrary.

  Next, the configuration of each unit will be described.

  FIG. 5 is a diagram illustrating a configuration example of the front unit 12A. A front surface 12A-1 in FIG. 5 is a side surface facing the imaging space 32, and is a front surface for the user who performs the imaging operation. That is, the user basically performs a photographing operation in the photographing space 32 toward the front surface 12A-1 (facing the front unit 12A side).

  Near the center of the front surface 12 </ b> A- 1, a camera 51 for photographing a subject in the photographing space 32 is provided, and a touch panel monitor 52 is provided below the camera 51. The touch panel monitor 52 functions as, for example, a live view monitor that displays a composite image (moving image) obtained by combining an image such as a background or foreground with a captured image that has been photoelectrically converted by the camera 51 and captured in real time (immediately). And a function of displaying various GUI (Graphical User Interface) images and receiving a user's instruction through a touch panel superimposed on the screen. In addition, the front surface 12A-1 illuminates the inside of the photographing space 32 behind the milky white acrylic plate 53-1 to the milky white acrylic plate 53-6 formed of a curved surface or a flat surface, or matches the photographing timing by the camera 51. In addition to the installation of a lighting device that emits flash light, a fee for baggage storage areas 54-1 and 54-2 for placing user's baggage and the like during shooting, and a photo sticker creation game was introduced. A coin insertion / return port 55 is provided.

  Further, on the front surface 12A-1, on the back side of the milky white acrylic plate 53-1 to the milky white acrylic plate 53-3, near the ceiling of the shooting space 32, guidance sound in the shooting work of the photo sticker creation game, BGM (Back A speaker 124 (FIG. 8) (not shown) that outputs sound such as Ground Music) and sound effects is provided. In addition, the number and installation positions of the speakers 124 are arbitrary.

  The camera 51 is configured by an imaging device or the like using a CCD (Charge Coupled Device), and captures a moving image as a captured image by capturing a user in the photographing space 32 in the photographing work process of the photo sticker creation game. A part of the captured image captured by the camera 51 is cut out or an image such as a background is combined and displayed on the touch panel monitor 52 immediately. In addition, from the captured image captured by the camera 51 at a predetermined timing, an image of the frame at that timing is extracted and stored as a captured image (still image).

  For example, a camera using a CMOS sensor may be used as the camera 51 instead of the CCD. Of course, any camera other than these may be used as long as the user can be photographed and digital image data can be obtained. Further, the number of cameras 51 and the installation position are arbitrary.

  The touch panel monitor 52 includes, for example, a live view monitor 123 (FIG. 8) such as a CRT (Cathode Ray Tube) or an LCD (Liquid Crystal Display) and a colorless and transparent touch panel laminated thereon, and is used during photographing work. The captured image captured by the camera 51 is displayed immediately (in real time) so that the person can determine the standing position and posture while checking the photographing result. At this time, for example, the captured image displayed on the touch panel monitor 52 is synthesized with an image designated by the user, such as a background or a foreground, or a part of the captured image is trimmed (trimmed) according to the image size of the image. Or). In addition, the touch panel monitor 52 displays an image related to the shooting work such as a guide image for guiding the procedure of the shooting work, and the touch panel (for example, pressure-sensitive type or electromagnetic induction type) accepts a tap operation by a user's finger or the like, for example. It is made like that.

  Behind the milky white acrylic plate 53-1 to the milky white acrylic plate 53-6, a plurality of lighting devices capable of fluorescent light and strobe light emission are installed to brighten the shooting space 32 and ensure operability and safety. Furthermore, in order to constantly illuminate the shooting space 32 and improve the image quality of the shot image, the flash is emitted at a timing corresponding to the shooting timing, and the user who is shot is irradiated with light from various angles. Hereinafter, the milky white acrylic plate 53-1 to the milky white acrylic plate 53-6 are referred to as a milky white acrylic plate 53 when it is not necessary to distinguish between them. Further, an LED (Light Emitting Diode) (not shown) may be provided on the surface of the predetermined milk-white acrylic plate 53 in order to adjust the brightness of the photographing space 32 by flash emission.

  The luggage storage areas 54-1 and 54-2 are shelves that are used as storage areas for user baggage at the time of photographing. The luggage storage areas 54-1 and 54-2 do not interfere with shooting, that is, do not interfere with the user's shooting pose and do not interfere with the above-described illumination of the illumination device. In the position. The user can perform photographing work more comfortably by placing the baggage in the luggage storage areas 54-1 and 54-2.

  The coin insertion / return port 55 is a user interface that allows a user to insert a coin as a price for a photo sticker creation game, or to return a coin that has failed to be inserted or a coin for fishing.

  Of course, a configuration other than that described above may be provided on the front surface 12A-1 of the front unit 12A.

  Next, the configuration of the editing unit 13 will be described.

  FIG. 6 is a diagram illustrating a configuration example of the left side of the editing unit 13 as viewed from the photographing unit 12 side (configuration example on the first editing space 33A side).

  In FIG. 6, an illumination device 61 is provided above the first editing interface 13A to illuminate the first editing space 33A that becomes dark because it is partitioned by the side curtain 21 or the like. The illuminating device 61 allows the photo sticker creating apparatus 1 to suppress a reduction in the user's editing work efficiency and improve the safety of the first editing space 33A. In the example of FIG. 6, the upper portion of the editing unit 13 on the first editing interface 13 </ b> A where the lighting device 61 is provided penetrates to the second editing space 33 </ b> B side. The first editing space 33A and the second editing space 33B are shared. That is, the illumination device 61 illuminates not only the first editing space 33A but also the second editing space 33B, thereby improving safety. Note that the installation position and the number of the illumination devices 61 are arbitrary.

  The first editing interface 13A is provided with a tablet built-in monitor 62, two touch pens 63-1, a touch pen 63-2, a speaker 64, and an additional coin insertion / return port 65.

  In the tablet built-in monitor 62, a colorless and transparent tablet capable of inputting position information with the touch pen 63-1 or the touch pen 63-2 is superimposed on a display screen of a display device capable of displaying an image such as a CRT display or an LCD. It is constituted by. Accordingly, the tablet built-in monitor 62 not only displays a GUI image or the like by the display device but also accepts an input operation from the user by the tablet. On the tablet built-in monitor 62, for example, an editing screen including a GUI image for editing work for editing a shot image obtained by shooting work in a shooting space is displayed.

  The touch pen 63-1 and the touch pen 63-2 are detachably fixed to the editing unit 13 by a predetermined attachment on the left and right sides of the tablet built-in monitor 62 one by one on the left and right sides. By using each of these touch pens, it is possible to play the editing process of the photo sticker creation process at the same time. In the following description, the touch pen 63-1 and the touch pen 63-2 are simply referred to as the touch pen 63 when it is not necessary to distinguish them from each other.

  The touch pen 63 is electrically connected to a system installed inside the editing unit 13 by wire (or wireless). The touch pen 63 operates in cooperation with the tablet of the tablet built-in monitor 62. When the user operates the touch pen 63 and taps the tablet (the touch pen 63 and the display screen of the tablet built-in monitor 62 are brought into contact (or close proximity)), the tablet operates as an input unit that receives input from the user. To do. The number of touch pens 63 to be installed, the installation position, the shape, and the like are arbitrary.

  The speaker 64 outputs, for example, sound related to editing work of the photo sticker creation game, such as guidance sound, sound effects, or BGM. The speaker 64 is disposed on the right side of the tablet built-in monitor 62. Since a right-handed user performs an editing operation while holding the touch pen 63 with the right hand, even when the editing operation is performed while leaning on the editing unit 13, there is little possibility of leaning on the right side of the tablet built-in monitor 62. Therefore, by disposing the speaker 64 on the right side of the tablet built-in monitor 62, even when a right-handed user performs an editing operation while leaning on the editing unit 13, the user closes the speaker 64 and can hear sound. It can be prevented from disappearing. The number, design, shape, etc. of the speakers 64 are arbitrary.

  The additional coin insertion / return port 65 is provided to allow the user to insert additional coins or return the change when the user wants to extend the editable time or outputs a plurality of photo stickers. It has been. In addition, payment of this additional price is not limited to a coin, Naturally, it can also be made to be able to pay also with a banknote. In addition, instead of cash (coins and banknotes) that are actually distributed, payment can also be made with prepaid cards, coins, electronic money, etc. distributed at amusement facilities where the photo sticker creation device 1 is installed. May be.

  However, the extension of the editable time is performed when the shooting operation is not currently performed in the shooting space 32 (when there is no next user) or the other editing interface (in this case, the second editing interface). 13B) is possible only when it is empty, and cannot be extended when the shooting operation is performed in the shooting space 32 and the other editing interface is used.

  Note that the configuration of the editing unit 13 on the second editing space 33B side is basically the same as the configuration of the first editing space 33A side shown in FIG. In the following, the configuration example shown in FIG. 6 is also applied to the case where the configuration of the editing unit 13 on the second editing space 33B side is described.

  FIG. 7 is a diagram illustrating a configuration example of the photo sticker discharge unit 13 </ b> C of the editing unit 13.

  As shown in FIG. 7, the photo sticker discharging unit 13C has a photo sticker created by a user who has performed editing work in the first editing space 33A (that is, a photo sticker created by operating the first editing interface 13A). And a photo sticker discharge port 71 through which a photo sticker created by a user who has performed editing work in the second edit space 33B (that is, a photo sticker created by operating the second edit interface 13B) is discharged. It has been.

  A printer is installed inside the editing unit 13, and a printing image created by operating the first editing interface 13A or the second editing interface 13B is printed on sticker paper by the printer. The photo sticker output from the printer is discharged from the photo sticker discharge port 71.

  Further, a speaker (not shown) may be provided in the photo sticker discharge unit 13C. The speaker (not shown) outputs sound relating to printing and discharging of the photo sticker such as guidance sound, sound effect, or background music.

  Next, an internal configuration example of the photo sticker creating apparatus 1 will be described.

  FIG. 8 is a block diagram illustrating a functional configuration example of the photo sticker creating apparatus 1. The same parts as those described above are denoted by the same reference numerals.

  In FIG. 8, the photo sticker creating apparatus 1 includes a control unit 101 that is a processing unit that controls the entire apparatus of the photo sticker creating apparatus 1. The control unit 101 includes a storage unit 102, a communication unit 103, A drive 104, a ROM (Read Only Memory) 106, a RAM (Random Access Memory) 107, a photographing unit 112, an editing unit 113, a printer 114, an external monitor 16, a speaker 17, and a touch button 18 are respectively connected via predetermined buses. Connected.

  The storage unit 102 has a non-volatile storage medium such as a hard disk or a flash memory, and stores various setting information supplied from the control unit 101 in the storage medium or is recorded in the storage medium. The setting information is read and supplied to the control unit 101. Note that this storage medium may be anything as long as it can store information.

  The communication unit 103 is connected to another communication device (not shown) via an external network (not shown) such as the Internet or a public telephone line network, or simply via a communication cable (not shown). And is controlled by the control unit 101 to communicate with another communication device such as a mobile terminal such as a user's mobile phone, a user's personal computer, or a central management server. For example, the communication unit 103 transmits a transmission signal supplied as a communication signal from the control unit 101 to another communication device, or supplies a reception signal supplied from another communication device to the control unit 101 as a communication signal. To do.

  The drive 104 includes a removable medium such as a magnetic disk (including a flexible disk), an optical disk (including a CD-ROM (Compact Disc-Read Only Memory), a DVD (Digital Versatile Disc)), a magneto-optical disk, and a semiconductor memory. The computer program and data read out from them are supplied to the control unit 101 as necessary, and stored in the storage unit 102 or installed.

  The ROM 106 stores programs and data executed by the control unit 101 in advance, and the ROM 106 supplies the programs and data to the control unit 101 based on instructions from the control unit 101. The RAM 107 temporarily holds data and programs processed by the control unit 101.

  The imaging unit 112 is a block related to the imaging unit 12 and includes a coin processing unit 121, a background curtain unit 15, an illumination control unit 122, a camera 51, a touch panel monitor 52, and a speaker 124.

  The coin processing unit 121 is activated in response to an activation signal input from the control unit 101. When a coin is inserted into the coin insertion / return port 55, a process related to payment for the photo sticker creation device 1 by the user (game participation) And the activation signal is supplied to the control unit 101. In response to the activation signal, the control unit 101 causes the touch panel monitor 52 and the speaker 124 to perform charging processing, guidance, and the like.

  Based on the background control signal supplied from the control unit 101, the background curtain unit 15 selects a background to be provided, that is, raises and lowers (expands and stores) the movable background curtain. The background may be selected manually by the user.

  Based on the illumination control signal supplied from the control unit 101, the illumination control unit 122 turns on or off the illumination device installed behind the milky acrylic plate 53.

  The camera 51 is controlled by the control unit 101 to shoot a user or the like in the shooting space 32 as a subject in the shooting process of the photo sticker creation game. For example, the camera 51 captures a moving image under the control of the control unit 101 and supplies the captured moving image data to the control unit 101. In addition, the camera 51 acquires a captured image (still image) by operating a mechanical shutter based on a shutter control signal supplied from the control unit 101, and stores the acquired still image data in the storage unit 102. Let

  The touch panel monitor 52 includes a live view monitor 123. The live view monitor 123 of the touch panel monitor 52 is an image based on RGB signals supplied from the control unit 101 (an image captured in real time by the camera 51, a composite image created using a captured image, a GUI image, or the like). Is displayed. Further, when the user taps (touches or approaches) the touch panel monitor 52 using his / her own finger or the like, the touch panel monitor 52 supplies the position information to the control unit 101. The control unit 101 analyzes the position information, identifies and accepts information and instructions input by the user with respect to the display image.

  The speaker 124 is provided at an arbitrary position of the photographing unit 12 and outputs, for example, sound of a sound signal supplied from the control unit 101 (such as shooting work guidance sound, sound effect, or background music).

  The editing unit 113 is a block related to the editing unit 13, and includes a first editing unit 113A corresponding to the first editing interface 13A and a second editing unit 113B corresponding to the second editing interface 13B. The first editing unit 113A and the second editing unit 113B have the same configuration, and the following description is given with respect to the first editing unit 113A, but is also applicable to the second editing unit 113B.

  As described with reference to FIG. 6, the first editing unit 113 </ b> A includes a coin processing unit 130 in addition to the tablet built-in monitor 62, the touch pen 63-1 and the touch pen 63-2, and the speaker 64.

  As described above, the tablet built-in monitor 62 includes the editing monitor 141 made of a display device such as a CRT display or LCD, and the colorless and transparent tablet 142 superimposed on the display screen of the editing monitor 141. The tablet built-in monitor 62 displays on the editing monitor 141 an image relating to editing, such as an editing input GUI and a photographed image, corresponding to the RGB signal supplied from the control unit 101. When the user operates the touch pen 63-1 or the touch pen 63-2 and taps the tablet 142, the tablet built-in monitor 62 supplies a signal representing the position information to the control unit 101 as an input signal. The control unit 101 analyzes the position information represented by the input signal, specifies and accepts information and instructions input by the user with respect to the display image. That is, editing input for the captured image is performed using these configurations.

  The speaker 64 outputs the sound of the sound signal supplied from the control unit 101 (editing work guide sound, sound effect, or sound related to the editing work process of the photo sticker creation game, such as BGM).

  The coin processing unit 130 is activated in response to an activation signal input from the control unit 101. When a coin is inserted into the additional coin insertion / return port 65, a process related to payment for the photo sticker creation apparatus 1 by the user (additional) Print acceptance), and supplies an activation signal to the control unit 101. The control unit 101 causes the printer 114 to perform additional printing processing in response to the activation signal.

  The printer 114 prints the result of editing work performed by the editing unit 113 on sticker paper. When the printer 114 acquires the editing work result performed by the first editing unit 113 </ b> A from the control unit 101, the printer 114 acquires the sticker paper 162 from the sticker paper unit 161 and prints the editing work result on the sticker paper 162. When the printing is finished, the printer 114 discharges the printed sticker paper 162 as a photo sticker from the photo sticker discharge port 71 and provides it to the user.

  When the external monitor 16 obtains information from the control unit 101 that the user is shooting in the shooting space 32 in the shooting unit 12, the user who is waiting to use the shooting space next time. In contrast, a video showing the progress of the game of the user who is shooting is output. In addition, while the external monitor 16 does not acquire information indicating that the user is shooting in the shooting space 32 in the shooting unit 12 from the control unit 101, the flow of the photo sticker creation game and how to play Output contents (demonstration) to guide (explain) etc. Under the control of the control unit 101, the speaker 17 outputs audio that matches the video and content output from the external monitor 16.

  Further, when the external monitor 16 acquires information indicating that the touch button 18 has been operated by the user from the control unit 101, the external monitor 16 switches and outputs the contents of the output video, content, and the like.

  When the touch button 18 is operated (touched) by a predetermined user, the touch button 18 supplies an input signal representing the operation content to the control unit 101. The control unit 101 supplies information indicating that the operation represented by the input signal has been received to the external monitor 16 and controls to switch the display content of the external monitor 16.

  Next, the control unit 101 will be described. FIG. 9 shows an example of a functional block configuration realized by the control unit 101 executing a program stored in the ROM 106 or the like.

  The control unit 101 is a shooting processing unit 202 that performs processing related to a price (coin) that is input when starting the photo sticker creation game, and processing related to a shooting work process of the photo sticker creation game such as shooting a user. An editing processing unit 203 that performs processing related to a photo sticker creation game editing process such as graffiti editing for a photographed image; The post-service processing unit 205 performs processing related to the post-service process of the photo sticker creation game that serves a user who has finished the editing work.

  That is, the control unit 101 controls processing relating to each process of the photo sticker creation game.

  FIG. 10 is a functional block diagram illustrating a more detailed configuration example of the imaging processing unit 202 and the editing processing unit 203 of FIG.

  In the example of FIG. 10, the photographing processing unit 202 is a processing unit that performs processing in the photographing work process of the photo sticker creation game. For example, the photographing processing unit 202 accepts payment, photographs a user, and displays various GUIs. Process.

  In the example of FIG. 10, the shooting processing unit 202 includes a progress control unit 211, a shooting preparation processing unit 212, a course selection control unit 213, a composition image selection control unit 214, a live view display control unit 215, a shot image acquisition unit 216, The image processing control unit 217 and the composite image processing unit 218 are configured.

  The progress control unit 211 controls the progress of the photographing work process.

  The shooting preparation processing unit 212 performs processing related to shooting preparation, such as adjustment of reflection and adjustment of brightness.

  The course selection control unit 213 performs processing related to selection of a shooting course in shooting work.

  The composition image selection control unit 214 displays a composition image including a background, a foreground, and the like on the touch panel monitor 52, and controls processing related to the composition image selection performed by the user.

  The live view display control unit 215 performs processing related to display on the touch panel monitor 52 (live view display).

  The captured image acquisition unit 216 performs processing related to imaging, particularly acquisition of captured images.

  The image processing control unit 217 performs image processing on the captured image according to the shooting course.

  The composite image processing unit 218 performs processing related to generation of a composite image such as composite of captured images.

  In the example of FIG. 10, the editing processing unit 203 performs a process related to graffiti editing on the captured image obtained in the shooting work process by the user in the editing work process of the photo sticker creation game.

  For example, the editing processing unit 203 displays various GUIs, accepts graffiti editing input by a user, creates a composition image corresponding to the graffiti editing input, and synthesizes it with a captured image. That is, the editing processing unit 203 provides a user with a doodle editing function for a captured image. Any function may be provided as the graffiti editing function. For example, a graffiti pen function for scribbling a photographed image with a freehand line, or a predetermined image at an arbitrary position of the photographed image. There is a stamp function to synthesize.

  The editing processing unit 203 includes a progress control unit 231, an editing preparation processing unit 232, an input reception control unit 233, an editing screen display control unit 234, and a graffiti editing processing unit 235.

  The progress control unit 231 controls the progress of the editing work process.

  The editing preparation processing unit 232 performs processing related to editing preparation, such as selection of an image to be edited.

  The input reception control unit 233 controls the tablet 142 and performs a process of receiving operation input performed by the user with respect to various GUIs displayed on the editing monitor 141.

  The editing screen display control unit 234 performs processing related to display of the editing screen displayed on the editing monitor 141 and various GUI displays on the editing screen.

  The graffiti editing processing unit 235 performs processing related to graffiti editing on the edited image in response to an input operation using the touch pen 63 by the user.

  Next, the flow of the photo sticker creating game performed in the photo sticker creating apparatus 1 will be described with reference to the flowchart of FIG.

  In step S <b> 1, that is, while the photo sticker creation game is not performed in the shooting space 32, the shooting processing unit 202 of the control unit 101 performs a process of displaying a demonstration screen on the touch panel monitor 52.

  A user who performs a photo sticker creation game first inserts coins into the coin insertion / return slot 55 of the front unit 12A and pays for the game. The coin processing unit 121 accepts such coin insertion and supplies an activation signal to the control unit 101. In step S <b> 2, the imaging processing unit 202 determines whether or not a price has been inserted based on the activation signal, and proceeds to step S <b> 2 when determining that a valid price has been inserted. Note that until the user inserts a coin, the imaging processing unit 202 continuously performs a process of displaying a demonstration screen on the touch panel monitor 52.

  In step S <b> 3, the shooting processing unit 202 controls each unit of the shooting unit 112, and executes shooting processing that is processing related to shooting work of the photo sticker creation game performed in the shooting space 32.

  In the photographing process, the photographing processing unit 202 causes the user to select a composition image that is an image to be displayed as a background or foreground on the photographed subject image, and the photographed subject image and the composition selected by the user. A captured image obtained by combining the image for use is supplied to the edit processing unit 203. In addition, the photographing processing unit 202 causes the user to select a photographing course, and performs image processing on the photographed image according to the selected photographing course. The details of the photographing process will be described later with reference to FIG. 12 and the like, but a plurality of (for example, six) photographed images are obtained by the photographing process.

  When the shooting operation is completed, in step S4, the shooting processing unit 202 gives the editing space 33 (the first editing space 33A or the second editing space) to the user in the shooting space 32 who has finished the shooting operation by an image or a sound. 33B) is provided to guide the movement.

  In step S5, the editing processing unit 203 performs an editing process in which the user who has moved to the editing space 33 performs editing input on a plurality of (for example, six) photographed images obtained by the photographing process. Details of the editing process will be described later with reference to FIG.

  When the editing operation is completed, in step S6, the print processing unit 204 controls the printer 114 to perform a printing process for printing the edited image obtained by the editing operation on the sticker sheet 162 by the number corresponding to the standard price. .

  In step S7, the print processing unit 204 determines whether additional printing has been selected by the user. In the photo sticker creating apparatus 1, by paying the price for additional printing, it is possible to perform printing of a predetermined number of sheets more than the normal printing number. In step S7, the additional printing is performed by the user. It is determined whether or not the price for additional printing has been paid. In addition, selection of additional printing can be performed by the user paying the price for additional printing with respect to the additional coin insertion / return port 65 in the process of an edit process.

  If it is determined in step S7 that additional printing has not been selected, the print processing unit 204 skips step S8 and advances the process to step S9. On the other hand, if it is determined in step S7 that additional printing has been selected, the print processing unit 204 performs additional printing processing for printing the number of additional printings in step S8.

  In step S <b> 9, the post-service processing unit 205 performs post-service processing that provides a service for transferring the edited image to a mini game or a mobile terminal that posts the edited image for a user who is waiting for completion of printing. . When the post-service processing is completed, in step S10, the post-service processing unit 205 performs movement guidance for moving the user in the editing space 33 to the printed material receiving area 34. When the printing process is completed, in step S11, the print processing unit 204 controls the printer 114 to discharge the printed sticker sheet 162 from the photo sticker discharge port 71, and the photo sticker creation game process is ended.

  The photo sticker creation game is performed according to the above flow.

  Next, the detailed flow of the photographing process executed in step S3 of FIG. 11 will be described with reference to the flowchart of FIG. In the photographing process, as described above, for example, six photographed images are finally determined as editing processing targets.

  In step S21, the course selection control unit 213 of the shooting processing unit 202 displays a shooting course selection screen on the touch panel monitor 52, and allows the user to select a shooting course. For example, the user selects one of the GUI buttons (for example, the “couple course” button and the “friend course” button) provided on the shooting course selection screen (for example, taps on the button), and selects the shooting course. Enter instructions. As will be described later, the shooting course includes, for example, a couple course and a friend course as courses relating to user attributes (for example, user relations). The couple course is a shooting course in which shooting processing is performed especially for users of two men and women, and the friend course is a shooting course in which shooting processing is performed for users of the same sex two-person group or three or more groups. is there.

  When the shooting course selection screen is displayed, the course selection control unit 213 determines whether or not a couple course has been selected by the user in step S22. If it is determined that the couple course has been selected, the process proceeds to step S23. In step S23, the photographing processing unit 202 performs a couple course photographing process. Details of the couple course shooting process will be described later.

  If it is determined in step S22 that the couple course has not been selected, that is, if the friend course has been selected by the user, the course selection control unit 213 advances the process to step S24, and the photographing processing unit. 202 performs a friend course photographing process. Details of the friend course shooting process will be described later.

  When the process of step S23 or step S24 ends, the imaging process ends, the process returns to step S3 in FIG. 11, and the processes after step S3 are executed.

  Next, a detailed flow of the couple course photographing process executed in step S23 of FIG. 12 will be described with reference to the flowcharts of FIGS.

  When the couple course shooting process is started, in step S31, the shooting preparation processing unit 212 determines whether or not brightness is selected. At this time, for example, a setting screen (GUI) for setting and inputting the brightness of the photographed image is displayed on the touch panel monitor 52, and the photographing preparation processing unit 212 is instructed by a user who operates the setting screen. An image brightness instruction is accepted.

  If it is determined in step S31 that the brightness has been selected, in step S33, the shooting preparation processing unit 212 determines the brightness of the captured image (captured image) corresponding to the brightness determined to be selected in step S31. Make adjustments. When this adjustment ends, the process proceeds to step S34.

  On the other hand, if it is determined in step S31 that no brightness has been selected, the shooting preparation processing unit 212 sets a default brightness in step S32, and in step S33, a shot image corresponding to the default brightness. Adjust the brightness of the (captured image).

  In step S <b> 34, the course selection control unit 213 displays a selection screen (GUI) on the touch panel monitor 52 for selecting the type of image processing performed on the captured image.

  As a selection screen for selecting the type of image processing, for example, a gender selection screen shown in FIG. 15 is displayed.

  As shown in FIG. 15, the gender selection screen 301 is provided with two main configurations on the left and right except for a part. This basically allows two users of the same group (couple) to select and input gender at the same time.

  A live view image display unit 311 is provided in the center of the sex selection screen 301. In the live view image display unit 311, captured images of the subject A and the subject B from the camera 51 (hereinafter also referred to as a live view image) are displayed. In the live view image display unit 311, the positions of the faces of the subject A and the subject B are recognized, and frames 321A and 321B indicating the recognized face ranges (areas) of the subject A and the subject B are live view images. Is superimposed on the screen.

  In the gender selection screen 301, a gender selection button 312-1 and a sample image display unit 313-1 are on the left side of the live view image display unit 311, and a gender selection button 312 is on the right side of the live view image display unit 311. -2 and a sample image display unit 313-2 are provided.

  In the following, when there is no need to separately explain the gender selection button 312-1 and the gender selection button 312-2, and the sample image display unit 313-1 and the sample image display unit 313-2, respectively, the gender selection button 312 is selected. And a sample image display unit 313.

  The gender selection button 312 includes a “boy” button and a “girl” button, and one of them, that is, the gender is selected by the user, and the selected user in the photographed image obtained as a result of photographing. The type of image processing to be performed on the face area of the subject corresponding to is determined.

  In the sample image display unit 313, when gender is selected by the gender selection button 312, the face area of the subject corresponding to the user who has selected the gender selection button 312 in the still image extracted from the live view image at that time Is extracted, and a face image that has been subjected to image processing corresponding to the selected gender is displayed as a sample image.

  In addition, a remaining time display unit 314 that displays the remaining time (remaining time) of the time limit allocated in advance to the photographing work is provided in the upper right part of the sex selection screen 301. Further, at the top of the gender selection screen 301, a message prompting each of the two users to select a gender with the gender selection button 312 “Select each gender! Is displayed!

  That is, in the couple course photographing process, as the image process performed on the photographed image, the image process (male image process or female image process) corresponding to the selected gender is performed. Here, the image processing for men is, for example, a process of making the skin color brown like tanned or a facial outline on the face image of the subject, and the image processing for girls is, for example, the subject In the face image, the skin color is whitened or the eye size is increased. Note that, when male is selected as the gender, the male image processing is performed, but any image processing may not be performed. As a result, when the user includes a male, a more realistic photographed image of the face of the subject that is a male is acquired.

  Returning to the flowchart of FIG. 13, in step S <b> 35, the course selection control unit 213 accepts selection of gender-specific image processing by the user on the gender selection screen 301 displayed in step S <b> 34, that is, gender selection. .

  In step S36, the course selection control unit 213 displays a sample image that has been subjected to image processing in accordance with the selection received by the user received in step S35.

  That is, on the gender selection screen 301 shown in FIG. 15A, the “boy” button of the gender selection button 312-1 is selected by the user corresponding to the subject A, and the gender selection button 312-2 is selected by the user corresponding to the subject B. When the “Girl” button is selected, as shown in FIG. 15B, the “boy” button of the gender selection button 312-1 is highlighted and the “girl” button of the gender selection button 312-2 is highlighted. Is displayed. Further, the sample image display unit 313-1 includes a face image (subject A ′) obtained by performing image processing for men on the face image obtained by extracting the face area of the subject A in the still image extracted from the live view image. ) Is displayed on the sample image display unit 313-2, the face image obtained by performing image processing for women on the face image obtained by extracting the face area of the subject B in the still image extracted from the live view image (). Subject B ′) is displayed.

  Thereby, the user can image the captured image after the image processing is performed.

  In the gender selection screen 301 of FIG. 15, the sample image display unit 313 displays a face image that has been subjected to image processing corresponding to the selected gender as a sample image. As shown in the figure, on the gender selection screen 301, the sample image display unit 313 may not be provided, and a face image (sample image) subjected to image processing corresponding to the selected gender may not be displayed. In this case, the process in step S36 in the flowchart of FIG. 13 is omitted. As a result, the user can perform a photographing operation while looking forward to the photographed image after the image processing.

  Returning to the flowchart of FIG. 13, in step S <b> 37, the compositing image selection control unit 214 performs shooting for selecting a shooting compositing image that is a foreground or background image to be combined with the captured image. A composite image selection screen (GUI) is displayed on the touch panel monitor 52. Here, on the photographic composition image selection screen, a photographic composition image corresponding to the couple course may be displayed and selected.

  In step S38, the compositing image selection control unit 214 selects a photographic compositing image from the photographic compositing images displayed in the process of step S37. Only the number of sheets (for example, 6 sheets) is accepted.

  Here, according to the sex selected on the sex selection screen 301 described with reference to FIG. Specifically, for example, when a male is included in the selected gender, the selection of a single image for image composition is not accepted, but a plurality of images (for example, two images) for image composition You may make it receive selection of an image. As a result, for a male user who is not so familiar with the photo sticker creation game, the operation at the time of shooting can be simplified. In addition, when the selected gender is male and female, selection of a composite image for photographing exclusive for a couple in which a plurality (for example, 6) is set may be accepted. As a result, it is only necessary to select the image for composition for photographing once, and the operation at the time of photographing can be simplified.

  Furthermore, according to the selected gender, BGM (Back Ground Music) flowing in the shooting space during the shooting process, the background color of the live view image displayed on the touch panel monitor 52, or the like may be changed. Good.

  In step S39, the live view display control unit 215 synthesizes the photographic composition image and the captured image of the camera 51 and displays them on the touch panel monitor 52 (live view display). Here, when a shooting composition image corresponding to the couple course is selected in step S37, a captured image obtained by combining the shooting composition image corresponding to the couple course is displayed on the touch panel monitor 52 in a live view. Become so. Details of the live view display process will be described later.

  In step S40, the captured image acquisition unit 216 displays a countdown indicator on the touch panel monitor 52 or the like, controls the camera 51 to perform imaging, and obtains a captured image.

  In step S41, the image processing control unit 217 performs the image processing selected in step S35 on each subject in the captured image obtained in step S40. That is, in the photographed image, the male image process is performed on the face area of the subject A, and the female image process is performed on the face area of the subject B.

  In step S42, the composite image processing unit 218 combines the photographic composite image with the captured image that has been subjected to the image processing, and generates a composite image. Further, in step S43, the composite image processing unit 218 displays the generated composite image on the touch panel monitor 52 or the like, and allows the user to check the composite image (photographing result).

  In step S44, the progress processing unit 211 determines whether or not shooting has been performed a predetermined number of times. If the number of shots has not reached the predetermined number of times, the process returns to step S39, and the subsequent processing is performed. Repeated. For example, when shooting for obtaining a shot image to be combined with each of the synthesis images selected on the shooting synthesis image selection screen is performed and a total of six shootings are performed, the shooting is performed a predetermined number of times in step S44. It is determined.

  If it is determined in step S44 that the image has been shot a predetermined number of times, the process proceeds to step S45 in FIG.

  In step S45 of FIG. 14, the progress control unit 211 determines whether or not it is within the shootable time, and determines whether or not shooting is possible based on the elapsed time or the remaining time. If it is determined in step S45 that shooting is possible, the process proceeds to step S46, where additional shooting, so-called “bonus shooting” is performed.

  In step S46, the progress control unit 211 displays a bonus shooting guide screen for guiding the bonus shooting on the touch panel monitor 52 or the like, and allows the user to check the outline of the bonus shooting.

  In step S47, the compositing image selection control unit 214 displays the shooting compositing image selection screen on the touch panel monitor 52 in the same manner as in step S37 in FIG. 13, and in step S48, the user performs the same as in step S38. The selection of a composite image for shooting is accepted.

  In step S49, as in step S39, the live view display control unit 215 synthesizes the synthesis image and the captured image of the camera 51, and displays them on the touch panel monitor 52 (live view display).

  In step S50, the captured image acquisition unit 216 performs bonus shooting, which is additional shooting. This bonus shooting is also performed in the same manner as the normal shooting described above. That is, the captured image acquisition unit 216 displays a countdown indicator on the touch panel monitor 52 or the like, controls the camera 51, and acquires a captured image, similarly to step S40.

  In step S51, the image processing control unit 217 performs the image processing selected in step S35 on each subject in the photographed image obtained in step S50, similarly to step S41.

  In step S52, similarly to step S42, the composite image processing unit 218 generates a composite image by synthesizing the shooting composite image with the shot image obtained by shooting.

  In step S53, the composite image processing unit 218 displays the composite image on the touch panel monitor 52 in the same manner as in step S43, and allows the user to check the photographing result.

  In step S54, the progress control unit 211 determines whether or not a predetermined time has elapsed. If the photographing is still possible, the process returns to step S47, the subsequent processing is repeated, and bonus photographing is performed again. .

  On the other hand, if it is determined in step S54 that the predetermined time has elapsed, the process proceeds to step S55. Even if it is determined in step S45 described above that the time is not within the shootable time, since bonus shooting is not performed, step S46 to step S54 are skipped, and the process proceeds to step S55.

  In step S <b> 55, the composite image processing unit 218 displays the composite image generated so far on the touch panel monitor 52, so that a plurality of (for example, six) photographed images are used as editing target images. , Together with the shooting information indicating that the couple course has been selected, it is stored in the storage unit 102. When bonus shooting is performed, the composite image processing unit 218 gives the user a plurality of images (for example, 6 images) out of a plurality of images (for example, 6 images) and a captured image by bonus shooting. The image to be edited is selected, and the selected photographed image is stored in the storage unit 102 together with the photographing information indicating that the couple course has been selected.

  When the process of step S55 ends, the couple course shooting process ends, and the process returns to step S23 of FIG.

  In this way, when a couple course is selected, image processing according to gender is selected by the user as the type of image processing for the subject (user) in the captured image, and selection is performed for the subject in the captured image. The processed image processing is performed. Thereby, since the image processing intended by the user can be performed on the captured image with a simple operation, the satisfaction of the user with respect to the imaging operation in the photo sticker creation operation can be improved.

  In the above example, two users who are couples select different genders, but of course, they may select the same gender, and what is the actual gender of the user? You can also choose the opposite gender. This makes it possible to perform interesting and interesting image processing on the photographed image, so that the user's satisfaction with the photographing work in the photo sticker creation work can be further improved. It becomes.

  In the above-described example, image processing according to gender is selected. However, image processing according to user attributes such as parent / child (adult and child) distinction and age (age) is selected. You may make it do. Specifically, when image processing according to parent-child distinction is selected, a parent-child selection button including a “parent” button and a “child” button is provided instead of the gender selection screen 301 shown in FIG. When the parent / child selection screen is displayed and the “parent” button is selected, image processing is performed to sharpen the face shape (outline) so that the face looks dignified, and the “child” button is selected. Then, image processing may be performed to make the shape of the face round so that the face looks lovely. When image processing corresponding to the age (age) is selected, an age selection screen having an age selection button composed of age-specific buttons such as a “20s” button and a “40s” button is displayed. In this way, image processing for adjusting the texture of the skin, such as changing the shape of the face or removing the spots or wrinkles, may be performed according to the selected age selection button. Furthermore, appearance (physical) characteristics such as height and length of hair are applied as user attributes so that image processing is performed according to the stage of height and length of hair. Also good.

  In the above, as a sample image, a face image subjected to image processing for each selected gender is displayed on the face area of the subject in the still image extracted from the live view image (captured image). A face image may be obtained by shooting a user's face and the face image subjected to image processing for each selected gender may be displayed as a sample image.

  Here, referring to the flowcharts of FIG. 17 and FIG. 18, a face image is obtained by shooting a user's face and image processing for each selected gender is performed as a sample image. A detailed flow of the couple course shooting process to be displayed will be described.

  The processes in steps S71 to S75 and S80 to 96 in the flowcharts of FIGS. 17 and 18 are basically the same as the processes in steps S31 to S33 and S39 to S55 in the flowcharts of FIGS. Description is omitted.

  That is, in step S75 of the flowchart of FIG. 17, after accepting the selection of the shooting composition image, in step S76, the course selection control unit 213 displays a trial shooting execution screen on the touch panel monitor 52, and the user Let's take a picture.

  For this reason, for example, a trial shooting execution screen 400 shown in FIG.

  In the vicinity of the center of the trial shooting execution screen 400, face image display units 411-1 and 411-2 are provided. In face image display sections 411-1 and 411-2, face areas in the live view images of subject A and subject B from camera 51 are displayed, respectively.

  In addition, a shooting countdown indicator 412 is displayed at the bottom of the trial shooting execution screen 400 to indicate the remaining number of seconds until the shutter timing of shooting, for example, “3, 2, 1, 0” or the like.

  A remaining time display unit 413 for displaying the remaining time (remaining time) of the time limit assigned in advance to the shooting operation is provided in the upper right part of the trial execution screen 400. Further, on the upper part of the trial shooting execution screen 400, a message “Enter the frame and take a trial!” Prompting each of the two users to prepare for trial shooting is displayed.

  The user moves so that the face of each user (subject) in the live view image is displayed on each of the face image display units 411-1 and 411-2, and is displayed on the shooting countdown indicator 412. When the number reaches “0”, trial shooting is performed.

  Returning to the flowchart of FIG. 17, in step S <b> 77, the course selection control unit 213 displays a selection screen (GUI) for selecting the type of image processing performed on the captured image on the touch panel monitor 52.

  When trial shooting is performed in step S76, trial shooting execution screen 400 shown in FIG. 19A displayed on touch panel monitor 52 is switched to gender selection screen 420 shown in FIG. 19B.

  As shown in FIG. 19B, face image display sections 411-1 and 411-2 are provided near the center of the sex selection screen 420, as in the trial shooting execution screen 400. However, here, the face areas (face images) in the still images of the subject A and the subject B obtained as a result of trial shooting are displayed on the face image display units 411-1 and 411-2, respectively.

  Gender selection buttons 421-1 and 421-2 are provided below the face image display units 411-1 and 411-2. The gender selection buttons 421-1 and 421-2 have the same functions as the gender selection buttons 312-1 and 312-2 described with reference to FIG.

  Further, the remaining time display unit 413 is provided in the upper right part of the sex selection screen 420 as in the trial shooting execution screen 400. Furthermore, at the top of the gender selection screen 420, a message “Select your gender!” Prompting each of the two users to select a gender with the gender selection buttons 421-1 and 421-2 is displayed. ing.

  Returning to the flowchart of FIG. 17, in step S78, the course selection control unit 213 accepts selection of image processing by gender by the user, that is, selection of gender, on the gender selection screen 420 displayed in step S77. .

  In step S79, the course selection control unit 213 displays a sample image that has been subjected to image processing in accordance with the selection received by the user received in step S78.

  That is, on the gender selection screen 420 shown in FIG. 19B, the “boy” button of the gender selection button 421-1 is selected by the user corresponding to the subject A, and the gender selection button 421-2 is selected by the user corresponding to the subject B. When the “girl” button is selected, as shown in FIG. 19C, the “boy” button of the gender selection button 421-1 is highlighted and the “girl” button of the gender selection button 421-2 is highlighted. Is displayed. Further, the face image display unit 411-1 displays a face image (subject A ′) obtained by performing image processing for men on the face image of the subject A in the still image that has been shot, as a sample image. In the face image display unit 411-2, a face image (subject B ′) obtained by performing image processing for women on the face image of the subject B in the still image that has been taken is displayed as a sample image.

  As a result, a face image (sample image) that has been subjected to image processing on the face image obtained by shooting because it is close to the original shooting is displayed. The captured image can be imaged more reliably.

  In the above description, the user is allowed to select the type of image processing to be performed on the face image of the user (subject) obtained by trial shooting. The type of image processing to be performed on the user's face image may be determined according to the user's standing position when trial shooting is performed.

  That is, in step S76 of the flowchart of FIG. 17, the course selection control unit 213 causes the touch panel monitor 52 to display a trial shooting execution screen 430 as shown in FIG.

  In the vicinity of the center of the trial execution screen 430, face image display units 431-1 and 431-2 are provided in the same manner as in FIG. 19A. Note that a message “boy is here” is displayed at the top of the face image display unit 431-1, and a message “girl is here” is displayed at the top of the face image display unit 431-2.

  In addition, at the top of the trial shooting execution screen 430, a message that prompts the two users to pay attention to the standing position in trial shooting “Enter the frame and take a trial shot! Do not make a mistake because the process changes! ”Is displayed.

  That is, the face image displayed on the face image display unit 431-1 and taken for trial is subjected to image processing for men and displayed on the face image display unit 431-2 for trial photography. The image processing for women comes to be performed on the image.

  In this case, steps S77 and S78 in the flowchart of FIG. 17 are not executed, and the process proceeds to step S79.

  In step S79, the course selection control unit 213 displays a sample image that has been subjected to image processing of sex according to the user's standing position.

  That is, when trial shooting is performed at a predetermined timing in the state of the trial shooting execution screen 430 shown in FIG. 20A, the trial image shooting is performed on the face image display unit 431-1 as shown in FIG. 20B. A face image (subject A ′) that has been subjected to image processing for men is displayed on the face image of the subject A in the still image thus obtained, and the face image display unit 431-2 displays the face image of the still image that has been taken. A face image (subject B ′) obtained by performing image processing for women on the face image of the subject B is displayed.

  In this way, image processing according to the gender according to the standing position at the time of trial shooting is performed on the subject in the captured image, so that the user only has to decide the standing position and perform a troublesome selection operation. Therefore, it is possible to perform image processing intended by the user without having to perform the process, and it is possible to improve the user's satisfaction with the photographing work in the photo sticker creation work.

  Next, a detailed flow of the friend course photographing process executed in step S24 of FIG. 12 will be described with reference to the flowcharts of FIGS.

  Note that the processing of steps S111 to S113, S116 to S120, S122 to S130, and S132 to S135 in the flowcharts of FIGS. 21 and 22 is performed in steps S31 to S33, S36 to S40, and S42 to S50 of the flowcharts of FIGS. , S52 to S55, which are basically the same as those in FIG.

  That is, in step S114, the course selection control unit 213 displays a selection screen (GUI) on the touch panel monitor 52 for selecting the type of image processing performed on the captured image.

  As a selection screen for selecting the type of image processing, for example, an eye size selection screen shown in FIG. 23 is displayed.

  As shown in FIG. 23, the eye size selection screen 501 is provided with two main structures on the left and right except for a part. This basically allows at least two users of the same group (friend group) to select and input the eye size at the same time. In the friend course shooting process, shooting work is performed by at least two users. However, the shooting work may be performed by one user. The main configuration of the selection screen 501 is provided one by one.

  A live view image display unit 511 is provided at the center of the eye size selection screen 501. In the live view image display unit 511, live view images of the subject A and the subject B from the camera 51 are displayed. In the live view image display unit 511, the positions of the faces of the subjects A and B are recognized, and frames 521A and 521B indicating the recognized face ranges (areas) of the subjects A and B are the live view images. Is superimposed on the screen.

  In the eye size selection screen 501, an eye size selection button 512-1 is on the left side of the live view image display unit 511, and an eye size selection button is on the right side of the live view image display unit 511. 512-2 are provided.

  In the following, the eye size selection button 512-1 and the eye size selection button 512-2 are referred to as the eye size selection button 512 when there is no need to distinguish between them.

  The eye size selection button 512 includes a “large” button, a “medium” button, and a “small” button, and when one of them is selected by the user, The type of image processing to be performed on the face area of the subject corresponding to the selected user is determined.

  In the upper right part of the eye size selection screen 501, there is provided a remaining time display unit 513 for displaying the remaining time (remaining time) of the time limit assigned in advance to the photographing work. Further, at the top of the eye size selection screen 501, a message prompting the user to select the eye size with the eye size selection button 512 “selects each recognized eye size”. "!" Is displayed.

  That is, in the friend course photographing process, as the image process performed on the photographed image, an image process (eye size changing process) for changing the eye size according to the selected eye size is performed.

  Returning to the flowchart of FIG. 21, in step S115, the course selection control unit 213 selects image processing for each user on the eye size selection screen 501 displayed in step S114, that is, the eye size. Accept the selection.

  The sample image displayed in step S116 is displayed on a sample image display unit (not shown) on the eye size selection screen 501 in FIG. If the sample image display unit is not provided on the eye size selection screen 501, the process of step S116 is omitted.

  In step S121, the image processing control unit 217 performs the image processing (eye size changing process) selected in step S115 on each subject in the captured image obtained in step S120.

  Further, in step S131, the image processing control unit 217 performs the image processing selected in step S115 on each subject in the captured image obtained in step S130, as in step S121.

  In step S135, the generated composite image (captured image) is stored in the storage unit 102 as an image to be edited together with shooting information indicating that the friend course has been selected in the shooting process. When bonus shooting is performed, the editing target image (captured image) selected by the user is stored in the storage unit 102 together with shooting information indicating that the friend course has been selected.

  In this way, when the friend course is selected, image processing for changing the size of the eyes is selected by the user as the type of image processing for the subject (user) in the captured image, and the subject in the captured image is processed. Then, the selected image processing is performed. Thereby, since the image processing intended by the user can be performed on the captured image with a simple operation, the satisfaction of the user with respect to the imaging operation in the photo sticker creation operation can be improved.

  In the description with reference to FIG. 23, it is assumed that there are two users (subjects), but in the friend course shooting process, there may be three or more users.

  FIG. 24 shows an example of an eye size selection screen 501 displayed on the touch panel monitor 52 when shooting is performed by three people in the friend course shooting process.

  In the eye size selection screen 501 shown in FIG. 24A, in addition to the configuration shown in FIG. 23, an eye size selection button 512-is provided below each of the eye size selection buttons 512-1 and 512-2. OK buttons 514-1 and 514-2 are provided for determining selection of eye size for 1 and 512-2, respectively. In the live view image display unit 511, live view images of the subject A, the subject B, and the subject C from the camera 51 are displayed. In the live view image display unit 511, the positions of the faces of the subject A, the subject B, and the subject C are recognized, and a frame that indicates the recognized face range (area) of the subject A, the subject B, and the subject C. 521A, 521B, and 521C are displayed superimposed on the live view image.

  Hereinafter, the OK buttons 514-1 and 514-2 are referred to as OK buttons 514 when it is not necessary to distinguish them from each other.

  In the eye size selection screen 501 shown in FIG. 24A, the subject A, the subject B, and the subject C are recognized as the first, second, and third subjects, respectively. The eye size selection button 512-1 and the OK button 514-1 correspond to the first subject, and the eye size selection button 512-2 and the OK button 514-2 correspond to the second subject. doing. That is, the user corresponding to the subject A selects the eye size with the eye size selection button 512-1, and the user selects the OK button 514-1 to determine the selection, and the user corresponding to the subject B The size of the eye can be selected by the eye size selection button 512-2, and the selection can be determined by the OK button 514-2. At this time, the user corresponding to the subject C cannot select / determine the size of the eyes.

  Then, when the user corresponding to the subject B selects the “medium” button of the eye size selection button 512-2 and confirms the selection with the OK button 514-2, the eye size selection screen 501 is displayed as shown in FIG. It changes as shown in 24B. That is, when the eye size selection button 512-2 and the OK button 514-2 are operated by the user corresponding to the subject B, which is the second subject, and the eye size is selected and determined, The size selection button 512-2 and the OK button 514-2 correspond to the subject C, which is the third subject. That is, the user corresponding to the subject C can select and determine the size of the eyes.

  Thus, even if there are three users (subjects), out of two persons who can select / determine the size of the eye, select the size of the eye corresponding to the subject that has been previously selected / determined. Since the button 512 and the OK button 514 correspond to the third subject, each of the three subjects can select and determine the size of the eye. The same applies even when there are four or more subjects. When the touch panel monitor 52 is sufficiently large, the eye size selection button 512 and the OK button 514 may be displayed for the number of users (subjects). As a result, the user can select and determine the size of the eyes all at once without waiting for the turn, and the shooting operation can be performed smoothly.

  In this way, the photographing process is performed.

  According to the above processing, a shooting course is selected by the user, and the type of image processing corresponding to the selected shooting course is performed on the subject of the shot image. Thereby, since the image processing intended by the user can be performed on the captured image with a simple operation, the satisfaction of the user with respect to the imaging operation in the photo sticker creation operation can be improved.

  In the above, the sample image is displayed in order to make the user confirm the image of the image processing performed on the captured image. However, the captured image (live view image) is processed in real time. By performing the above, the photographed image that has been subjected to the image processing may be caused to image in real time.

  In the above description, the image process selected in the friend course shooting process is described as an eye size change process that changes the size of the subject's eyes. It may be a skin color changing process for changing (the degree of whiteness).

  Further, in the friend course shooting process, similar to the couple course shooting process described with reference to the flowchart of FIG. 17, after performing trial shooting, image processing such as an eye size changing process may be selected. Good.

  Furthermore, in the above description, the image processing performed on the photographed image is performed on the face of the subject regardless of the photographing course, but may be image processing on the entire subject.

  Next, an example of a detailed flow of the editing process executed in step S5 of FIG. 11 will be described with reference to the flowchart of FIG. Note that this editing process is actually performed by editing the person who has moved the shooting process of the photo sticker creation game corresponding to the flowchart of FIG. 12 in the first editing space 33A or the second editing space 33B. Although the same processing is executed for any editing space, in the following, for simplification of explanation, it is determined whether the user has moved to the first editing space 33A or the second editing. A description will be given assuming that the user has moved to the editing space 33 without specifically stating whether the user has moved to the space 33B.

  When the editing process is started, the editing preparation processing unit 232 acquires, from the storage unit 102, an editing target image that is a captured image obtained by the capturing unit 112 and selected as a graffiti editing target in step S131. To do. In step S132, the editing preparation processing unit 232 displays a predetermined GUI screen on the editing monitor 141 in the editing space 33, and allows the user who has moved to select a game type for the post-service processing. As described above, the post-service processing unit 205 provides services such as mini-games and image transfer to a user who is waiting for completion of printing. The edit preparation processing unit 232 allows the user to select what to do at this time from the options prepared in advance. When the user selects the contents of the subsequent customer service process, the editing preparation processing unit 232 next displays a GUI image of the editing start button on the editing monitor 141. The user operates the touch pen 63 and depresses the editing start button. When the editing start button is pressed, an editing screen which is a GUI for editing work is displayed on the editing monitor 141, so that the user can start the doodle editing work.

  In step S133, the editing preparation processing unit 232 determines whether or not the editing start button has been operated by the user based on the information supplied from the tablet built-in monitor 62, and waits until it is determined that the button has been operated. If it is determined that the editing start button has been operated, the editing preparation processing unit 232 advances the process to step S134, and the editing screen display control unit 234 causes the editing monitor 141 to input the graffiti editing for the captured image. Display the editing screen that is a GUI screen. Details of the editing screen display processing will be described later.

  In step S135, the input reception control unit 233 controls the tablet 142 and the touch pen 63 of the editing unit 113, and starts receiving graffiti editing input on the editing screen by the user.

  In step S136, the input reception control unit 233 receives an instruction from the user based on the input signal supplied from the tablet built-in monitor 62 (that is, the position information received by the touch pen 63 and the tablet 142) (GUI It is determined whether or not an operation is performed. When the touch pen 63 is operated by the user or the like and it is determined that the position information is input to the tablet 142 with respect to the editing screen displayed on the editing monitor 141, the input reception control unit 233 advances the process to step S137. The graffiti editing processing unit 235 grasps the instruction content input based on the input coordinates, and performs processing corresponding to the instruction.

  When the process corresponding to such an input is completed, the graffiti edit processing unit 235 advances the process to step S138. If it is determined in step S136 that an instruction has not been input, the graffiti editing processing unit 235 skips step S137 and proceeds to step S138.

  In step S138, the progress control unit 231 determines whether a predetermined time (time limit) has elapsed or the end button has been operated, and the time limit has not been reached, and the end button is also operated. When it determines with it not having progressed, the progress control part 231 returns a process to step S136, and repeats the process after it. That is, the progress control unit 231 repeats the processing from step S136 to step S138 until a predetermined time has elapsed or the end button has been operated, and performs processing for graffiti editing input. If it is determined in step S138 that a predetermined time has elapsed or the end button has been operated, the progress control unit 231 advances the process to step S139.

  In step S139, the progress control unit 231 saves the edited image (doodle editing result) on which the doodle editing has been performed as a print target image in order to create a print image. Note that “save” simply indicates that an edited image, which is an editing result, is provided to the printing process, and whether or not the data is actually saved in the RAM 107 or the storage unit 102 is arbitrary. is there.

  In step S140, the progress control unit 231 ends the display of the edit screen, and ends the reception of the edit input. When the process of step S140 ends, the progress control unit 231 ends the editing process, returns the process to step S5 of FIG. 11, and executes the processes after step S6.

  Editing operations are performed by the editing process as described above.

  Next, the detailed flow of the edit screen display process executed in step S134 of FIG. 25 will be described with reference to the flowchart of FIG.

  In step S171, the edit screen display control unit 234 reads the shooting information from the storage unit 102, and determines whether or not a couple course is selected in the shooting process.

  Here, with reference to FIG. 27, details of the shooting information stored in the storage unit 102 will be described.

  As shown in FIG. 27, the shooting information 590 includes shooting course information 591 and subject information 592.

  The shooting course information 591 is information indicating whether a couple course or a friend course is selected in the selection of a shooting course in the shooting process.

  When the shooting course information 591 indicates that the couple course has been selected, the subject information 592 indicates the sex selected for each of the two subjects A and B (for example, the subject A is male and the subject B is female). And information indicating a standing position (such as a man on the left side and a woman on the right side). When the shooting course information 591 indicates that a couple course has been selected, the subject information 592 is empty information indicating nothing, for example.

  That is, in step S171, the edit screen display control unit 234 determines whether a couple course has been selected based on the shooting course information 591 of the shooting information 590.

  If it is determined in step S171 that the couple course has been selected, the process proceeds to step S172, and the editing screen display control unit 234 performs shooting according to the standing position and the subject based on the subject information 592 of the shooting information 590. An editing screen, which is a GUI screen that allows the user to input graffiti editing on the image, is displayed on the editing monitor 141.

  FIG. 28 is a diagram illustrating a configuration example of an editing screen in a case where the subject information 592 includes information indicating a standing position in which a male is on the left side and a female is on the right side.

  As shown in FIG. 28, the editing screen 601 is provided with two main configurations on the left and right except for a part. This basically allows two users of the same set to perform graffiti editing input at the same time. On the editing screen 601 in FIG. 28, a display is made so that a man on the left side and a woman on the right side can perform editing work toward the editing monitor 141 based on the standing position indicated by the subject information 592. Has been made.

  In the upper center of the editing screen 601, a GUI button or the like for the user to select an image or a function is provided. The edit target selection unit 611 displayed in the upper center selects an edit image on which the user performs graffiti from among edit target images that can be graffiti edited by the user on the edit screen 601 as edit targets. It is a GUI.

  This editing target image is an editing target image that is generated by a shooting operation and is selected by the user in the editing operation. The edit target selection unit 611 displays a list of thumbnail images of the edit target images as a GUI.

  When the user selects a desired thumbnail image of the editing target image from the thumbnail images of the editing target image displayed in the editing target selection unit 611 with the touch pen 63, the selected image is set as the editing image. Largely displayed as an edited image on the editing screen 601.

  The edited image display unit 612-1 is a work area for displaying an edited image on which the user located on the left side of the editing monitor 141 performs graffiti editing. For example, the first editing interface 13A or the second editing interface 13B. The image to be edited (selected image) selected by operating the touch pen 63-1 provided on the left side of is displayed as an edited image.

  Similarly, the edited image display unit 612-2 is a work area for displaying an edited image on which the user located on the right side of the editing monitor 141 performs graffiti editing, for example, the first editing interface 13A or the second editing interface. An image to be edited (selected image) selected by operating the touch pen 63-2 provided on the right side of the editing interface 13B is displayed as an edited image.

  In the following description, the edited image display unit 612-1 and the edited image display unit 612-2 are referred to as an edited image display unit 612 when it is not necessary to distinguish between them.

  The remaining time display unit 613 displayed in the upper center of the editing screen 601 displays the remaining time (remaining time) of the time limit allocated in advance to the editing work. This display allows the user to distribute the pace of graffiti editing.

  A control instruction accepting unit 614-1 displayed on the left side of the edit target selecting unit 611 is a GUI for accepting a control instruction related to general doodle editing for the edited image displayed on the edited image display unit 612-1. The control instruction accepting unit 614-1 includes, for example, a GUI for eraser function (“Kashigum” button) that erases the graffiti performed so far, and a GUI (“From the beginning” button for specifying redo of editing work in units of graffiti processing. ), A GUI (“Back” button) for instructing to redo the previous editing operation, a GUI (“End” button) for instructing the end of the editing operation, and the like are displayed.

  A control instruction accepting unit 614-2 displayed on the right side of the edit target selecting unit 611 is a GUI for accepting a control instruction related to general doodle editing for the edited image displayed on the edited image display unit 612-2. The content of the GUI is the same as that of the control instruction receiving unit 614-1.

  Hereinafter, the control instruction receiving unit 614-1 and the control instruction receiving unit 614-2 are referred to as a control instruction receiving unit 614 when it is not necessary to distinguish them from each other for explanation.

  An editing tool selection unit 615 displayed at the lower center of the editing screen 601 is a GUI for selecting an editing tool used for editing the editing image displayed on the editing image display unit 612.

  The editing tool selection unit 615 includes a GUI button for easily selecting a graffiti tool, which is a basic editing tool that allows for easy editing of graffiti even if a user is a beginner, and a special graffiti editing function. A special button for selecting a special tool that is a compilation tool, a pen button for selecting a graffiti pen tool that is an editing tool that writes lines etc. to the edited image freehand, and the face part of the subject in the edited image A makeup button for selecting a makeup tool, which is an editing tool that edits makeup with false eyelashes, color contact lenses, teak, etc., and an editing tool that synthesizes a predetermined pattern or character at an arbitrary position in the edited image. A stamp button for selecting a stamp tool, a decorated special font A message button for selecting a message tool, which is an editing tool that synthesizes a character string at an arbitrary position in an edited image, and an editing tool that arranges a predetermined pattern or character in an arbitrary position on the edited image in a line A roll button is provided for selecting a roll tool.

  When the user operates the touch pen 63 and presses these buttons, a selection anchor indicating that the editing tool corresponding to the button is selected is displayed on the touch pen 63 side of the pressed button. The Note that the display color of the button pressed by the touch pen 63 can be reversed. In addition, when an easy button is pressed in the editing tool selection unit 615, buttons other than the easy button may not be pressed (it may be in an inactive state).

  In particular, as in the example of FIG. 28, when the user who performs editing work on the left side of the editing screen 601 (editing input to the edited image A) is a male, regardless of whether the user has selected or not. Instead, by default, the easy button is selected (pressed). If the user who performs the editing operation on the right side of the editing screen 601 (editing input to the editing image B) is a woman, all editing tools can be selected, and the state shown in FIG. In the example, the pen button is selected.

  The editing tool selected in this way is displayed on an editing palette (content palette). In the case of the example of FIG. 28, a GUI for operating the easy tool is displayed in the content palette 616-1 displayed on the left side of the editing tool selection unit 615. This easy tool is displayed on the tabs “STEP 1” to “STEP 4” in the content palette 616-1. According to the easy tool displayed on the “STEP1” to “STEP4” tabs, even users who are not familiar with graffiti editing can select the “STEP1” to “STEP4” tabs in order. Editing operations can be easily performed by using the editing tool displayed on the tab.

  A GUI for operating the pen tool is displayed in the content palette 616-2 displayed on the right side of the editing tool selection unit 615. In the example of FIG. 28, the pen tool includes a “pen 1” tab, a “pen 2” tab, a “pen 3” tab, a “stamp 1” tab, a “stamp 2” tab, a “rolling roller” tab, and a “chibi roller” tab. Is displayed.

  As described above, the user who selects the couple course at the time of shooting can perform the editing work at the same standing position as that at the time of shooting. At this time, for the user who selects the “boy” button at the time of shooting, an editing tool (ease of easy editing) is provided so that, for example, the pen button is selected and “Pen 1” is selected. ”Tab, and there is no need to perform complicated selection work such as selecting pen tools (pen icons) such as“ Aurora ”,“ Gradeoneon ”,“ Neon ”, and even male users who are not used to editing work. , You will be able to easily edit.

  Hereinafter, the content palette 616-1 and the content palette 616-2 are referred to as a content palette 616 when it is not necessary to distinguish them from each other.

  Also, on the upper side of the editing tool selection unit 615, a convenient tool button 617 for displaying the content (editing tool) displayed on the content palette 616 used by the adjacent user on the content palette 616 is displayed. Is provided.

  Further, a sample image display unit 618-1 and a sample image display unit 618-2 are provided on the left side of the edited image display unit 612-1 and the right side of the edited image display unit 612-2, respectively. The sample image display unit 618-1 and the sample image display unit 618-2 each have a sample (sample image) of an edited image obtained by an editing function of “one-shot graffiti” that enables a rich editing in a short time. Is displayed.

  Note that the GUI display examples described above are merely examples, and of course, configurations other than those described above may be used. Also, some of the editing tools described above may be omitted, or editing tools other than those described above may be further prepared. Furthermore, a method other than the above-described method may be applied to the operation method of the editing screen 601.

  Returning to the flowchart of FIG. 26, if it is determined in step S171 that the couple course has not been selected, that is, if the shooting course information 591 of the shooting information 590 indicates that the friend course has been selected, processing is performed. In step S173, the editing screen display control unit 234 causes the editing monitor 141 to display a normal editing screen that does not depend on the standing position and the subject.

  In the example of the editing screen 601 shown in FIG. 28, the normal editing screen is a state in which all the editing tools can be selected by both the right and left users, that is, the editing tool selection unit 615 selects the easy button by default. This is an editing screen in a state where it is not performed.

  When the process of step S171 ends, the edit screen display process ends, and the process returns to step S134 of FIG. And the process after step S134 is performed.

  According to the above processing, the display of the editing screen is controlled according to the shooting course selected at the time of shooting. In particular, when a couple course is selected, the display of the editing screen is controlled according to the gender selected by the user (subject) at the time of shooting and the standing position at the time of shooting. As a result, the male user is provided with an editing screen on which the editing work can be easily performed, and the satisfaction of the user with respect to the editing work in the photo sticker creation work can be improved.

  In this way, the editing process is performed. Note that when the processing in step S134 and subsequent steps in FIG. 25 is executed and the editing processing is completed, the processing returns to step S5 in FIG. 11 and printing processing is performed in step S6.

  In this printing process, a sticker sheet 162 whose edge color or design (pattern) is changed according to the shooting course selected at the time of shooting or the sex when the couple course is selected is obtained by editing. The edited image may be printed.

  Further, the edited image may be printed on a sticker paper 162 having a different size according to the shooting course selected at the time of shooting or the sex when the couple course is selected. Depending on the gender when selected, the size ratio of the sticker sheet 162 is different (the edited image by the male is 1/3 of the whole and the edited image by the female is 2/3 of the whole). A completed image may be printed.

  As described above, since the photograph sticker on which the photographed image subjected to the image processing intended by the user is printed can be created, it is possible to improve the satisfaction of the user with respect to the created photograph sticker. .

  As described above, the image processing intended by the user can be performed on the captured image with a simple operation, so the work of creating the photo sticker and the user's satisfaction with the created photo sticker are improved. It becomes possible to make it.

  Furthermore, in the above, in the shooting process, the user selects a shooting course, and when a couple course is selected, the user is further made to select a gender. You may select the gender based on the entered name by letting the user enter the name, or at the beginning of the editing process (before starting), let the user enter the name and enter the name You may make it select gender based on. When the user inputs a name at the start of editing processing, image processing according to gender or the like is not performed in the shooting processing. In the editing process, the subject (face) in any one of the captured images that have not been subjected to image processing is recognized, and the recognized subject is associated with the input name, whereby another captured image is obtained. It is possible to perform gender-based image processing based on the name on the subject.

  Further, the name input by the user as described above may be printed on the edge or margin of the sticker sheet 162 in the printing process.

  In the above, before shooting, the shooting course and sex, that is, the type of image processing applied to the shot image is selected by the user, but after shooting, not before shooting. After selecting the image to be edited, the feature amount of the face and eyes of the subject of each image to be edited and the image of the face and eyes prepared in advance are matched, and editing with the highest degree of matching A face in the target image may be further recognized, and image processing to be performed on the captured image (edit target image) may be selected according to the recognition result (that is, gender). Of course, this processing may be performed not on the image to be edited but on the sample image obtained in the above-described photographing processing or the photographed image obtained by trial photographing. Furthermore, by detecting the standing position of the user (gender) based on the recognition result (gender) in which the face of the subject is recognized in the photographed image (image to be edited) obtained by the photographing process and not subjected to image processing. The left and right display of the editing screen may be controlled.

  By the way, in the photographing process described above, a real-time photographed image is displayed (live view display) so that the user can confirm himself / herself. Details of live view display in the shooting process will be described below.

  FIG. 29 is a functional block diagram illustrating a more detailed configuration example of the live view display control unit 215 of FIG.

  The live view display control unit 215 includes a composition processing unit 731, a face recognition unit 732, a distance determination unit 733, a completed frame image acquisition unit 734, and a display control unit 735.

  The compositing processing unit 731 is for shooting selected by the user with respect to a real-time captured image (moving image) (captured image; hereinafter also referred to as a live view image) displayed on the touch panel monitor 52 (live view display). Synthesize the compositing image.

  The face recognition unit 732 recognizes the position (region) of the face of the user (subject) by a method such as detecting eyes in the live view image.

  The distance determination unit 733 determines whether or not the face of the subject in the live view image has a predetermined positional relationship based on the position of the face of the subject recognized by the face recognition unit 732, more specifically, the live view image. It is determined whether or not the position of the face of the subject is within a predetermined distance range.

  The completed frame image acquisition unit 734 acquires a completed frame image, which is one of the shooting composition images, based on the determination result of the distance determination unit 733.

  The display control unit 735 controls the touch panel monitor 52 to display the display of the live view image in which the shooting composition image is combined.

  Next, a detailed flow of the live view display process executed in step S39 of FIG. 13, for example, will be described with reference to the flowchart of FIG.

  In step S171, when the camera 51 acquires a live view image (captured image), in step S172, the face recognition unit 732 recognizes the position of the face of the user (subject) in the live view image acquired by the camera 51. To do.

  In step S173, for example, the composition processing unit 731 determines whether or not the photographing composition image selected in step S38 in FIG. 13 is a combined frame image.

  The combined frame image is a frame image that is combined with a part of the subject (specifically, the face of the subject) in the live view image, and is an image that forms a part of a predetermined figure. For example, the frame image for merging is an image representing each line obtained by vertically dividing a line that draws a heart shape, a star shape, a circle, or a rectangle into two. Further, the coalescing frame image is not limited to an image representing each line obtained by dividing the predetermined figure into two vertically, but may be an image representing each line obtained by dividing the predetermined figure into two horizontally. Alternatively, the image may be an image representing a line that draws predetermined independent figures (such as a heart shape or a star shape).

  If it is determined in step S173 that the shooting composition image is a coalesced frame image, the process proceeds to step S174, and the compositing processing unit 731 reads out the corresponding coalesced frame image from the storage unit 102, and combines them. A frame image and a live view image are combined.

  In step S175, the display control unit 735 causes the touch panel monitor 52 to display a live view image display screen on which the live view image in which the frame image for combination is combined by the combining processing unit 731 is displayed. On the live view image display screen, the live view image is displayed such that the combined frame image for merging follows the motion of the subject of the live view image that is a moving image.

  FIG. 31 shows an example of a live view image display screen displayed on the touch panel monitor 52.

  As shown in FIG. 31A, a live view image display unit 811 is provided in the center of the live view image display screen 801. In the live view image display unit 811, live view images of the subject A (left side) and the subject B (right side) are displayed. In the live view image display unit 811, the positions of the faces of the subject A and the subject B are recognized, and rectangular frames 821A and 821B indicating the recognized face positions (areas) of the subject A and the subject B are the subjects. The images are superimposed on the live view image so as to follow the movement of the faces of A and the subject B. Further, in the live view image display unit 811, the united frame images 822A and 822B, which are images in which a heart-shaped line is vertically divided into two so as to correspond to the sizes of the frames 821A and 821B, are displayed on the subject A. In addition, it is superimposed on the live view image so as to follow the movement of the face of the subject B. Note that an icon representing the attribute (gender) of the subject set by being selected in advance by the user is displayed above each of the frames 821A and 821B. In the example of FIG. 31A, an icon representing a man is displayed above the frame 821A, and an icon representing a woman is displayed above the frame 821B. Furthermore, when the sex of the subject A is male, the color of the frame 821A and the merged frame image 822A is blue. When the sex of the subject B is female, the color of the frame 821B and the merged frame image 822B is pink. The display colors of the frames 821A and 821B and the combined frame images 822A and 822B may be changed according to the subject attribute, such as color.

  A sample image display unit 812 is provided on the left side of the live view image display unit 811. In the sample image display unit 812, the result obtained when the positions of the face of the subject A and the face of the subject B are within a predetermined distance range in the live view image displayed on the live view image display unit 811. A model sample image obtained by combining the frame image with a still image of a subject (model) prepared in advance is displayed. In addition, the sample image display unit 812 synthesizes each model face as a model sample image, for example, when the face position of the subject is not within the predetermined distance range and within the predetermined distance range. A plurality of still images representing a state where the combined frame image is changed to a completed frame image may be displayed sequentially in sequence like a so-called flip book, or the above-described state may be a moving image. It may be displayed as an image. As a result, users who are subjects can easily see at a glance how the frame image for merging in the live view image changes to the completed frame image depending on how they move within the live view image. Will be able to confirm.

  The completed frame image is a frame image prepared corresponding to the merged frame image, and is a graphic image formed by combining the merged frame images (combined with the merged frame images). That is, in the example of FIG. 31, the completed frame image synthesized in the model sample image displayed on the sample image display unit 812 is a figure representing a heart-shaped line. The completed frame image is stored in the storage unit 102. In addition, the completed frame image corresponds to, for example, a star-shaped and star-shaped frame image when the frame image for merging is an image representing a line that draws a predetermined figure such as a heart shape or a star shape. Or a heart-shaped frame image prepared corresponding to a circle and star-shaped frame image.

  A captured image display unit 813 is provided at the bottom of the live view image display screen 801. In the photographed image display unit 813, a plurality of images (the example of FIG. 31) in which the selected composition image for photographing (a completed frame image corresponding to the union frame image) is combined with the model image before photographing. 6 model sample images are displayed, and after shooting, multiple (6) shot images (including shot images in which frame images such as completed frame images are combined) obtained by shooting are displayed. Is done.

  In the upper right part of the live view image display screen 801, a remaining time display unit 814 for displaying the remaining time (remaining time) of the time limit allocated in advance to the shooting work is provided. Further, at the top of the live view image display screen 801, a message “Provide me! I ’ll shoot!” Prompting each user to shoot near each other is displayed.

  Returning to the flowchart of FIG. 30, in step S176, the distance determination unit 733 determines whether the position of the face of the subject in the live view image is within a predetermined distance range.

  Specifically, as indicated by a double-headed arrow in FIG. 32A, between the vertices of the frame 821A representing the face area of the subject A and the vertices of the frame 821B representing the face area of the subject B, respectively. It is determined whether the distance between the vertices having the smallest distance is smaller than a predetermined value.

  If it is determined in step S176 that the position of the subject's face in the live view image is not within the predetermined distance range, the process proceeds to step S177, and the distance determination unit 733 includes the speaker 124 provided in the photographing unit 112. (FIG. 8) is controlled, and the user (subject) is guided to bring the faces closer to each other by voice, such as “Get closer!” Or “More face up!”. Here, the distance determination unit 733 controls the display of the touch panel monitor 52 so as to display an arrow or the like that prompts the faces of the subjects to approach each other on the live view image display unit 811 illustrated in FIG. 31A. You may do it.

  For example, in the live view image displayed in the live view image display unit 811 in FIG. 31A, when the standing position changes relatively, for example, the subject A and the subject B are switched left and right, the subjects A and The merged frame images 822A and 822B displayed superimposed on the face of the subject B may be horizontally reversed with reference to the vertical center line of each face area.

  In step S178, the distance determination unit 733 determines whether or not a predetermined time has elapsed since the live view display process was started.

  If it is determined in step S178 that the predetermined time has not elapsed, the process returns to step S176.

  If it is determined in step S176 that the position of the subject's face in the live view image is within the predetermined distance range before the predetermined time has elapsed since the start of the live view display process, the process proceeds to step S179. Then, the completed frame image acquisition unit 734 acquires a completed frame image corresponding to the coalescing frame image from the storage unit 102.

  For example, when it is determined that the distance between the vertex of the frame 821A and the vertex of the frame 821B indicated by the double arrow in FIG. 32A is smaller than a predetermined value, the completed frame image acquisition unit 734 is illustrated in FIG. 32B. Such a heart-shaped completed frame image 831 corresponding to the merged frame images 822A and 822B is read from the storage unit 102. At this time, the completed frame image acquisition unit 734 sets a frame 821C including the frames 821A and 821B. For example, the size (shape) of the completed frame image 831 so that the frame 821C is inscribed in the completed frame image 831. Adjust. The completed frame image 831 may be simply adjusted to a size close to the size of the frame 821C or may be adjusted to a size included inside the frame 821C.

  For example, compared to the example of FIG. 32B, when the frames 821A and 821B are arranged vertically, the shape of the frame 821C is a vertically long rectangle. In this case, the completed frame image acquisition unit 734 adjusts the shape of the completed frame image 831 so that the vertically long rectangle is inscribed in a vertically long heart shape. When the frame 821A and the frame 821B are arranged side by side, the shape of the frame 821C is a horizontally long rectangle. In this case, the completed frame image acquisition unit 734 adjusts the shape of the completed frame image 831 so that the horizontally long rectangle is inscribed in a horizontally long heart shape.

  Further, the size of the frame 821A and the frame 821B varies depending on the standing position of the user corresponding to the subject A and the user corresponding to the subject B with respect to the depth direction of the camera 51 in the shooting space. A difference also occurs in the size of the combined frame images 822A and 822B synthesized with the live view image of the subject in S174. Even in such a case, the completed frame image acquisition unit 734 sets the frame 821C including the frames 821A and 821B, and the size (shape) of the completed frame image 831 so that the frame 821C is inscribed in the completed frame image 831. ) Can be adjusted.

  Note that if there is a difference in size between the combined frame images 822A and 822B depending on the user's standing position in the depth direction of the camera 51, the distance determination unit 733 uses the processing as in the process in step S177. The person may be guided so as to match the standing position of the camera 51 in the depth direction.

  Returning to the flowchart of FIG. 30, in step S180, the composition processing unit 731 combines the completed frame image acquired (read) by the completed frame image acquisition unit 734 and the live view image acquired by the camera 51. .

  In step S181, the display control unit 735 displays the live view image obtained by the process of step S178 on the live view image display screen displayed on the touch panel monitor 52.

  In this way, as shown in FIG. 31B, the heart-shaped completed frame image 831 is synthesized (superimposed) on the entire face of the subject (Subject A and Subject B) displayed on the live view image display unit 811. The live view image is displayed.

  In FIG. 31, an example of synthesis of a completed frame image when there are two users (subjects) has been described. With reference to FIG. 33, a completed frame image when there are three subjects is illustrated. An example of synthesis will be described.

  FIG. 33 shows an example of a live view image display screen displayed on the touch panel monitor 52. The configuration of the live view image display screen 801 shown in FIG. 33 is the same as that of the live view image display screen 801 described with reference to FIG.

  In the live view image display unit 811 in FIG. 33A, live view images of the subject A, the subject B, and the subject C are displayed. In the live view image display unit 811 in FIG. 33A, the positions of the faces of the subjects A, B, and C are recognized, and the positions (areas) of the recognized faces of the subjects A, B, and C are recognized. The rectangular frames 921A, 921B, and 921C indicating are superimposed on the live view image and displayed. Further, in the live view image display unit 411 in FIG. 33A, the united frame images 922A, 922B, and 922C, which are images formed of lines that draw a round shape so as to correspond to the sizes of the frames 921A, 921B, and 921C. Is superimposed on the live view image.

  From the state of FIG. 33A, for example, the distance between the midpoint of the line connecting the center points of the frames 921A and 921C and the center point of the frame 921B is smaller than a predetermined value, and the distance between the frames 921A and 921B When the distance between the frame 921B and the frame 921C becomes smaller than each other predetermined value, the dumplings sandwiched between the skewers corresponding to the combined frame images 922A, 922B, and 922C as shown in FIG. 33B are displayed. An image of the completed frame image 931 is read out. In this way, a completed frame image 931 that is an image of dumplings sandwiched between the faces of the subjects (Subject A, Subject B, and Subject C) displayed in the live view image display unit 811 is synthesized ( The superimposed live view image is displayed.

  Now, returning to the flowchart of FIG. 30, if it is determined in step S178 that a predetermined time has elapsed since the start of the live view display process, the process proceeds to step S182, and the display control unit 735 displays the live view image. The display of the combined frame image that is combined with the live view image and displayed on the display unit 811 is terminated. That is, when the position of the subject's face does not fall within the predetermined distance range within a predetermined time, only the live view image acquired by the camera 51 is displayed on the live view image. In this case, neither the combined frame image nor the completed frame image is synthesized with the captured image.

  By the way, when it is determined in step S173 that the photographic composition image is not a coalesced frame image, that is, for example, the photographic composition image selected in step S38 in FIG. 13 is a frame other than the coalesced frame image. If it is an image, a normal image, or an aura image, the process proceeds to step S183, and the compositing processing unit 731 reads out the selected composite image for shooting from the storage unit 102, and combines the live image and the live view image. Is synthesized.

  In step S184, the display control unit 735 causes the touch panel monitor 52 to display a live view image display screen on which the live view image synthesized by the synthesis processing unit 731 is displayed.

  When the process of step S181, step S182, or step S184 is completed, the live view display process is terminated, the process returns to step S39 in FIG. 13, and the subsequent processes are executed.

  The completed frame image combined with the live view image in this way is reflected in the captured image (image to be edited) obtained as a result of the imaging process and the edited image obtained by the editing process.

  According to the above processing, a merged frame image forming a part of a predetermined figure is selected as a shooting composition image, and the face of the subject in which the selected merged frame image is synthesized in the live view image. Is a predetermined positional relationship, a completed frame image forming the entire predetermined figure is synthesized with the entire face of the subject. Thereby, the user can make the frame image combined with each face part of the subject in the live view image into one completed frame image by bringing the faces close to each other, while enjoying the shooting operation. It can be carried out. In addition, the completed frame image is reflected in the photographed image (image to be edited) and the edited image, so of course it will be reflected in the final photo sticker and the image transmitted to the mobile terminal. Can be. Therefore, it is possible to improve the photo sticker creation work and the user satisfaction with the photo sticker created.

  Note that the coalescing frame image that can be selected by the user is changed according to the selected shooting course. More specifically, when a couple course is selected as the shooting course, a combined frame that can obtain the heart-shaped completed frame image described with reference to FIG. 31 when the user selects a shooting composition image. When an image can be selected and a friend course is selected as a shooting course, a frame image for merging or a star-shaped image obtained with reference to FIG. 33 can be obtained. A coalescing frame image or the like from which a frame image can be obtained can be selected.

  In the above description, in the process of step S176, the distance between the vertices of the rectangular frame representing the face area of the subject is referred to as the distance between the subject faces. The distance between the sides of each frame, the distance between the vertex and the side of each frame, and the distance between the center points of each frame may be referred to.

  Furthermore, the completed frame image is acquired when the positions of the faces of the subject are within a predetermined distance range, but the positions of the faces of the subject are within the predetermined distance range and When the size of the face area (for example, the area of a rectangular frame representing the face area) is larger than a predetermined size, a completed frame image may be acquired.

  Also, when the positions of the faces of the subject are within a predetermined distance range, the obtained completed frame image is combined with the live view image and displayed, and a predetermined aura image or frame image is displayed as an animation. Further, the corresponding aura image or frame image may be reflected in the captured image (edit target image) or the edited image.

  Furthermore, the background color of the live view image may be adjusted according to the positional relationship (distance) between the faces of the subject. Specifically, when the faces of the subject are far away, the background color of the live view image may be blue, and when the faces of the subject are close, the background color of the live view image may be pink. Good. Further, the brightness and color of the lighting device and LEDs in the photographing space 32 may be adjusted according to the positional relationship between the faces of the subject.

  Further, the completed frame image obtained by the live view display process may be used as an editing composition image that is synthesized by a user's editing input with respect to the captured image to be edited in the editing process. Also, during the editing process, you can change the color of the completed frame image combined with the captured image in the shooting process, change the completed frame image to an image related to the completed frame image, or delete the combined completed frame image You may be able to do that.

  In the above description, in the live view image, the united frame image and the completed frame image are combined with the face portion of the subject, but the entire subject body or a part of the subject body other than the face is used. You may make it synthesize | combine. That is, for example, a combined frame image may be synthesized for each subject, and when the positions of the subjects are within a predetermined distance range, the completed frame image may be synthesized for all subjects.

  The completed frame image is stored in the storage unit 102 in correspondence with the merged frame image. However, the completed frame image acquisition unit 734 generates a completed frame image based on the merged frame image. You may make it do.

  Further, in the above description, an example in which a completed frame image is synthesized when the positions of subjects as users are within a predetermined distance range in the live view image has been described. When performing a shooting operation by a person, in the live view image, the subject as the user and the model image of the model that is another subject that is registered in advance and displayed in the live view image are combined. The frame images may be combined, and the completed frame image may be combined when the position of the subject that is the user and the model image is within a predetermined distance range.

  By the way, the above-described series of processes can be executed by hardware or can be executed by software. When the above-described series of processing is executed by software, a program constituting the software is installed from a network or a recording medium.

  For example, as shown in FIG. 8, the recording medium is distributed to distribute the program to the administrator of the photo sticker creating apparatus 1 separately from the apparatus main body, and is a magnetic disk (flexible disk) on which the program is recorded. Disk), optical disk (including CD-ROM and DVD), magneto-optical disk (including MD), or removable media 105 made of semiconductor memory, etc., but also pre-installed in the device body The ROM 106 in which the program is recorded and the hard disk included in the storage unit 102 are distributed to the administrator.

  In the present specification, the step of describing the program recorded on the recording medium is not limited to the processing performed in chronological order according to the described order, but is not necessarily performed in chronological order. It also includes processes that are executed individually.

  In the above, the printing medium is not limited to sticker paper or photographic paper. For example, printing on a predetermined size of paper or film, poster paper, a card such as a telephone card, or a cloth such as a T-shirt. You may make it do.

  Further, in this specification, the system represents the entire apparatus constituted by a plurality of apparatuses. In the above, the configuration described as one device may be divided and configured as a plurality of devices. Conversely, the configurations described above as a plurality of devices may be combined into a single device. Of course, configurations other than those described above may be added to the configuration of each device. Furthermore, if the configuration and operation of the entire system are substantially the same, a part of the configuration of a certain device may be included in the configuration of another device.

DESCRIPTION OF SYMBOLS 1 Photo sticker production apparatus 12 Shooting unit 13 Editing unit 101 Control unit 112 Shooting unit 113 Editing unit 202 Shooting processing unit 203 Editing processing unit 204 Print processing unit 205 Subsequent customer service processing unit 213 Course selection control unit 215 Live view display control unit 216 Shooting Image acquisition unit 217 Image processing control unit 234 Edit screen display control unit

Claims (11)

  1. An image editing apparatus for photographing a user as a subject and outputting the obtained image,
    Course selection means for selecting a course for determining the type of image processing to be performed on the image photographed as a subject by the user;
    Photographing means for photographing the user as a subject and obtaining a photographed image;
    Image processing control means for controlling the captured image to perform image processing of a type corresponding to the course selected by the course selection means;
    Editing means for editing the photographed image subjected to image processing based on graffiti editing input by a user ;
    Screen display control means for controlling display of an edit screen for the user to perform the graffiti edit input on the captured image edited by the editing means ;
    The course selection means selects a first course for performing image processing according to the sex of the user on the captured image of the user as a subject of the course,
    The image processing control means, when the first course is selected by the course selection means, controls to perform image processing according to the sex of the user to the captured image;
    The screen display control means controls the display of the edit screen according to the gender of the user when the first course is selected by the course selection means .
  2. The course selection means selects a second course for performing image processing according to the user's instruction on the captured image, which is different from the first course, out of the courses,
    The image processing control means, if the second course is selected by the course selecting unit, according to claim 1 for controlling so as to perform image processing according to the instruction of the user with respect to the captured image Image editing device.
  3. When the first course is selected, the course selection means further selects a gender for each user,
    The image processing control unit, to the user, respectively, in the captured image, according to claim 1 in which the control is to perform image processing in accordance with the sex of each of the user selected by the course selecting unit Image editing device.
  4. When the first course is selected by the course selection unit, the screen display control unit is configured to display the editing screen according to the relative position of the user in the captured image and the gender of the user. The image editing apparatus according to claim 1 , wherein display is controlled.
  5. Said synthesis image to be synthesized on the photographic image, the image editing apparatus according to any one of claims 1 to 3 further comprising a synthesis image selecting means for selecting in response to selection of the course selection unit.
  6. A moving image display control unit for controlling the captured image captured by the imaging unit to be displayed as a moving image on a display unit in real time;
    The image processing control unit controls the moving image to perform image processing according to the selection of the course selection unit;
    The moving image display control means, wherein the image processing control unit, according to claim 1 to 3 image processing corresponding to the selection of the course selection means to the moving image that has been subjected, it controls to display on the display unit An image editing apparatus according to any one of the above.
  7. A printing means for printing the edited result on the sticker sheet;
    Said printing means, said sealing sheet design or size depending on the selection of the course selecting means, the image editing apparatus according to any one of claims 1 to 3 for printing the edited result.
  8. The image editing apparatus according to claim 3 , wherein the course selection unit selects a gender for each user based on a name input by the user.
  9. The image editing apparatus according to claim 1, wherein the course selection unit selects the course according to a face image recognized based on a state of the user's face in the shooting.
  10. An image editing method of an image editing apparatus for photographing a user as a subject and outputting the obtained image,
    A course selection step for selecting a course for determining the type of image processing to be performed on the image captured by the user as a subject;
    A photographing step of photographing the user as a subject to obtain a photographed image;
    An image processing control step for controlling the captured image to perform image processing of a type corresponding to the course selected by the course selection step;
    An editing step for editing the photographed image subjected to image processing based on a graffiti editing input by a user ;
    A screen display control step for controlling display of an editing screen for the user to perform the graffiti editing input on the captured image edited by the processing of the editing step;
    Including
    In the course selection step, the first course for performing image processing according to the sex of the user is selected for the captured image of the user as a subject. And
    The processing of the image processing control step is controlled so that when the first course is selected by the processing of the course selection step, the captured image is subjected to image processing according to the sex of the user,
    The processing of the screen display control step is an image editing method for controlling display of the editing screen according to the gender of the user when the first course is selected by the course selecting means .
  11. A program for causing a computer to execute a process of photographing a user as a subject and outputting the obtained image.
    A course selection step for selecting a course for determining the type of image processing to be performed on the image captured by the user as a subject;
    A shooting control step for controlling shooting of the user as a subject and obtaining a shot image;
    An image processing control step for controlling the captured image to perform image processing of a type corresponding to the course selected by the course selection step;
    An editing control step for controlling editing of the photographed image subjected to image processing based on a graffiti editing input by a user ;
    A screen display control step for controlling display of an edit screen for the user to perform the graffiti edit input on the captured image edited by the process of the edit control step;
    Causes the computer to execute processing including
    In the course selection step, the first course for performing image processing according to the sex of the user is selected for the captured image of the user as a subject. And
    The processing of the image processing control step is controlled so that when the first course is selected by the processing of the course selection step, the captured image is subjected to image processing according to the sex of the user,
    The process of the screen display control step is a program for controlling the display of the edit screen according to the sex of the user when the first course is selected by the course selection means .
JP2010178151A 2010-08-06 2010-08-06 Image editing apparatus and method, and program Active JP5447276B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010178151A JP5447276B2 (en) 2010-08-06 2010-08-06 Image editing apparatus and method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010178151A JP5447276B2 (en) 2010-08-06 2010-08-06 Image editing apparatus and method, and program

Publications (2)

Publication Number Publication Date
JP2012037735A JP2012037735A (en) 2012-02-23
JP5447276B2 true JP5447276B2 (en) 2014-03-19

Family

ID=45849782

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010178151A Active JP5447276B2 (en) 2010-08-06 2010-08-06 Image editing apparatus and method, and program

Country Status (1)

Country Link
JP (1) JP5447276B2 (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6217013B2 (en) * 2012-05-15 2017-10-25 株式会社メイクソフトウェア Photography game machine, photography game method and control program
JP2013243466A (en) * 2012-05-18 2013-12-05 Tatsumi Denshi Kogyo Kk Imaging apparatus for game, imaging method for game, and program
JP6184077B2 (en) * 2012-10-30 2017-08-23 キヤノン株式会社 Imaging device and control method thereof.
JP5545507B2 (en) * 2012-11-07 2014-07-09 フリュー株式会社 Image processing apparatus, image processing method, and program
JP5630675B2 (en) * 2013-02-25 2014-11-26 フリュー株式会社 Photo sealing machine, photo sealing machine processing method and program
JP6191318B2 (en) * 2013-08-02 2017-09-06 辰巳電子工業株式会社 Game shooting device, image processing method, and computer program
JP6112356B2 (en) 2013-08-09 2017-04-12 フリュー株式会社 Image editing apparatus, image editing method, and program
JP6146268B2 (en) * 2013-11-15 2017-06-14 フリュー株式会社 Photo sticker creation apparatus, image processing method, and program
JP5916143B2 (en) * 2013-12-02 2016-05-11 株式会社メイクソフトウェア Photo game machine
JP5632104B2 (en) * 2014-02-19 2014-11-26 辰巳電子工業株式会社 Game shooting device, processing method and program for game shooting device
JP5870431B2 (en) * 2014-04-28 2016-03-01 株式会社メイクソフトウェア Photography game machine and its control program
JP5725245B2 (en) * 2014-09-01 2015-05-27 フリュー株式会社 Image printing apparatus and image printing method
JP2015213336A (en) * 2015-06-15 2015-11-26 辰巳電子工業株式会社 Imaging apparatus for game, imaging method for game, and program
JP6090369B2 (en) * 2015-06-15 2017-03-08 辰巳電子工業株式会社 Game shooting device, game shooting method and program
JP6355107B2 (en) * 2015-08-13 2018-07-11 株式会社メイクソフトウェア Photography game machine and its control program
JP5967283B2 (en) * 2015-10-28 2016-08-10 辰巳電子工業株式会社 Photo shooting game machine, its control method and control program
JP6057005B2 (en) * 2016-05-02 2017-01-11 フリュー株式会社 Photo sealing machine, photo sealing machine processing method and program
JP6156559B2 (en) * 2016-08-19 2017-07-05 辰巳電子工業株式会社 Game shooting device, game shooting method, and program
JP6241689B2 (en) * 2016-12-19 2017-12-06 フリュー株式会社 Image processing apparatus, image processing method, and program
JP2017173848A (en) * 2017-06-07 2017-09-28 辰巳電子工業株式会社 Imaging apparatus for game, imaging method for game, and program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH1175152A (en) * 1997-08-27 1999-03-16 Daishin Home Kk Image processing unit
JP2004062839A (en) * 2002-06-03 2004-02-26 Omron Corp Photographic printer, printing paper unit, method, program for controlling photographic printer and computer readable recording medium in which the same is recorded
JP4247574B2 (en) * 2003-12-26 2009-04-02 フリュー株式会社 Photo print device, photo print device control method, photo print device control program, recording medium containing photo print device control program
JP2006072045A (en) * 2004-09-02 2006-03-16 Make Softwear:Kk Automatic photographing device
JP2007264858A (en) * 2006-03-28 2007-10-11 Fujitsu Ltd Personal name sexuality determination program, machine translation program, personal name sexuality determination device, machine translation device, personal name sexuality determination processing method and machine translation processing method
JP5309979B2 (en) * 2008-12-26 2013-10-09 フリュー株式会社 Photo sticker creating apparatus and method, and program

Also Published As

Publication number Publication date
JP2012037735A (en) 2012-02-23

Similar Documents

Publication Publication Date Title
JP2005277772A (en) Photographic seal vending machine, and method and program of creating photographic seal
JP5003852B1 (en) Image providing apparatus and method, and program
JP5637346B2 (en) Photo sticker creation apparatus, photo sticker creation method, and program
JP4770662B2 (en) Photo sticker creation apparatus and method, and program
JP2004112302A (en) Photographic-sticker vending machine and method therefor, photographic-sticker and photographic-sticker unit
JP4919031B2 (en) Photo sticker creation apparatus and method, and program
JP5304233B2 (en) Photo sticker creation apparatus, photo sticker creation method, and program
JP4523839B2 (en) Image input apparatus and method, and program
JP4919131B1 (en) Image providing apparatus and method, and program
JP5418708B2 (en) Photo sealing machine, photo sealing machine processing method and program
JP5447276B2 (en) Image editing apparatus and method, and program
JP5007774B1 (en) Photo seal machine, photo seal machine processing method, and program thereof
JP5196060B1 (en) Photo sticker creation apparatus, photo sticker creation method, and program
JP2010154453A (en) Photographic sticker creating device, photographic sticker creating method, and program
JP2012074778A (en) Photograph seal generation device, photograph seal generation method, and program
KR20040014918A (en) Photo sticker vending Machine and method, sticker sheet and sticker sheet unit
JP5240969B2 (en) Photo sticker creation apparatus and method, and program
JP4911077B2 (en) Photo sticker creation apparatus, photo sticker creation method, and program
JP4919133B1 (en) Photo sticker creation apparatus and method, and program
JP4730599B2 (en) Photo sticker creation apparatus and method, and program
JP4608016B1 (en) Photography game machine, photography game method and control program
JP2004357018A (en) Photograph vending machine and photograph providing method in photograph vending machine
JP5456984B2 (en) Photo sticker creation apparatus, photo sticker creation method, and program
JP2004260650A (en) Photographic print provision device and method, and print paper unit
JP5057151B2 (en) Photo sticker creating apparatus and method, and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20120405

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130821

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130910

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131111

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20131203

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20131216

R150 Certificate of patent or registration of utility model

Ref document number: 5447276

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250