JP2004046591A - Picture evaluation device - Google Patents

Picture evaluation device Download PDF

Info

Publication number
JP2004046591A
JP2004046591A JP2002204099A JP2002204099A JP2004046591A JP 2004046591 A JP2004046591 A JP 2004046591A JP 2002204099 A JP2002204099 A JP 2002204099A JP 2002204099 A JP2002204099 A JP 2002204099A JP 2004046591 A JP2004046591 A JP 2004046591A
Authority
JP
Japan
Prior art keywords
image
evaluation
photographed
scene
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2002204099A
Other languages
Japanese (ja)
Inventor
Tomoaki Tamura
田村 知章
Original Assignee
Konica Minolta Holdings Inc
コニカミノルタホールディングス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Holdings Inc, コニカミノルタホールディングス株式会社 filed Critical Konica Minolta Holdings Inc
Priority to JP2002204099A priority Critical patent/JP2004046591A/en
Publication of JP2004046591A publication Critical patent/JP2004046591A/en
Application status is Withdrawn legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To enable a user to easily enjoy printing an filmed image in which a subject looks good. <P>SOLUTION: According to a digital camera 1, a CPU 11a samples a face image of the subject of each frame filmed moving image, performs smile evaluation and handsome face evaluation every face image and, with respect to each frame picture, calculates total smile evaluation values and total handsome face evaluation values of each subject in the image. Also, the CPU 11a analyzes one of the frame pictures and decides a place of a filmed image. When the CPU decides that an informal scene has been filmed, a priority evaluation value is set as the smile evaluation value; when it is decided that a formal scene has been filmed, the priority evaluation value is set as the handsome evaluation value; and each frame picture is ordered in descending order of the priority evaluation value. Then, each frame picture is displayed to a monitor 23 on the basis of the order. <P>COPYRIGHT: (C)2004,JPO

Description

[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to an image evaluation device that evaluates a captured image.
[0002]
[Prior art]
Heretofore, in order to obtain a satisfactory photograph of a subject, a plurality of photographing operations have been performed, and a photograph considered to be optimal among them has been selected. However, in the group photograph, it is possible to select a photograph that is satisfied by several persons among the photographed persons, but it is difficult to select a photograph that is satisfied by all the photographed persons.
[0003]
As a solution to the above-described problem, Japanese Patent Application Laid-Open No. 2001-45355 discloses a technique of continuously photographing a group photograph and displaying the photograph on a touch panel, and switching the image photographed by each subject on the touch panel to select a desired image. A photographing apparatus is disclosed that selects, determines, and touches a part in which oneself is photographed with a finger, extracts a partial image of the place touched by the finger, and combines the partial images of each subject to form a photographic image. ing. In addition, in a group photograph, a technique of synthesizing each photographed image by each subject by remote-controlling a photographing timing and a technique of synthesizing an image photographed by each subject with a digital camera are also disclosed. .
[0004]
[Problems to be solved by the invention]
The above-described technique in which the subject selects his or her favorite image is effective for obtaining a photograph with a good expression. However, a photograph is a photograph in which not only oneself but also a person who appears together is good, but in the technique disclosed above, in order for all the photographed persons to obtain a photographed image with a good expression, it is necessary to obtain a good image. All the photographers must select images with good facial expressions, and when there are many photographers, it takes time and is complicated. In addition, the operation of each subject by the remote control of the shooting timing by each remote control, and the synthesis of the images taken by each subject by the digital camera, respectively, are performed by the remote control, the digital camera, etc. corresponding to each subject. This device was not suitable for casual shooting.
[0005]
In recent years, digital cameras have become widespread alongside silver halide cameras. In digital cameras, it is only necessary to record captured images in a memory, and there is no consumption of film unlike a silver halide camera. If a large number of images are taken for one shooting scene, many people have more options for obtaining a print with a good expression. However, it is a cumbersome task for the user to select and print an image with a good expression from all the photographed images by all the photographed persons, and this is a constraint on printing and enjoying the photographed images.
[0006]
An object of the present invention is to enable a user to easily print a photographed image with a good expression and enjoy the photographed person.
[0007]
[Means for Solving the Problems]
In order to solve the above problems, the invention according to claim 1 is:
It is characterized by including evaluation means for evaluating and scoring the expression of the subject in the input photographed image.
[0008]
According to the first aspect of the present invention, the facial expression of the subject in the captured image can be evaluated and scored. Therefore, it is possible to evaluate the quality of the expression of the subject.
[0009]
The invention according to claim 2 is the invention according to claim 1,
The evaluation means evaluates and scores at least one of the degree of smile and the degree of neatness of the facial expression of the subject.
[0010]
According to the second aspect of the present invention, at least one of the degree of smile and the degree of neatness of the expression of the subject in the captured image can be evaluated and scored. Therefore, it is possible to evaluate whether or not the expression of the subject is an expression corresponding to the shooting scene.
[0011]
The invention according to claim 3 is the invention according to claim 1 or 2,
The evaluation means averages the evaluations for the plurality of photographed persons when one photographed image includes a plurality of photographed persons, and scores an overall evaluation value in the one photographed image. It is characterized by doing.
[0012]
According to the third aspect of the present invention, when a plurality of photographed persons are included in one photographed image, the evaluations scored for the plurality of photographed persons are averaged, and the evaluations in the photographed images are averaged. The overall evaluation value can be scored. Therefore, it is possible to evaluate the photographed image.
[0013]
The invention according to claim 4 is the invention according to claim 3,
The scoring means is characterized in that, when scoring the comprehensive evaluation value, scoring is performed by weighting the evaluation of a specific person.
[0014]
According to the fourth aspect of the invention, when scoring the total evaluation value of the subject in one captured image, the evaluation can be performed by weighting the evaluation of a specific person. Therefore, it is possible to evaluate a captured image in consideration of the evaluation of a person who wants to be emphasized.
[0015]
The invention according to claim 5 is the invention according to any one of claims 1 to 4,
A display unit for displaying the captured image,
When a plurality of photographed persons are included in one photographed image, control means is provided for displaying the evaluations scored by the evaluation means on the display means for each photographed person. .
[0016]
According to the fifth aspect of the present invention, when a plurality of subjects are included in one photographed image, the graded evaluation can be displayed on the display means for each subject. Therefore, the user can know the evaluation of each subject.
[0017]
The invention according to claim 6 is the invention according to claim 5,
The control means displays the comprehensive evaluation value on the display means.
[0018]
According to the sixth aspect of the present invention, it is possible to display the comprehensive evaluation value of the captured image on the display unit. Therefore, the user can know the comprehensive evaluation value for each captured image.
[0019]
The invention according to claim 7 is the invention according to any one of claims 1 to 6,
A photographing scene determining unit that analyzes the photographed image to determine whether the scene of the photographed image is a casual scene or a formal scene is provided.
[0020]
The invention according to claim 8 is the invention according to claim 7,
The photographing scene determination means analyzes whether the photographed image is photographed outdoors or photographed indoors, so that the scene of the photographed image is a casual scene or a formal scene. It is characterized by judging whether the scene is appropriate.
[0021]
The invention according to claim 9 is the invention according to claim 7 or 8,
The photographing scene determining means is configured to determine whether a scene of the photographed image is a casual scene or a formal scene based on clothes of a photographed person in the photographed image.
[0022]
According to a tenth aspect of the present invention, in the invention according to any one of the seventh to ninth aspects, the photographic scene determination unit determines a scene of the photographic image based on a hairstyle of a subject in the photographic image. It is characterized by determining whether the scene is a casual scene or a formal scene.
[0023]
According to the invention described in claims 7 to 10, it is possible to analyze the captured image and determine whether the scene of the captured image is a casual scene or a formal scene.
[0024]
The invention according to claim 11 is the invention according to any one of claims 7 to 10,
When the scene of the photographed image is determined to be a casual scene by the photographing scene determination unit, the control unit evaluates the degree of the smile scored by the evaluation unit with respect to the degree of neatness. It is characterized in that it is displayed on the photographed image prior to evaluation.
[0025]
According to the invention described in claim 11, when the scene of the photographed image is determined to be a casual scene, the evaluation of the graded smile is prioritized over the evaluation of the neatness of the photographed image. Can be displayed above. Therefore, the evaluation according to the scene of the captured image can be displayed in an easily viewable manner.
[0026]
The invention according to claim 12 is the invention according to claim 11,
When the scene of the photographed image is determined to be a formal scene by the photographing scene determination unit, the control unit evaluates the degree of neatness scored by the evaluation unit with respect to the degree of the smile. It is characterized in that it is displayed on the photographed image prior to evaluation.
[0027]
According to the twelfth aspect of the present invention, when it is determined that the scene of the photographed image is a formal scene, the evaluation of the graded neatness is given priority over the evaluation of the degree of smile on the photographed image. Can be displayed. Therefore, the evaluation according to the scene of the captured image can be displayed in an easily viewable manner.
[0028]
The invention according to claim 13 is the invention according to any one of claims 5 to 12,
The evaluation unit includes a rank setting unit that evaluates a facial expression of a photographed person in each of the plurality of captured images, and sets a rank in the descending order of evaluation by the evaluation unit for the plurality of captured images,
The control means controls the plurality of captured images to be displayed on the display means based on the order set by the order setting means.
[0029]
According to the thirteenth aspect of the present invention, the expressions of the photographed person in the plurality of photographed images are respectively evaluated, and the plurality of photographed images are ranked in the order of higher evaluation, and the plurality of photographed images are determined based on the ranking. The image can be displayed on the display means. Therefore, a plurality of photographed images can be displayed in the order of good expression of the subject, so that the user can easily select a photographed image to be printed.
[0030]
The invention according to claim 14 is the invention according to claim 13,
The plurality of captured images are captured images obtained by cutting out each frame in a moving image as a still image.
[0031]
According to the fourteenth aspect of the present invention, for each still image obtained by cutting out each frame in a moving image as a still image, a ranking is set in descending order of evaluation, and a plurality of captured images are displayed based on the ranking. can do. Therefore, since each frame image in the moving image can be displayed in the order of the expression of the subject, the user can easily select the captured image to be printed.
[0032]
The invention according to claim 15 is the invention according to claim 13 or 14,
An output unit that outputs the order set by the order setting unit to the external device in association with each of the captured images is provided.
[0033]
According to the invention of claim 15, it is possible to output the set ranking of the photographed images to the external device in association with each photographed image. Therefore, the images can be output to an external device for display and printing in ascending order of appearance of the subject.
[0034]
The invention according to claim 16 is the invention according to any one of claims 13 to 15,
In the case where a plurality of subjects are shown in each of the plurality of photographed images, among the face images of the respective subjects in the plurality of photographed images, each of the subjects most highly evaluated by the evaluation unit And a face image synthesizing means for extracting the face image and synthesizing it as one still image.
[0035]
According to the sixteenth aspect of the present invention, when a plurality of photographed persons are included in each of the plurality of photographed images, the face image of each photographed person in the plurality of photographed images is most highly evaluated. A face image of each subject can be extracted and combined as one still image. Therefore, it is possible to acquire a photographed image with a good expression for all the photographed persons.
[0036]
The invention according to claim 17 is the invention according to claim 16, wherein
The order setting means sets the still image synthesized by the face image synthesizing means to the highest order.
[0037]
According to the seventeenth aspect, among the face images of each subject in the plurality of captured images, the face image of each subject that is most highly evaluated is extracted and combined as one still image. Can be set to the highest rank. Therefore, when selecting an image to be printed, the user can easily select a photographed image with a good expression of all subjects.
[0038]
The invention according to claim 18 is the invention according to any one of claims 14 to 17,
It is characterized by comprising moving picture creation means for creating a moving picture from the plurality of still images in the order set by the order setting means.
[0039]
According to the eighteenth aspect, a moving image can be created from the plurality of still images in a set order. Therefore, even when a still image is cut out from a moving image and evaluated, the invention is implemented without increasing the file size from the original moving image by creating and recording a moving image in which images are replaced in the order of higher evaluation. can do.
[0040]
The invention according to claim 19 is the invention according to any one of claims 1 to 18,
The image processing apparatus further includes a storage unit that stores the evaluation scored by the evaluation unit in association with the photographed image.
[0041]
According to the nineteenth aspect, the graded evaluation can be stored in association with the captured image. Therefore, since the evaluation of the photographed image is stored, it is possible to easily display a plurality of photographed images arranged in descending order of expression at any time and output the images to an external device.
[0042]
BEST MODE FOR CARRYING OUT THE INVENTION
[First Embodiment]
Hereinafter, a first embodiment in which the present invention is applied to a digital camera will be described in detail with reference to the drawings.
FIG. 1 is a block diagram illustrating a functional configuration of a digital camera 1 according to the present embodiment. As shown in FIG. 1, the digital camera 1 includes a control unit 11 having a CPU 11a, a ROM 11b and a RAM 11c, an optical system control unit 12, an imaging optical system 13, an imaging device 14, an image signal generation circuit 15, a TG 16, a CCD driver 17, It comprises an image signal processing unit 18, an AE / AF processing circuit 19, a built-in memory 20, a recording medium 21, a monitor drive circuit 22, a monitor 23, a flash unit 24, an operation unit 25, an EEPROM 26, a battery 27 and the like.
[0043]
A CPU (Central Processing Unit) 11a reads a control program stored in the ROM 11b in advance, expands the program in the RAM 11c, and controls the entire digital camera 1 according to the expanded program. Further, the CPU 11a executes various processes including a photographed image evaluation process described later according to the developed program. The CPU 11a has functions as an evaluation unit, a control unit, a shooting scene determination unit, a ranking setting unit, a face image combining unit, and a moving image creation unit described in the claims of the present invention.
[0044]
The ROM (Read Only Memory) 11b is configured by a nonvolatile memory such as a semiconductor. The ROM 11b stores in advance various programs such as a control program and a processing program corresponding to the digital camera 1, data referred to by the various programs, and the like.
[0045]
A RAM (Random Access Memory) 11c temporarily stores a program and various data read from the ROM 11b in various processes executed by the CPU 11a.
[0046]
The optical system control unit 12 includes a zoom control mechanism, a focus control mechanism, a shutter control mechanism, an aperture control mechanism, and the like, and controls a zoom position detection signal, a focus position detection signal, an output signal of the AE / AF processing circuit 19, and the like. The imaging optical system 13 is controlled by a control signal from the control unit 11 while receiving the feedback.
[0047]
The imaging optical system 13 includes a focusing lens, a zoom lens, a shutter, an aperture, and the like, and forms an image of framing subject information in an appropriate exposure and focus state on the imaging element 14 under the control of the optical system control unit 12.
[0048]
The imaging device 14 is configured by a CCD (Charge Coupled Device) or the like, and photoelectrically converts the subject light transmitted through the imaging optical system 13 to form an image.
The image signal generation circuit 15 receives the signal photoelectrically converted by the imaging element 14, performs AGC, clamp processing, and the like, and outputs the signal to the image signal processing unit 18.
[0049]
A TG (Timing Generator) 16 generates a predetermined timing signal, outputs the signal to the image signal generation circuit 15 and the CCD driver 17, and controls the driving of these signals. The CCD driver 17 receives the timing signal and controls the driving of the image sensor 14 in synchronization.
[0050]
The image signal processing unit 18 performs A / D conversion on the analog image signal input from the image signal generation circuit 15 and outputs the analog image signal to the AE / AF processing circuit 19. Further, the image signal processing unit 18 performs processing such as white balance and γ correction on the A / D converted digital image data, and temporarily stores the processed image data in the built-in memory 20. Further, image data temporarily stored in the built-in memory 20 is read out, and compression processing is performed to make the data format suitable for recording on the recording medium 21. Further, image data temporarily stored in the built-in memory 20 or image data recorded on the recording medium 21 may be subjected to decompression processing for reproduction and display on the monitor 23, or may be suitable for reproduction output. Or convert it to an image signal.
[0051]
The AE / AF processing circuit 19 performs an automatic focus adjustment (AF) process for automatically focusing a lens in the photographing optical system 13 on a photographing target based on digital image data output from the image signal processing unit 18. Then, an automatic exposure adjustment (AE: Auto Exposure) process for automatically adjusting the exposure of the image sensor 14 in accordance with the brightness of the imaging target is executed.
[0052]
In the AF processing, a high-frequency component of image data for one screen or a predetermined portion of the screen is extracted by a high-pass filter, an AF evaluation value is calculated by an arithmetic processing such as cumulative addition, and output to the CPU 11a.
In the AE processing, arithmetic processing such as cumulative addition is performed on photometric luminance values of image data for one screen, AE conditions such as an exposure amount (exposure value) required for photographing are calculated, and output to the CPU 11a.
[0053]
The built-in memory 20 includes a memory such as a buffer memory for temporarily storing digital image data output from the image signal processing unit 18, and a nonvolatile semiconductor for recording image data compressed by the image signal processing unit 18. And a memory.
[0054]
The recording medium 21 is detachable from a mounting unit (not shown), is composed of a nonvolatile semiconductor memory such as a flash memory (Flash Memory), and records image data that has been compressed by the image signal processing unit 18. The recording medium 21 has a function as a storage unit described in the claims of the present invention, and has a function as an output unit when attached to an external device.
[0055]
The monitor drive circuit 22 drives the monitor 23 to display the digital image data output by the image signal processing unit 18. The monitor 23 is configured by an LCD (Liquid Crystal Display) or the like. The monitor 23 has a function as a display unit described in the claims of the present invention.
[0056]
The flash unit 24 includes a strobe circuit, a light emitting unit, and the like, and irradiates the subject with auxiliary light when the brightness of the subject is low.
[0057]
The operation unit 25 includes a power switch, a release switch, a reproduction switch, various mode conversion switches, a zoom lever, a left and right arrow key, a decision key, and the like, and outputs a user operation signal to the CPU 11a.
[0058]
An EEPROM (Electrically Erasable and Programmable ROM) 26 is constituted by a nonvolatile memory, and stores adjustment data and the like used for various operations in advance.
[0059]
In the present embodiment, the EEPROM 26 stores a smile evaluation criterion table 261 and a neat evaluation criterion table 262.
FIG. 2A is a diagram illustrating an example of data storage in the smile evaluation criterion table 261. As shown in FIG. 2A, the smile evaluation criterion table 261 includes an element area 261a, a coefficient area 261b, and an evaluation point area 261c. The element area 261a stores data (for example, “eyebrows”, “eyes”, and “lips”) representing the constituent elements of the face as “elements”. The coefficient area 261a stores data (for example, “2”, “5”, “4”) representing the coefficient of the corresponding face component as a “coefficient”. The evaluation point area 261c stores a reference image of a corresponding face component and its evaluation point. For example, for the eyebrows, the reference image that is raised is associated with a lower evaluation point by the angle, and the reference image is stored in association with the evaluation point so that the higher the angle, the higher the evaluation point, as the angle decreases. I have. For the eyes, the reference image in which the pupil is closed is associated with a low evaluation point depending on the degree of opening of the pupil, and the reference image is stored in association with the evaluation point so that the higher the pupil is, the higher the evaluation point becomes. Have been. When the eyes are closed, the reference image is stored in association with the evaluation point so as to be a minus point. For the lips, the reference image having the angles at both ends lowered is associated with a low point, and as the angles at both ends increase, the reference image is stored in association with the evaluation points so as to become a higher evaluation point. .
[0060]
FIG. 2B is a diagram illustrating an example of data storage in the neat evaluation criterion table 262. As shown in FIG. 2B, the neat evaluation reference table 262 includes an element area 262a, a coefficient area 262b, and an evaluation point area 262c. The element area 262a stores data (for example, “face 1”, “face 2”, “mayu”,...) Representing face constituent elements as “elements”. The coefficient area 262b stores data (for example, “1”, “2”, “2”,...) Representing coefficients of the corresponding face constituent elements as “coefficients”. The evaluation point area 262c stores a reference image of a corresponding face component and its evaluation point. For example, regarding the orientation of the face, the one that faces directly in front is considered to be a high evaluation point, and the one that faces face to left and right, and the one that leans on the neck, the lower the evaluation point, the larger the degree is. Images are stored in association with evaluation points. The eyebrows are stored in association with the evaluation points such that the flat reference image has a high evaluation point, and the reference image having an increased or decreased angle has a lower evaluation point as its degree is larger. I have. The eyes are stored in association with the evaluation points such that the reference image with the pupil open is a high evaluation point, the reference image with the pupil closed, and the reference image with the eyes closed is a low evaluation point. I have. For the lips, a reference image is stored in association with an evaluation point such that an image in which the angle at both ends is raised or lowered becomes a lower evaluation point.
[0061]
The battery 27 in FIG. 1 is configured by a lithium ion secondary battery, an alkaline manganese battery, or the like, and supplies power to the digital camera 1.
[0062]
Next, the operation will be described.
First, the photographed image evaluation processing executed by the CPU 11a will be described with reference to the flowchart in FIG.
[0063]
The CPU 11a waits for the release switch to be pressed (step S1). When the release switch is pressed (step S1; YES), the CPU 11a determines whether or not the mode is the evaluation mode (step S2), and determines that the mode is not the evaluation mode. Then (step S2; NO), a normal photographing process is executed (step S3). That is, the CPU 11a performs photometry and distance measurement, calculates an exposure value and an AF evaluation value by the AE / AF processing circuit 19 based on the measured numerical values, performs photographing of a subject based on these, and outputs an image to the acquired image. Processing is performed and displayed on the monitor 23.
[0064]
On the other hand, if the CPU 11a determines in step S2 that the mode is the evaluation mode (step S2; YES), the CPU 11a captures a moving image and records the image in the built-in memory 20 (step S4), and after a predetermined time elapses (step S5; YES), the moving image shooting ends, and the routine goes to Step S6. In moving image shooting, the CPU 11a records, for example, 15 frames / second for a predetermined time, for example, 3 seconds, and cuts out the frame images 101 to 145 of the recorded moving images as still images.
[0065]
Note that a plurality of still images may be obtained by performing continuous shooting instead of shooting with a moving image.
[0066]
When the moving image shooting ends, the CPU 11a displays the shot moving image on the monitor 23 (step S6). After the moving image is displayed, a weighting evaluation of a person is performed by an input operation of the operation unit 25 (when calculating a smile evaluation value and a neat evaluation value to be described later, weighting is performed on other persons to calculate an overall evaluation value of the frame image. When the instruction of “YES” is input (step S7; YES), the CPU 11a displays a weight setting screen on the monitor 23, and displays the designated order and the position information for the person designated by the operation unit 25. It is acquired and weighted in the specified order, and is set to be evaluated (step S8).
[0067]
Next, the CPU 11a waits for the start of the evaluation to be instructed by the operation unit 25, and when the start of the evaluation is instructed (Step S9; YES), the CPU 11a performs, for each of the frame images 101 to 145 captured in Step S4, A facial image of each subject (person) is extracted and its positional information is acquired, and a smile evaluation (evaluation of the degree of smile of the facial expression) and a neat evaluation (evaluation of the degree of neatness of the facial expression) are performed for each face image. Evaluation) is performed (step S10). The smile evaluation refers to a smile evaluation criterion table 261 stored in the EEPROM 26, calculates the evaluation points of the shape of the eyebrows, pupils, and lips of the face image, and weights each evaluation point by a coefficient. Are added to obtain a smile evaluation value of the frame image in the face image. For the eyebrows and eyes, the evaluation points may be calculated for both the left and right sides and may be reduced to 1 /, or the evaluation may be performed for one side. As the neat evaluation, with reference to the neat evaluation criterion table 262 stored in the EEPROM 26, evaluation points of the face direction and the shape of each element of the eyebrows, pupils, and lips are calculated, and each evaluation point is weighted by a coefficient. The values are summed to obtain a neat evaluation value in the face image. For the eyebrows and eyes, the evaluation points may be calculated for both the left and right sides and may be reduced to 1 /, or the evaluation may be performed for one side.
[0068]
Next, for each of the frame images 101 to 145, the CPU 11a calculates the average value of the smile evaluation values and the average value of the neat evaluation values calculated for each of the face images in step S10, and obtains the total of the frame images 101 to 145. A smile evaluation value and an overall neatness evaluation value are set (step S11). Here, when the weighting evaluation of the person is set in step S8, the CPU 11a calculates the total smile evaluation value and the total neat evaluation value of each of the frame images 101 to 145 by a predetermined algorithm. The total smile evaluation value and the total neat evaluation value are calculated by weighting the smile evaluation value and the neat evaluation value of the face image located at or near the position of the position information obtained based on the designated order.
[0069]
Next, the CPU 11a analyzes one of the frame images, for example, the frame image 101, and determines whether the captured image is a casual scene or a formal screen (step S12). Specifically, the CPU 11a determines whether it is a formal scene or a casual scene based on whether the shooting is indoors or outdoors, whether the clothes are formal clothes, or whether the hairstyle is rough or not. When determining that the shooting is a shooting of a casual scene (step S12; 1), the CPU 11a sets the priority evaluation value as a smile evaluation value (step S13). On the other hand, if it is determined in step S12 that the shooting is a shooting of a formal scene (step S12; 2), the CPU 11a sets a priority evaluation value as a formal evaluation value (step S14).
[0070]
Next, the CPU 11a extracts a face image of a frame image having the highest priority evaluation value among the frame images 101 to 145 for each face image of each person (step S15), collects the extracted face images, and The frame is synthesized with the frame composite image 146 (step S16).
[0071]
Next, the CPU 11a sets the composite image 146 created in step S16 as the highest rank and sets the order of the frame images 101 to 146 from the one with the highest priority evaluation value (step S17). Then, the CPU 11a creates an evaluation value file 211 in which evaluation value information such as the total smile evaluation value and the total neatness evaluation value corresponding to each of the frame images 101 to 146 is arranged according to the order and stored as a text file (step S18). . In addition, one frame file is created by arranging the frame images 101 to 146 in accordance with the order, and is stored (stored) in the recording medium 21 in association with the evaluation file 211 (step S19). Then, the CPU 11a displays the top frame image on the monitor 23 (Step S20).
[0072]
FIG. 4 is a diagram showing a data storage example of the evaluation value file 211 created in step S18 of FIG. As shown in FIG. 4, the evaluation value file 211 includes a NO area 211a, a NAME area 211b, a SMILE area 211c, a FORMAL area 211d, a PERSON area 211e, a PERSON_NO area 211f, an AREA area 211g, and a SMILEP area. 211h and a FORMALP area 211i.
[0073]
The NO area 211a stores data (for example, “1”, “2”,...) Indicating the order of the priority evaluation values set in step S13 or S14 as “NO”. The NAME area 211b stores data indicating the name of the frame image (for example, “DSC00001.xxx”, “DSC00002.xxx”,...) As “NAME”. The SMILE area 211c stores data (for example, “35”, “25”,...) Representing the total smile evaluation value of the frame image as “SMILE”. The FORMAL area 211d stores data (for example, “32”, “38”,...) Representing the overall neatness evaluation value of the frame image as “FORMAL”. The PERSON area 211e stores, as “PERSON”, data (for example, “3”, “3”,...) Indicating the number of persons (persons) photographed in the frame image.
[0074]
The PERSON_NO area 211f stores a code (for example, “001”, “002”, “003”,...) Uniquely assigned to specify a person (person) photographed in the frame image. Stored as “PERSON_NO”. The AREA area 211g includes data (for example, “(50, 150)-(200, 300)”, “(100, 100)” indicating the position information of the face image of the subject to which the corresponding code is assigned in the frame image. 200)-(600,700) "," (256,389)-(769,864) ",...) Are stored as" AREA ". The SMILEP area 211h stores, as “SMILEP”, data (for example, “40”, “30”, “35”,...) Indicating the smile evaluation value of the subject to which the corresponding PERSON_NO is assigned. The FORMALP area 211i stores, as “FORMALP”, data (for example, “32”, “29”, “35”,...) Indicating a neat evaluation value of the subject to which the corresponding PERSON_NO is assigned.
[0075]
FIG. 5 is a diagram showing an example of the data structure of the moving image file stored in the recording medium 21 in step S19 in FIG. As shown in FIG. 5, in the moving image file, a comment area storing a comment on the moving image file, a frame image data area storing each frame image data, and an evaluation value file corresponding to the moving image are stored. It consists of an evaluation value file area. Each piece of frame image data has a header area and an image information area, and the header area stores accessory information related to each piece of frame image data. In a moving image format such as MPEG, in order to compress the amount of information, a difference between still images is extracted, encoded and recorded, and at the time of reproduction, the codes are used to complement between still images. However, it is desirable that the evaluation of each frame image in the above-described captured image evaluation process be performed on an “I-picture (Intra-coded Picture)” having the largest amount of information as a still image. The data structure of the moving image file is not limited to this. For example, information in the evaluation value file may be stored in the comment area.
[0076]
The above-mentioned evaluation value file 211 stores the priority evaluation value rank, the overall smile evaluation value, the overall neatness evaluation value, and the number of photographed persons in association with each frame image obtained by shooting a moving image. The face position information, the smile evaluation value, and the neat evaluation value for each person are stored together. Then, as shown in FIG. 5, the evaluation value file 211 is stored in the recording medium 21 in association with the frame image data. Therefore, for example, by attaching the recording medium 21 to an external output device, printing can be performed in the order of the priority evaluation value on the external output device.
[0077]
Note that, among the still images cut out from the moving image, one or more of the still images with high evaluations may be newly recorded on the recording medium 21 as independent still images. The space of the medium 21 is reduced. Therefore, if the cut-out still image is discarded after the evaluation, a moving image file is created again, and the captured image is stored by associating and storing the evaluation value file or writing the evaluation value file in the comment area. Since the increase in the amount is slight, it is desirable not to additionally record still images. A still image obtained by extracting a face image of each highly evaluated subject from a still image cut out from a moving image and synthesizing it into one still image may be recorded as an independent still image. Also in this case, a moving image file having the combined still image as the first frame can be created and recorded. Although the new moving image file has a larger file size than the original moving image file, the number of files does not increase, which makes it easier for the user to manage.
[0078]
FIG. 6 is a diagram illustrating an example of the captured image display screen 231 displayed on the monitor 23 in step S20 of FIG. FIG. 6A is a diagram illustrating an example of a captured image display screen 231 displayed when a casual scene is determined in step S12 of FIG. As shown in FIG. 6A, on the photographed image display screen 231, together with the frame image, the overall smile evaluation value and the overall neatness evaluation value of the frame image are displayed with the overall smile evaluation value facing upward. Only the overall smile evaluation value, which is the priority evaluation value, may be displayed together with the frame image. When the display of the evaluation value of each subject is instructed by an operation from the operation unit 25, the smile evaluation value and the neat evaluation value of each subject are displayed as shown in FIG. 6B. Only the smile evaluation value of each subject may be displayed.
[0079]
FIG. 6C is a diagram illustrating an example of the captured image display screen 231 displayed when it is determined in step S12 of FIG. 3 that the scene is a formal scene. As shown in FIG. 6C, the captured image display screen 231 displays the frame image and the overall smile evaluation value and the overall neatness evaluation value of the frame image with the overall neatness evaluation value upward. Along with the frame image, only the overall neatness evaluation value which is the priority evaluation value may be displayed. When the display of the evaluation value of each subject is instructed by an operation from the operation unit 25, the smile evaluation value and the neat evaluation value of each subject are displayed as shown in FIG. Only the neat evaluation value of each subject may be displayed.
[0080]
FIG. 6E shows an example of the captured image display screen 231 displayed when weighting is set for the bride and groom images in step S8 of FIG. 3 and the display of the evaluation value of each subject is instructed. FIG. In FIG. 6 (e), the neglected evaluation values of the subjects other than the groom and the bride are low (20 points to 40 points), but the neat evaluation values of the groom and the bride are high (70 points, 80 points). The neat evaluation value is as high as 74 points. Conversely, as shown in FIG. 6 (f), the neglected evaluation values of the subjects other than the groom and the bride are high (80 to 90 points), but the neat evaluation values of the groom and the bride are high (40 points, 30 points), and the overall neatness evaluation value is as low as 50 points.
[0081]
After the highest image is displayed on the captured image display screen 231, the right arrow key of the operation unit 25 is operated to instruct the image feed, and the CPU 11a displays the captured image display screen 231 and the next highest priority evaluation value. Display the frame image. As the right arrow key is pressed, the CPU 11a causes the captured image display screen 231 to switch and display an image having a higher priority evaluation value from an image having a lower frame image. Further, as the left arrow key is pressed, the CPU 11a causes the captured image display screen 231 to switch and display the image with the lower priority evaluation value from the image with the higher frame image.
[0082]
When the setting of the number of prints is instructed by operating the operation unit 25 while the photographed image display screen 231 is displayed, and the number of prints is input, the CPU 11a starts the header of the frame image data displayed on the recording medium 21. The number of prints is recorded in the area. As a result, it is possible to determine the image to be printed and specify the number of images to be printed on the spot. In the external output device, when the recording medium 21 on which the image information and the number of prints are recorded is mounted, the specified image is printed in the specified number. Since the photographed image display screen 231 is displayed in descending order of the priority evaluation value, a photograph having a good expression for all the photographed persons of the photographed image is displayed first, and the selection of the image to be printed becomes easy.
[0083]
As described above, according to the digital camera 1, when the release switch is pressed, the CPU 11a determines whether or not the camera is in the evaluation mode. In each of the frame images 101 to 145 of the moving image that has been recorded, a face image of each subject (person) is extracted, the position information is acquired, and a smile evaluation and neatness are obtained for each face image. The evaluation is performed, and for each of the frame images 101 to 145, the overall smile evaluation value and the overall neatness evaluation value of the face image of the subject photographed on the image are calculated.
[0084]
Next, the CPU 11a analyzes one of the frame images, for example, the frame image 101, determines the scene of the photographed image, and determines that the photographing is a casual scene photographing. Value, and when it is determined that the shooting is for a formal scene, the priority evaluation value is set as a neat evaluation value, and for each face image of each person, the frame image having the highest priority evaluation value among the frame images 101 to 145 is set. Are extracted, and the extracted face images are collected and combined into a one-frame composite image 146. The composite image 146 is ranked highest, and each of the frame images 101 to 146 is ranked in descending order of priority evaluation value. Set.
[0085]
Then, the CPU 11a creates an evaluation value file 211 that stores, as a text file, evaluation value information such as an overall smile evaluation value and an overall neatness evaluation value corresponding to each of the frame images 101 to 146 according to the order of the priority evaluation values. Then, the evaluation file 211 and the frame image data are associated with each other and stored on the recording medium 21 as one moving image file, and the top frame image is displayed on the monitor 23 together with the overall smile evaluation value and the overall neatness evaluation value. When the switching to the next frame image is instructed by operating the arrow keys of the operation unit 25, the CPU 11a switches to the frame image with the next highest priority evaluation value and displays it.
[0086]
Accordingly, the facial expression of each subject (person) is evaluated according to the scene, and the captured images are displayed in descending order of evaluation, so that the user can select the image with the good facial expression, which is time-consuming and complicated. Work can be easily performed, and the photographed image can be easily printed and enjoyed.
[0087]
Note that the description in the above embodiment is a preferred example of the digital camera 1 according to the present invention, and is not limited to this.
For example, in the above-described embodiment, both the smile evaluation value and the neat evaluation value of the subject are calculated and the images are ranked according to the priority evaluation value according to the scene. Then, only the evaluation value corresponding to the scene may be automatically calculated. Thereby, processing can be shortened. The user may be allowed to select either the smile evaluation value or the neat evaluation value, and only the selected evaluation value may be calculated and ranked.
[0088]
The coefficients (weighting) of each element for calculating the smile evaluation value and the neat evaluation value may be set by the user. Alternatively, images may be ranked in accordance with the priority evaluation value of only a specific subject. This allows the user to freely customize the evaluation criteria.
[0089]
The designation of the person to be weighted for the evaluation is to be performed on the screen by the user. For example, an image of the person to be weighted is input to the digital camera 1 in advance, and the input is performed during the image captured by the digital camera 1. An image of a face that is the same (similar) to the extracted image may be determined, and the person may be weighted. It is desirable that a person set to perform weighting can be referred to on the finding screen of the monitor 23 at the time of shooting. FIG. 7A is a diagram illustrating an example of the finding screen 232. As shown by 232a in FIG. 7A, on the finding screen 232, an image of a person previously input as a person to be weighted is displayed in the form of a picture in picture, and this display is displayed by operating the operation unit 25. It is desirable to be able to release it. Also, as shown by 232b in FIG. 7A, it is desirable to highlight the person most similar to the input image on the finding screen 232 by, for example, encircling the person.
[0090]
Also, for example, when photographing a wedding hall, a person who wants to set weights such as a groom and a bride is always arranged at the center. As described above, when the arrangement area of the person to be weighted can be specified to some extent, the area for which the weight is to be increased (high allocation area) and the ratio thereof may be set before shooting. It is desirable that the set area can be referred to at the time of shooting, such as by being displayed on the finding screen of the monitor 23 at the time of shooting. FIG. 7B is a diagram illustrating an example of the finding screen 233. The area 233a of the finding screen 233 shown in FIG. 7B is a high-scoring area, where the groom and the bride are arranged in this area, and are evaluated by weighting in S11 in FIG. In this way, it is possible to check on the finding screen 233 whether or not the person to be weighted and evaluated is located in the high-scoring area, so that the person to be weighted is photographed so as to be surely placed in the high-scoring area. And accurate evaluation is possible. It is assumed that the shape of the high allocation area and the weighting ratio can be changed after photographing and re-evaluated.
[0091]
In addition, the detailed configuration and detailed operation of the digital camera 1 can be appropriately changed without departing from the spirit of the present invention.
[0092]
[Second embodiment]
Hereinafter, a second embodiment in which the present invention is applied to an image selection device will be described with reference to the drawings.
First, the configuration will be described.
FIG. 8 is a diagram conceptually showing the overall configuration of the present embodiment. As shown in FIG. 8, the image selection device 3 is connected to a printer 4 via a data communication cable 2 such as a USB (Universal Serial Bus). The image selection device 3 includes a monitor 33a, a number display section 33b, a left arrow key 34a, a decision key 34b, a right arrow key 34c, a ten key 34d, and a memory card slot 35a. A memory card 36 (for example, the recording medium 21 in the first embodiment) on which a captured image is recorded can be inserted.
[0093]
FIG. 9 is a block diagram illustrating a functional configuration of the image selection device 3. As shown in FIG. 8, the image selection device 3 includes a control unit 31, a storage unit 32, a display unit 33, an operation unit 34, a memory I / F 35, a memory card 36, a USB-I / F 37, and the like. Each part is connected by a bus 38.
[0094]
The control unit 31 is configured by a CPU or the like, reads a control program stored in advance in the storage unit 32, and controls the entire image selection device 3 according to the read program. In addition, the control unit 31 executes various processes such as an image selection process described later according to the read program.
[0095]
The storage unit 32 is configured by a nonvolatile memory such as a semiconductor. The storage unit 32 stores in advance various programs such as a control program and a processing program corresponding to the image selection device 3, data referred to by the various programs, and the like. Also, a smile evaluation reference table and a neat evaluation reference table similar to those shown in FIG. 2 are stored.
[0096]
The display unit 33 includes a monitor 33a and a number display unit 33b. The monitor 33a is composed of an LCD (Liquid Crystal Display) or an EL (Electro Luminescent) display, and displays various types of information and image data recorded on the memory card 36 as images according to an instruction signal from the control unit 31. The number display section 33b is configured by an LCD (Liquid Crystal Display) or the like, and displays a numerical value input by the ten keys 34d as the number of prints.
[0097]
The operation unit 34 includes a left arrow key 34a, an enter key 34b, a right arrow key 34c, a numeric keypad 34d, and the like, and outputs a user operation signal to the control unit 31.
[0098]
The memory I / F 35 is an interface configured to include the memory card slot 35a, and is an interface for establishing an electrical connection with the memory card 36 mounted on the memory card slot 35a.
[0099]
The memory card 36 is detachable from the memory card slot 35a, is composed of a nonvolatile semiconductor memory such as a flash memory (Flash Memory), and records image data photographed by an external photographing device.
[0100]
The USB-I / F 37 is an interface for connecting a data communication cable such as a USB and the image selection device 3.
[0101]
Next, the operation will be described.
FIG. 10 is a flowchart illustrating an image selection process performed by the control unit 31. Hereinafter, the image selection processing will be described with reference to FIG.
[0102]
The control unit 31 waits for the memory card 36 to be mounted (step S41), and detects the mounting of the memory card 36 (step S41; YES), and stores the image file stored at the top of the memory card 36 as a moving image file. Alternatively, it is determined whether or not the file is a continuous shooting file (step S42). If it is determined that the file is neither a moving image file nor a continuous shooting file (step S42; NO), the image data of the image file is displayed on the monitor 33a (step S42). S43). Then, when an input signal is input from the numeric keypad 34d, the control unit 31 causes the input numerical value to be displayed on the number display unit 33b (step S44; YES), and printing is instructed by pressing the enter key 34b. Then (step S45; YES), the input numerical value and the image data of the image file are output to the printer 4 via the USB-I / F 37 (step S46). Next, when the arrow key is pressed (step S47; YES), the control unit 31 performs the processing from step S42 on the next image stored in the memory card 36. If the number designation has not been input (step S44; NO), or if the print instruction has not been input (step S45; NO), the controller 31 proceeds to step S47 and waits for the operation of the arrow key (step S47). The input is waited for (step S44).
[0103]
When the control unit 31 determines in step S42 that the image file stored at the top of the memory card 36 is a moving image file or a continuous shooting file (step S42; YES), the evaluation value file is stored in the file. It is determined whether or not it has been performed (step S48). The evaluation value file is, like the evaluation value file 211 created in the first embodiment of the present invention, a smile evaluation value and a neat evaluation value of each image of each image of the image file, and a total evaluation value of each image. The evaluation value is stored.
[0104]
If it is determined in step S48 that the evaluation value file is not stored (step S48; NO), the control unit 31 causes the monitor 33a to display "Do you want to perform evaluation scoring?" When the input is made by the user 34b (step S49; YES), an evaluation process is performed on each image stored in the image file. As an evaluation procedure, an evaluation value file for the image is created and recorded by a process similar to the process of step S7 in FIG. After the evaluation processing, the control unit 31 proceeds to step S51.
[0105]
If it is determined in step S48 that the evaluation value file is stored (step S48; YES), or if the evaluation value file is created in step S50, the control unit 31 transmits the moving image file or the continuous shooting file (continuous shooting file). The data of the image having the highest evaluation value (priority evaluation value) is displayed (step S51). Then, when an input signal is input from the numeric keypad 34d, the control unit 31 causes the input numerical value to be displayed on the number display unit 33b (step S52; YES), and printing is instructed by pressing the enter key 34b. Then (step S53; YES), the input numerical value and the data of the image are output to the printer 4 via the USB-I / F 37 (step S54). Next, when the arrow key is pressed (step S55; YES), the control unit 31 determines whether the image is the last image in the image file and determines that the image is the last image (step S55). S56; YES), returning to step S42 to execute the processing for the next image file. On the other hand, when determining that the image is not the last image (step S56; NO), the control unit 31 returns to step S51, displays the next image, and executes the process of step S51.
[0106]
On the other hand, if the instruction for evaluation and scoring is not input in step S49 (step S49; NO), the control unit 31 causes the monitor 33a to display the images in the order in which the images are stored in the memory card 36 (step S57), and inputs the images from the ten keys 34d. When the signal is input, the input numerical value is displayed on the number display section 33b (step S58; YES), and when printing is instructed by pressing the enter key 34b (step S59; YES), the input is performed. The value and the image data are output to the printer 4 via the USB-I / F 37 (step S60). Next, when the arrow key is pressed (step S61; YES), the control unit 31 determines whether the image is the last image in the image file, and determines that the image is the last image (step S61). S62: YES), returning to step S42 to execute the processing for the next image file. On the other hand, when determining that the image is not the last image (step S62; NO), the control unit 31 returns to step S57, displays the next image, and executes the process of step S57.
[0107]
As described above, according to the image selection device 3, when the memory card 36 is inserted, the control unit 31 displays the evaluation value on the monitor 33a for the image file in which the evaluation value file is stored in association with the memory card 36. The images are displayed in descending order, and when the number of prints is specified and printing is instructed, the image data and the number of prints are transferred to the printer 4. For an image file to which no evaluation value file is associated, when creation of an evaluation value file is instructed by an instruction from the operation unit 34, an evaluation value file is created and images are displayed in descending order of evaluation values. When the number of copies is designated and printing is instructed, the image data and the number of copies are transferred to the printer 4.
[0108]
Accordingly, the facial expression of each subject (person) is evaluated according to the scene, and the captured images are displayed in descending order of evaluation, so that the user can select the image with the good facial expression, which is time-consuming and complicated. Work can be easily performed, and the photographed image can be easily printed and enjoyed. In addition, since it is possible to immediately print on the printer 4 by designating the number of prints, the user can immediately enjoy the print, and the photographer can immediately collect the price.
[0109]
Note that the description content in the above embodiment is a preferred example of the image selection device 3 according to the present invention, and is not limited to this.
For example, in the above embodiment, the image files are displayed in the order in which they are stored. However, the image file name to be displayed is specified, and the steps S42 to S62 of the image selection process are executed for the specified image file. You may make it. Thereby, a desired image file can be easily displayed, and convenience is improved. In addition, when a plurality of still image files are designated, not limited to a moving image file or a continuous shooting file, the evaluation values of the plurality of still images may be determined, and the images may be rearranged and displayed in descending order of the evaluation values.
[0110]
Further, in the above embodiment, the number information and the image data are transferred when the print instruction is issued for the image for which the number is specified, but when the recording is instructed, the number information is transferred to the header of the image data. It may be stored in an area and printed later.
[0111]
When a moving image file or a continuous shooting file associated with the evaluation value file is specified, the image data of the specified file is output to the printer 4 in order of the priority evaluation value, and the printer 4 outputs the priority evaluation value. May be printed in the order of.
[0112]
Further, the evaluation of the image is not limited to being performed by the digital camera 1 in the first embodiment or the image processing device 3 in the second embodiment. For example, in a printing device such as a printer, evaluation may be performed, and an image with the highest evaluation may be output. In addition, implement an image evaluation site on the Internet, evaluate images uploaded from the imaging device or personal computer, return image numbers in the order of evaluation, and select images to print based on the image numbers. Alternatively, a service may be provided in which printing is performed, an image to be stored on a CD or DVD is selected, and the image is recorded on a medium having a good storage state. Also, a service may be provided in which an image is created by combining a face image with a high evaluation value and downloaded. As a result, even a user who does not have an evaluation function in the photographing apparatus can easily select the best print image, and can also obtain a composite image with the best expressions. Ordering prints can be done easily without having to visit a lab or the like.
[0113]
In addition, the detailed configuration and detailed operation of the image selection device 3 can be appropriately changed without departing from the spirit of the present invention.
[0114]
【The invention's effect】
According to the first aspect of the present invention, the facial expression of the subject in the captured image can be evaluated and scored. Therefore, it is possible to evaluate the quality of the expression of the subject.
[0115]
According to the second aspect of the present invention, at least one of the degree of smile and the degree of neatness of the expression of the subject in the captured image can be evaluated and scored. Therefore, it is possible to evaluate whether or not the expression of the subject is an expression corresponding to the shooting scene.
[0116]
According to the third aspect of the present invention, when a plurality of photographed persons are included in one photographed image, the evaluations scored for the plurality of photographed persons are averaged, and the evaluations in the photographed images are averaged. The overall evaluation value can be scored. Therefore, it is possible to evaluate the photographed image.
[0117]
According to the fourth aspect of the invention, when scoring the total evaluation value of the subject in one captured image, the evaluation can be performed by weighting the evaluation of a specific person. Therefore, it is possible to evaluate a captured image in consideration of the evaluation of a person who wants to be emphasized.
[0118]
According to the fifth aspect of the present invention, when a plurality of subjects are included in one photographed image, the graded evaluation can be displayed on the display means for each subject. Therefore, the user can know the evaluation of each subject.
[0119]
According to the sixth aspect of the present invention, it is possible to display the comprehensive evaluation value of the captured image on the display unit. Therefore, the user can know the comprehensive evaluation value for each captured image.
[0120]
According to the invention described in claims 7 to 10, it is possible to analyze the captured image and determine whether the scene of the captured image is a casual scene or a formal scene.
[0121]
According to the invention described in claim 11, when the scene of the photographed image is determined to be a casual scene, the evaluation of the graded smile is prioritized over the evaluation of the neatness of the photographed image. Can be displayed above. Therefore, the evaluation according to the scene of the captured image can be displayed in an easily viewable manner.
[0122]
According to the twelfth aspect of the present invention, when it is determined that the scene of the photographed image is a formal scene, the evaluation of the graded neatness is given priority over the evaluation of the degree of smile on the photographed image. Can be displayed. Therefore, the evaluation according to the scene of the captured image can be displayed in an easily viewable manner.
[0123]
According to the thirteenth aspect of the present invention, the expressions of the photographed person in the plurality of photographed images are respectively evaluated, and the plurality of photographed images are ranked in the order of higher evaluation, and the plurality of photographed images are determined based on the ranking. The image can be displayed on the display means. Therefore, a plurality of photographed images can be displayed in the order of good expression of the subject, so that the user can easily select a photographed image to be printed.
[0124]
According to the fourteenth aspect of the present invention, for each still image obtained by cutting out each frame in a moving image as a still image, a ranking is set in descending order of evaluation, and a plurality of captured images are displayed based on the ranking. can do. Therefore, since each frame image in the moving image can be displayed in the order of the expression of the subject, the user can easily select the captured image to be printed.
[0125]
According to the invention of claim 15, it is possible to output the set ranking of the photographed images to the external device in association with each photographed image. Therefore, the images can be output to an external device for display and printing in ascending order of appearance of the subject.
[0126]
According to the sixteenth aspect of the present invention, when a plurality of photographed persons are included in each of the plurality of photographed images, the face image of each photographed person in the plurality of photographed images is most highly evaluated. A face image of each subject can be extracted and combined as one still image. Therefore, it is possible to acquire a photographed image with a good expression for all the photographed persons.
[0127]
According to the seventeenth aspect, among the face images of each subject in the plurality of captured images, the face image of each subject that is most highly evaluated is extracted and combined as one still image. Can be set to the highest rank. Therefore, when selecting an image to be printed, the user can easily select a photographed image with a good expression of all subjects.
[0128]
According to the eighteenth aspect, a moving image can be created from the plurality of still images in a set order. Therefore, even when a still image is cut out from a moving image and evaluated, the invention is implemented without increasing the file size from the original moving image by creating and recording a moving image in which images are replaced in the order of higher evaluation. can do.
[0129]
According to the nineteenth aspect, the graded evaluation can be stored in association with the captured image. Therefore, since the evaluation of the photographed image is stored, it is possible to easily display a plurality of photographed images arranged in descending order of expression at any time and output the images to an external device.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a functional configuration of a digital camera 1 according to the present invention.
FIG. 2 is a diagram showing an example of data storage inside a smile evaluation criterion table 261 and a neat evaluation criterion table 262 of FIG.
FIG. 3 is a flowchart of a captured image evaluation process executed by a CPU 11a of FIG. 1;
FIG. 4 is a diagram showing an example of data storage of an evaluation value file 211 created in step S18 of FIG.
FIG. 5 is a diagram showing an example of a data structure of a moving image file stored in a recording medium 21 in step S19 of FIG.
6 is a diagram showing an example of a captured image display screen 231 displayed on a monitor 23 in step S20 of FIG.
FIG. 7 is a diagram showing an example of finding screens 232 and 233 displayed on a monitor 23 during shooting.
FIG. 8 is a diagram conceptually showing an overall configuration of a second embodiment according to the present invention.
9 is a block diagram illustrating a functional configuration of the image selection device 3 in FIG.
FIG. 10 is a flowchart illustrating an image selection process executed by a control unit 31 of FIG. 9;
[Explanation of symbols]
1 Digital camera
2 Data communication cable
3 Image selection device
4 Printer
11 Control unit
11a CPU
11b ROM
11c RAM
12 Optical system controller
13 Imaging optical system
14 Image sensor
15 Image signal generation circuit
16 TG
17 CCD driver
18 Image signal processing unit
19 AE / AF processing circuit
20 Built-in memory
21 Recording media
22 Monitor drive circuit
23 Monitor
24 Flash part
25 Operation unit
26 EEPROM
27 batteries
31 Control unit
32 storage unit
33 Display
34 Operation unit
35 Memory I / F
36 memory card
37 USB-I / F

Claims (19)

  1. An image evaluation apparatus comprising: an evaluation unit that evaluates and scores a facial expression of a subject in an input captured image.
  2. The image evaluation apparatus according to claim 1, wherein the evaluation unit evaluates and scores at least one of a degree of smile and a degree of neatness of the facial expression of the subject.
  3. The evaluation unit averages the evaluations of the plurality of photographed persons when a plurality of photographed persons are included in one photographed image, and scores an overall evaluation value of the one photographed image. The image evaluation device according to claim 1, wherein the evaluation is performed.
  4. The image evaluation apparatus according to claim 3, wherein the evaluation unit weights the evaluation of a specific person when scoring the comprehensive evaluation value.
  5. A display unit for displaying the captured image,
    When a plurality of photographed persons are included in one photographed image, control means is provided for displaying the evaluations scored by the evaluation means on the display means for each photographed person. The image evaluation device according to claim 1.
  6. The image evaluation device according to claim 5, wherein the control unit causes the display unit to display the comprehensive evaluation value.
  7. 7. A photographing scene judging means for analyzing the photographed image and judging whether the scene of the photographed image is a casual scene or a formal scene is provided. An image evaluation device according to claim 1.
  8. The photographing scene determination means analyzes whether the photographed image is photographed outdoors or photographed indoors, so that the scene of the photographed image is a casual scene or a formal scene. The image evaluation device according to claim 7, wherein it is determined whether the scene is a proper scene.
  9. 8. The photographing scene judging means judges whether the scene of the photographed image is a casual scene or a formal scene based on clothes of a photographed person in the photographed image. Or the image evaluation device according to 8.
  10. 8. The photographing scene judging unit judges whether the scene of the photographed image is a casual scene or a formal scene based on a hairstyle of a photographed person in the photographed image. The image evaluation device according to any one of claims 9 to 9.
  11. When the scene of the photographed image is determined to be a casual scene by the photographing scene determination unit, the control unit evaluates the degree of the smile scored by the evaluation unit with respect to the degree of neatness. The image evaluation device according to any one of claims 7 to 10, wherein the image evaluation device displays the image on the captured image in preference to the evaluation.
  12. When the scene of the photographed image is determined to be a formal scene by the photographing scene determination unit, the control unit evaluates the degree of neatness scored by the evaluation unit with respect to the degree of the smile. The image evaluation device according to claim 11, wherein the image is displayed on the photographed image prior to evaluation.
  13. The evaluation unit includes a rank setting unit that evaluates a facial expression of a photographed person in each of the plurality of captured images, and sets a rank in the descending order of evaluation by the evaluation unit for the plurality of captured images,
    13. The apparatus according to claim 5, wherein the control unit controls the plurality of captured images to be displayed on the display unit based on the order set by the order setting unit. Image evaluation device.
  14. 14. The image evaluation device according to claim 13, wherein the plurality of captured images are captured images obtained by cutting out each frame in a moving image as a still image.
  15. The image evaluation apparatus according to claim 13, further comprising an output unit configured to output the order set by the order setting unit to the external device in association with each of the captured images.
  16. In the case where a plurality of subjects are shown in each of the plurality of photographed images, among the face images of the respective subjects in the plurality of photographed images, each of the subjects most highly evaluated by the evaluation unit The image evaluation apparatus according to any one of claims 13 to 15, further comprising: a face image synthesizing unit that extracts a face image of (i) and synthesizes it as one still image.
  17. 17. The image evaluation device according to claim 16, wherein the order setting unit sets the still image synthesized by the face image synthesizing unit to the highest order.
  18. The image evaluation apparatus according to claim 14, further comprising a moving image creating unit that creates a moving image from the plurality of still images in the order set by the order setting unit.
  19. The image evaluation device according to claim 1, further comprising a storage unit configured to store the evaluation scored by the evaluation unit in association with the captured image.
JP2002204099A 2002-07-12 2002-07-12 Picture evaluation device Withdrawn JP2004046591A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002204099A JP2004046591A (en) 2002-07-12 2002-07-12 Picture evaluation device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002204099A JP2004046591A (en) 2002-07-12 2002-07-12 Picture evaluation device

Publications (1)

Publication Number Publication Date
JP2004046591A true JP2004046591A (en) 2004-02-12

Family

ID=31709795

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002204099A Withdrawn JP2004046591A (en) 2002-07-12 2002-07-12 Picture evaluation device

Country Status (1)

Country Link
JP (1) JP2004046591A (en)

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005242567A (en) * 2004-02-25 2005-09-08 Canon Inc Movement evaluation device and method
JP2005301008A (en) * 2004-04-14 2005-10-27 Casio Comput Co Ltd Photographic image printer and program
JP2005318515A (en) * 2004-03-31 2005-11-10 Fuji Photo Film Co Ltd Digital still camera, image reproducing apparatus, face image display apparatus, and methods of controlling same
WO2006117942A1 (en) * 2005-04-28 2006-11-09 Konica Minolta Holdings, Inc. Person imaging device and person imaging method
JP2006330800A (en) * 2005-05-23 2006-12-07 Nippon Telegr & Teleph Corp <Ntt> Image synthesis system, image synthesis method, and program of the method
WO2007000685A1 (en) * 2005-06-28 2007-01-04 Koninklijke Philips Electronics N.V. Method of operating a camera for taking electronic images, camera for taking electronic images
JP2007067560A (en) * 2005-08-29 2007-03-15 Canon Inc Imaging apparatus and its control method, computer program and recording medium
JP2007274207A (en) * 2006-03-30 2007-10-18 Fujifilm Corp Image display device, image pickup device, and image display method
JP2008027086A (en) * 2006-07-19 2008-02-07 Sony Computer Entertainment Inc Facial expression inducing device, facial expression inducing method, and facial expression inducing system
JP2008058553A (en) * 2006-08-31 2008-03-13 Casio Comput Co Ltd Imaging apparatus, imaging method, and imaging control program
JP2008071048A (en) * 2006-09-13 2008-03-27 Nippon Telegr & Teleph Corp <Ntt> System for presenting dynamic content and its program
JP2008311818A (en) * 2007-06-13 2008-12-25 Sony Corp Image photographing device and image photographing method, and computer program
JP2009088742A (en) * 2007-09-28 2009-04-23 Casio Comput Co Ltd Imaging apparatus, imaging control program, and imaging method
JP2009177678A (en) * 2008-01-28 2009-08-06 Nikon Corp Camera and image display device
JP2009267695A (en) * 2008-04-24 2009-11-12 Canon Inc Image processing apparatus, control method for the same, and program
JP2009290336A (en) * 2008-05-27 2009-12-10 Sony Corp Image reproducing device, image reproducing method, and program
JP2010028773A (en) * 2008-07-24 2010-02-04 Canon Inc Image processing apparatus, image processing method and program
US7693413B2 (en) 2005-07-21 2010-04-06 Sony Corporation Camera system, information processing device, information processing method, and computer program
JP2010166259A (en) * 2009-01-14 2010-07-29 Sony Corp Image-capture device, image-capture method, and image-capture program
JP2010177811A (en) * 2009-01-27 2010-08-12 Nikon Corp Digital camera
JP2010177894A (en) * 2009-01-28 2010-08-12 Sony Corp Imaging apparatus, image management apparatus, image management method, and computer program
JP2010177812A (en) * 2009-01-27 2010-08-12 Nikon Corp Digital camera and image processing apparatus
US7796785B2 (en) 2005-03-03 2010-09-14 Fujifilm Corporation Image extracting apparatus, image extracting method, and image extracting program
JP2010239467A (en) * 2009-03-31 2010-10-21 Casio Computer Co Ltd Image selection device, method for selecting image and program
JP2011029978A (en) * 2009-07-27 2011-02-10 Casio Computer Co Ltd Image processing unit, method of processing image, and program
US7952618B2 (en) 2006-01-27 2011-05-31 Fujifilm Corporation Apparatus for controlling display of detection of target image, and method of controlling same
WO2011089872A1 (en) 2010-01-22 2011-07-28 パナソニック株式会社 Image management device, image management method, program, recording medium, and integrated circuit
JP2011155605A (en) * 2010-01-28 2011-08-11 Nikon Corp Image processing device, imaging device, and image processing program
WO2011105579A1 (en) 2010-02-26 2011-09-01 シャープ株式会社 Content reproduction device, television receiver, content reproduction method, content reproduction program, and recording medium
US8081227B1 (en) * 2006-11-30 2011-12-20 Adobe Systems Incorporated Image quality visual indicator
EP2405646A1 (en) * 2006-03-23 2012-01-11 Nikon Corporation Camera and image processing program
WO2012011213A1 (en) * 2010-07-21 2012-01-26 パナソニック株式会社 Image management device, image management method, program, recording medium, and integrated circuit for image management
US8238618B2 (en) 2006-08-02 2012-08-07 Sony Corporation Image-capturing apparatus and method, facial expression evaluation apparatus, and program
US8243159B2 (en) 2008-04-09 2012-08-14 Sony Corporation Image capturing device, image processing device, image analysis method for the image capturing device and the image processing device, and program for facial attribute detection
JP2012182841A (en) * 2012-06-11 2012-09-20 Canon Inc Image processing apparatus, control method for the same, and program
JP2012256327A (en) * 2012-06-27 2012-12-27 Olympus Imaging Corp Image display apparatus, image display method, and image display program
JP2013021462A (en) * 2011-07-08 2013-01-31 Fujitsu Mobile Communications Ltd Terminal device, picture imaging method for terminal device, and picture imaging device
JP2013046374A (en) * 2011-08-26 2013-03-04 Sanyo Electric Co Ltd Image processor
WO2013031241A1 (en) * 2011-09-02 2013-03-07 株式会社ニコン Electronic camera, image-processing apparatus, and image-processing program
US8421901B2 (en) 2009-01-23 2013-04-16 Nikon Corporation Display apparatus and imaging apparatus
US8615112B2 (en) 2007-03-30 2013-12-24 Casio Computer Co., Ltd. Image pickup apparatus equipped with face-recognition function
US8711265B2 (en) 2008-04-24 2014-04-29 Canon Kabushiki Kaisha Image processing apparatus, control method for the same, and storage medium
JP2014116957A (en) * 2007-12-28 2014-06-26 Casio Comput Co Ltd Imaging apparatus and program
US8780221B2 (en) 2008-04-09 2014-07-15 Canon Kabushiki Kaisha Facial expression recognition apparatus, image sensing apparatus, facial expression recognition method, and computer-readable storage medium
WO2014162788A1 (en) * 2013-04-02 2014-10-09 Necソリューションイノベータ株式会社 Facial-expression assessment device, dance assessment device, karaoke device, and game device
US9591210B2 (en) 2012-03-06 2017-03-07 Sony Corporation Image processing face detection apparatus, method for controlling the same, and program
JP2018038090A (en) * 2017-12-07 2018-03-08 オリンパス株式会社 Image creation device, image creation method, image creation program, and image creation system
JP2019504514A (en) * 2016-11-29 2019-02-14 北京小米移動軟件有限公司Beijing Xiaomi Mobile Software Co.,Ltd. Photo composition method, apparatus, program, and recording medium

Cited By (76)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005242567A (en) * 2004-02-25 2005-09-08 Canon Inc Movement evaluation device and method
JP4489608B2 (en) * 2004-03-31 2010-06-23 富士フイルム株式会社 Digital still camera, image reproduction device, face image display device, and control method thereof
JP2005318515A (en) * 2004-03-31 2005-11-10 Fuji Photo Film Co Ltd Digital still camera, image reproducing apparatus, face image display apparatus, and methods of controlling same
US7853140B2 (en) 2004-03-31 2010-12-14 Fujifilm Corporation Digital still camera, image reproducing apparatus, face image display apparatus and methods of controlling same including determination device and decision device
JP2005301008A (en) * 2004-04-14 2005-10-27 Casio Comput Co Ltd Photographic image printer and program
JP4506253B2 (en) * 2004-04-14 2010-07-21 カシオ計算機株式会社 Photo image extraction apparatus and program
US7796785B2 (en) 2005-03-03 2010-09-14 Fujifilm Corporation Image extracting apparatus, image extracting method, and image extracting program
JP2011170892A (en) * 2005-03-03 2011-09-01 Fujifilm Corp Image extracting device, image extracting method, and image extracting program
WO2006117942A1 (en) * 2005-04-28 2006-11-09 Konica Minolta Holdings, Inc. Person imaging device and person imaging method
JP2006330800A (en) * 2005-05-23 2006-12-07 Nippon Telegr & Teleph Corp <Ntt> Image synthesis system, image synthesis method, and program of the method
WO2007000685A1 (en) * 2005-06-28 2007-01-04 Koninklijke Philips Electronics N.V. Method of operating a camera for taking electronic images, camera for taking electronic images
US7693413B2 (en) 2005-07-21 2010-04-06 Sony Corporation Camera system, information processing device, information processing method, and computer program
JP2007067560A (en) * 2005-08-29 2007-03-15 Canon Inc Imaging apparatus and its control method, computer program and recording medium
US7952618B2 (en) 2006-01-27 2011-05-31 Fujifilm Corporation Apparatus for controlling display of detection of target image, and method of controlling same
EP2405646A1 (en) * 2006-03-23 2012-01-11 Nikon Corporation Camera and image processing program
US8199242B2 (en) 2006-03-23 2012-06-12 Nikon Corporation Camera and image processing program
US8704856B2 (en) 2006-03-30 2014-04-22 Fujifilm Corporation Image display apparatus, image-taking apparatus and image display method
US9736379B2 (en) 2006-03-30 2017-08-15 Fujifilm Corporation Image display apparatus, image-taking apparatus and image display method
JP4507281B2 (en) * 2006-03-30 2010-07-21 富士フイルム株式会社 Image display device, imaging device, and image display method
JP2007274207A (en) * 2006-03-30 2007-10-18 Fujifilm Corp Image display device, image pickup device, and image display method
JP2008027086A (en) * 2006-07-19 2008-02-07 Sony Computer Entertainment Inc Facial expression inducing device, facial expression inducing method, and facial expression inducing system
CN101867712B (en) 2006-08-02 2013-03-13 索尼株式会社 Image-capturing apparatus and method and expression evaluation apparatus
US8416999B2 (en) 2006-08-02 2013-04-09 Sony Corporation Image-capturing apparatus and method, expression evaluation apparatus, and program
US8238618B2 (en) 2006-08-02 2012-08-07 Sony Corporation Image-capturing apparatus and method, facial expression evaluation apparatus, and program
CN101877766B (en) 2006-08-02 2013-03-13 索尼株式会社 Image-capturing apparatus and method, expression evaluation apparatus
US8406485B2 (en) 2006-08-02 2013-03-26 Sony Corporation Image-capturing apparatus and method, expression evaluation apparatus, and program
KR101401165B1 (en) 2006-08-02 2014-05-29 소니 주식회사 Image-capturing apparatus and method, expression evaluation apparatus, and recording medium
US8416996B2 (en) 2006-08-02 2013-04-09 Sony Corporation Image-capturing apparatus and method, expression evaluation apparatus, and program
US8260041B2 (en) 2006-08-02 2012-09-04 Sony Corporation Image-capturing apparatus and method, expression evaluation apparatus, and program
US8260012B2 (en) 2006-08-02 2012-09-04 Sony Corporation Image-capturing apparatus and method, expression evaluation apparatus, and program
JP2008058553A (en) * 2006-08-31 2008-03-13 Casio Comput Co Ltd Imaging apparatus, imaging method, and imaging control program
JP2008071048A (en) * 2006-09-13 2008-03-27 Nippon Telegr & Teleph Corp <Ntt> System for presenting dynamic content and its program
US8081227B1 (en) * 2006-11-30 2011-12-20 Adobe Systems Incorporated Image quality visual indicator
US8615112B2 (en) 2007-03-30 2013-12-24 Casio Computer Co., Ltd. Image pickup apparatus equipped with face-recognition function
US9042610B2 (en) 2007-03-30 2015-05-26 Casio Computer Co., Ltd. Image pickup apparatus equipped with face-recognition function
JP2008311818A (en) * 2007-06-13 2008-12-25 Sony Corp Image photographing device and image photographing method, and computer program
JP2009088742A (en) * 2007-09-28 2009-04-23 Casio Comput Co Ltd Imaging apparatus, imaging control program, and imaging method
JP2014116957A (en) * 2007-12-28 2014-06-26 Casio Comput Co Ltd Imaging apparatus and program
JP2009177678A (en) * 2008-01-28 2009-08-06 Nikon Corp Camera and image display device
US9147107B2 (en) 2008-04-09 2015-09-29 Canon Kabushiki Kaisha Facial expression recognition apparatus, image sensing apparatus, facial expression recognition method, and computer-readable storage medium
US8780221B2 (en) 2008-04-09 2014-07-15 Canon Kabushiki Kaisha Facial expression recognition apparatus, image sensing apparatus, facial expression recognition method, and computer-readable storage medium
US8243159B2 (en) 2008-04-09 2012-08-14 Sony Corporation Image capturing device, image processing device, image analysis method for the image capturing device and the image processing device, and program for facial attribute detection
US9258482B2 (en) 2008-04-09 2016-02-09 Canon Kabushiki Kaisha Facial expression recognition apparatus, image sensing apparatus, facial expression recognition method, and computer-readable storage medium
JP2009267695A (en) * 2008-04-24 2009-11-12 Canon Inc Image processing apparatus, control method for the same, and program
US8711265B2 (en) 2008-04-24 2014-04-29 Canon Kabushiki Kaisha Image processing apparatus, control method for the same, and storage medium
JP2009290336A (en) * 2008-05-27 2009-12-10 Sony Corp Image reproducing device, image reproducing method, and program
JP2010028773A (en) * 2008-07-24 2010-02-04 Canon Inc Image processing apparatus, image processing method and program
US8199213B2 (en) 2008-07-24 2012-06-12 Canon Kabushiki Kaisha Method for selecting desirable images from among a plurality of images and apparatus thereof
JP2010166259A (en) * 2009-01-14 2010-07-29 Sony Corp Image-capture device, image-capture method, and image-capture program
US8285108B2 (en) 2009-01-14 2012-10-09 Sony Corporation Image-capture device, image-capture method, and image-capture program
US8421901B2 (en) 2009-01-23 2013-04-16 Nikon Corporation Display apparatus and imaging apparatus
JP2010177811A (en) * 2009-01-27 2010-08-12 Nikon Corp Digital camera
JP2010177812A (en) * 2009-01-27 2010-08-12 Nikon Corp Digital camera and image processing apparatus
US8768063B2 (en) 2009-01-28 2014-07-01 Sony Corporation Image processing apparatus, image management apparatus and image management method, and computer program
JP2010177894A (en) * 2009-01-28 2010-08-12 Sony Corp Imaging apparatus, image management apparatus, image management method, and computer program
JP2010239467A (en) * 2009-03-31 2010-10-21 Casio Computer Co Ltd Image selection device, method for selecting image and program
US8416312B2 (en) 2009-03-31 2013-04-09 Casio Computer Co., Ltd. Image selection device and method for selecting image
JP2011029978A (en) * 2009-07-27 2011-02-10 Casio Computer Co Ltd Image processing unit, method of processing image, and program
WO2011089872A1 (en) 2010-01-22 2011-07-28 パナソニック株式会社 Image management device, image management method, program, recording medium, and integrated circuit
JP2011155605A (en) * 2010-01-28 2011-08-11 Nikon Corp Image processing device, imaging device, and image processing program
WO2011105579A1 (en) 2010-02-26 2011-09-01 シャープ株式会社 Content reproduction device, television receiver, content reproduction method, content reproduction program, and recording medium
WO2012011213A1 (en) * 2010-07-21 2012-01-26 パナソニック株式会社 Image management device, image management method, program, recording medium, and integrated circuit for image management
JP5723367B2 (en) * 2010-07-21 2015-05-27 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Image management apparatus, image management method, program, recording medium, and integrated circuit for image management
JP2013021462A (en) * 2011-07-08 2013-01-31 Fujitsu Mobile Communications Ltd Terminal device, picture imaging method for terminal device, and picture imaging device
JP2013046374A (en) * 2011-08-26 2013-03-04 Sanyo Electric Co Ltd Image processor
US10489898B2 (en) 2011-09-02 2019-11-26 Nikon Corporation Electronic camera, image-processing device, and image-processing program
US9792677B2 (en) 2011-09-02 2017-10-17 Nikon Corporation Electronic camera, image processing device, and image processing program
WO2013031241A1 (en) * 2011-09-02 2013-03-07 株式会社ニコン Electronic camera, image-processing apparatus, and image-processing program
JP2013055455A (en) * 2011-09-02 2013-03-21 Nikon Corp Electronic camera, image processing device, and image processing program
CN106982331A (en) * 2011-09-02 2017-07-25 株式会社尼康 Camera and electronic equipment
US9591210B2 (en) 2012-03-06 2017-03-07 Sony Corporation Image processing face detection apparatus, method for controlling the same, and program
JP2012182841A (en) * 2012-06-11 2012-09-20 Canon Inc Image processing apparatus, control method for the same, and program
JP2012256327A (en) * 2012-06-27 2012-12-27 Olympus Imaging Corp Image display apparatus, image display method, and image display program
WO2014162788A1 (en) * 2013-04-02 2014-10-09 Necソリューションイノベータ株式会社 Facial-expression assessment device, dance assessment device, karaoke device, and game device
JP2019504514A (en) * 2016-11-29 2019-02-14 北京小米移動軟件有限公司Beijing Xiaomi Mobile Software Co.,Ltd. Photo composition method, apparatus, program, and recording medium
JP2018038090A (en) * 2017-12-07 2018-03-08 オリンパス株式会社 Image creation device, image creation method, image creation program, and image creation system

Similar Documents

Publication Publication Date Title
US6853401B2 (en) Digital camera having specifiable tracking focusing point
US7317485B1 (en) Digital still camera with composition advising function, and method of controlling operation of same
CN100539647C (en) Method for displaying face detection frame, method for displaying character information, and image-taking device
US8659619B2 (en) Display device and method for determining an area of importance in an original image
CN100465757C (en) A photographic device, a method of processing information
JP4904108B2 (en) Imaging apparatus and image display control method
US20110134273A1 (en) Imaging apparatus, control method of imaging apparatus, and computer program
JP2009094725A (en) Imaging method and device
JP4671133B2 (en) Image processing device
US20090102940A1 (en) Imaging device and imaging control method
US8957981B2 (en) Imaging device for capturing self-portrait images
JP4683339B2 (en) Image trimming device
JP3959690B2 (en) Imaging apparatus and imaging method
CN102231801B (en) An electronic camera and an image processing apparatus
JP2007081772A (en) Image processor, method and program
JP4208726B2 (en) Image display device, image display method, and image display program
JP4135100B2 (en) Imaging device
US7720369B2 (en) Image taking apparatus
KR20090039631A (en) Composition determining apparatus, composition determining method, and program
JP5056061B2 (en) Imaging device
JP2006033241A (en) Image pickup device and image acquiring means
JP2006116943A (en) Method and system for printing
JP4640456B2 (en) Image recording apparatus, image recording method, image processing apparatus, image processing method, and program
CN101313565B (en) Electronic camera and image processing device
JP2004040724A (en) Camera apparatus and photographing method of object

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20050610

A761 Written withdrawal of application

Free format text: JAPANESE INTERMEDIATE CODE: A761

Effective date: 20060228