WO2010007865A1 - 撮像装置、撮像方法及びプログラム - Google Patents
撮像装置、撮像方法及びプログラム Download PDFInfo
- Publication number
- WO2010007865A1 WO2010007865A1 PCT/JP2009/061646 JP2009061646W WO2010007865A1 WO 2010007865 A1 WO2010007865 A1 WO 2010007865A1 JP 2009061646 W JP2009061646 W JP 2009061646W WO 2010007865 A1 WO2010007865 A1 WO 2010007865A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- facial expression
- value
- expression value
- determination
- priority
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/175—Static expression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
- H04N1/00328—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
- H04N1/00336—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing pattern recognition, e.g. of a face or a geographic feature
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/21—Intermediate information storage
- H04N1/2104—Intermediate information storage for one or a few pictures
- H04N1/2112—Intermediate information storage for one or a few pictures using still video cameras
- H04N1/2137—Intermediate information storage for one or a few pictures using still video cameras with temporary storage before final recording, e.g. in a frame buffer
- H04N1/2141—Intermediate information storage for one or a few pictures using still video cameras with temporary storage before final recording, e.g. in a frame buffer in a multi-frame buffer
- H04N1/2145—Intermediate information storage for one or a few pictures using still video cameras with temporary storage before final recording, e.g. in a frame buffer in a multi-frame buffer of a sequence of images for selection of a single frame before final recording, e.g. from a continuous sequence captured before and after shutter-release
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/765—Interface circuits between an apparatus for recording and another apparatus
- H04N5/77—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
- H04N5/772—Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/00127—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
- H04N1/00326—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
- H04N1/00328—Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2101/00—Still video cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/0077—Types of the still picture apparatus
- H04N2201/0084—Digital still camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/78—Television signal recording using magnetic recording
- H04N5/781—Television signal recording using magnetic recording on disks or drums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
- H04N5/84—Television signal recording using optical recording
- H04N5/85—Television signal recording using optical recording on discs or drums
Definitions
- the present invention relates to an image pickup apparatus, an image pickup method, and a program for picking up an image using an image pickup device, and in particular, an image pickup apparatus and an image pickup method for automatically storing an image obtained by image pickup by determining the state of a subject. And the program.
- shooting refers to an operation of storing a video obtained by imaging in a storage unit included in the imaging apparatus.
- a human face is detected from a video signal of a subject obtained by imaging, and the detected facial expression is evaluated based on a certain condition.
- photographing is automatically performed.
- a subject includes a plurality of persons
- each person's facial expression is evaluated based on a certain condition, and each evaluated facial expression reaches a certain condition.
- Shooting is performed automatically.
- Patent Document 1 As described above, if the technique disclosed in Patent Document 1 is used, even when a subject includes a plurality of persons, the facial expressions of the persons become predetermined expressions (for example, smiles). It seems that it is possible to perform the shooting intended by the user because the shooting is automatically performed in the case of the camera.
- predetermined expressions for example, smiles
- the present invention provides: Storage means for storing the image of the subject; Detecting means for detecting a human face from a signal obtained by imaging an image of the subject; When the detection unit detects a plurality of human faces, the evaluation unit evaluates each facial expression of the detected plurality of human faces and calculates a plurality of facial expression values obtained by quantifying the evaluated facial expressions.
- a facial expression value for determination for determining whether or not to store the video of the subject is calculated from the plurality of facial expression values.
- the video of the subject is Control means for storing in the storage means.
- An imaging method in an imaging apparatus having a storage means for storing a subject image A process of detecting a human face from a signal obtained by capturing an image of the subject; When detecting a plurality of human faces, a process of evaluating each facial expression of the detected plurality of human faces and calculating a plurality of facial expression values obtained by quantifying the evaluated facial expressions; A facial expression value for determination for determining whether to store an image of the subject is calculated from the plurality of facial expression values, and when the facial expression value for determination is equal to or greater than a predetermined threshold, the video of the subject is And storing it in the storage means.
- an imaging apparatus having a storage means for storing a subject image, A function of detecting a human face from a signal obtained by capturing an image of the subject; When detecting a plurality of human faces, a function of evaluating each facial expression of the detected human faces, and calculating a plurality of facial expression values obtained by quantifying the evaluated facial expressions; A facial expression value for determination for determining whether or not to store the video of the subject is calculated from the plurality of facial expression values. When the facial expression value for determination is equal to or greater than a predetermined threshold, the video of the subject is A function to be stored in the storage means.
- the present invention is configured as described above, it is possible to automatically perform shooting intended by the user without missing a photo opportunity in various shooting scenes.
- FIG. 3 is a flowchart for explaining an example of an operation when a subject including a plurality of persons is photographed in the imaging apparatus shown in FIG. 1. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG.
- FIG. 9 is a flowchart for explaining an example of an operation when a subject including a plurality of persons is photographed in the imaging apparatus shown in FIG. 8. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG. 6 is a flowchart for explaining another example of the operation when photographing an object including a plurality of persons in the imaging apparatus shown in FIG. 1. It is a figure which shows the several face enclosed by the several face display frame currently displayed on the display part shown in FIG.
- FIG. 1 is a diagram showing a first embodiment of an imaging apparatus according to the present invention.
- the imaging apparatus 100 includes a camera module 101, a control unit 110, a buffer memory 111, a display unit 112, and a recording memory 113 that is a storage unit.
- the camera module 101 includes an optical lens unit 102, an image sensor unit 103, an AD (analog / digital) conversion unit 104, a video signal processing unit 105, an IF (interface) unit 106, A drive pulse supply unit 107 and a microcomputer 108 are provided.
- the optical lens unit 102 collects the light reflected by the subject.
- the image sensor unit 103 converts the light collected by the optical lens unit 102 into an analog video signal, and transmits the converted analog video signal to the AD conversion unit 104.
- the AD conversion unit 104 Upon receiving the analog video signal transmitted from the image sensor unit 103, the AD conversion unit 104 converts the received analog video signal into a digital video signal and transmits the digital video signal to the video signal processing unit 105.
- the video signal processing unit 105 includes a video processing unit 105a, a detection unit 105b, and an evaluation unit 105c.
- the video processing unit 105a is a face in the human face detected by the detection unit 105b and video processing such as color interpolation, color correction, and image quality adjustment on the video based on the digital video signal transmitted from the AD conversion unit 104. Video processing is performed to display a face display frame that clearly indicates this.
- the video processing unit 105 a receives a video storage instruction for storing the video based on the digital video signal received from the AD conversion unit 104 in the buffer memory 111 from the microcomputer 108, the digital video received from the AD conversion unit 104 is received.
- the signal is transmitted to the buffer memory 111 via the IF unit 106 and the internal bus 109.
- the detecting unit 105b detects a human face from a video based on the digital video signal transmitted from the AD converting unit 104.
- the evaluation unit 105c evaluates the facial expression of the person detected by the detection unit 105b, and calculates a facial expression value obtained by quantifying the evaluated facial expression. Then, the calculated facial expression value is transmitted to the control unit 110 via the IF unit 106 and the internal bus 109.
- the IF unit 106 mediates transmission / reception of control signals between the microcomputer 108 and the control unit 110.
- the drive pulse supply unit 107 supplies a pulse for driving the image sensor unit 103.
- the microcomputer 108 controls the video signal processing unit 105 and the drive pulse supply unit 107 based on a control signal that enters the camera module 101 via the IF unit 106.
- a video storage instruction for storing a video based on the digital video signal in the buffer memory 111 is transmitted to the video signal processing unit 105.
- the internal bus 109 mediates transmission / reception of digital video signals and control signals between the camera module 101 and the control unit 110, the buffer memory 111, the display unit 112, and the recording memory 113.
- the control unit 110 controls the operation of the imaging apparatus 100, and typical operations include the following.
- a threshold value that is a reference for determining whether or not to perform shooting is stored.
- the facial expression value transmitted from the video signal processing unit 105 is received via the IF unit 106 and the internal bus 109.
- a facial expression value for determination used to determine whether to perform photographing is calculated.
- the calculated facial expression value for determination is compared with a threshold value to determine whether or not to perform shooting.
- a photographing instruction for instructing photographing is transmitted to the microcomputer 108 via the internal bus 109 and the IF unit 106.
- the buffer memory 111 receives the digital video signal transmitted from the video signal processing unit 105 via the IF unit 106 and the internal bus 109, and temporarily stores video based on the received digital video signal.
- the display unit 112 receives the digital video signal transmitted from the video signal processing unit 105 via the IF unit 106, the internal bus 109, and the buffer memory 111, and displays a video based on the received digital video signal.
- the recording memory 113 stores an image based on the digital image signal stored in the buffer memory 111 according to an instruction from the control unit 110.
- FIG. 2 is a flowchart for explaining an example of the operation when the imaging apparatus 100 shown in FIG. 1 captures a subject including a plurality of persons.
- the imaging device 100 is turned on by the user, and the imaging device 100 is activated (step S1).
- the control unit 110 executes initialization.
- initial values such as an operation clock, a shooting angle of view, and an image quality setting of the imaging device 100 are set, and shooting with the imaging device 100 becomes possible.
- the optical lens unit 102 condenses the light reflected by the subject.
- the image sensor unit 103 converts the light collected by the optical lens unit 102 into an analog video signal, and transmits the converted analog video signal to the AD conversion unit 104.
- the AD conversion unit 104 that has received the analog video signal transmitted from the image sensor unit 103 converts the received analog video signal into a digital video signal and transmits the digital video signal to the video signal processing unit 105.
- the video processing unit 105a of the video signal processing unit 105 that has received the digital video signal transmitted from the AD conversion unit 104 performs video processing such as color interpolation, color correction, and image quality adjustment on the video based on the received digital video signal. Transmit to buffer memory 111.
- the display unit 112 that has received the video signal transmitted from the video signal processing unit 105 via the IF unit 106, the internal bus 109, and the buffer memory 111 displays a subject image based on the received video signal (step S2).
- the user determines the composition of the subject while viewing the video of the subject output on the display unit 112.
- the detection unit 105b of the video signal processing unit 105 detects a human face from the digital video signal received from the AD conversion unit 104, and notifies the video processing unit 105a to that effect.
- the video processing unit 105a Upon receiving the notification from the detection unit 105b, the video processing unit 105a transmits to the buffer memory 111 a digital video signal subjected to video processing for displaying a face display frame on the detected face portion of the person.
- the display unit 112 that has received the video signal transmitted from the video signal processing unit 105 via the IF unit 106, the internal bus 109, and the buffer memory 111 is a video in which a face display frame is superimposed on the video of the subject displayed in step S2. Is displayed (step S3).
- FIG. 3 is a diagram showing an example of a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG.
- the description of portions other than the face surrounded by the face display frame is omitted for easy understanding.
- the display unit 112 shown in FIG. 1 displays face display frames 31 to 33 on the face portions of a plurality of persons as subjects.
- the evaluation unit 105c of the video signal processing unit 105 evaluates each facial expression detected by the detection unit 105b and calculates each facial expression value (step S4).
- an expression value calculation method for example, there is a method in which a sample video of an expression is held in the video signal processing unit 105 and is calculated by comparing the sample video with the detected facial expression of a person.
- the evaluation unit 105c transmits the calculated facial expression values to the control unit 110 via the IF unit 106 and the internal bus 109 (step S5).
- the control unit 110 that has received the plurality of facial expression values transmitted from the video signal processing unit 105 calculates a facial expression value for determination from the received plurality of facial expression values using a predetermined method (step S6).
- a predetermined method There are a plurality of methods for calculating the facial expression value for determination from a plurality of facial expression values, and the user can select the method according to the shooting scene. A method for calculating the facial expression value for determination will be described later.
- control unit 110 compares the calculated facial expression value for determination with the threshold stored in itself (step S7).
- the control unit 110 transmits a photographing instruction to the microcomputer 108 via the internal bus 109 and the IF unit 106. (Step S8).
- the microcomputer 108 that has received the shooting instruction transmitted from the control unit 110 transmits a video storage instruction for storing the video based on the digital video signal received by the video signal processing unit 105 in the buffer memory 111 to the video signal processing unit 105. (Step S9).
- the control unit 110 does not transmit a photographing instruction to the microcomputer 108. In this case, returning to the operation of step S4, the video signal processing unit 105 continues the operation of calculating the facial expression value and the operation of transmitting the calculated facial expression value to the control unit 110. Further, the control unit 110 calculates a facial expression value for determination from a plurality of facial expression values received from the video signal processing unit 105 using a predetermined method, and continues the operation of comparing with a threshold value.
- the video processing unit 105 a of the video signal processing unit 105 that has received the video storage instruction transmitted from the microcomputer 108 transmits the digital video signal received from the AD conversion unit 104 to the buffer memory 111 via the IF unit 106 and the internal bus 109. To do.
- the buffer memory 111 that has received the digital video signal transmitted from the video signal processing unit 105 via the IF unit 106 and the internal bus 109 temporarily stores a video based on the received digital video signal (step S10).
- control unit 110 acquires the video stored in the buffer memory 111, performs video processing necessary to store the video in the recording memory 113, and displays the video subjected to the video processing as the recording memory 113. (Step S11). Note that shooting is completed by this operation.
- the above is the operation when photographing an object including a plurality of persons in the imaging apparatus of the present embodiment.
- control unit 110 calculates the facial expression value for determination in step S6 of the above-described operation flow and the comparison between the calculated facial expression value for determination and the threshold value performed by the control unit 110 in step S7 will be described. Will be described with reference to the drawings.
- the facial expression value is calculated by comparing the detected facial expression with the sample video held by the video signal processing unit 105. Specifically, the video signal processing unit 105 calculates a ratio at which the detected facial expression deviates from the sample video, and sets a value indicating the ratio as a percentage as the expression value.
- the threshold is set to 70%.
- FIG. 4 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 1, and FIG. 4A shows the highest facial expression value among the plurality of faces.
- the figure which shows a case where a facial expression value is lower than a threshold value (b) is a figure which shows the case where the highest facial expression value is more than a threshold value among the facial expression values of several face.
- FIG. 4 the description of portions other than the face surrounded by the face display frame is omitted for easy understanding. The same applies to FIGS. 5 to 7 referred to in the following description.
- the control unit 110 determines that the determination facial expression value is lower than the threshold value, and does not transmit a shooting instruction to the microcomputer 108.
- the control unit 110 determines that the determination facial expression value is equal to or greater than the threshold value, and transmits a shooting instruction to the microcomputer 108.
- FIG. 5 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 1, and (a) is the lowest of the expression values of the plurality of faces.
- (b) is a figure which shows the case where the lowest expression value is more than a threshold value among the expression values of several faces.
- the control unit 110 determines that the determination facial expression value is lower than the threshold value, and does not transmit a shooting instruction to the microcomputer 108.
- the control unit 110 determines that the determination facial expression value is equal to or greater than the threshold value, and transmits a shooting instruction to the microcomputer 108.
- FIG. 6 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 1, and FIG. 6A shows an average value of expression values of a plurality of faces as a threshold value.
- (B) is a figure which shows the case where the average value of the facial expression value of several faces is more than a threshold value.
- step S ⁇ b> 7 of the above-described operation flow the control unit 110 determines that the determination facial expression value is lower than the threshold value, and does not transmit a shooting instruction to the microcomputer 108.
- the control unit 110 determines that the determination facial expression value is equal to or greater than the threshold value, and transmits a shooting instruction to the microcomputer 108.
- FIG. 7 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 1, and FIG. 7A shows the highest facial expression value among the plurality of faces.
- step S ⁇ b> 7 of the above-described operation flow the control unit 110 determines that the determination facial expression value is lower than the threshold value, and does not transmit a shooting instruction to the microcomputer 108.
- the control unit 110 determines that the determination facial expression value is equal to or greater than the threshold value, and transmits a shooting instruction to the microcomputer 108.
- the facial expression values to be excluded are not limited to two facial expression values, ie, the highest facial expression value and the lowest facial expression value, and a predetermined number of upper and lower facial expressions can be excluded from a plurality of facial expression values.
- the threshold value that is a reference for photographing is described as 70%, but this threshold value can be arbitrarily set by the user.
- the control unit 110 has only one threshold value that serves as a reference when determining whether to perform shooting.
- a priority is added to an expression value, and a priority expression value threshold for an expression value to which priority is added and an expression value to which no priority is added are used. A case where a plurality of threshold values with the non-priority facial expression value threshold value are provided will be described.
- FIG. 8 is a diagram showing a second embodiment of the imaging apparatus of the present invention.
- the video signal processing unit 105 shown in FIG. 1 includes a priority face determination unit 105d in addition to the video processing unit 105a, the detection unit 105b, and the evaluation unit 105c.
- This priority face determination unit 105d determines the priority of the facial expression value of the person detected by the detection unit 105b.
- FIG. 9 is a flowchart for explaining an example of the operation when the imaging apparatus 100 shown in FIG. 8 photographs a subject including a plurality of persons.
- step S51 to S54 are the same as the operations from step S1 to S4 in the flowchart shown in FIG.
- the priority face determination unit 105d calculates the area of the face display frame displayed for each of the human faces detected by the detection unit 105b (step S55).
- the priority face determination unit 105d determines the expression value of the face having the largest face display frame as the priority expression value that is an expression value with high priority (step S56). The determined priority facial expression value is notified to the evaluation unit 105c.
- the evaluation unit 105 c transmits the calculated facial expression values to the control unit 110 via the IF unit 106 and the internal bus 109. At this time, the evaluation unit 105c adds the information indicating that to the priority facial expression value and transmits it (step S57).
- the control unit 110 that has received the facial expression value transmitted from the video signal processing unit 105 extracts a priority facial expression value from the received facial expression value (step S58).
- the control unit 110 that has extracted the priority facial expression value compares the extracted priority facial expression value with the threshold value for the priority facial expression value stored therein, and the non-priority facial expression value that is a facial expression value other than the priority facial expression value and itself. Is compared with the threshold value for non-priority facial expression value stored (step S59).
- FIG. 10 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 8, and (a) shows that the priority expression value is equal to or higher than the priority expression value threshold value.
- FIG. 5B is a diagram illustrating a case where the non-priority facial expression value is equal to or greater than the threshold value for non-priority facial expression value; FIG. It is a figure which shows the case where it is lower.
- the priority expression value threshold is set to 70%
- the non-priority expression value threshold is set to 50%.
- the description of parts other than the face surrounded by the face display frame is omitted for easy understanding.
- the expression values of the three faces 81 to 83 are 50%, 80%, and 50%, respectively.
- the face display frame displayed on the face 82 has a larger area than the face display frames displayed on the other faces 81 and 83. Therefore, 80%, which is the facial expression value of the face 82, becomes the priority facial expression value, and this value is equal to or greater than the priority facial expression value threshold.
- the expression values of the face 81 and the face 83 are both 50%, and this value is equal to or greater than the non-priority expression value threshold. Therefore, the control unit 110 transmits a shooting instruction to the microcomputer 108. (Step S60).
- the expression values of the three faces 84 to 86 are 0%, 80%, and 50%, respectively.
- the face display frame displayed on the face 85 has a larger area than the face display frames displayed on the other faces 84 and 86. Therefore, 80%, which is the facial expression value of the face 85, becomes the priority facial expression value, and this value is equal to or greater than the priority facial expression value threshold.
- the facial expression value of the face 84 is 0%, which is lower than the non-priority facial expression value threshold. Therefore, the control unit 110 does not transmit a shooting instruction to the microcomputer 108.
- the video signal processing unit 105 continues the calculation of the expression value and the area of the face display frame and the operation of transmitting the calculated expression value to the control unit 110. Further, the control unit 110 continues the operation of extracting the priority facial expression value from the plurality of facial expression values received from the video signal processing unit 105 and comparing the extracted priority facial expression value with the priority facial expression value threshold.
- Steps S61 to S63 which are subsequent operations, are the same as steps S9 to S11 in the flowchart shown in FIG.
- the priority is determined by the area of the face display frame.
- the value may be a priority facial expression value.
- each non-priority facial expression value is compared with a threshold value for non-priority facial expression value.
- An average value of facial expression values other than the priority facial expression value is set as a non-priority facial expression value. May be compared with a non-priority facial expression value threshold.
- a priority facial expression value having a high priority is selected from a plurality of facial expression values, and photographing is performed based on the priority facial expression value. Therefore, it is possible to automatically perform photographing that captures the user's intention assumed in the photographing scene, in which it is desired to emphasize the facial expression of a specific person among a plurality of persons.
- the threshold value serving as a reference when determining whether or not shooting is performed is a fixed value in one shooting.
- the threshold is changed based on a plurality of facial expression values received by the control unit 110 will be described.
- the initially set threshold is 70%.
- the highest facial expression value is compared with the lowest facial expression value, and when the facial expression value difference that is the difference is equal to or larger than a certain value, the initially set threshold value is changed. Specifically, when the facial expression value difference is 80% or more, the threshold value is changed, and the changed threshold value is 50%.
- the configuration of the imaging device is the same as that of the imaging device 100 shown in FIG.
- FIG. 11 is a flowchart for explaining another example of the operation when the subject including a plurality of persons is photographed in the imaging apparatus shown in FIG.
- step S101 to S105 are the same as the operations from step S1 to S5 in the flowchart shown in FIG.
- the control unit 110 that has received the plurality of facial expression values transmitted from the video signal processing unit 105 compares the highest facial expression value with the lowest facial expression value among the received facial expression values (step S106).
- FIG. 12 is a diagram showing a plurality of faces surrounded by a plurality of face display frames displayed on the display unit 112 shown in FIG. 1, and FIG. 12A is a diagram showing a case where a threshold value is changed. b) is a diagram showing a case where the threshold value is not changed. In FIG. 12, for the sake of easy understanding, description of portions other than the face surrounded by the face display frame is omitted.
- the expression values of the three faces 91 to 93 are 80%, 50%, and 0%, respectively.
- the face 91 (expression value 80%) shows the highest expression value
- the face 93 (expression value 0%) shows the lowest expression value.
- the difference in values is 80% or more.
- control unit 110 changes the threshold value (step S107).
- control part 110 calculates the facial expression value for a determination (step S108).
- the average value of the facial expression values of the faces 91 to 93 is calculated in order to use the average value of the plurality of facial expression values as the determination facial expression value.
- the expression values of the three faces 94 to 96 are 100%, 50%, and 70%, respectively.
- the face 94 shows the highest expression value
- the face 95 shows the lowest expression value.
- the difference in facial expression values is less than 80%.
- control unit 110 proceeds to the operation of step S108 without changing the threshold value, and calculates a facial expression value for determination.
- Steps S109 to S113, which are subsequent operations, are the same as steps S7 to S11 of the flowchart shown in FIG.
- the highest facial expression value is compared with the lowest facial expression value, and the case where the threshold is changed when the difference is equal to or greater than a certain value has been described.
- the value is not limited to the highest facial expression value and the lowest facial expression value.
- the threshold value can be changed when the difference between the lowest facial expression value and the second lowest facial expression value is equal to or greater than a certain value.
- the threshold value is automatically changed when the difference between facial expression values of a plurality of facial expression values is large, so even if there are individual differences in the facial expressions of a plurality of persons on the subject. It is possible to automatically shoot.
- the case where the number of detected human faces is three has been described as an example.
- the number of faces is not limited to three and may be more or less than three. .
- imaging device 100 can be applied to all imaging devices such as an electronic still camera and a camera-equipped mobile phone having a configuration using the camera module 101 shown in FIGS.
- the detection unit, the evaluation unit, and the priority face determination unit have been described as being configured in the video signal processing unit 105, but the detection unit, the evaluation unit, and the priority face determination unit are the control unit. It is also possible to implement it by mounting it on 110.
- the processing in the imaging apparatus is recorded on a recording medium that can be read by the imaging apparatus, in addition to what is realized by the dedicated hardware described above.
- the program recorded on the recording medium may be read by the imaging apparatus and executed.
- the recording medium that can be read by the imaging apparatus refers to a transfer medium such as a floppy disk, a magneto-optical disk, a DVD, and a CD, as well as an HDD built in the imaging apparatus.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
- Television Signal Processing For Recording (AREA)
- Image Processing (AREA)
Abstract
Description
被写体の映像を記憶するための記憶手段と、
前記被写体の映像の撮像による信号から人物の顔を検出する検出手段と、
前記検出手段が複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する評価手段と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる制御手段と、を有する。
前記被写体の映像の撮像による信号から人物の顔を検出する処理と、
複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する処理と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる処理と、を有する。
前記被写体の映像の撮像による信号から人物の顔を検出する機能と、
複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する機能と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる機能と、を実現させる。
(第1の実施の形態)
図1は、本発明の撮像装置の第1の実施の形態を示す図である。
(1)撮影を行うかどうかを判定する際の基準となる閾値を記憶する。
(2)映像信号処理部105から送信された表情値をIF部106及び内部バス109を介して受信する。
(3)受信した表情値から、撮影を行うかどうかの判定をするために使用される判定用表情値を算出する。
(4)算出した判定用表情値と閾値とを比較し、撮影を行うかどうかの判定を行う。
(5)撮影を行う判定をした場合、撮影を指示するための撮影指示を内部バス109及びIF部106を介してマイコン108へ送信する。
(第2の実施の形態)
上述した第1の実施の形態では、制御部110が記憶している撮影を行うかどうかを判定する際の基準となる閾値は1つだけであった。以下に説明する第2に実施の形態では、表情値に優先度を付加し、優先度が付加された表情値のための優先表情値用閾値と、優先度が付加されていない表情値のための非優先表情値用閾値との複数の閾値を設ける場合について説明する。
(第3の実施の形態)
上述した第1の実施の形態及び第2の実施の形態では、撮影が行われるかどうかを判定する際の基準となる閾値は、1回の撮影において固定値であった。以下に説明する第3の実施の形態では、制御部110が受信する複数の表情値に基づき、閾値が変更される場合について説明する。
Claims (21)
- 被写体の映像を記憶するための記憶手段と、
前記被写体の映像の撮像による信号から人物の顔を検出する検出手段と、
前記検出手段が複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する評価手段と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる制御手段と、を有する撮像装置。 - 請求項1に記載の撮像装置において、
前記制御手段は、前記複数の表情値のうち最も高い表情値を前記判定用表情値とする撮像装置。 - 請求項1に記載の撮像装置において、
前記制御手段は、前記複数の表情値のうち最も低い表情値を前記判定用表情値とする撮像装置。 - 請求項1に記載の撮像装置において、
前記制御手段は、前記複数の表情値の平均値を前記判定用表情値とする撮像装置。 - 請求項1に記載の撮像装置において、
前記制御手段は、前記複数の表情値から少なくとも1つの表情値を除外した残りの表情値の平均値を前記判定用表情値とする撮像装置。 - 請求項1に記載の撮像装置において、
前記検出手段が検出した複数の人物の顔から優先的な顔を決定する優先顔決定手段を有し、
前記制御手段は、前記優先的な顔の表情値である優先表情値が予め決められた優先表情値用閾値以上であり、かつ、前記複数の表情値のうち前記優先表情値以外の表情値である非優先表情値が予め決められた非優先表情値用閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる撮像装置。 - 請求項1に記載の撮像装置において、
前記制御手段は、前記複数の表情値から少なくとも2つの表情値を選択し、前記選択された表情値の差である表情値差を算出し、前記表情値差が一定値以上である場合、前記予め決められた閾値を変更する撮像装置。 - 被写体の映像を記憶するための記憶手段を有する撮像装置における撮像方法であって、
前記被写体の映像の撮像による信号から人物の顔を検出する処理と、
複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する処理と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる処理と、を有する撮像方法。 - 請求項8に記載の撮像方法において、
前記複数の表情値のうち最も高い表情値を前記判定用表情値とする処理を有する撮像方法。 - 請求項8に記載の撮像方法において、
前記複数の表情値のうち最も低い表情値を前記判定用表情値とする処理を有する撮像方法。 - 請求項8に記載の撮像方法において、
前記複数の表情値の平均値を前記判定用表情値とする処理を有する撮像方法。 - 請求項8に記載の撮像方法において、
前記複数の表情値から少なくとも1つの表情値を除外した残りの表情値の平均値を前記判定用表情値とする処理を有する撮像方法。 - 請求項8に記載の撮像方法において、
検出した複数の人物の顔から優先的な顔を決定する処理と、
前記優先的な顔の表情値である優先表情値が予め決められた優先表情値用閾値以上であり、かつ、前記複数の表情値のうち前記優先表情値以外の表情値である非優先表情値が予め決められた非優先表情値用閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる処理と、を有する撮像方法。 - 請求項8に記載の撮像方法において、
前記複数の表情値から少なくとも2つの表情値を選択する処理と、
前記選択された表情値の差である表情値差を算出する処理と、
前記表情値差が一定値以上である場合、前記予め決められた閾値を変更する処理と、を有する撮像方法。 - 被写体の映像を記憶するための記憶手段を有する撮像装置に、
前記被写体の映像の撮像による信号から人物の顔を検出する機能と、
複数の人物の顔を検出した場合、前記検出された複数の人物の顔それぞれの表情を評価し、前記評価されたそれぞれの表情を数値化した複数の表情値を算出する機能と、
前記複数の表情値から、前記被写体の映像を記憶するかどうかを判定するための判定用表情値を算出し、前記判定用表情値が予め決められた閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる機能と、を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
前記複数の表情値のうち最も高い表情値を前記判定用表情値とする機能を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
前記複数の表情値のうち最も低い表情値を前記判定用表情値とする機能を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
前記複数の表情値の平均値を前記判定用表情値とする機能を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
前記複数の表情値から少なくとも1つの表情値を除外した残りの表情値の平均値を前記判定用表情値とする機能を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
検出した複数の人物の顔から優先的な顔を決定する機能と、
前記優先的な顔の表情値である優先表情値が予め決められた優先表情値用閾値以上であり、かつ、前記複数の表情値のうち前記優先表情値以外の表情値である非優先表情値が予め決められた非優先表情値用閾値以上である場合、前記被写体の映像を前記記憶手段に記憶させる機能と、を実現させるためのプログラム。 - 請求項15に記載のプログラムにおいて、
前記複数の表情値から少なくとも2つの表情値を選択する機能と、
前記選択された表情値の差である表情値差を算出する機能と、
前記表情値差が一定値以上である場合、前記予め決められた閾値を変更する機能と、を実現させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010520815A JP5482654B2 (ja) | 2008-07-17 | 2009-06-25 | 撮像装置、撮像方法及びプログラム |
CN2009801280276A CN102100062A (zh) | 2008-07-17 | 2009-06-25 | 成像设备、成像方法和程序 |
US13/003,845 US20110109770A1 (en) | 2008-07-17 | 2009-06-25 | Imaging apparatus, imaging method, and program |
EP20090797793 EP2309723A4 (en) | 2008-07-17 | 2009-06-25 | IMAGING DEVICE, IMAGING METHOD, AND PROGRAM |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-186179 | 2008-07-17 | ||
JP2008186179 | 2008-07-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010007865A1 true WO2010007865A1 (ja) | 2010-01-21 |
Family
ID=41550273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/061646 WO2010007865A1 (ja) | 2008-07-17 | 2009-06-25 | 撮像装置、撮像方法及びプログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20110109770A1 (ja) |
EP (1) | EP2309723A4 (ja) |
JP (2) | JP5482654B2 (ja) |
CN (1) | CN102100062A (ja) |
WO (1) | WO2010007865A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019159939A (ja) * | 2018-03-14 | 2019-09-19 | 京セラドキュメントソリューションズ株式会社 | 電子機器及び画像形成装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013186512A (ja) * | 2012-03-06 | 2013-09-19 | Sony Corp | 画像処理装置および方法、並びにプログラム |
US9241102B2 (en) * | 2012-12-05 | 2016-01-19 | Xerox Corporation | Video capture of multi-faceted documents |
WO2015017868A1 (en) * | 2013-08-02 | 2015-02-05 | Emotient | Filter and shutter based on image emotion content |
US9247136B2 (en) | 2013-08-21 | 2016-01-26 | Xerox Corporation | Automatic mobile photo capture using video analysis |
KR102192704B1 (ko) | 2013-10-22 | 2020-12-17 | 엘지전자 주식회사 | 영상 출력 장치 |
CN104899544B (zh) * | 2014-03-04 | 2019-04-12 | 佳能株式会社 | 图像处理装置和图像处理方法 |
US9456123B2 (en) | 2014-12-18 | 2016-09-27 | Xerox Corporation | Method and system to configure mobile electronic device settings using remote data store analytics |
CN106203332A (zh) * | 2016-07-08 | 2016-12-07 | 北京光年无限科技有限公司 | 基于智能机器人视觉识别人脸面部表情变化的方法及系统 |
CN107249100A (zh) * | 2017-06-30 | 2017-10-13 | 北京金山安全软件有限公司 | 一种拍照方法、装置、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005234686A (ja) * | 2004-02-17 | 2005-09-02 | Fuji Xerox Co Ltd | 表情認識装置、表情認識方法、およびプログラム |
JP2006237803A (ja) * | 2005-02-23 | 2006-09-07 | Konica Minolta Photo Imaging Inc | 撮像システム、写真撮影スタジオ、撮像システムの制御方法 |
JP2008027086A (ja) * | 2006-07-19 | 2008-02-07 | Sony Computer Entertainment Inc | 表情誘導装置および表情誘導方法、表情誘導システム |
JP2008042319A (ja) | 2006-08-02 | 2008-02-21 | Sony Corp | 撮像装置および方法、表情評価装置およびプログラム |
JP2008186179A (ja) | 2007-01-29 | 2008-08-14 | Hitachi Ltd | ストレージシステム |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7298412B2 (en) * | 2001-09-18 | 2007-11-20 | Ricoh Company, Limited | Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program |
JP4492273B2 (ja) * | 2004-09-22 | 2010-06-30 | 株式会社ニコン | 撮像装置およびプログラム |
JP2006115406A (ja) * | 2004-10-18 | 2006-04-27 | Omron Corp | 撮像装置 |
WO2007000685A1 (en) * | 2005-06-28 | 2007-01-04 | Koninklijke Philips Electronics N.V. | Method of operating a camera for taking electronic images, camera for taking electronic images |
JP5239126B2 (ja) * | 2006-04-11 | 2013-07-17 | 株式会社ニコン | 電子カメラ |
JP2007282119A (ja) * | 2006-04-11 | 2007-10-25 | Nikon Corp | 電子カメラおよび画像処理装置 |
JP4431547B2 (ja) * | 2006-04-14 | 2010-03-17 | 富士フイルム株式会社 | 画像表示制御装置ならびにそれらの制御方法およびそれらの制御プログラム |
JP4264660B2 (ja) * | 2006-06-09 | 2009-05-20 | ソニー株式会社 | 撮像装置、および撮像装置制御方法、並びにコンピュータ・プログラム |
JP4218712B2 (ja) * | 2006-08-04 | 2009-02-04 | ソニー株式会社 | 顔検出装置、撮像装置および顔検出方法 |
JP4264663B2 (ja) * | 2006-11-21 | 2009-05-20 | ソニー株式会社 | 撮影装置、画像処理装置、および、これらにおける画像処理方法ならびに当該方法をコンピュータに実行させるプログラム |
JP4720880B2 (ja) * | 2008-09-04 | 2011-07-13 | ソニー株式会社 | 画像処理装置、撮像装置、画像処理方法およびプログラム |
-
2009
- 2009-06-25 US US13/003,845 patent/US20110109770A1/en not_active Abandoned
- 2009-06-25 JP JP2010520815A patent/JP5482654B2/ja not_active Expired - Fee Related
- 2009-06-25 WO PCT/JP2009/061646 patent/WO2010007865A1/ja active Application Filing
- 2009-06-25 CN CN2009801280276A patent/CN102100062A/zh active Pending
- 2009-06-25 EP EP20090797793 patent/EP2309723A4/en not_active Withdrawn
-
2013
- 2013-11-06 JP JP2013230208A patent/JP5681871B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005234686A (ja) * | 2004-02-17 | 2005-09-02 | Fuji Xerox Co Ltd | 表情認識装置、表情認識方法、およびプログラム |
JP2006237803A (ja) * | 2005-02-23 | 2006-09-07 | Konica Minolta Photo Imaging Inc | 撮像システム、写真撮影スタジオ、撮像システムの制御方法 |
JP2008027086A (ja) * | 2006-07-19 | 2008-02-07 | Sony Computer Entertainment Inc | 表情誘導装置および表情誘導方法、表情誘導システム |
JP2008042319A (ja) | 2006-08-02 | 2008-02-21 | Sony Corp | 撮像装置および方法、表情評価装置およびプログラム |
JP2008186179A (ja) | 2007-01-29 | 2008-08-14 | Hitachi Ltd | ストレージシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP2309723A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019159939A (ja) * | 2018-03-14 | 2019-09-19 | 京セラドキュメントソリューションズ株式会社 | 電子機器及び画像形成装置 |
Also Published As
Publication number | Publication date |
---|---|
JP5482654B2 (ja) | 2014-05-07 |
JPWO2010007865A1 (ja) | 2012-01-05 |
EP2309723A4 (en) | 2011-08-31 |
JP2014090420A (ja) | 2014-05-15 |
CN102100062A (zh) | 2011-06-15 |
US20110109770A1 (en) | 2011-05-12 |
JP5681871B2 (ja) | 2015-03-11 |
EP2309723A1 (en) | 2011-04-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5681871B2 (ja) | 撮像装置、撮像方法及びプログラム | |
US7706674B2 (en) | Device and method for controlling flash | |
JP5144422B2 (ja) | 撮影装置及び撮影方法 | |
JP4356778B2 (ja) | 画像撮影装置及び画像撮影方法、並びにコンピュータ・プログラム | |
US8749687B2 (en) | Apparatus and method of capturing jump image | |
US7787019B2 (en) | Camera and shooting control method therefor | |
US20080181460A1 (en) | Imaging apparatus and imaging method | |
US20080025710A1 (en) | Image taking system | |
JP4657960B2 (ja) | 撮像方法および装置 | |
JP5623256B2 (ja) | 撮像装置、その制御方法及びプログラム | |
JP2007311861A (ja) | 撮影装置及び方法 | |
TWI492618B (zh) | 攝影裝置及電腦可讀取記錄媒體 | |
JP5030022B2 (ja) | 撮像装置及びそのプログラム | |
JP2008139683A (ja) | 撮像装置及びオートフォーカス制御方法 | |
JP4799501B2 (ja) | 撮影装置、撮影装置の制御方法およびプログラム | |
JP2007049484A (ja) | デジタルカメラ | |
KR101721226B1 (ko) | 사용자가 원하는 시점의 정지 영상을 획득할 수 있는 디지털 영상 신호 처리 장치 및 이의 제어 방법 | |
KR101630304B1 (ko) | 디지털 촬영 장치, 그 제어 방법, 및 컴퓨터 판독가능 매체 | |
JP4998122B2 (ja) | 撮像装置及びそのプログラム | |
JP5328528B2 (ja) | 撮像装置、撮像装置の制御方法、及びコンピュータプログラム | |
JP2009017427A (ja) | 撮像装置 | |
JP2009182880A (ja) | 撮像装置及びそのプログラム | |
JP2008172395A (ja) | 撮像装置、画像処理装置、方法およびプログラム | |
US8073319B2 (en) | Photographing method and photographing apparatus based on face detection and photography conditions | |
JP4208796B2 (ja) | 撮像装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980128027.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09797793 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009797793 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010520815 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13003845 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |