JP2006005662A - Electronic camera and electronic camera system - Google Patents

Electronic camera and electronic camera system Download PDF

Info

Publication number
JP2006005662A
JP2006005662A JP2004179983A JP2004179983A JP2006005662A JP 2006005662 A JP2006005662 A JP 2006005662A JP 2004179983 A JP2004179983 A JP 2004179983A JP 2004179983 A JP2004179983 A JP 2004179983A JP 2006005662 A JP2006005662 A JP 2006005662A
Authority
JP
Japan
Prior art keywords
electronic camera
unit
position
imaging
subject
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2004179983A
Other languages
Japanese (ja)
Other versions
JP4442330B2 (en
Inventor
Yasuyuki Motoki
康之 元木
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2004179983A priority Critical patent/JP4442330B2/en
Publication of JP2006005662A publication Critical patent/JP2006005662A/en
Application granted granted Critical
Publication of JP4442330B2 publication Critical patent/JP4442330B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide technology for securely photographing a subject without missing a shutter chance when the subject is completely ready. <P>SOLUTION: This electronic camera has an imaging part, a detection part, and a timing decision part. The imaging part performs imaging operation to generate image data by imaging the subject. The detection part performs detection processing for detecting the position of a face area of the subject. The timing decision part makes the detection part perform the detection processing and decides whether the movement quantity of the position of the face area is within a specified range based upon comparison with the position of the face area which is detected before each time the detection processing is performed. The timing decision part when deciding that the movement quantity is within the specified range in a specified period or at a specified comparison frequency, commands the imaging part to perform the imaging operation. Therefore, when the subject is completely ready, the subject can securely be photographed without missing a shutter chance. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention relates to an electronic camera having a function of detecting a face area from an image, and an electronic camera system. The present invention also relates to a technique for performing self-photographing.

  When photographing a subject that is an animal, it is necessary to press the release button in accordance with the timing when the animal makes a desired pose or expression. In self-photographing, since it is necessary to consider the time for the photographer to move to the photographing position after the photographer operates the camera, for example, a self-timer function is used. However, it is not always easy to press the release button when the animal poses in a predetermined state, or to automatically shoot when the photographer is ready at the shooting position in self-shooting.

Therefore, Patent Document 1 detects the orientation of the face of the subject image included in the image in order to release at the timing when the subject has a desired pose, and automatically shoots when the subject faces in a predetermined direction. . Japanese Patent Application Laid-Open No. 2004-228561 allows the photographer to check whether the captured image obtained by the self-timer photographing is appropriate. Specifically, the captured image display time after shooting in the self-timer shooting mode is set longer than that in the normal shooting mode. Thus, an attempt is made so that the display of the captured image does not end until the photographer who is the subject returns to the installation position of the camera after shooting.
JP 2001-357404 A JP 2001-86374 A

Even if it is detected that the face of the subject is facing a predetermined direction, the subject is not always ready at the moment of detection. Therefore, in the method of Patent Document 1, there is a possibility that the subject will be released when the subject is not ready, resulting in a failed photo.
According to Patent Document 2, the photographer can easily confirm whether or not the captured image obtained by the self-timer photographing is appropriate. However, this method does not automatically shoot at the timing when the photographer is ready at the shooting position, and cannot prevent the shooting as a failed photo. Therefore, it cannot be said that the method of Patent Document 2 essentially solves the problem. For this reason, in self-photographing, there has been a demand for a technique for automatically confirming that the photographer has reached the photographing position without imposing an operation burden on the photographer.

An object of the present invention is to provide a technique for reliably capturing an image without missing a photo opportunity when a subject is ready.
Another object of the present invention is to provide a technique for automatically confirming that a photographer has reached a photographing position and taking a picture without placing an operation burden on the photographer in self-photographing.

  According to another aspect of the present invention, there is provided an electronic camera including an imaging unit, a detection unit, and a timing determination unit. The imaging unit performs an imaging operation for imaging a subject and generating image data. The detection unit performs detection processing for detecting the position of the face area of the subject. The timing determination unit repeatedly performs the detection process, and determines whether or not the amount of motion of the face area position is within a predetermined range based on the comparison with the previously detected face area position. Furthermore, the timing determination unit instructs the imaging unit to perform an imaging operation when it is determined to be within the predetermined range in a predetermined period or a predetermined number of comparisons.

According to a second aspect of the present invention, in the electronic camera according to the first aspect, “after the detection unit detects the position of the face region, the timing determination unit focuses the detected lens on the detected face region before issuing the command. It is equipped with a focus control unit for matching.
The invention according to claim 3 is the electronic camera according to claim 1 or 2, characterized by the following points. First, the timing determination unit repeatedly performs the imaging operation when the detection process is repeatedly performed. Second, the detection unit detects the position of the face area using the image data generated by the imaging unit.

  According to a fourth aspect of the present invention, the electronic camera according to any one of the first to third aspects is characterized by the following points. First, it has an operation member for instructing the imaging unit to perform an imaging operation for a still image in synchronization with a photographer's operation. Second, the processing of the detection unit and the timing determination unit is performed when the electronic camera is set to “shooting mode in which a still image capturing operation is performed at a timing independent of the operation of the operation member”. .

  According to a fifth aspect of the present invention, there is provided an electronic camera comprising an imaging unit, a counting unit, and an imaging command unit. The imaging unit performs an imaging operation for imaging a subject and generating image data. The counting unit performs a counting process of counting the number of subjects by detecting each subject. The imaging command unit repeatedly performs the counting process, determines whether or not the number of subjects has increased every time the counting process is performed, and instructs the imaging unit to perform an imaging operation after determining that the number has increased.

  The invention of claim 6 is the electronic camera according to claim 5, characterized by the following points. First, a position detection unit that performs a position detection process for detecting the position of the face area of the subject is provided after it is determined that the shooting command unit has increased. Secondly, the imaging command unit repeatedly performs position detection processing after determining that the increase has occurred, and the amount of motion of the position of the face area is predetermined based on a comparison with the position of the face area detected previously. Whether or not it is within the range is determined every time the position detection process is performed, and the command is issued when it is determined that it is within the predetermined range within a predetermined period or a predetermined number of comparisons.

According to a seventh aspect of the present invention, in the electronic camera according to the sixth aspect of the present invention, “after the position detecting unit detects the position of the face region, the photographing lens is placed on the detected face region before the command is issued. It is equipped with a focus control unit for focusing the camera.
The invention according to claim 8 is the electronic camera according to claim 6 or 7, characterized by the following points. First, the imaging command unit repeatedly performs the imaging operation when repeatedly performing the position detection process. Secondly, the position detection unit detects the position of the face region using the image data generated by the imaging unit.

  The invention of claim 9 is the electronic camera according to any one of claims 5 to 8, characterized by the following points. First, it has an operation member for instructing the imaging unit to perform an imaging operation for a still image in synchronization with a photographer's operation. Second, the processing of the counting unit and the shooting command unit is performed when the electronic camera is set to “shooting mode in which a still image pickup operation is performed at a timing independent of the operation of the operation member”. .

  According to a tenth aspect of the present invention, there is provided an electronic camera system capable of controlling a timing at which the electronic camera images a subject. The electronic camera system according to the present invention includes a detection unit and a timing determination unit. The detection unit performs detection processing for detecting the position of the face area of the subject. The timing determination unit repeatedly performs the detection process, and determines whether or not the amount of movement of the face area position is within a predetermined range based on the comparison with the previously detected face area position. When it is determined to be within a predetermined range for a predetermined period or a predetermined number of comparisons, the electronic camera is instructed to capture an image.

  The invention of claim 11 is an electronic camera system capable of controlling the timing at which the electronic camera captures an image of a subject. According to another aspect of the present invention, there is provided an electronic camera system including a count unit and a photographing command unit. The counting unit performs a counting process of counting the number of subjects by detecting each subject. The imaging command unit repeatedly performs the counting process, determines whether or not the number of subjects has increased every time the counting process is performed, and instructs the electronic camera to perform imaging after determining that the number has increased.

In one embodiment of the present invention, the position of the face area is repeatedly detected, and when the amount of movement is determined to be within a predetermined range, an imaging operation is commanded to the imaging unit. Therefore, when the subject is ready, it is possible to reliably shoot without missing a photo opportunity.
In another embodiment of the present invention, the number of subjects is repeatedly counted, and after determining that the number of subjects has increased, an imaging operation is commanded to the imaging unit. Therefore, in self-photographing, it is possible to automatically confirm that the photographer has reached the photographing position without imposing an operation burden on the photographer.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.
<Configuration of this embodiment>
FIG. 1 is a block diagram of the electronic camera of this embodiment. As shown in the figure, the electronic camera 8 includes a photographing lens 12, a focal plane shutter 16, an image sensor 20, an analog signal processing unit 24, an A / D conversion unit 28, a timing generator 32, and a focus control unit. 36, a signal output unit 38, an operation unit 40, an MPU (Micro Processor Unit) 44, a system bus 48, an image processing unit 52, a memory 56, a card interface 64, and a replaceable memory card (recording). Medium) 68, USB (Universal Serial BUS) 72, USB interface 76, monitor orientation detection unit 80, monitor control unit 84, and liquid crystal monitor 88. System control of the electronic camera 8 is performed by the MPU 44.

  FIG. 2 is a perspective view showing the external appearance of the electronic camera 8. As shown in the figure, the camera body of the electronic camera 8 includes a lens side body 92 and a monitor side body 94. Inside the lens side body 92 and the monitor side body 94, a rotating shaft mechanism 96 for connecting both is incorporated. Both of them can rotate around a dashed line in the figure (free angle monitor). Therefore, at the time of shooting, the liquid crystal monitor 88 can be directed to the photographer side as shown in the figure, and the liquid crystal monitor 88 can be directed to the subject side (opposite the figure).

  The operation unit 40 includes an operation button group 98 including a power button, a shooting mode selection button, a shooting condition input button, a release button, and the like. The electronic camera 12 of this embodiment is mainly characterized by the functions of the MPU 44 and the image processing unit 52 in a face recognition first mode, a face recognition second mode, a self-photographing first mode, and a self-photographing second mode, which will be described later. Functions not particularly described are the same as those of a conventional electronic camera.

<Description of operation of this embodiment>
3, 4, 5, 6, and 7 are flowcharts illustrating the operation of the electronic camera 8. FIG. 3 shows the overall flow of operation. FIG. 4 shows an operation when the face recognition first mode is set in step S3 of FIG. FIG. 5 shows the operation when the face recognition second mode is set in step S3. FIG. 6 shows an operation when the self-photographing first mode is set in step S3. FIG. 7 shows an operation when the self-photographing second mode is set in step S3. The operation of the electronic camera 8 will be described below according to the step numbers shown in the figure.

[Step S1] When the power button of the electronic camera 8 is turned on, a power-on process is performed. Thereafter, the photographer operates the button group of the operation unit 40 to set the photographing mode and the like.
[Step S2] The image pickup device 20 is exposed and repeatedly accumulates and discharges charges, and continuously outputs image signals for moving images at a predetermined frame rate. The analog signal processing unit 24 and the A / D conversion unit 28 perform clamp processing, sensitivity correction processing, A / D conversion, and the like on the image signal to generate image data for moving images. The image data for moving images is usually composed of pixel values of some of all effective pixels by thinning readout. The image data is subjected to color process processing by the image processing unit 52 and then displayed as a moving image on the liquid crystal monitor 88.

  [Step S3] The MPU 44 determines which mode is set. When the face recognition first mode is set, the process proceeds to step S11 in FIG. 4, and when the face recognition second mode is set, the process proceeds to step S31 in FIG. 5 and the self-photographing first mode is set. In this case, the process proceeds to step S51 in FIG. 6, and in the case where the second mode for self-photographing is set, the process proceeds to step S71 in FIG. Then, after shooting is performed in any of these modes and image data of a still image is generated, the process proceeds to step S4. The electronic camera 8 also has other well-known shooting modes, which are not directly related to the present invention and will not be described.

[Step S4] The still image data generated in each mode is compressed and then recorded on the memory card 68 via the card interface 64. The operation in each mode will be described below. First, the first face recognition mode (FIG. 4) will be described.
[Step S11] The MPU 44 determines whether or not the release button is half-pressed. If half-pressed, the process proceeds to step S12. If not, the electronic camera 8 stands by until half-pressed.

[Step S12] The output of the image signal for moving image and the moving image display by the image sensor 20 are continued from step S2, and this is continued until step S21. The image processing unit 52 performs a known face recognition process on the latest image data output from the A / D conversion unit 28 to determine a face area in the image.
The face recognition process may be performed by searching for facial feature points such as lips, eyes, and eyebrows in the image, for example. Lips can be determined, for example, by detecting an area slightly redder than the surroundings. The eyes can be determined, for example, by detecting a black round part corresponding to the pupil and white parts on both sides thereof. The position of both eyebrows can be determined, for example, by detecting two areas slightly blacker than the surroundings above the eyes. Then, an area that includes these facial feature points on the inside and can be regarded almost as a skin color is detected, and thereby a face area in the image is determined. If no facial feature point is detected, the image processing unit 52 transmits a determination result “no face area” to the MPU 44.

  [Step S13] The MPU 44 determines whether or not a face area is detected in the image data in step S12. If detected, the process proceeds to step S14. If not detected, the process returns to step S12, and the same processing is repeated until a face area is detected. Note that the number of image data subjected to face recognition processing in the repetition loop in steps S12 and S13 may be less than the frame rate of the image sensor 20. For example, the rate may be 3 or 4 frames per second.

  [Step S14] The MPU 44 causes the memory 56 to store the position, size, and number of face areas detected in step S12. Here, in order to determine the position and size of the face region, it is assumed that all pixels are arranged in a grid pattern in the horizontal direction and the vertical direction in the image sensor 20, and this is considered as coordinates. The size of the face area to be stored here may be, for example, the number of pixels in the horizontal and vertical directions of the smallest rectangle including the face area inside the image data. Further, the position of the face area may be set to the coordinates of the pixel corresponding to the upper right vertex of the rectangle, for example.

  Then, the MPU 44 superimposes and displays a minimum rectangular frame including the face area on the subject image displayed on the liquid crystal monitor 88. FIG. 8 shows a display example of a face area frame on the liquid crystal monitor 88. If a plurality of face areas are detected as shown in FIG. 8A, the respective frames are superimposed on the liquid crystal monitor 88 (the same applies to other steps described later). As the position of the face area, the positions of all pixels corresponding to the outline of the face area may be stored. The frame of the face area displayed on the liquid crystal monitor 88 may not be a rectangle but may be a trace of the outline of the face area.

[Step S15] The image processing unit 52 performs face recognition processing on the latest image data output from the A / D conversion unit 28, as in step S12.
[Step S16] The MPU 44 determines whether or not a face area is detected in the image data in step S15. If not detected, the process proceeds to step S17.
If detected, the MPU 44 stores the position, size, and number of face areas detected in step S15 in the memory 56 separately from those already stored. Further, the MPU 44 causes the liquid crystal monitor 88 to display the frame for the face area detected in step S15 instead of the frame that has been displayed so far.

  Steps S12 to S18 are repeated loops. Therefore, only the frame of the face area in the image data that has undergone the face recognition process most recently is continuously displayed on the liquid crystal monitor 88 (this is the same in other modes described later). Further, the memory 56 stores the position, size, and number of face areas detected in steps S14 and S16 in the order of the photographing times of the image data used for detection. Thereafter, the process proceeds to step S18.

[Step S17] The MPU 44 erases all the positions, sizes, and number of face areas stored in the memory 56. Then, the process returns to step S12.
[Step S18] Regardless of how many times the processing in steps S14 and S16 has been performed, the memory 56 stores the position, size, and number of face areas detected from a plurality of image data in step S18. . The MPU 44 sets the position, size, and number of face areas detected from the most recently captured image data to the position, size, and number of face areas detected from the second most recently captured image data. Compare. As a result of the comparison, if both can be regarded as the same, the process proceeds to step S19, and otherwise, the process returns to step S17.

  More specifically, if the number of face regions matches and the coordinates of the vertexes of the frame, the size in the horizontal direction, and the size in the vertical direction substantially match in the comparison, the process proceeds to step S19. If the coordinates of the apex of the frame, the horizontal size, and the vertical size are exactly the same, it is clearly recognized that "the subject intentionally maintains the same posture and is ready" it can. However, even if these three are slightly different, it can be recognized that the subject is ready. The range that may be slightly different corresponds to the above “substantially match” and “the amount of motion is within a predetermined range” mentioned in the claims. Since the number of face areas indicates the number of subjects, the number of face areas needs to be completely the same in order to proceed to step S19.

[Step S19] The MPU 44 counts “the number of image data corresponding to the position, size, and number of face areas stored in the memory 56”. The MPU 44 determines whether or not the counted value has reached a predetermined number (for example, 6 frames or 8 frames). When it has reached, it progresses to step S20, and when that is not right, it returns to step S15.
Note that face recognition processing is performed at regular intervals at a rate equal to or less than the above-described frame rate, and steps S15 to S19 are repeated in a loop. Therefore, if it is determined that the predetermined number has been reached, this means that the position, size, and number of the face area of the subject do not change for a predetermined period. In steps S14 and S16, the shooting time is stored in the memory 56 together with the position, size, and number of face areas, and instead of using the predetermined number as a determination condition, a few seconds (for example, 1 second or 2 seconds). May be used as the determination condition. The predetermined number and the predetermined period here may be set in advance by the photographer. The same applies to steps S61 and S82 described later.

[Step S20] The MPU 44 instructs the focus control unit 36 to focus on the frame of the face area. The focus control unit 36 adjusts the photographing lens 12 in accordance with an instruction from the MPU 44 to adjust the focus.
If a plurality of face areas are recognized, for example, the face area of the subject located closest to the electronic camera 8 may be focused. The closest subject may be, for example, the largest face area size (that is, a frame) in the image data, but the closest subject may be determined by a triangulation method or the like. Alternatively, the face area of the subject closest to the center in the image data may be focused. Alternatively, the subject at a position set in advance by the photographer may be focused.

[Step S21] The MPU 44 causes the signal output unit 38 to output a signal (for example, sound or light) that confirms that the face area of the subject has not moved for a predetermined period. Further, the MPU 44 stops the moving image capturing operation that has been continued from Step S2.
[Step S22] The MPU 44 determines whether or not the release button has been fully pressed. If it is fully pressed, the process proceeds to step S23. If not, the electronic camera 8 waits until it is fully pressed.

  [Step S23] In synchronism with the full press of the release button, the MPU 44 controls each part to start exposure of the image sensor 20, and the image data for the still image including the pixel values of all effective pixels is A / D converted. Output from the unit 28. Since details of this photographing operation are known, the description thereof is omitted. The image processing unit 52 performs processing such as white balance adjustment, gamma correction, color interpolation, color conversion, color correction, and edge enhancement on the still image data. Thereafter, the process proceeds to step S4. The above is the description of the operation in the first face recognition mode. Next, the second face recognition mode (FIG. 5) will be described.

[Step S31] The photographer fully presses the release button. If not fully pressed, the electronic camera 8 waits until it is fully pressed.
[Step S32] In synchronism with the full depression of the release button, the image processing unit 52 performs face recognition processing on the latest moving image image data output from the A / D conversion unit 28. The face recognition process here is the same as step S12. Moreover, since the process of subsequent step S33-S40 is respectively the same as that of above-mentioned step S13-S20 except the difference of a step number, the overlapping description is abbreviate | omitted.

  [Step S41] The MPU 44 causes the cue output unit 38 to output a confirmation signal in synchronism with “when it is confirmed in step S40 that the face area has not moved for a predetermined period”. In synchronization with this, image data for a still image is generated by the same photographing operation as in step S23, and then the process proceeds to step S4. The above is the description of the operation in the face recognition second mode. The difference between the first face recognition mode and the second face recognition mode is that after it is confirmed that the face area has not moved for a predetermined period, shooting is performed automatically or in synchronization with the photographer's instructions. Is it? Next, the first self-photographing mode (FIG. 6) will be described.

[Step S51] The photographer fully presses the release button. If not fully pressed, the electronic camera 8 waits until it is fully pressed.
[Step S52] The electronic camera 8 waits for a predetermined period (for example, 2 seconds or 3 seconds). This first self-photographing mode assumes that the electronic camera 8 is fixed on, for example, a tripod and the photographer is also a subject. That is, the waiting period takes into account the time for the photographer to move from the installation location of the electronic camera 8 to the shooting position. Further, the waiting period may be set by inputting or selecting the photographer in advance.

[Step S53] The monitor orientation detection unit 80 determines whether or not the liquid crystal monitor 88 is facing the subject, and transmits the determination result to the MPU 44. If it is not facing the subject side, the process proceeds to step S54. If it is facing the subject side, step S54 is skipped and the process proceeds to step S55.
[Step S54] The MPU 44 turns off the display of the liquid crystal monitor 88 in order to save power consumption. Note that the generation of moving image data from step S2 is continuously performed. [Step S55] The image processing unit 52 performs face recognition processing similar to that described above on the latest moving image image data output from the A / D conversion unit 28.

  [Step S56] The MPU 44 causes the memory 56 to store the position, size, and number of face areas detected in Step S55. When the liquid crystal monitor 88 faces the subject, that is, when this step is reached without going through step S54, a frame indicating the face area is superimposed on the subject image on the liquid crystal monitor 88. Therefore, the photographer who is the subject can visually confirm the composition of the photographing by looking at the liquid crystal monitor 8.

If the liquid crystal monitor 88 does not face the subject, that is, if this step is reached via step S54, the liquid crystal monitor 88 remains turned off thereafter. Further, the power consumption may be saved by not turning off the display of the liquid crystal monitor 88 in step S54 and not displaying the face area frame in step S56.
[Step S57] The image processing unit 52 takes in the latest image data output from the A / D conversion unit 28, and performs face recognition processing on the latest image data.

[Step S58] The MPU 44 stores in the memory 56 the position, size, and number of face areas detected in Step S57. Since steps S55 to S60 are repeated in a loop, when the liquid crystal monitor 88 faces the subject, only the frame of the face area in the image data subjected to the face recognition process most recently is displayed on the liquid crystal monitor 88. Continuously displayed.
[Step S59] The MPU 44 compares the newest and second most recent face area positions, sizes, and numbers stored in the memory 56 in the same manner as in Step S18. As a result of comparison, if both can be regarded as the same, the process proceeds to step S61, and if not, the process proceeds to step S60.

[Step S60] The MPU 44 erases all the positions, sizes, and number of face areas stored in the memory 56. Then, the process returns to step S55.
[Step S61] The MPU 44 counts “the number of image data corresponding to the position, size, and number of face areas stored in the memory 56”. If the counted value does not reach the predetermined number, the MPU 44 proceeds to Step S57. If it has returned and reached, the process proceeds to step S62. The subsequent processes in steps S62 and S63 are the same as steps S40 and S41 in the face recognition second mode, respectively.

  The above is the description of the operation in the first self-photographing mode. The main differences between the face recognition second mode and the self-photographing first mode are the following two. First, in the first self-photographing mode, it is assumed that the photographer is the subject, and therefore, the self-photographing first mode waits for a predetermined period after the release button is fully pressed. Second, in the first self-photographing mode, the display is turned off according to the orientation of the liquid crystal monitor 88. Next, the second self-photographing mode (FIG. 7) will be described.

[Step S71] The photographer fully presses the release button. If not fully pressed, the electronic camera 8 waits until it is fully pressed.
[Step S72] The image processing unit 52 performs face recognition processing similar to that described above on the latest moving image image data output from the A / D conversion unit 28 to detect a face area. The MPU 44 stores the number of face areas thus detected in the memory 56 as the number of subjects. At this point in time, since the release button has just been pressed down, the photographer has not moved to the shooting position. Therefore, the number of face regions here is considered to be one (corresponding to the photographer) or two or more less than the number of subjects scheduled to be photographed.

Further, the number of subjects may be detected by a known pattern recognition instead of the face recognition process described above. The subsequent steps S73 and S74 are the same as steps S53 and S54 described above, except for the difference in the step numbers.
[Step S75] The image processing unit 52 performs face recognition processing on the latest moving image image data output from the A / D conversion unit 28 to detect a face area. The MPU 44 distinguishes the number of detected face regions from that detected in step S72 and stores it in the memory 56 as the number of subjects.

  When the liquid crystal monitor 88 faces the subject, that is, when the process of step S74 is not performed, the face area frame is superimposed on the subject image on the liquid crystal monitor 88. Since steps S75 and S76 are repeated, only the frame of the face area in the image data that has undergone the face recognition process most recently is displayed. If the liquid crystal monitor 88 is not facing the subject side, the liquid crystal monitor 88 remains turned off thereafter.

  [Step S76] The MPU 44 determines whether or not “the number of subjects stored in Step S75 is greater than the number of subjects stored in Step S72”. If there are many, it means that the photographer has reached the photographing position, and the process proceeds to step S77. Otherwise, the MPU 44 deletes the result of the face recognition process performed by the image processing unit 52 in step S75, and deletes the number of subjects stored in step S75 from the memory 56. Thereafter, the process returns to step S75. Note that the number of subjects stored in step S72 is not deleted.

  [Step S77] The MPU 44 causes the memory 56 to store the position, size, and number of face areas detected in Step S75. When the liquid crystal monitor 88 faces the subject, a newly added face frame (see FIG. 8 (b)) in addition to the face frame (see FIG. 8 (a)) displayed so far. (Usually considered to be the photographer's face) is also displayed. Subsequent steps S78 to S80 are the same as steps S57 to S59 described above, except for the difference in the step numbers.

  [Step S81] The MPU 44 deletes all the positions, sizes, and numbers of face areas stored in the memory 56. Then, the image processing unit 52 performs face recognition processing on the latest moving image image data output from the A / D conversion unit 28 to detect a face area. The MPU 44 stores the detected position, size, and number of face areas in the memory 56. When the liquid crystal monitor 88 faces the subject, the MPU 44 causes the liquid crystal monitor 88 to display a newly detected face area frame. Thereafter, the process returns to step S78.

  The subsequent steps S82 to S84 are the same as steps S61 to S63 described above, except for the difference in the step numbers. It should be noted that the subject's face area to be focused in step S83 may be the face area of the last added subject, other than the ones mentioned in step S20. The main difference from the first self-photographing mode is that in the second self-photographing mode, photographing is performed after confirming that the number of subjects has increased. The above is the description of the operation of the electronic camera 8 of the present embodiment.

<Effect of this embodiment>
In the face recognition first and second modes, face recognition processing is performed on continuously generated moving image data to determine whether the position, size, and number of face regions remain unchanged for a predetermined period. Then, at the moment when it is determined that there is no change, shooting is automatically performed, or a signal is output to prompt the photographer to fully press the release button. Here, the fact that the position, size, and number of face areas do not change for a predetermined period almost means that the subject is ready. Therefore, when the subject is ready, it is possible to reliably shoot without missing a photo opportunity.

In the first self-photographing mode, after the release button is fully pressed, the electronic camera 8 stands by for a period that takes into account the time when the photographer moves to the photographing position. Thereafter, in the same manner as described above, a picture is automatically taken at the moment when it is determined that the position, size, and number of face areas do not change for a predetermined period. Therefore, the same effect as described above can be obtained.
In the second self-photographing mode, the number of subjects is detected by face recognition processing, and after the number of subjects has been detected, photographing is automatically performed. An increase in the number of subjects means that the photographer has moved to the shooting position. Therefore, in self-photographing, it is possible to automatically confirm that the photographer has reached the photographing position without imposing an operation burden on the photographer. Also in this case, since the image is taken after confirming that the subject has not moved for a predetermined period, the same effect as described above can be obtained.

  In the above four modes, the MPU 44 controls each unit so that the detected face area is focused immediately after it is determined that the position, size, and number of face areas do not change for a predetermined period. For this reason, since the captured image is in focus at an appropriate position in the composition of the photograph, there is no possibility of being a failed photograph. In addition, the photographer does not have to worry about which of the four modes is set when focusing. As a result, the usability of the electronic camera can be greatly improved.

  Further, in the first and second self-photographing modes, when the liquid crystal monitor 88 is not facing the subject, the display on the liquid crystal monitor 88 is turned off after the release button is fully pressed. In these modes, the liquid crystal monitor 88 is not facing the subject side, and the photographer and the subject do not see the display on the liquid crystal monitor 88 after the release button is fully pressed. Therefore, it is possible to prevent the battery from being consumed unnecessarily.

<Supplementary items of this embodiment>
[1] In the present embodiment, the example in which the focus control of the photographing lens 12 is performed after confirming that the position, size, and number of detected face regions do not change for a predetermined period has been described. The present invention is not limited to such an embodiment. The timing for performing the focus control may be after the face area is detected. For example, in face recognition 2nd mode, you may perform immediately after either of step S32, S33, S34, S35, and S36. By doing so, the release time lag can be shortened as compared with the case of focusing just before shooting as in the present embodiment.

  [2] The example in which the face area of the subject is detected from the image data generated by the image sensor 20 or the like for displaying a moving image before photographing and the number of subjects is counted has been described. The present invention is not limited to such an embodiment. In addition to the image sensor 20, a sensor for detecting a subject image in the image space of the photographing lens 12 is provided. Based on the output result of this sensor, the face area of the subject is detected in the same manner as in this embodiment, and the number of subjects is determined. You may count.

  [3] The example in which the output of moving image signals by the image sensor 20 and the moving image display are stopped and the image capturing operation for still images is performed has been described. The present invention is not limited to such an embodiment. The output of the image signal for moving image and the moving image display may be continuously performed without performing the imaging operation for still image. In that case, the image was captured when it was determined that the amount of motion of the face area of the subject was within a predetermined range from a plurality of frames (image data) continuously generated for the moving image (steps S19, S39, S61, and S82). Just select a thing and record it.

  Further, for example, as described in [2] above, in the form of providing a sensor for detecting the face area of the subject separately from the image sensor 20, the following may be performed. Initially, the face area of the subject is detected by the sensor without performing the moving image capturing operation by the image sensor 20. Then, from the sensor output result, when the amount of movement of the subject is determined to be within the predetermined range, the moving image capturing operation by the image sensor 20 is started.

[4] The example in which the face area is detected on the assumption that the subject is a person has been described. The present invention is not limited to such an embodiment. The subject may be another animal.
[5] An example in which face recognition is performed by detecting a plurality of feature points in a face and the subject is determined as an object has been described. The present invention is not limited to such an embodiment. A plurality of feature points may be detected from a subject part other than the face and determined as a subject.

[6] The correspondence between the claims and the present embodiment will be described below. In addition, the correspondence shown below is one interpretation shown for reference, and does not limit the present invention.
The imaging unit described in the claims includes the photographing lens 12, the focal plane shutter 16, the imaging device 20, the analog signal processing unit 24, the A / D conversion unit 28, the timing generator 32, and the focus control unit 36, and generates image data in these. This corresponds to the control function of the MPU 44 to be executed. The generation of image data by each of these units corresponds to the imaging operation described in the claims.

The detection unit and the position detection unit described in the claims correspond to the function of the image processing unit 52 that performs face recognition processing on image data and detects the position, size, and number of face regions. The detection of the position of the face area here corresponds to the detection process and the position detection process described in the claims.
The timing determination unit described in the claims corresponds to “a function of the MPU 44 that automatically shoots or outputs a confirmation signal after determining that the position, size, and number of face regions have not changed for a predetermined period”.

The focus control unit described in the claims corresponds to the focus control unit 36 and the function of the MPU 44 that instructs the focus control unit 36 to focus on the face region after the position of the face region is detected.
The counting unit described in the claims corresponds to the function of the image processing unit 52 that performs face recognition processing on the image data and detects the number of face regions. The detection of the number of face regions here corresponds to the counting process described in the claims.

The imaging command unit described in the claims corresponds to “a function of the image processing unit 52 and the MPU 44 for detecting the number of subjects, and a function of the MPU 44 for imaging after detecting the increase in the number of subjects”.
The operation member described in the claims corresponds to the release button of the operation unit 40.
The “shooting mode in which the image pickup operation for a still image is performed at a timing not depending on the operation of the operation member” according to the claims corresponds to the second mode for face recognition, the first mode for self-shooting, and the second mode for self-shooting. To do. The first face recognition mode is “a shooting mode in which a still image capturing operation is performed at a timing depending on (synchronized with) an operation on the operation member”.

  [7] The example in which the control shown in the flowcharts of FIGS. The present invention is not limited to such an embodiment. The present invention prepares an external device connected to an electronic camera using various networks including wireless and wired, and executes the control shown in the flowcharts of FIGS. 4 to 7 using an arithmetic device of the external device. It can also be applied to electronic camera systems.

  In that case, the arithmetic unit of the external device may include the detection unit, the timing determination unit, the count unit, and the imaging command unit in the supplementary item [6]. The part used as the imaging unit may be on the electronic camera side, and may not be in the arithmetic unit of the external device. Then, the arithmetic device of the external device may acquire image data from the electronic camera side, detect a subject based on the image data, and instruct the electronic camera to perform imaging. Alternatively, as described in the supplementary item [2], a sensor for detecting a subject image in the image space of the photographing lens of the electronic camera is provided in the external device, and the arithmetic unit of the external device is based on the output result of the sensor. Then, the subject may be detected and imaging may be instructed to the electronic camera side.

  As described above in detail, the present invention can be used greatly in the fields of electronic cameras and electronic camera systems.

It is a block diagram of the electronic camera of this embodiment. It is an external view of the electronic camera of this embodiment. It is a flowchart which shows the whole operation | movement of the electronic camera of this embodiment. 4 is a flowchart showing an operation when the face recognition first mode is set in step S3 of FIG. 3. FIG. 4 is a flowchart showing an operation when the face recognition second mode is set in step S3 of FIG. 3. FIG. 4 is a flowchart showing an operation when a self-photographing first mode is set in step S3 of FIG. FIG. 4 is a flowchart showing an operation when a self-photographing second mode is set in step S3 of FIG. It is explanatory drawing which shows the example of a display of the frame of the face area | region in a liquid crystal monitor.

Explanation of symbols

8 Electronic Camera 12 Shooting Lens 16 Focal Plane Shutter 20 Image Sensor 24 Analog Signal Processing Unit 28 A / D Conversion Unit 32 Timing Generator 36 Focus Control Unit 38 Signal Output Unit 40 Operation Unit 44 MPU
48 System Bus 52 Image Processing Unit 56 Memory 64 Card Interface 68 Memory Card 72 USB
76 USB interface 80 Monitor direction detector 84 Monitor controller 88 Liquid crystal monitor 92 Lens side body 94 Monitor side body 96 Rotating shaft mechanism 98 Button group for operation

Claims (11)

  1. An imaging unit that performs an imaging operation of imaging a subject and generating image data;
    A detection unit that performs detection processing for detecting the position of the face area of the subject;
    The detection process is repeatedly performed, and based on a comparison with the previously detected position of the face area, it is determined each time the detection process determines whether or not the amount of movement of the face area position is within a predetermined range. An electronic camera comprising: a timing determination unit that instructs the imaging unit to perform the imaging operation when it is determined that the image is within the predetermined range in a period of time or a predetermined number of comparisons.
  2. The electronic camera according to claim 1,
    After the detection unit detects the position of the face area, the focus determination unit includes a focus control unit that focuses the photographing lens on the detected face area before the timing determination unit issues the command. Electronic camera.
  3. The electronic camera according to claim 1 or 2,
    The timing determination unit repeatedly performs the imaging operation when the detection process is repeatedly performed,
    The electronic camera characterized in that the detection unit detects the position of the face region using the image data generated by the imaging unit.
  4. The electronic camera according to any one of claims 1 to 3,
    In synchronization with receiving a photographer's operation, the imaging unit has an operation member that commands the imaging operation for a still image,
    The processing of the detection unit and the timing determination unit is performed when the electronic camera is set to “a shooting mode in which the imaging operation for still images is performed at a timing that does not depend on an operation on the operation member”. An electronic camera characterized by that.
  5. An imaging unit that performs an imaging operation of imaging a subject and generating image data;
    A counting unit that performs a counting process of counting the number of subjects by detecting each of the subjects;
    The counting process is repeatedly performed to determine whether or not the number of subjects has increased every time the counting process is performed, and after determining that the number has increased, a shooting command unit that commands the imaging operation to the imaging unit An electronic camera characterized by comprising.
  6. The electronic camera according to claim 5,
    A position detection unit that performs a position detection process for detecting the position of the face area of the subject after determining that the shooting command unit has increased;
    The imaging command unit repeatedly performs the position detection process after determining that the increase has occurred, and the amount of motion of the position of the face area is determined based on a comparison with the position of the face area detected previously. An electronic camera that determines whether or not the position is within a range each time the position detection processing is performed and determines that the position is within the predetermined range within a predetermined period or a predetermined number of comparisons.
  7. The electronic camera according to claim 6.
    After the position detection unit detects the position of the face area, a focus control unit is provided to focus the shooting lens on the detected face area before the shooting command unit performs the command. Electronic camera.
  8. The electronic camera according to claim 6 or 7,
    The imaging command unit repeatedly performs the imaging operation when the position detection process is repeatedly performed,
    The electronic camera according to claim 1, wherein the position detection unit detects the position of the face region using the image data generated by the imaging unit.
  9. The electronic camera according to any one of claims 5 to 8,
    In synchronization with receiving a photographer's operation, the imaging unit has an operation member that commands the imaging operation for a still image,
    The processing of the counting unit and the shooting command unit is performed when the electronic camera is set to “shooting mode in which the imaging operation for still images is performed at a timing independent of the operation on the operation member”. An electronic camera characterized by that.
  10. An electronic camera system capable of controlling the timing at which the electronic camera images a subject,
    A detection unit that performs detection processing for detecting the position of the face area of the subject;
    The detection process is repeatedly performed, and based on a comparison with the previously detected position of the face area, it is determined each time the detection process determines whether or not the amount of movement of the face area position is within a predetermined range. An electronic camera system comprising: a timing determination unit that instructs the electronic camera to perform the imaging when it is determined that the predetermined range is within the predetermined range.
  11. An electronic camera system capable of controlling the timing at which the electronic camera images a subject,
    A counting unit that performs a counting process of counting the number of subjects by detecting each of the subjects;
    A repetitive counting process to determine whether or not the number of subjects has increased each time the counting process is performed, and after determining that the number has increased, a shooting command unit that commands the electronic camera to perform the imaging An electronic camera system characterized by that.
JP2004179983A 2004-06-17 2004-06-17 Electronic camera and electronic camera system Active JP4442330B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004179983A JP4442330B2 (en) 2004-06-17 2004-06-17 Electronic camera and electronic camera system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004179983A JP4442330B2 (en) 2004-06-17 2004-06-17 Electronic camera and electronic camera system

Publications (2)

Publication Number Publication Date
JP2006005662A true JP2006005662A (en) 2006-01-05
JP4442330B2 JP4442330B2 (en) 2010-03-31

Family

ID=35773669

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004179983A Active JP4442330B2 (en) 2004-06-17 2004-06-17 Electronic camera and electronic camera system

Country Status (1)

Country Link
JP (1) JP4442330B2 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006174105A (en) * 2004-12-16 2006-06-29 Casio Comput Co Ltd Electronic camera and program
JP2006254358A (en) * 2005-03-14 2006-09-21 Omron Corp Imaging apparatus and method of timer photographing
JP2007208757A (en) * 2006-02-03 2007-08-16 Casio Comput Co Ltd Camera system, and camera control program
JP2007251324A (en) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc Recording method
JP2007267309A (en) * 2006-03-30 2007-10-11 Sanyo Electric Co Ltd Electronic camera
JP2007318225A (en) * 2006-05-23 2007-12-06 Fujifilm Corp Photographing apparatus and photographing method
JP2007329602A (en) * 2006-06-07 2007-12-20 Casio Comput Co Ltd Imaging apparatus, photographing method, and photographing program
JP2008004061A (en) * 2006-05-26 2008-01-10 Fujifilm Corp Object image detection system, matching decision apparatus and sorting apparatus for object image section, and control method thereof
JP2008160280A (en) * 2006-12-21 2008-07-10 Sanyo Electric Co Ltd Imaging apparatus and automatic imaging method
JP2008193411A (en) * 2007-02-05 2008-08-21 Fujifilm Corp Photographing controller, photographing device, and photographing control method
JP2008244976A (en) * 2007-03-28 2008-10-09 Casio Comput Co Ltd Imaging device, and method and program for recording photographic image
JP2008245055A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Image display device, photographing device, and image display method
WO2008131823A1 (en) * 2007-04-30 2008-11-06 Fotonation Vision Limited Method and apparatus for automatically controlling the decisive moment for an image acquisition device
JP2008271132A (en) * 2007-04-19 2008-11-06 Matsushita Electric Ind Co Ltd Imaging apparatus and imaging method
WO2009053863A1 (en) * 2007-10-26 2009-04-30 Sony Ericsson Mobile Communications Ab Automatic timing of a photographic shot
JP2009239923A (en) * 2006-07-25 2009-10-15 Fujifilm Corp Imaging device
JP2010028353A (en) * 2008-07-17 2010-02-04 Canon Inc Image capturing apparatus, imaging method, program and recording medium
JP2010028354A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging apparatus, imaging method, program, and recording medium
JP2010041399A (en) * 2008-08-05 2010-02-18 Canon Inc Imaging device and its control method
JP2010041598A (en) * 2008-08-07 2010-02-18 Canon Inc Imaging apparatus, and control method and control program for the same
EP1855464A3 (en) * 2006-05-12 2010-06-23 FUJIFILM Corporation Method for displaying face detection frame, method for displaying character information, and image-taking device
CN101867718A (en) * 2010-02-26 2010-10-20 深圳市同洲电子股份有限公司 Method and device for taking picture automatically
JP2011101333A (en) * 2009-10-06 2011-05-19 Nec Casio Mobile Communications Ltd Electronic apparatus, image processing method and program
US8004573B2 (en) 2008-06-26 2011-08-23 Casio Computer Co., Ltd. Imaging apparatus, imaged picture recording method, and storage medium storing computer program
US8144205B2 (en) 2007-01-30 2012-03-27 Sanyo Electric Co., Ltd. Electronic camera with feature image recognition
US8284264B2 (en) 2006-09-19 2012-10-09 Fujifilm Corporation Imaging apparatus, method, and program
US8477993B2 (en) 2007-09-28 2013-07-02 Fujifilm Corporation Image taking apparatus and image taking method
US8498446B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Method of improving orientation and color balance of digital images using face detection information
JP2013225876A (en) * 2013-06-04 2013-10-31 Casio Comput Co Ltd Image specifying device, and image specifying program
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US9007480B2 (en) 2008-07-30 2015-04-14 Fotonation Limited Automatic face and skin beautification using face detection
CN105744165A (en) * 2016-02-25 2016-07-06 深圳天珑无线科技有限公司 Photographing method and device, and terminal
US9648231B2 (en) 2014-06-30 2017-05-09 Canon Kabushiki Kaisha Image pickup apparatus having plurality of image pickup units, control method therefor, and storage medium
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US10032068B2 (en) 2009-10-02 2018-07-24 Fotonation Limited Method of making a digital camera image of a first scene with a superimposed second scene

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9692964B2 (en) 2003-06-26 2017-06-27 Fotonation Limited Modification of post-viewing parameters for digital images using image region or feature information
US8948468B2 (en) 2003-06-26 2015-02-03 Fotonation Limited Modification of viewing parameters for digital images using face detection information
US8498446B2 (en) 2003-06-26 2013-07-30 DigitalOptics Corporation Europe Limited Method of improving orientation and color balance of digital images using face detection information
JP2006174105A (en) * 2004-12-16 2006-06-29 Casio Comput Co Ltd Electronic camera and program
JP4639869B2 (en) * 2005-03-14 2011-02-23 オムロン株式会社 Imaging apparatus and timer photographing method
JP2006254358A (en) * 2005-03-14 2006-09-21 Omron Corp Imaging apparatus and method of timer photographing
JP2007208757A (en) * 2006-02-03 2007-08-16 Casio Comput Co Ltd Camera system, and camera control program
JP2007251324A (en) * 2006-03-14 2007-09-27 Hitachi Kokusai Electric Inc Recording method
JP4551347B2 (en) * 2006-03-14 2010-09-29 株式会社日立国際電気 Surveillance image recording apparatus and recording method thereof.
JP2007267309A (en) * 2006-03-30 2007-10-11 Sanyo Electric Co Ltd Electronic camera
US8073207B2 (en) 2006-05-12 2011-12-06 Fujifilm Corporation Method for displaying face detection frame, method for displaying character information, and image-taking device
EP1855464A3 (en) * 2006-05-12 2010-06-23 FUJIFILM Corporation Method for displaying face detection frame, method for displaying character information, and image-taking device
JP2007318225A (en) * 2006-05-23 2007-12-06 Fujifilm Corp Photographing apparatus and photographing method
JP4654974B2 (en) * 2006-05-23 2011-03-23 富士フイルム株式会社 Imaging apparatus and imaging method
JP2008004061A (en) * 2006-05-26 2008-01-10 Fujifilm Corp Object image detection system, matching decision apparatus and sorting apparatus for object image section, and control method thereof
JP4577275B2 (en) * 2006-06-07 2010-11-10 カシオ計算機株式会社 Imaging apparatus, image recording method, and program
JP2007329602A (en) * 2006-06-07 2007-12-20 Casio Comput Co Ltd Imaging apparatus, photographing method, and photographing program
JP2009239923A (en) * 2006-07-25 2009-10-15 Fujifilm Corp Imaging device
US8284264B2 (en) 2006-09-19 2012-10-09 Fujifilm Corporation Imaging apparatus, method, and program
JP2008160280A (en) * 2006-12-21 2008-07-10 Sanyo Electric Co Ltd Imaging apparatus and automatic imaging method
US8144205B2 (en) 2007-01-30 2012-03-27 Sanyo Electric Co., Ltd. Electronic camera with feature image recognition
US8150208B2 (en) 2007-02-05 2012-04-03 Fujifilm Corporation Image pickup apparatus having stability checker for specific object feature value, and program and method for control of image pickup including checking stability of specific object feature value
JP2008193411A (en) * 2007-02-05 2008-08-21 Fujifilm Corp Photographing controller, photographing device, and photographing control method
JP2008244976A (en) * 2007-03-28 2008-10-09 Casio Comput Co Ltd Imaging device, and method and program for recording photographic image
JP2008245055A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Image display device, photographing device, and image display method
US8131142B2 (en) 2007-04-19 2012-03-06 Panasonic Corporation Imaging apparatus
JP2008271132A (en) * 2007-04-19 2008-11-06 Matsushita Electric Ind Co Ltd Imaging apparatus and imaging method
US8391704B2 (en) 2007-04-19 2013-03-05 Panasonic Corporation Imaging apparatus
WO2008131823A1 (en) * 2007-04-30 2008-11-06 Fotonation Vision Limited Method and apparatus for automatically controlling the decisive moment for an image acquisition device
US8477993B2 (en) 2007-09-28 2013-07-02 Fujifilm Corporation Image taking apparatus and image taking method
WO2009053863A1 (en) * 2007-10-26 2009-04-30 Sony Ericsson Mobile Communications Ab Automatic timing of a photographic shot
US8004573B2 (en) 2008-06-26 2011-08-23 Casio Computer Co., Ltd. Imaging apparatus, imaged picture recording method, and storage medium storing computer program
JP2010028353A (en) * 2008-07-17 2010-02-04 Canon Inc Image capturing apparatus, imaging method, program and recording medium
US8269851B2 (en) 2008-07-17 2012-09-18 Canon Kabushiki Kaisha Image pickup device and image pickup method to set image capturing condition
JP2010028354A (en) * 2008-07-17 2010-02-04 Canon Inc Imaging apparatus, imaging method, program, and recording medium
US9007480B2 (en) 2008-07-30 2015-04-14 Fotonation Limited Automatic face and skin beautification using face detection
JP2010041399A (en) * 2008-08-05 2010-02-18 Canon Inc Imaging device and its control method
JP2010041598A (en) * 2008-08-07 2010-02-18 Canon Inc Imaging apparatus, and control method and control program for the same
US10032068B2 (en) 2009-10-02 2018-07-24 Fotonation Limited Method of making a digital camera image of a first scene with a superimposed second scene
JP2011101333A (en) * 2009-10-06 2011-05-19 Nec Casio Mobile Communications Ltd Electronic apparatus, image processing method and program
CN101867718A (en) * 2010-02-26 2010-10-20 深圳市同洲电子股份有限公司 Method and device for taking picture automatically
JP2013225876A (en) * 2013-06-04 2013-10-31 Casio Comput Co Ltd Image specifying device, and image specifying program
US9648231B2 (en) 2014-06-30 2017-05-09 Canon Kabushiki Kaisha Image pickup apparatus having plurality of image pickup units, control method therefor, and storage medium
CN105744165A (en) * 2016-02-25 2016-07-06 深圳天珑无线科技有限公司 Photographing method and device, and terminal

Also Published As

Publication number Publication date
JP4442330B2 (en) 2010-03-31

Similar Documents

Publication Publication Date Title
JP4324170B2 (en) Imaging apparatus and display control method
US7920187B2 (en) Image pickup device that identifies portions of a face
US8649574B2 (en) Imaging apparatus, control method of imaging apparatus, and computer program
JP4904243B2 (en) Imaging apparatus and imaging control method
US8194140B2 (en) Image pickup apparatus for performing a desirable self timer shooting and an automatic shooting method using the same
JP2007318225A (en) Photographing apparatus and photographing method
US20080131107A1 (en) Photographing apparatus
JP4518131B2 (en) Imaging method and apparatus
US7868917B2 (en) Imaging device with moving object prediction notification
JP2005318554A (en) Imaging device, control method thereof, program, and storage medium
JP2011004340A (en) Imaging apparatus and control method therefor
CN101399916B (en) Image taking apparatus and image taking method
JP2005128437A (en) Photographing device
JP2007028536A (en) Digital camera
JP2009089077A (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
US7916182B2 (en) Imaging device and method which performs face recognition during a timer delay
US20080273097A1 (en) Image capturing device, image capturing method and controlling program
US20050012833A1 (en) Image capturing apparatus
JP4761146B2 (en) Imaging apparatus and program thereof
US8786760B2 (en) Digital photographing apparatus and method using face recognition function
JP2004206688A (en) Face recognition method, face image cutting out method, and imaging apparatus
US8199221B2 (en) Image recording apparatus, image recording method, image processing apparatus, image processing method, and program
JP4135100B2 (en) Imaging device
US8212895B2 (en) Digital camera system with portrait effect
JP2008289095A (en) Imaging apparatus, display method and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070612

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20090910

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090929

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091127

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20091222

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100104

R150 Certificate of patent or registration of utility model

Ref document number: 4442330

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130122

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130122

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130122

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140122

Year of fee payment: 4

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250