JP4970716B2 - Electronic camera - Google Patents

Electronic camera Download PDF

Info

Publication number
JP4970716B2
JP4970716B2 JP2004254396A JP2004254396A JP4970716B2 JP 4970716 B2 JP4970716 B2 JP 4970716B2 JP 2004254396 A JP2004254396 A JP 2004254396A JP 2004254396 A JP2004254396 A JP 2004254396A JP 4970716 B2 JP4970716 B2 JP 4970716B2
Authority
JP
Japan
Prior art keywords
subject
instruction
unit
electronic camera
composition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2004254396A
Other languages
Japanese (ja)
Other versions
JP2006074368A (en
Inventor
稔 加藤
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to JP2004254396A priority Critical patent/JP4970716B2/en
Publication of JP2006074368A publication Critical patent/JP2006074368A/en
Application granted granted Critical
Publication of JP4970716B2 publication Critical patent/JP4970716B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23222Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Description

  The present invention relates to an electronic camera capable of displaying a composition assist frame.

  2. Description of the Related Art Conventionally, an electronic camera that displays a composition assist frame is known in order to make it easier for a photographer to determine a composition for photographing (see, for example, Patent Document 1). The composition assist frame is, for example, one in which a frame for accommodating a subject person is arranged so as to be biased to either the left or right of the image. In this composition assisting frame, a space is created from the opposite side of the person to the center of the image, so that the landscape can be effectively accommodated in this space. When this composition assist frame is used, a calm atmosphere is often produced by directing the subject's face toward the center of the image, that is, the landscape side.

In general, the composition assist frame is displayed on the liquid crystal display device on the back of the electronic camera so as to overlap the through image of the subject. At the time of shooting, the orientation of the electronic camera, the zoom position of the shooting lens, the distance between the subject and the electronic camera, and the like are changed so that the position and size of the subject in the display image matches the frame for composition assistance. The photograph was taken when the subject image was aligned with the composition assist frame.
Japanese Patent Laid-Open No. 2002-314874 (Section 6-7, FIGS. 4 to 5)

  When the composition assist frame is used as described above, the photographer shoots while looking at the liquid crystal display device, but the liquid crystal display device may be difficult to see due to sunlight. Under such shooting conditions, the photographer can confirm whether or not the subject person is within the composition assist frame, but it is difficult to confirm details such as the orientation of the subject person's face. . For this reason, for example, when using a composition assisting frame in which the subject person is biased as described above, the subject person may be photographed without noticing that the face of the subject person is not facing the landscape. Therefore, it has been desired to bring the captured image as close as possible to the composition intended by the photographer by an easy method.

  An object of the present invention is to provide a technique for bringing a captured image closer to a composition intended by a photographer in an electronic camera capable of displaying a composition assisting frame even under conditions where the display device is difficult to see.

The electronic camera according to claim 1, a display unit that performs guide display for determining a composition of shooting, an imaging unit that captures an image of a subject and generates image data, and a characteristic part of the subject in the image data And an instruction unit that is arranged on the front side of the apparatus and outputs an instruction to bring the subject closer to the photographing composition indicated by the guide display, and the display unit displays the guide display. Guide display is performed based on a predetermined frame to be performed, and the predetermined frame is associated with auxiliary information indicating a subject state with respect to the predetermined frame and stored in a program memory, and the instruction unit extracts the extracted information Based on the characteristic part of the subject and the incidental information, the instruction is output to the subject so as to be visible, and the subject composition is displayed in the photographing composition indicated by the guide display. Instructions and to approximate, the instructions and based on the supplementary information, and outputs to be identified by visual recognition of the object.

The invention of claim 2 is the electronic camera according to claim 1, characterized by the following points. First, the extraction unit extracts the face area of the subject. Second, the instruction unit, based on the extracted face region, by estimating the direction of the object to be Utsushitai that put the image indicated by the image data line-of-sight as the subject state, the closer the object state in the composition of the shot Is output.

According to a third aspect of the present invention, in the electronic camera according to the first or second aspect, the instruction unit outputs a voice instruction to the photographer and a first output unit that outputs a voice instruction to the subject. And a second output unit .

According to a fourth aspect of the present invention, the electronic camera according to any one of the first to third aspects is characterized by the following points. First, based on the extracted characteristic part of the subject, it is determined whether or not the subject matches the shooting composition. After determining that the subject matches, it is extracted whether or not the subject has performed a shooting cue operation. An operation determination unit for determining based on the characteristic part of the subject . Second, shooting is started after elapse of a predetermined time from when the operation determination unit determines that the shooting cue operation has been performed.

  In the electronic camera of the present invention, the display unit performs guide display for determining the composition of shooting, and the extraction unit extracts a region of the subject from the image data obtained by imaging. The instruction unit outputs an instruction for bringing the subject closer to the shooting composition based on the extracted subject region. Therefore, even if the subject or the photographer only follows this instruction and the guide display is difficult to see, the captured image can be brought close to the composition intended by the photographer.

Hereinafter, embodiments of the present invention will be described with reference to the drawings.
<Configuration of this embodiment>
FIG. 1 is a block diagram of the electronic camera in the present embodiment. As shown in the figure, the electronic camera 12 includes an interchangeable photographing lens 14, a focal plane shutter 16, a CCD (imaging device) 18, an analog signal processing unit 20, an A / D conversion unit 22, and a timing generator. 26, aperture drive mechanism 28, shutter control unit 30, lens drive unit 32, instruction unit 34, MPU (Micro Processor Unit) 38, operation unit 40, system bus 44, and image processing unit 46. , An image memory 48, a program memory 50, a monitor control unit 54, a liquid crystal monitor (Liquid Crystal Display) 56, a card interface 58, and a replaceable memory card (recording medium) 60.

  The photographing lens 14 is a zoom lens having a lens group 66 and a diaphragm 68, and can adjust the focal length (that is, the angle of view) without shifting the focus position. The lens driving unit 32 moves the lens group 66 in the direction of the optical axis in accordance with a command from the MPU 38 to adjust the focal length. The aperture driving mechanism 28 adjusts the amount of light transmitted through the photographing lens 14 by adjusting the aperture 68 in accordance with a command from the MPU 38. The shutter control unit 30 controls the travel of the front curtain and rear curtain (not shown) of the focal plane shutter 16 in accordance with a command from the MPU 38. The analog signal processing unit 20 and the A / D conversion unit 22 perform clamp processing, sensitivity correction processing, A / D conversion, and the like on the pixel output from the CCD 18 to generate digital image data. The MPU 38 performs system control of the electronic camera 12 via the system bus 44. The operation unit 40 includes a set of buttons such as a power button, a shooting mode selection button, and a release button (not shown).

  The electronic camera 12 of the present embodiment is mainly characterized by having three shooting modes of a scene assist + face recognition mode, a first self shooting mode, and a second self shooting mode in addition to the conventional shooting mode. Hereinafter, these three shooting modes will be described. In the scene assist + face recognition mode, a composition assist frame is displayed, and instructions such as movement are given to the photographer and the subject person. In the first and second self-photographing modes, a frame for composition assistance is displayed in the same manner, but it is assumed that the photographer himself is the subject. That is, when the first or second self-photographing mode is set and the photographer enters the photographing range, the photographer is instructed to move or the like by a guide display or the like. Thereafter, the image is automatically taken by a self-timer.

  In these three shooting modes, the composition assist frame selected by the photographer is displayed on the liquid crystal monitor 56 so as to overlap the subject image. FIG. 2 shows several examples of composition assist frames (dotted line portions) stored in advance in the program memory 50. FIG. 2 (a) assumes that the subject person is on the right and the landscape is stored from the left to the center of the screen. FIG. 2B shows the subject person on the left in FIG. FIG. 2 (c) is used as a guide for the arrangement of the face and upper body of the subject person when taking a portrait. FIG. 2D is for two-shot.

  The program memory 50 stores the following supplementary information for each of the four composition assist frames. The accompanying information indicates which direction is desirable for the direction of the face of the subject person and the direction of the line of sight. The supplementary information of the composition assisting frame in FIG. 2A indicates that it is desirable that the direction of the face of the subject person and the direction of the line of sight are slightly leftward. This is because a space is created from the left side to the center of the image, and a calm atmosphere can be produced by directing the face of the subject person to the space side. The supplementary information of the composition assisting frame in FIG. 2B indicates that the direction of the face of the subject person and the direction of the line of sight are preferably slightly rightward. The supplementary information of the composition assisting frame shown in FIG. 2C indicates that the face direction of the subject person and the direction of the line of sight are preferably front-facing. In the supplementary information of the frame for assisting composition in FIG. 2D, the face direction and line-of-sight direction of the right subject person are preferably slightly leftward, and the face direction and line-of-sight direction of the left subject person are slightly rightward. This is desirable. This is because facing each other can produce a friendly atmosphere.

  The instructing unit 34 instructs movement or the like so that the position of the subject person matches the frame for assisting composition and the accompanying information. This instruction is performed by sound or lighting and blinking of a light emitting unit provided in front of the camera body of the electronic camera 12. FIG. 3 is a front view of the electronic camera 12. As shown in the figure, the instructing unit 34 includes a front side audio output window 72, a first light emitting unit Sa, a second light emitting unit Sb, a third light emitting unit Sc, a fourth light emitting unit Sd, and a fifth light emitting unit Se. Has in front of. Although not shown in the figure, the instruction unit 34 has a back side audio output window 74 on the back side of the camera body. The first to fourth light emitting units Sa to Sd are arranged in the vertical and horizontal directions with respect to the position of the fifth light emitting unit Se. The first light-emitting portion Sa, the second light-emitting portion Sb, the third light-emitting portion Sc, and the fourth light-emitting portion Sd have a light-emitting surface that is triangular, and the fifth light-emitting portion Se has a circular light-emitting surface. The first to fifth light emitting units Sa to Se emit, for example, green light by the light emitting diode.

  FIG. 4 is a table showing the instruction contents of the first to fifth light emitting units Sa to Se. The first light-emitting unit Sa, the second light-emitting unit Sb, the third light-emitting unit Sc, and the fourth light-emitting unit Sd instruct the subject person to move by blinking. When the first light emitting unit Sa blinks, a movement instruction in the direction approaching the electronic camera 12 (an instruction to move forward) is indicated. When the third light emitting unit Sc is blinking, a movement instruction in the direction away from the electronic camera 12 (rear movement instruction) is shown. When the second light emitting unit Sb is blinking, an instruction to move to the right is given to a subject person who seems to be looking at the front of the camera body. Note that the left and right directions are reversed between the photographer looking at the liquid crystal monitor 56 on the back of the camera body and the subject person looking at the front of the camera body. When the fourth light emitting unit Sd is blinking, an instruction to move to the left is given to a subject person who seems to be looking at the front of the camera body.

  In addition, the first light-emitting unit Sa, the second light-emitting unit Sb, the third light-emitting unit Sc, and the fourth light-emitting unit Sd turn on to instruct the subject person about the face direction. When the first light emitting unit Sa is lit, an instruction to turn the face upward is shown. When the third light emitting unit Sc is lit, an instruction to turn the face further downward is shown. When the second light emitting unit Sb is lit, an instruction to turn the face to the right is shown. When the fourth light emitting unit Sd is lit, an instruction to turn the face to the left is shown. The fifth light emitting unit Se is lit in green, for example, when the position of the subject person, the orientation of the face of the subject person, and the direction of the line of sight of the subject person match the selected composition assisting frame.

<Description of operation of this embodiment>
5, 6 and 7 are flowcharts showing the operation of the electronic camera 12 of the present embodiment. FIG. 5 is an introduction part of the operation flow, and mainly shows the operation contents in the scene assist + face recognition mode. FIG. 6 shows the operation content after setting the first self-photographing mode in FIG. FIG. 7 shows the operation content after the second self-photographing mode is set in FIG. The operation of the electronic camera 12 will be described below according to the step numbers shown in the figure.

  [Step S1] When the power button of the electronic camera 12 is turned on, a power-on process is performed. Thereafter, the user operates the button group of the operation unit 40 to set the shooting mode and the like. When the photographer sets any of the scene assist + face recognition mode, the first self-photographing mode, and the second self-photographing mode, the composition assisting frame can be selected at this time. Then, the process proceeds to step S2.

[Step S2] The MPU 38 determines whether or not the scene assist + face recognition mode is set. If the scene assist + face recognition mode is set, the process proceeds to step S6, and if not, the process proceeds to step S3.
[Step S3] The MPU 38 determines whether or not the first self-photographing mode is set. If the first self-photographing mode is set, the process proceeds to step S16 in FIG. 6; otherwise, the process proceeds to step S4.

[Step S4] The MPU 38 determines whether or not the second self-photographing mode is set. If the second self-photographing mode is set, the process proceeds to step S27 in FIG. 7, and if not, the process proceeds to step S5.
[Step S5] Shooting in the set shooting mode is performed, and the operation of the electronic camera ends. Since the operation in this case is publicly known, the description is omitted.

  [Step S6] The MPU 38 starts continuous shooting for through images at a preset frame rate (for example, 30 fps) by the electronic shutter operation of the CCD 18. A plurality of pieces of image data generated continuously are subjected to image processing such as color interpolation by the image processing unit 46, and then taken into the image memory 48 and displayed on the liquid crystal monitor 56 as a through image. At this time, the selected composition assisting frame is also displayed on the through image. Since the details of this operation are known, the description is omitted. Then, the process proceeds to step S7.

  [Step S7] After a predetermined time has elapsed since the start of continuous shooting for through images, the MPU 38 reads the latest one of the plurality of image data generated for through images from the image memory 56. In addition, when reaching Step S7 again through Step S10 described later, the latest one may be read without waiting for the elapse of a predetermined time. Then, the MPU 38 extracts the face area of the subject person in the image indicated by the read image data.

  Specifically, facial feature points such as lips, eyes, and eyebrows are searched for in the face frame indicated by the selected composition assisting frame, and if not, the range is searched outside the frame. The position of the lips can be determined, for example, by detecting an area slightly redder than the surroundings. The position of the eyes can be determined, for example, by detecting a black round part corresponding to the pupil and white parts on both sides thereof. The position of both eyebrows can be determined, for example, by detecting two areas slightly blacker than the surroundings above the eyes. Then, an area that includes these facial feature points on the inside and can be regarded almost as a skin color is detected, and thereby the position of the face in the image and the area of the face area are determined. Then, the process proceeds to step S8.

  [Step S8] The MPU 38 compares the selected composition assist frame and its accompanying information with the extracted face area. This comparison is performed from four points of the face area, the face position, the line-of-sight direction, and the face direction in the image. Hereinafter, these four points are collectively referred to as a subject state. Then, the MPU 38 determines whether or not the subject state matches the composition assisting frame and its accompanying information. If they do not match, the MPU 38 determines how they are shifted and transmits them to the instruction unit 34. . Hereinafter, this determination method will be described by taking as an example the case where the composition assist frame in FIG. 2A is selected. Here, FIG. 8A is an example of an image displayed on the liquid crystal monitor 56.

First, the MPU 38 determines the area area of the face. In the case of FIG. 8A, the face area of the subject person in the image is smaller than the face frame indicated by the composition assisting frame. Accordingly, the MPU 38 determines that the face area does not match and is small.
Second, the MPU 38 determines the face position. In the case of FIG. 8A, the face area of the subject person in the image is located to the right of the face frame indicated by the composition assisting frame. Therefore, the MPU 38 determines that the face position does not match and is shifted to the right.

  Third, the MPU 38 determines the direction of the line of sight. FIG. 9 is an explanatory diagram illustrating a method of determining the direction of the line of sight. The MPU 38 first determines the area ratio of the white regions (corresponding to white eyes) on both sides of the previously detected pupil. As shown in FIG. 9A, when the area of the white eye on the right side of the pupil and the area of the white eye on the left side are equal, it is determined that neither the right side nor the left side is facing. As shown in FIG. 9B, if the area of the white eye on the right side of the pupil is smaller than the left side, it is determined that it is facing right. As shown in FIG. 9C, if the area of the white eye on the left side of the pupil is smaller than the right side, it is determined that it is facing left. If the lower curve of the outer edge of the pupil is visible as shown in FIG. If the upper curve of the outer edge of the pupil is visible as shown in FIG. As shown in FIG. 9A, it is determined that neither the right side nor the left side is facing, and when the upper and lower curves of the outer edge of the pupil are not visible, it is determined that the front side is facing. In the case of FIG. 8A, the line of sight of the subject person is directed in the front direction. On the other hand, the supplementary information of the frame for composition assistance is, as described above, “the face direction of the subject person and the direction of the line of sight are preferably slightly leftward”. Therefore, the MPU 38 determines that the direction of the line of sight does not match and should turn to the left.

  Fourth, the MPU 38 determines the face orientation. FIG. 10 is an explanatory diagram showing a method for determining the face orientation. The MPU 38 first compares the areas of both eyes in the image. As shown in FIG. 10A, when the area of the right eye is smaller than the area of the left eye in the image, it is determined that it is facing right. On the other hand, as shown in FIG. 10B, when the area of the left eye is smaller than the area of the right eye in the image, it is determined that it is facing left. Whether the face is facing upward or downward may be determined, for example, in the same direction as the direction of the line of sight previously determined. In the case of FIG. 8A, the face of the subject person is facing the front. Therefore, the MPU 38 determines that the face orientation does not match the supplementary information of the composition assisting frame, and should turn to the left.

FIG. 8B is a display example of the liquid crystal monitor 56 when the two-shot composition assist frame shown in FIG. 2D is selected. In the example of FIG. 8B, it is determined that the subject state of either subject person matches the composition assisting frame and its accompanying information. After determining the subject state as described above, the process proceeds to step S9.
[Step S9] If it is determined in step S8 that the composition assisting frame and its accompanying information match the subject state, the process proceeds to step S11. Otherwise, the process proceeds to step S10.

  [Step S10] The instruction unit 34 instructs the subject person and the photographer based on the determination result of the MPU 38 in step S8. The direction of the face is instructed to the subject person by the first to fourth light emitting units Sa to Sd. The direction of the line of sight is instructed to the subject person through the front side audio output window 72. The moving direction is instructed to the photographer through the back side audio output window 74.

When it is determined that the face area is small, the instructing unit 34, for example, emits a voice “Please move the camera forward” from the back side audio output window 74 and instructs the photographer to move forward. I do. Similarly, when it is determined that the face area is large, a sound such as “Please move the camera backward” may be issued.
When it is determined that the face area of the subject person is shifted to the right from the face portion of the composition assisting frame, the instruction unit 34 outputs, for example, a voice “Please move the camera to the right” on the back side. The camera shoots from the window 74 and instructs the photographer to move the electronic camera 12 to the right. Similarly, when it is determined that the face area is shifted to the left from the face portion of the composition assisting frame, a voice such as “Please move the camera to the left” may be generated.

  When it is determined that the direction of the line of sight should be directed further to the left, the instructing unit 34 emits a sound such as “Please turn your line of sight to the right” from the front side audio output window 72 to the subject person. The direction of the line of sight is instructed. Note that the direction in which the MPU 38 processes as left or right is the left-right direction when viewing the display image of the liquid crystal monitor 56 on the back of the camera body. Accordingly, the left-right direction output as the determination result of the MPU 38 is opposite to the left-right direction for the subject person looking at the front of the camera body. This also applies to the instruction of the face direction to the subject person described later in this step and the instruction of the movement direction to the subject person in step S20.

  Similarly, when it is determined that the direction of the line of sight should be more upward, downward, or right, the sound may be emitted from the front-side audio output window 72 in the same manner. Note that the direction of the line of sight for the subject person may be performed by causing the first to fourth light emitting units Sa to Sd to emit light in a color different from the movement instruction. For example, any one of the first to fourth light emitting units Sa to Sd may be lit in orange to indicate that the line of sight is directed further upward, right, lower, or left. However, it is considered that an instruction for the direction of the line of sight is more preferable. This is because, for example, if the subject person faces right or left, the electronic camera 12 does not enter the field of view.

  When it is determined that the face should be directed upward, the instruction unit 34 instructs the subject person to turn the face upward by turning on the first light emitting unit Sa. When it is determined that the face should be directed further downward, the instruction unit 34 instructs the subject person to turn the face further downward by turning on the third light emitting unit Sc. When it is determined that the face should be turned to the left, the instruction unit 34 turns on the second light emitting unit Sb to instruct the subject person to turn the face further to the right. When it is determined that the face should be turned to the right, the instruction unit 34 turns on the fourth light emitting unit Sd to instruct the subject person to turn the face further to the left. Thereafter, the photographer and the subject person approach the composition for photographing according to the instruction. Then, the process returns to step S6.

  Note that when the two-shot composition assist frame is selected, it is considered that there are two subject persons, so which subject person the instructions of the first to fourth light emitting units Sa to Sd are directed to. There is a risk of not knowing. In this case, all the instructions may be performed separately for the two subject persons by the sound from the front-side sound output window 72. For example, “The person on the right should move forward. The person on the left should turn to the right. ” Or you may instruct | indicate separately with respect to two subject persons by providing another thing similar to 1st-5th light emission part Sa-Se on the front side of a camera body.

  [Step S11] The instructing unit 34 turns on the fifth light emitting unit Se to notify the subject person that the subject state matches the frame for assisting composition and the accompanying information. In addition, the instruction unit 34 emits a sound such as “Matched” from the back side audio output window 74 to notify the photographer that the subject state matches the frame for assisting composition and its accompanying information. Then, the process proceeds to step S12.

[Step S12] The MPU 38 determines whether or not the release button has been fully pressed. If it is fully pressed, the process proceeds to step S15, and if not, the process proceeds to step S13.
[Step S13] The MPU 38 determines whether or not the subject state matches the composition assisting frame and its accompanying information by the same operation as in Steps S6 to S9. Is transmitted to the instruction unit 34. If they match, the process returns to step S12, and if they do not match, the process proceeds to step S14.

  [Step S14] The instructing unit 34, for example, emits a sound such as “developed” simultaneously from both the front-side audio output window 72 and the rear-side audio output window 74, thereby informing the photographer and the subject person of the deviation. Inform. At the same time, the instruction unit 34 turns off the fifth light emitting unit Se. Thereafter, the MPU 38 transmits how the deviation has occurred to the instruction unit 34 and returns to Step S10.

  [Step S15] Still image shooting is performed by a known operation, and image data of a still image is generated. The image processing unit 52 performs image processing such as white balance adjustment, color interpolation, color correction, contour enhancement, gamma correction, and image compression on the generated image data. Thereafter, the image data is recorded on the memory card 68 via the card interface 64. Thereby, the operation of the electronic camera 12 ends.

[Step S16] Steps S16 to S26 in FIG. 6 are processes in the first self-photographing mode. In step S1, the photographer sets the shooting mode to the first self-shooting mode, and then moves to the shooting range of the electronic camera 12. Thereafter, in step S16, the electronic camera 12 performs the same process as in step S6. Then, the process proceeds to step S17.
[Steps S17 to S19] The electronic camera 12 performs the same processing as steps S7 to S9. If it is determined in step S18 that the composition assisting frame and its accompanying information match the subject state, the process proceeds to step S21. Otherwise, the process proceeds to step S20.

[Step S20] The instruction unit 34 instructs the subject person who is the photographer to move based on the determination result of the MPU 38 in step S18.
When it is determined that the face area is small, the instruction unit 34 instructs the movement in the direction approaching the electronic camera 12 (forward movement) by blinking the first light emitting unit Sa. When it is determined that the face area is large, the instruction unit 34 instructs the movement away from the electronic camera 12 (backward movement) by blinking the third light emitting unit Sc. When it is determined that the face area of the subject person is shifted to the right with respect to the face portion of the composition assisting frame, the instruction unit 34 causes the second light emitting unit Sb to blink, thereby causing “move to the right”. Instruct the subject person. Here, “move to the right” is a move in the direction that the MPU 38 is processing as left as described above. When it is determined that the face area is shifted to the left from the face portion of the composition assisting frame, the fourth light emitting unit Sd is blinked to instruct the movement to the left. The direction of the line of sight and the direction of the face are instructed in the same manner as in step S10. Then, the process returns to step S16.

  [Step S21] The instructing unit 34 turns on the fifth light emitting unit Se to notify the subject person that the subject state matches the frame for assisting composition and the accompanying information. In addition, you may notify by producing | generating the audio | voices, such as "it matched", from the front side audio | voice output window 72. The MPU 38 starts counting the elapsed time with reference to the time when the match is notified (time zero). Then, the process proceeds to step S22.

[Step S22] The MPU 38 determines whether or not the subject state matches the composition assisting frame and the accompanying information by the same operation as in Steps S16 to S19. If they do not match, the process proceeds to step S23, and if they match, the process proceeds to step S24.
[Step S23] The instructing unit 34 emits a sound such as “deviation” from the front-side audio output window 72 and turns off the fifth light emitting unit Se to notify the deviation. Thereafter, the MPU 38 transmits how the deviation has occurred to the instructing unit 34, and returns to step S20.

[Step S24] The MPU 38 determines whether or not the elapsed time from the notification of the match has reached a predetermined time A. Here, the predetermined time A may be set by being input by the photographer, or a predetermined time (for example, 5 seconds) stored in the program memory 50 may be used. If the predetermined time has been reached, the process proceeds to step S25, and if not, the process returns to step S22.
[Step S25] The MPU 38 starts a self-timer. The self-timer here proceeds to step S26 after a predetermined time B has elapsed and automatically starts photographing. The predetermined time B may be set by being input by the photographer, or may be stored in advance in the program memory 50 (for example, 3 seconds).

[Step S26] The electronic camera 12 shoots a still image by the same process as in step S15, and then ends the operation.
[Steps S27 to S30] Steps S27 to S35 in FIG. 7 are processes in the second self-photographing mode. In steps S27 to S30, the electronic camera 12 performs the same processing as steps S16 to S19. If it is determined in step S29 that the composition assisting frame and its accompanying information match the subject state, the process proceeds to step S32. Otherwise, the process proceeds to step S31.

[Step S31] Based on the determination result of the MPU 38 in step S29, the instruction unit 34 instructs the subject person who is the photographer to move, as in step S20.
[Step S32] As in step S21, the instruction unit 34 notifies the subject person that the subject state matches the composition assisting frame and its accompanying information. Then, the process proceeds to step S33.

  [Step S33] The processing of steps S33 to S35 is processing for determining whether or not the subject has performed a shooting cue operation. In the present embodiment, the operation of closing the eyes three times will be described as an example of the photographing cue operation. In step S33, the electronic camera 12 performs continuous shooting for through images at a predetermined frame rate, as in step S6. The MPU 38 starts counting the continuous shooting period in synchronization with the start of the continuous shooting.

[Step S34] When the continuous shooting period reaches a predetermined time C (for example, 3 seconds), the MPU 38 resets the counting of the continuous shooting period, and then restarts the counting of the continuous shooting period and proceeds to Step S35. Migrate processing. Note that the continuous shooting is continuously performed even after the count of the continuous shooting period is reset.
[Step S35] The MPU 38 reads image data generated after the start of counting the continuous shooting period from the image memory 48, extracts an image area for operation determination from this image data, and stores it in the image memory 48. . Since the image region for motion determination here is for determining the motion of closing eyes three times, it is a region including both eyes detected in step S28, and half of the entire region of the face is sufficient. The operation of reading the image data and extracting the image area in this way is performed for all the image data generated after the start of the continuous shooting period counting. Note that when the processing in steps S34 and S35 is performed again, the image data read once is not read. Further, the number of image data read here is not limited to all generated data. What is necessary is just to set arbitrarily according to the processing speed of MPU38, etc. For example, you may read in the ratio of 1 frame in all 3 frames generated.

Next, the MPU 38 arranges the image areas in order of time, and determines whether or not a shooting cue operation has been performed. Specifically, for example, it may be determined whether or not the appearance of the pupil and the white eye disappearing again is repeated three times. Since the color of the skin that appears on the pupil when the eyes are closed and the color of the pupil or the white of the eye are significantly different from each other, this operation can be determined from the difference in color.
If it is determined that the shooting signal operation is not performed, the process returns to step S34. Note that generation of image data by continuous shooting and counting of the continuous shooting period are continued even during the above-described operation determination processing period. The predetermined time C is set to be sufficiently longer than the period required for this operation determination. If it is determined that the shooting cue operation has been performed, the process proceeds to step S25 after stopping the continuous shooting and counting of the continuous shooting period. Thereafter, shooting is performed in the same manner as described above, and the operation of the electronic camera 12 ends. The above is the description of the operation of the electronic camera 12 of the present embodiment.

<Effect of this embodiment>
In the conventional photographing mode using the composition assist frame, the face recognition of the subject person is not performed. For this reason, the photographer has confirmed details such as the direction of the face of the subject person and the direction of the line of sight by looking at the subject image on the liquid crystal display device. Therefore, when it is difficult to see the liquid crystal display device, the composition intended by the photographer may be greatly different from the captured image.

  On the other hand, in the scene assist + face recognition mode, the first self-photographing mode, and the second self-photographing mode of the present embodiment, the face area of the subject person is extracted, and the face area area, face position, line-of-sight direction, face Determine the direction. Based on this, it is determined whether or not the composition assisting frame and its accompanying information match the subject state, and if they do not match, how they are shifted. If they do match, the fact is notified by voice or lighting of the fifth light emitting unit Se, and if they do not match, an instruction is given as to what should be done.

  Therefore, even if the liquid crystal monitor 56 is not visible due to sunlight, at least one of the photographer and the subject person can follow the instructions of the electronic camera 12 and can photograph with the composition intended by the photographer. In addition, since detailed instructions such as the direction of gaze and the direction of the face are given, a photograph with an atmosphere can be taken. Furthermore, in the first and second self-photographing modes, the photographer can take a picture of the composition intended by himself by simply following the instructions of the electronic camera 12. As a result, user convenience is greatly improved.

<Supplementary items of this embodiment>
[1] In the scene assist + face recognition mode, an example in which the direction of movement is instructed to the photographer and the position of the subject person in the image is adjusted by moving the electronic camera 12 has been described. The present invention is not limited to such an embodiment. For example, the position of the subject person in the image may be adjusted by instructing the subject person in the direction of movement in the same manner as in step S20 and having the subject person move. In addition, the movement instruction to the photographer may be given by guide display instead of voice. Specifically, for example, the same thing as the first to fifth light emitting portions Sa to Se can be provided on the back side of the camera body. In the first and second self-photographing modes, an example in which the moving direction and the face direction are instructed to the subject person by the first to fifth light emitting units Sa to Se has been described. May be performed.

  [2] As the shooting signal operation, an operation of closing the eyes three times is given. The present invention is not limited to such an embodiment. For example, an operation of shaking the hand three times in front of the face may be used. Also in this case, when images generated by continuous shooting are arranged in time order, an image that is not visible because the pupils and white eyes are covered with hands can be determined. In addition, it is considered that the shooting cue operation is an operation that can detect a motion from a difference in color because it can be easily determined by image processing.

  [3] The example in which, in step S7, the latest one of the plurality of image data generated for the through image is read and processing such as face recognition is performed. The present invention is not limited to such an embodiment. The image data read by the MPU 38 may be arbitrarily set according to the processing speed of the MPU 38 and the like. For example, image data may be read at a rate of one frame per second, and the instruction content may be updated every other second.

  [4] An example has been described in which information stored in advance in the program memory 50 is used as supplementary information indicating which direction is desirable for the direction of the face of the subject person and the direction of the line of sight, and instructions are given accordingly. The present invention is not limited to such an embodiment. The incidental information of each composition assisting frame may be set by the photographer's input. Also, a composition assisting frame may be used as desired by the photographer's input.

[5] Finally, the correspondence between the claims and the present embodiment will be described. In addition, the correspondence shown below is one interpretation shown for reference, and does not limit the present invention.
The guide display described in the claims corresponds to a frame for composition assistance.
The display unit described in the claims corresponds to the liquid crystal monitor 56.
The imaging unit described in the claims includes an imaging lens 14, a focal plane shutter 16, a CCD 18, an analog signal processing unit 20, an A / D conversion unit 22, a timing generator 26, and an MPU 38 that causes these to generate image data. It corresponds to the function of.

The extraction unit described in the claims corresponds to the function of the MPU 38 for extracting the face area of the subject person in the image data (see step S5).
The instructing unit described in the claims determines that “the area of the face, the position of the face, the direction of the line of sight, and the direction of the face are determined, and these four points (subject state), the frame for assisting composition, and its supplementary information. This corresponds to the function of the MPU 38 that determines whether or not they match, and the instruction unit 34 ”(see step S6).

The operation determination unit described in the claims determines whether or not the subject state matches the frame for assisting composition and its accompanying information, and after determining that the subject matches, whether or not the subject has performed a shooting cue operation. Corresponds to the “function of the MPU 38 for determining“ (see steps S27 to S29, S34, S35).
The predetermined time described in the claims corresponds to the predetermined time B (see step S25).

  As described above in detail, the present invention can be used greatly in the field of electronic cameras.

It is a block diagram of the electronic camera of this embodiment. It is an example of a frame for composition assistance stored in advance by the electronic camera. It is a front view of an electronic camera. It is a table | surface which shows the instruction content of the 1st-5th light emission part. This is an introduction of a flowchart showing the operation of the electronic camera, and mainly shows the operation content in the scene assist + face recognition mode. In FIG. 5, it is a flowchart which shows the operation | movement content after setting to 1st self imaging | photography mode. In FIG. 5, it is a flowchart which shows the operation | movement content after setting to 2nd self imaging | photography mode. FIGS. 2A and 2B are examples of images displayed on a liquid crystal monitor, in which FIG. 2A is an example when a frame for composition assistance in FIG. 2A is selected, and FIG. 2B is an image for composition assistance in FIG. This is an example when the frame is selected. It is explanatory drawing which shows the determination method of the direction of eyes | visual_axis. It is explanatory drawing which shows the determination method of face direction.

Explanation of symbols

12 Electronic Camera 14 Shooting Lens 16 Focal Plane Shutter 18 CCD
20 Analog signal processing unit 22 A / D conversion unit 26 Timing generator 28 Aperture drive mechanism 30 Shutter control unit 32 Lens drive unit 34 Instruction unit 38 MPU
40 operation unit 44 system bus 46 image processing unit 48 image memory 50 program memory 54 monitor control unit 56 liquid crystal monitor 58 card interface 60 memory card 66 lens group 68 aperture 72 front side audio output window 74 rear side audio output window

Claims (4)

  1. A display unit for displaying a guide for determining a composition of shooting;
    An imaging unit that images a subject and generates image data;
    An extraction unit for extracting a characteristic part of the subject in the image data;
    An instruction unit that is arranged on the front side of the apparatus and outputs an instruction to bring the subject closer to the photographing composition indicated by the guide display;
    The display unit performs guide display based on a predetermined frame for performing the guide display,
    The predetermined frame is stored in the program memory in association with incidental information indicating a subject state with respect to the predetermined frame ,
    The instruction unit outputs the instruction to the subject so as to be visible based on the extracted characteristic part of the subject and the incidental information, and brings the subject closer to the photographing composition indicated by the guide display. An electronic camera that outputs an instruction and an instruction based on the incidental information so as to be identifiable by visual recognition of the subject.
  2. The electronic camera according to claim 1.
    The extraction unit extracts a face area of the subject,
    The instruction unit estimates the direction of the line of sight of the subject in the image indicated by the image data as a subject state based on the extracted face area, and gives an instruction to bring the subject state closer to the composition of the shooting. An electronic camera characterized by output.
  3. The electronic camera according to claim 1 or 2,
    The instruction unit includes an electronic camera that includes a first output unit that outputs a voice instruction to the subject, and a second output unit that outputs a voice instruction to the photographer. .
  4. The electronic camera according to any one of claims 1 to 3,
    Based on the extracted characteristic part of the subject, it is determined whether or not the subject matches the shooting composition, and after determining that it matches, whether or not the subject has performed a shooting cue operation is extracted. An operation determination unit for determining based on a characteristic part of the subject,
    An electronic camera, wherein shooting is started after a predetermined time has elapsed from when the operation determination unit determines that the shooting signal operation has been performed.
JP2004254396A 2004-09-01 2004-09-01 Electronic camera Active JP4970716B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004254396A JP4970716B2 (en) 2004-09-01 2004-09-01 Electronic camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004254396A JP4970716B2 (en) 2004-09-01 2004-09-01 Electronic camera

Publications (2)

Publication Number Publication Date
JP2006074368A JP2006074368A (en) 2006-03-16
JP4970716B2 true JP4970716B2 (en) 2012-07-11

Family

ID=36154508

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004254396A Active JP4970716B2 (en) 2004-09-01 2004-09-01 Electronic camera

Country Status (1)

Country Link
JP (1) JP4970716B2 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4864502B2 (en) * 2006-03-23 2012-02-01 富士フイルム株式会社 Imaging apparatus and imaging condition guide method
JP4891674B2 (en) * 2006-06-30 2012-03-07 オリンパスイメージング株式会社 camera
JP4463792B2 (en) 2006-09-29 2010-05-19 富士フイルム株式会社 Imaging device
JP4904243B2 (en) 2007-10-17 2012-03-28 富士フイルム株式会社 Imaging apparatus and imaging control method
KR100840023B1 (en) * 2007-11-13 2008-06-20 (주)올라웍스 Method and system for adjusting pose at the time of taking photos of himself or herself
JP4944055B2 (en) * 2008-03-19 2012-05-30 富士フイルム株式会社 Imaging apparatus and imaging control method
JP5493284B2 (en) * 2008-03-31 2014-05-14 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
JP5251547B2 (en) 2008-06-06 2013-07-31 ソニー株式会社 Image photographing apparatus, image photographing method, and computer program
US8477207B2 (en) 2008-06-06 2013-07-02 Sony Corporation Image capturing apparatus, image capturing method, and computer program
JP5152583B2 (en) * 2008-09-26 2013-02-27 カシオ計算機株式会社 Imaging apparatus, imaging guide method, and program
KR101030652B1 (en) * 2008-12-16 2011-04-20 아이리텍 잉크 An Acquisition System and Method of High Quality Eye Images for Iris Recognition
JP5325626B2 (en) * 2009-03-25 2013-10-23 オリンパスイメージング株式会社 Camera and subject tracking method
JP5385032B2 (en) * 2009-07-08 2014-01-08 ソニーモバイルコミュニケーションズ株式会社 Imaging apparatus and imaging control method
JP5709500B2 (en) * 2010-12-09 2015-04-30 株式会社ザクティ Electronic camera
US9374517B2 (en) 2012-10-12 2016-06-21 Ebay Inc. Guided photography and video on a mobile device
KR101431651B1 (en) 2013-05-14 2014-08-22 중앙대학교 산학협력단 Apparatus and method for mobile photo shooting for a blind person
JP6257198B2 (en) * 2013-07-22 2018-01-10 キヤノン株式会社 Optical apparatus, control method therefor, and control program
WO2015056466A1 (en) * 2013-10-16 2015-04-23 オリンパスイメージング株式会社 Display device, image generation device, display method and program
JP5702037B1 (en) * 2013-10-30 2015-04-15 オリンパスイメージング株式会社 Imaging apparatus, imaging method, and program
WO2015064144A1 (en) 2013-10-30 2015-05-07 オリンパスイメージング株式会社 Image capturing device, image capturing method, and program
JP6555632B2 (en) * 2014-08-20 2019-08-07 リコーイメージング株式会社 Interchangeable lens camera
JP2016189544A (en) * 2015-03-30 2016-11-04 富士フイルム株式会社 Image pickup device, image pickup method, program and recording medium

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0652362B2 (en) * 1986-04-24 1994-07-06 オリンパス光学工業株式会社 Serufumo - de function with a camera
JPH01310332A (en) * 1988-06-09 1989-12-14 Olympus Optical Co Ltd Camera
JPH0473731A (en) * 1990-07-16 1992-03-09 Nikon Corp Camera instructing change of composition
JP3412238B2 (en) * 1993-03-31 2003-06-03 株式会社ニコン Camera with composition advice function
JPH10243274A (en) * 1997-02-25 1998-09-11 Sanyo Electric Co Ltd Image-photographing device
JP2000244800A (en) * 1999-02-22 2000-09-08 Victor Co Of Japan Ltd Image pickup device
JP2001215403A (en) * 2000-02-01 2001-08-10 Canon Inc Image pickup device and automatic focus detection method
JP2001357404A (en) * 2000-06-14 2001-12-26 Minolta Co Ltd Picture extracting device
JP4421151B2 (en) * 2001-09-17 2010-02-24 株式会社リコー Digital camera imaging device
JP4364465B2 (en) * 2001-09-18 2009-11-18 株式会社リコー Imaging device
JP2003224761A (en) * 2002-01-28 2003-08-08 Konica Corp Imaging apparatus

Also Published As

Publication number Publication date
JP2006074368A (en) 2006-03-16

Similar Documents

Publication Publication Date Title
US6549231B1 (en) Image recording apparatus
EP2051506B1 (en) Imaging device and imaging control method
JP4521071B2 (en) Digital still camera with composition assist function and operation control method thereof
US20060044399A1 (en) Control system for an image capture device
JP5547854B2 (en) Portrait image synthesis from multiple images captured by a portable device
JP5136669B2 (en) Image processing apparatus, image processing method, and program
JP4853320B2 (en) Image processing apparatus and image processing method
US8330714B2 (en) Electronic device
JP4683339B2 (en) Image trimming device
JP4135100B2 (en) Imaging device
JP2007028536A (en) Digital camera
JP2008244804A (en) Image-taking device and method, and control program
JP2006005662A (en) Electronic camera and electronic camera system
JP2005252732A (en) Imaging device
US9185370B2 (en) Camera and camera control method
CN101311962B (en) The image editing apparatus and the control method of the image editing apparatus
EP2791899B1 (en) Method and apparatus for image capture targeting
US8111321B2 (en) Imaging device and method for its image processing, with face region and focus degree information
JPH11355617A (en) Camera with image display device
KR20100039430A (en) Image processor, image processing method, digital camera, and imaging apparatus
US20080181460A1 (en) Imaging apparatus and imaging method
US8102465B2 (en) Photographing apparatus and photographing method for photographing an image by controlling light irradiation on a subject
JP5814566B2 (en) Imaging device, imaging method, and imaging device control program
JP5268433B2 (en) Imaging device and imaging device control method
JP4463792B2 (en) Imaging device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070829

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090811

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091013

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20091104

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091228

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100126

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100329

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20100420

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100720

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20100906

A912 Removal of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A912

Effective date: 20110121

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20120309

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120405

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150413

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150413

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250