JP4553346B2 - Focus adjustment device and focus adjustment method - Google Patents

Focus adjustment device and focus adjustment method Download PDF

Info

Publication number
JP4553346B2
JP4553346B2 JP2003361940A JP2003361940A JP4553346B2 JP 4553346 B2 JP4553346 B2 JP 4553346B2 JP 2003361940 A JP2003361940 A JP 2003361940A JP 2003361940 A JP2003361940 A JP 2003361940A JP 4553346 B2 JP4553346 B2 JP 4553346B2
Authority
JP
Japan
Prior art keywords
focus
focus frame
eye
step
face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2003361940A
Other languages
Japanese (ja)
Other versions
JP2005128156A5 (en
JP2005128156A (en
Inventor
賢司 高橋
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Priority to JP2003361940A priority Critical patent/JP4553346B2/en
Publication of JP2005128156A5 publication Critical patent/JP2005128156A5/ja
Publication of JP2005128156A publication Critical patent/JP2005128156A/en
Application granted granted Critical
Publication of JP4553346B2 publication Critical patent/JP4553346B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a focus adjustment apparatus and a focus adjustment method, and more particularly to a focus adjustment apparatus and a focus adjustment method that are preferably used in an imaging apparatus.

  When shooting a person, the focus is basically on the eyes. Even if the outline of the nose or cheeks is in focus, if the eye is not in focus, the photographed image gives a defocused impression.

  In a conventional imaging apparatus with a fixed focus frame, when it is desired to take a photograph with a composition as shown in FIG. 12A, in order to take a photograph focused on the eyes, as shown in FIG. The angle of view is changed so that the eyes are within a predetermined frame (focus frame 70), the shutter is half-pressed to focus on the eye, and the shutter is half-pressed to change its focus position. While holding (AF lock), shooting is performed at the desired angle of view as shown in FIG.

  In order to perform the focus adjustment more accurately, an imaging apparatus that performs focus adjustment by multi-point distance measurement having a plurality of focus frames 80 as shown in FIG. 13A has been proposed.

  In addition, a method has been proposed in which the eye is first focused so that the entire face of the subject is in focus, and the depth of field is set so that approximately 10 cm in front and behind is focused on the focal length. Patent Document 1).

Japanese Patent No. 3164692

  However, as described above, when changing the angle of view while keeping the shutter button pressed halfway, the shutter button may be accidentally released when changing the angle of view, or conversely, the desired angle of view may be depressed by pressing it fully. There were inconveniences such as being shot before it became. In such a case, the focus adjustment must be started again, and it takes a long time to shoot and sometimes misses a photo opportunity.

  In an imaging device that supports multi-point distance measurement, the person who is the main subject is focused, but if the person's eye position is out of the distance measurement point (within the focus frame), In some cases, it was not in focus. Further, since the size of the focus frame cannot be set, as shown in FIG. 13B, when the focus frame selected from the plurality of focus frames includes a facial part other than the eyes, for example, the nose, There was a problem that the camera was in focus.

  Also, the technique described in Patent Document 1 has a problem that when a plurality of persons are photographed, only one person is focused and other persons are blurred.

  The present invention has been made in view of the above-described problems, and an object of the present invention is to make it easy to obtain a photographed image focused on the eyes when photographing a person.

In order to solve the above problems, the focus detection apparatus of the present invention converts an incident light into an electrical signal corresponding to the amount of light and outputs image data based on the image data from the imaging means. A face detection unit for detecting a face from the subject image, a first detection unit for detecting an eye from the subject image, and a focus frame indicating a position and size for extracting image data used for focusing. A focus frame setting unit; and a second detection unit configured to detect a focal point position of one or more focus lenses based on image data in the focus frame, wherein the focus frame setting unit includes the first frame set the focus frame in the eye which is detected by the detecting means, first by the face detection unit, when the second face is detected based on the detection result of said second detecting means, said first Of choosing the right eye and one of each other near the focal position of the eye of any said second face of the right and left eyes of the left eye, reset the focus frame on the selected eye.

In addition, the focus detection method of the present invention converts an incident light into an electrical signal corresponding to the amount of light and outputs image data, and based on the image data obtained in the imaging step, from a subject image A face detection step for detecting a face, a first detection step for detecting an eye from the subject image, and a focus frame setting step for setting a focus frame indicating a position and size for extracting image data used for focusing And a second detection step of detecting a focal point position of one or more focus lenses based on the image data in the focus frame, wherein the focus frame setting step is detected by the first detection means. When the first and second faces are detected in the face detection step, the right eye of the first face is set based on the detection result in the second detection step. And left Any and select one of the eye near the focused position each other of the right and left eyes of the second face, reset the focus frame on the selected eye.

  According to the above configuration, it is possible to easily obtain a photographed image focused on the eyes when photographing a person.

  The best mode for carrying out the present invention will be described below in detail with reference to the accompanying drawings. However, the dimensions, shapes, relative arrangements, and the like of the components exemplified in the present embodiment should be changed as appropriate according to the configuration of the apparatus to which the present invention is applied and various conditions. It is not limited to.

<Embodiment 1>
Embodiment 1 of the present invention will be described below.

  FIG. 1 is a block diagram illustrating a simple configuration of the digital camera according to the first embodiment.

  An imaging unit 101 includes a lens system, an aperture, a shutter, a CCD, an A / D converter, and the like, and an optical image projected on the CCD by the lens system is converted into a digital signal and output. Reference numeral 102 denotes an exposure control unit which performs control related to exposure such as photometry of the luminance value of the screen, determination of the aperture value and shutter speed value, and setting of the aperture and shutter speed. Reference numeral 103 denotes a face detection processing unit, 104 denotes an AF control unit, 105 denotes a focus frame display unit, 106 denotes a main exposure control unit, and 107 denotes image data obtained by the main exposure shooting, for example, white balance correction and γ correction. An image processing unit 108 that performs known image processing such as the above, is a data writing unit that converts image data subjected to image processing into a predetermined format and records the image data on a recording medium (not shown).

  Next, photographing processing of the digital camera having the above-described configuration according to the first embodiment will be described with reference to the flowcharts of FIGS.

  When preparation for shooting is instructed by half-pressing a shutter button (not shown) or the like, the processing starts, and the imaging unit 101 performs shooting for exposure control. The exposure control unit 102 performs exposure control using image data obtained by photographing (step S101). In step S <b> 102, the face detection processing unit 103 detects whether a human face is present on the screen using the exposure control image data.

  The face detection processing unit 103 has a face detection function as described in, for example, JP-A-2002-251380.

  First, extract the edge from the image data for exposure control using luminance information to detect the shape that seems to be the organ that constitutes the face such as both eyes and mouth, extract the surrounding contour line that encompasses them, Determine the face area. From the determined face area, a point (hereinafter referred to as a “feature point”) that is further characteristic is detected in detail using luminance information. From the feature points, more detailed shapes and coordinates such as organs constituting the face such as both eyes and nose and the outline of the face are obtained. Thereby, the space | interval of both eyes, the shape of a nose, a mouth, an outline, and its arrangement | positioning state information can be obtained, for example.

  If a face exists (YES in step S103), in step S104, the face detection processing unit 103 obtains position information and size information of a person's eyes. In step S105, the AF control unit 104 sets a focus frame from the position information and size information of the human eye obtained by the face detection processing unit 103, and the process proceeds to step S107. Note that the size of the focus frame 120 at this time varies depending on the size of the eye as shown in FIG. Further, if there is only one face, one or two coordinates and size information of the right eye, left eye (in the case of landscape orientation) or both eyes and size information are obtained. You will get two coordinates and size information.

  On the other hand, if no face is detected in step S103, a predetermined focus frame is set in step S106, and the process proceeds to step S107.

  In step S <b> 107, the AF control unit 104 moves the focus lens of the optical system of the imaging unit 101 to the closest position. In step S108, the AF control unit 104 performs high-frequency detection on the image data in each focus frame set by the face detection processing unit 103 with respect to the image data acquired by the imaging unit 101 at the focus lens position. Processing is performed to obtain a high frequency signal. Furthermore, an addition value (hereinafter referred to as “focus signal”) of the high-frequency signal obtained for each focus frame is obtained. The process of obtaining the focus signal for each focus frame in step S108 is performed while moving the focus lens from the closest distance to infinity (until YES in step S109) with a predetermined step width (step S110).

  When the acquisition of the focus signal from the closest distance to the infinity of each focus frame is completed, the one with the largest focus signal value is obtained for each focus frame, and the step position of the focus lens that gives it is determined for each focus frame. The in-focus position is set (step S111). That is, when there are M focus frames in the screen, M focus positions are obtained. If the number of focus frames is one (NO in step S112 in FIG. 3), the obtained step position may be used as the in-focus position as it is, but when a plurality of focus frames are set (in step S112). YES), the closest in-focus position among the respective in-focus positions is set as the final in-focus position (step S113).

  The focus frame information corresponding to the final in-focus position determined by the AF control unit 104 is sent to the focus frame display unit 105 in step S114, and the focus frame is displayed on a display (not shown).

  The user confirms from the displayed focus frame whether or not the focus is set to a desired point, that is, in the case of portrait shooting, whether or not the focus is set on the person's eyes. If it is a location (YES in step S115), shooting is instructed by fully pressing a shutter button (not shown) or the like, and if it is not a desired location (NO in step S115), the shutter button is released and shooting is performed. This process is terminated without any processing.

  If the focus frame is at a desired position, the process proceeds to step S116, the final focus position information obtained by the AF control section 104 is sent to the main exposure control section 106, the focus lens is moved to the focus position, and the aperture is further reduced. And the main shutter is controlled by controlling the mechanical shutter.

  As described above, according to the first embodiment, face detection is performed to determine the coordinates and size of the eye, and the size and position of the focus frame are set in accordance with these, so it is easy to photograph a person in focus on the eye. Can be done.

<Embodiment 2>
Next, Embodiment 2 of the present invention will be described below.

  FIG. 5 is a block diagram showing a simple configuration of the digital camera according to the second embodiment. FIG. 5 shows a configuration in which a bracket shooting parameter determination unit 201 is added to the configuration shown in FIG. 1, and the other configurations are the same as those in FIG.

  Hereinafter, photographing processing of the digital camera having the above-described configuration according to the second embodiment will be described with reference to the flowchart of FIG. Note that the processing from step S101 to step S111 is the same as the processing described in the first embodiment with reference to FIG.

When the in-focus position of each focus frame is detected by the processing up to step S111, it is determined in step S201 whether the number of faces detected in step S102 is one. In one case, the in-focus position of the right eye and the in-focus position of the left eye have been detected. In step S202, an in-focus intermediate position that is intermediate between these two in-focus positions is obtained. Here, if the left-eye in-focus position is Fleft and the right-eye in-focus position is Fright, the in-focus intermediate position Fmiddle is obtained by the following equation (1).
Fmiddle = (Fleft + Fright) / 2 (1)

  In step S203, focus frame information (position and size information) corresponding to the left eye and right eye is sent to the focus frame display unit 105, and the focus frame is displayed at the position of the left eye and the position of the right eye.

  On the other hand, three in-focus position information of the in-focus position Fleft and the in-focus position Fright of the right eye determined by the AF control unit 104 and the in-focus position Fmiddle, which is the middle of both in-focus positions, are bracket shooting. It is sent to the parameter determination unit 201. In the bracket shooting parameter determination unit 201, if all three in-focus positions are different, it is determined that three focus bracket shootings are necessary, the number of shots is set to three, and the focus position at the time of the three shots is Fleft, Set to Fright and Fmiddle. If the right eye and the left eye are in the same focal position, it is determined that only one shot is required, the number of shots is set to 1, and the focus position is set to Fleft (step S204).

  Next, a case where a plurality of faces are detected (YES in step S205) will be described. Here, in order to make the explanation easy to understand, a case where two faces are detected will be described as a specific example, but the same processing is also performed when there are three or more faces.

In the processing up to step S111, the AF control unit 104 has already detected the focal position of each focus frame corresponding to the right eye and the left eye of each face. Here, the focal positions of the first left eye and the right eye are F1left and F1right, and the focal positions of the second left eye and the right eye are F2left and F2right. Next, in-focus positions close to each other are selected from the in-focus positions of the left eye and the right eye of each person (step S206). Here, it is assumed that the left eye is in focus position F1left for the first person and the right eye focus position F2right is selected for the second person. Focused position F1left, F2right is sent to Bed Rake' preparative imaging parameter determining unit 201.

  On the other hand, in step S207, focus frame information (position and size information) corresponding to the in-focus positions (in-focus positions F1left and F2right in the above example) selected in step S206 is sent to the focus frame display unit 105. A focus frame is displayed at each position.

In step S208, the blanking racket Tsu preparative imaging parameter determining unit 201, and two of shots Different focused position of F1left and F2right, sets the focus position at the time of the two imaging F1left, to F2right.

  In step S209, the user checks whether or not the point to be focused is a desired point from the displayed focus frame. If the point is a desired point (YES in step S209), a shutter (not shown) is displayed. If the shooting is instructed by pressing the button fully, for example, if it is deviated from the eyes (NO in step S209), the process is ended without releasing the shutter button and shooting.

  If the focus frame is on the human eye, the process proceeds to step S210, where the main exposure control is performed according to the number of bracket shots and the focus position determined by the bracket shooting parameter determination unit 201 in step S204 or step S208, and the focus position is sequentially moved. Take a picture.

  If NO in step S205, that is, if a face is not detected, the processing after step S112 in FIG.

  As described above, according to the second embodiment, face detection is performed to determine the coordinates and size of the eye, and the number of blanket shots and the focus position are determined based on the respective in-focus positions. It is possible to easily take a picture of a person who matches.

  In the second embodiment, when only one face is detected, three in-focus positions, that is, the in-focus position between the left eye and the right eye, and an intermediate in-focus position are obtained, and bracket shooting is performed. Although the case of determining the conditions (the number of shots and the focus position) has been described, the bracket shooting conditions may be obtained from the two in-focus positions of the left eye and the right eye.

  In the second embodiment, the bracket shooting parameter determination unit 201 performs a process of reducing the number of shots when the same in-focus position is present, but the AF control unit 104 obtains the number of shots without reducing the number of shots. The photographing may be performed so as to correspond to all the in-focus positions.

  Further, the timing of the focus frame display in step S203 and step S207 is not limited to that described above, and can be changed as appropriate. Further, in step S207, not only the in-focus position selected in step S206 but also the focus frame of the in-focus position before selection may be displayed.

<Embodiment 3>
Next, Embodiment 3 of the present invention will be described below.

  FIG. 7 is a block diagram showing a simple configuration of the digital camera according to the third embodiment. FIG. 7 is a configuration in which an AvTv determination unit 301 is added to the configuration shown in FIG. 1, and the other configurations are the same as those in FIG.

Hereinafter, photographing processing of the digital camera having the above-described configuration according to the third embodiment will be described with reference to the flowchart of FIG. Note that the processing from step S101 to step S111 is the same as the processing described in the first embodiment with reference to FIG.
When the in-focus position of each focus frame is detected by the processing up to step S111, it is determined in step S301 whether the number of faces detected in step S102 is one. In one case, assuming that the left-eye in-focus position is Fleft and the right-eye in-focus position is Fright, the in-focus positions Fleft and Fright are sent to the AvTv determining unit 301, and the process proceeds to step S304 to display the focus frame display unit. In 105, focus frame information (position and size information) corresponding to the left eye and right eye is sent, and the focus frame is displayed at the position of the left eye and the position of the right eye.

  In step S305, the AvTv determination unit 301 obtains an aperture value (Av value) at which these in-focus positions Fleft and Fright become the depth of field, and then in step S306, the aperture value and exposure control unit 102 The shutter speed (Tv value) is determined based on the obtained photometric value. The determined Av value and Tv value are sent to the exposure control unit 106.

  Next, a case where a plurality of faces are detected (YES in step S302) will be described. Here, in order to make the explanation easy to understand, a case where two faces are detected will be described as a specific example, but the same processing is also performed when there are three or more faces.

  In the processing up to step S111, the AF control unit 104 has already detected the focal position of each focus frame corresponding to the right eye and the left eye of each face. Here, the focal positions of the first left eye and the right eye are F1left and F1right, and the focal positions of the second left eye and the right eye are F2left and F2right. Next, in-focus positions close to each other are selected from the in-focus positions of the left eye and right eye of each person (step S303). Here, it is assumed that the left eye is in focus position F1left for the first person and the right eye focus position F2right is selected for the second person. The focal positions F1left and F2right are sent to the AvTv determination unit 301.

  On the other hand, in step S304, focus frame information (position and size information) corresponding to the in-focus positions (in-focus positions F1left and F2right in the above example) selected in step S303 is sent to the focus frame display unit 105. A focus frame is displayed at each position.

  In step S305, the AvTv determination unit 301 determines from the in-focus positions f1left and f2right an aperture value (Av value) that includes all in-focus positions within the depth of field (for example, at the in-focus positions f1left and f2right). In step S306, the aperture value and the exposure control unit 102 are informed of the aperture value so that the predetermined distance in the front-rear direction is substantially in-focus with the focus positions f1left and f2right. The shutter speed (Tv value) is determined based on the photometric value obtained in this way. The determined Av value and Tv value are sent to the exposure control unit 106.

  The user determines whether or not a desired point is set to be in focus from the focus frame displayed in step S304, that is, whether or not a person's eyes are set to be in focus in the case of human photography. If it is set as such (YES in step S307), shooting is instructed by fully pressing a shutter button (not shown), and if not set (NO in step S307). Then, the process is terminated without releasing the shutter button.

  If the desired point is set to be in focus, the process proceeds to step S307, the aperture and shutter speed obtained in steps S305 and S306 are set in the main exposure control unit 106, and photographing is performed.

  If NO in step S302, that is, if a face is not detected, the processing after step S112 in FIG.

  As described above, according to the third embodiment, taking into consideration the depth of field, it is possible to easily perform person photographing with all eyes in focus.

<Embodiment 4>
Next, Embodiment 4 of the present invention will be described below.

  FIG. 9 is a block diagram showing a simple configuration of the digital camera according to the third embodiment. FIG. 9 shows a configuration in which a focus frame selection UI unit 401 is added to the configuration shown in FIG. 1, and the other configurations are the same as those in FIG.

  Hereinafter, photographing processing of the digital camera having the above-described configuration according to the fourth embodiment will be described with reference to the flowchart of FIG. Note that the processing from step S101 to step S111 is the same as the processing described in the first embodiment with reference to FIG. 2 and FIG. However, in the first embodiment, the focus frames corresponding to the left eye and the right eye are respectively set in step S104. However, in the fourth embodiment, a focus frame including both the left eye and the right eye is set.

  A feature of the fourth embodiment is processing performed using the focus frame selection UI unit 401 when a plurality of faces are detected (YES in step S401). Here, for the sake of simplicity, FIG. As shown, a case where three faces are detected will be described. For convenience, the in-focus position of the focus frame 1111 including the left eye and right eye of the first eye 1101 is f1, the in-focus position of the focus frame 1112 including the left eye and right eye of the second person 1102, and the left eye of the third person 1103. And f3 is the focal position of the focus frame 1113 including the right eye.

  In step S402, the AF control unit 104 compares f1, f2, and f3 to obtain the closest in-focus position. The size and coordinate information of the focus frame at the closest in-focus position obtained are sent to the focus frame display unit 105, and the focus frame is displayed. When the in-focus position f1 is closest, a focus frame 1101 is displayed as shown in FIG. The user confirms whether or not the focus frame is on the eye part of the person to be focused (step S404), and if the focus frame is on the eye part of the person to be focused, the user presses the shutter to take a picture (step S405). When it is desired to focus on a different person, the focus frame selection UI unit 401 moves the focus frame to the person to be focused by a user interface using left and right buttons (not shown). At this time, if the in-focus position is close to the order of f1, f2, and f3, the focus frame in this order is sequentially changed from FIG. 11A to FIG. 11B according to the operation of the focus frame selection UI unit 401. Moving.

  In this way, the focus frame selection UI unit 401 sets the focus frame to the person that the user wants to focus on, and the shutter is pressed to perform shooting.

  In step S407, it is determined whether or not the number of detected faces is 1. If YES, the process proceeds to step S403, and if NO, no face is detected, so the process proceeds to step S112 in FIG. The processing described in the first embodiment is performed.

  As described above, according to the fourth embodiment, when a plurality of faces are detected, it is possible to easily capture a person who focuses on the eyes of a desired face.

  In the fourth embodiment, the case where three faces are detected on the screen has been described as an example. However, the present invention is not limited to this, and the same applies when two or more faces are detected. Needless to say, the operation is possible.

  In the fourth embodiment, the focus frame is moved in the order of closest focus position according to the operation of the focus frame selection UI unit 401. However, the present invention is not limited to this. Move the focus frame in order of increasing distance, move the focus frame in descending order of the size of the face detected by the face detection processing unit 103, or move the focus frame in ascending order of the distance from the center of the screen to the face. Various applications are conceivable.

<Other embodiments>
In the first to fourth embodiments, the case where the present invention is applied to a digital camera has been described as an example. However, the present invention is not limited to a digital camera, and a conventional silver salt camera is provided by providing a focus frame display unit in a finder. Is also applicable. In this case, not used for the imaging unit 1 01 this shoot, used in exposure control and / or imaging for focusing, shows a two-dimensional image sensor comprising a solid-state imaging device.

  Note that the present invention can be applied to a system composed of a plurality of devices (for example, a host computer, an interface device, a camera head, etc.), or a device composed of a single device (for example, a silver salt camera, a digital still camera, a digital It may be applied to a video camera or the like.

  Another object of the present invention is to supply a storage medium (or recording medium) in which a program code of software that realizes the functions of the above-described embodiments is recorded to a system or apparatus, and the computer (or CPU or CPU) of the system or apparatus. Needless to say, this can also be achieved by the MPU) reading and executing the program code stored in the storage medium. In this case, the program code itself read from the storage medium realizes the functions of the above-described embodiments, and the storage medium storing the program code constitutes the present invention. Further, by executing the program code read by the computer, not only the functions of the above-described embodiments are realized, but also an operating system (OS) running on the computer based on the instruction of the program code. It goes without saying that a case where the function of the above-described embodiment is realized by performing part or all of the actual processing and the processing is included. Examples of the storage medium for storing the program code include a flexible disk, hard disk, ROM, RAM, magnetic tape, nonvolatile memory card, CD-ROM, CD-R, DVD, optical disk, magneto-optical disk, MO, and the like. Can be considered. Also, a computer network such as a LAN (Local Area Network) or a WAN (Wide Area Network) can be used to supply the program code.

  Furthermore, after the program code read from the storage medium is written into a memory provided in a function expansion card inserted into the computer or a function expansion unit connected to the computer, the function is determined based on the instruction of the program code. It goes without saying that the CPU or the like provided in the expansion card or the function expansion unit performs part or all of the actual processing and the functions of the above-described embodiments are realized by the processing.

  When the present invention is applied to the above-described storage medium, the storage medium stores program codes corresponding to the above-described FIG. 2 and FIG. 3 and the flowchart shown in FIG. 6, FIG. 8, or FIG. It will be.

It is a block diagram which shows schematic structure of the digital camera of Embodiment 1 of this invention. , It is a flowchart which shows the imaging | photography process in Embodiment 1 of this invention. It is a figure explaining the focus frame in Embodiment 1 of the present invention. It is a block diagram which shows schematic structure of the digital camera of Embodiment 2 of this invention. It is a flowchart which shows the imaging | photography process in Embodiment 2 of this invention. It is a block diagram which shows schematic structure of the digital camera of Embodiment 3 of this invention. It is a flowchart which shows the imaging | photography process in Embodiment 3 of this invention. It is a block diagram which shows schematic structure of the digital camera of Embodiment 4 of this invention. It is a flowchart which shows the imaging | photography process in Embodiment 4 of this invention. It is a figure explaining the movement of the focus frame of Embodiment 4 of this invention. It is a figure explaining the AF lock imaging | photography procedure in the conventional camera. It is a figure explaining imaging | photography with multipoint AF.

Explanation of symbols

DESCRIPTION OF SYMBOLS 101 Image pick-up part 102 Exposure control part 103 Face detection process part 104 AF control part 105 Focus frame display part 106 Main exposure control part 107 Image processing part 108 Data writing part 201 Bracket imaging parameter setting part 301 AvTv determination part 401 Focus frame selection UI part

Claims (9)

  1. Imaging means for converting incident light into an electrical signal corresponding to the amount of light and outputting image data;
    Face detection means for detecting a face from a subject image based on the image data from the imaging means;
    First detecting means for detecting an eye from the subject image;
    A focus frame setting means for setting a focus frame indicating a position and a size for extracting image data used for focusing;
    Second detection means for detecting a focal position of one or more focus lenses based on image data in the focus frame;
    The focus frame setting means sets a focus frame for the eye detected by the first detection means, and the first detection means when the first and second faces are detected by the face detection means. Based on the detection result of the first face, select the right eye and the left eye of the first face and the right eye and the left eye of the second face, which are close to each other, and select the selected eye A focus detection apparatus, wherein the focus frame is reset on an eye.
  2.   The camera further comprises an aperture determination unit that determines an aperture value to be used for photographing, and the aperture determination unit determines the aperture value so that the eye selected by the focus frame setting unit is included in the depth of field. The focus detection apparatus according to claim 1.
  3.   The focus detection apparatus according to claim 1, wherein the focus frame setting unit includes a user interface, and sets the focus frame in accordance with information input via the user interface.
  4.   The focus detection apparatus according to claim 1, further comprising display means for displaying the focus frame set by the focus frame setting means.
  5.   The focus detection apparatus according to claim 3, further comprising display means for displaying a focus frame in accordance with information input via the user interface.
  6.   6. The method according to claim 1, wherein when a plurality of focus frames are set by the focus frame setting unit, the second detection unit detects one or more in-focus positions for each focus frame. The focus detection apparatus of any one of Claims.
  7. An imaging step of converting incident light into an electrical signal corresponding to the amount of light and outputting image data;
    A face detection step of detecting a face from a subject image based on the image data obtained in the imaging step;
    A first detection step of detecting an eye from the subject image;
    A focus frame setting step for setting a focus frame indicating a position and size for extracting image data used for focusing;
    A second detection step of detecting a focal point position of one or more focus lenses based on the image data in the focus frame,
    In the focus frame setting step, when the focus frame is set for the eyes detected by the first detection means , and the first and second faces are detected in the face detection step, the second detection step Based on the detection result of the first face, select one of the right eye and the left eye of the first face and the right eye and the left eye of the second face that are close to each other, and select the selected eye. focus detection method characterized by resetting the focus frame in the eye.
  8.   The program for making a computer perform each process of the focus detection method of Claim 7.
  9.   A computer-readable storage medium storing the program according to claim 8.
JP2003361940A 2003-10-22 2003-10-22 Focus adjustment device and focus adjustment method Active JP4553346B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003361940A JP4553346B2 (en) 2003-10-22 2003-10-22 Focus adjustment device and focus adjustment method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003361940A JP4553346B2 (en) 2003-10-22 2003-10-22 Focus adjustment device and focus adjustment method

Publications (3)

Publication Number Publication Date
JP2005128156A5 JP2005128156A5 (en) 2005-05-19
JP2005128156A JP2005128156A (en) 2005-05-19
JP4553346B2 true JP4553346B2 (en) 2010-09-29

Family

ID=34641738

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003361940A Active JP4553346B2 (en) 2003-10-22 2003-10-22 Focus adjustment device and focus adjustment method

Country Status (1)

Country Link
JP (1) JP4553346B2 (en)

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4574459B2 (en) * 2005-06-09 2010-11-04 キヤノン株式会社 Image capturing apparatus, control method therefor, program, and storage medium
JP4567538B2 (en) * 2005-06-22 2010-10-20 富士フイルム株式会社 Exposure amount calculation system, control method therefor, and control program therefor
JP4578334B2 (en) * 2005-06-22 2010-11-10 富士フイルム株式会社 In-focus position determining apparatus and method for imaging lens
JP2007065290A (en) * 2005-08-31 2007-03-15 Nikon Corp Automatic focusing device
JP4422667B2 (en) 2005-10-18 2010-02-24 富士フイルム株式会社 Imaging apparatus and imaging method
JP4579169B2 (en) 2006-02-27 2010-11-10 富士フイルム株式会社 Imaging condition setting method and imaging apparatus using the same
JP4657960B2 (en) * 2006-03-27 2011-03-23 富士フイルム株式会社 Imaging method and apparatus
JP5070728B2 (en) * 2006-04-04 2012-11-14 株式会社ニコン Camera
US8306280B2 (en) 2006-04-11 2012-11-06 Nikon Corporation Electronic camera and image processing apparatus
JP2007282118A (en) * 2006-04-11 2007-10-25 Nikon Corp Electronic camera and image processing apparatus
JP4182117B2 (en) 2006-05-10 2008-11-19 キヤノン株式会社 Imaging device, its control method, program, and storage medium
JP2007311861A (en) 2006-05-16 2007-11-29 Fujifilm Corp Photographic apparatus and method
JP4207980B2 (en) 2006-06-09 2009-01-14 ソニー株式会社 Imaging device, imaging device control method, and computer program
JP4127296B2 (en) 2006-06-09 2008-07-30 ソニー株式会社 Imaging device, imaging device control method, and computer program
JP5055018B2 (en) 2006-06-14 2012-10-24 キヤノン株式会社 Image processing apparatus, imaging apparatus, and control method for image processing apparatus
JP4802884B2 (en) * 2006-06-19 2011-10-26 カシオ計算機株式会社 Imaging apparatus, captured image recording method, and program
JP4629002B2 (en) * 2006-06-21 2011-02-09 三菱電機株式会社 Imaging device
JP4799366B2 (en) * 2006-10-27 2011-10-26 三洋電機株式会社 Imaging apparatus and image reproduction apparatus
JP4732303B2 (en) * 2006-11-02 2011-07-27 キヤノン株式会社 Imaging device
JP4254873B2 (en) 2007-02-16 2009-04-15 ソニー株式会社 Image processing apparatus, image processing method, imaging apparatus, and computer program
JP2008245055A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Image display device, photographing device, and image display method
JP4732397B2 (en) * 2007-04-11 2011-07-27 富士フイルム株式会社 Imaging apparatus and focus control method thereof
JP5019939B2 (en) * 2007-04-19 2012-09-05 パナソニック株式会社 Imaging apparatus and imaging method
JP4782725B2 (en) 2007-05-10 2011-09-28 富士フイルム株式会社 Focusing device, method and program
JP4823964B2 (en) * 2007-05-10 2011-11-24 富士フイルム株式会社 Imaging apparatus and imaging method
JP4767904B2 (en) * 2007-05-10 2011-09-07 富士フイルム株式会社 Imaging apparatus and imaging method
JP4905797B2 (en) * 2007-05-10 2012-03-28 富士フイルム株式会社 Imaging apparatus and imaging method
JP4756005B2 (en) * 2007-05-10 2011-08-24 富士フイルム株式会社 Imaging apparatus and imaging method
JP5046788B2 (en) * 2007-08-10 2012-10-10 キヤノン株式会社 Imaging apparatus and control method thereof
JP4997043B2 (en) * 2007-09-27 2012-08-08 富士フイルム株式会社 Imaging apparatus, method and program
JP4518131B2 (en) * 2007-10-05 2010-08-04 富士フイルム株式会社 Imaging method and apparatus
JP5338197B2 (en) * 2008-08-26 2013-11-13 株式会社ニコン Focus detection apparatus and imaging apparatus
JP5249146B2 (en) * 2009-07-03 2013-07-31 富士フイルム株式会社 Imaging control apparatus and method, and program
JP5548405B2 (en) * 2009-07-27 2014-07-16 キヤノン株式会社 Focus adjustment device
JP5669549B2 (en) 2010-12-10 2015-02-12 オリンパスイメージング株式会社 Imaging device
JP5075288B2 (en) * 2012-05-31 2012-11-21 キヤノン株式会社 Imaging apparatus and control method thereof
JP5385428B2 (en) * 2012-06-11 2014-01-08 パナソニック株式会社 Imaging device
JP5697650B2 (en) * 2012-12-13 2015-04-08 キヤノン株式会社 Imaging apparatus and control method thereof
JP6383251B2 (en) * 2013-12-27 2018-08-29 キヤノン株式会社 Imaging apparatus, control method therefor, program, and storage medium
JP5911557B2 (en) * 2014-12-16 2016-04-27 オリンパス株式会社 Imaging device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02300730A (en) * 1989-05-16 1990-12-12 Nikon Corp Auto-focus camera
JP2001215403A (en) * 2000-02-01 2001-08-10 Canon Inc Image pickup device and automatic focus detection method
JP2002006203A (en) * 2000-06-19 2002-01-09 Olympus Optical Co Ltd Focusing device and image pickup device
JP2002162559A (en) * 2000-11-24 2002-06-07 Olympus Optical Co Ltd Imaging device
JP2003092726A (en) * 2001-09-18 2003-03-28 Ricoh Co Ltd Imaging apparatus
JP2003107335A (en) * 2001-09-28 2003-04-09 Ricoh Co Ltd Image pickup device, automatic focusing method, and program for making computer execute the method
JP2004317699A (en) * 2003-04-15 2004-11-11 Nikon Corp Digital camera

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02300730A (en) * 1989-05-16 1990-12-12 Nikon Corp Auto-focus camera
JP2001215403A (en) * 2000-02-01 2001-08-10 Canon Inc Image pickup device and automatic focus detection method
JP2002006203A (en) * 2000-06-19 2002-01-09 Olympus Optical Co Ltd Focusing device and image pickup device
JP2002162559A (en) * 2000-11-24 2002-06-07 Olympus Optical Co Ltd Imaging device
JP2003092726A (en) * 2001-09-18 2003-03-28 Ricoh Co Ltd Imaging apparatus
JP2003107335A (en) * 2001-09-28 2003-04-09 Ricoh Co Ltd Image pickup device, automatic focusing method, and program for making computer execute the method
JP2004317699A (en) * 2003-04-15 2004-11-11 Nikon Corp Digital camera

Also Published As

Publication number Publication date
JP2005128156A (en) 2005-05-19

Similar Documents

Publication Publication Date Title
EP1855466B1 (en) Focus adjustment apparatus and method
US7855737B2 (en) Method of making a digital camera image of a scene including the camera user
CN101124816B (en) Image processing apparatus and image processing method
JP4746295B2 (en) Digital camera and photographing method
US8106995B2 (en) Image-taking method and apparatus
CN103081455B (en) The multiple images being captured from handheld device carry out portrait images synthesis
JP5389697B2 (en) In-camera generation of high-quality composite panoramic images
US7791668B2 (en) Digital camera
JP5136669B2 (en) Image processing apparatus, image processing method, and program
JP4671133B2 (en) Image processing device
US7706675B2 (en) Camera
CN102231801B (en) An electronic camera and an image processing apparatus
JP4976160B2 (en) Imaging device
EP1522952B1 (en) Digital camera
KR20090039631A (en) Composition determining apparatus, composition determining method, and program
EP2259573A2 (en) Digital Camera
CN101616261B (en) Image recording apparatus, image recording method, image processing apparatus, and image processing method
JP5046788B2 (en) Imaging apparatus and control method thereof
JP2011010275A (en) Image reproducing apparatus and imaging apparatus
CN100550986C (en) Method and camera with multiple resolution
KR101503333B1 (en) Image capturing apparatus and control method thereof
JPWO2008114499A1 (en) Imaging apparatus and imaging method
JP4315148B2 (en) Electronic camera
JP2006211139A (en) Imaging apparatus
JP5019939B2 (en) Imaging apparatus and imaging method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20061023

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20061023

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20090714

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20090803

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20091002

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20100215

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100517

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20100607

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20100709

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20100712

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130723

Year of fee payment: 3