JP2005300925A - Imaging apparatus - Google Patents

Imaging apparatus Download PDF

Info

Publication number
JP2005300925A
JP2005300925A JP2004117147A JP2004117147A JP2005300925A JP 2005300925 A JP2005300925 A JP 2005300925A JP 2004117147 A JP2004117147 A JP 2004117147A JP 2004117147 A JP2004117147 A JP 2004117147A JP 2005300925 A JP2005300925 A JP 2005300925A
Authority
JP
Japan
Prior art keywords
distance measurement
imaging
distance
unit
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2004117147A
Other languages
Japanese (ja)
Inventor
Kazusane Kageyama
Satoshi Yokota
聡 横田
和実 陰山
Original Assignee
Konica Minolta Photo Imaging Inc
コニカミノルタフォトイメージング株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Photo Imaging Inc, コニカミノルタフォトイメージング株式会社 filed Critical Konica Minolta Photo Imaging Inc
Priority to JP2004117147A priority Critical patent/JP2005300925A/en
Publication of JP2005300925A publication Critical patent/JP2005300925A/en
Pending legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide an imaging apparatus capable of realizing fast and proper focusing control, by correcting the parallax between a photographing lens and the optical system of a range finder. <P>SOLUTION: Parallax information, showing relative position relation between a plurality of positions (distance measured position) measured by the range finder etc., and a photographic area CA and relation to the distance from the imaging device 1, is pre-stored to a ROM etc. Distance measurement values of the respective distance measured positions are calculated, based on data for distance measurement from the range finder, and relative position relation between the actual photographic area CA and the plurality of distance measured positions MA1 and MA2 is recognized, based on the parallax information and the distance measurement values of the respective distance measured positions. Further, the distance measured position, which is the closest from a nearly the center position of live view among the plurality of distance measured positions, is detected as a focused position FP based on the recognized position relation, and the photographic lens is brought under focusing control, based on the distance measurement value of the detected focused position FP. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

  The present invention relates to a focus control technique in an imaging apparatus in which an imaging lens and an outside light type distance measuring device are separately provided.

  In general, a digital camera displays a preview based on image data from an image sensor and performs contrast focusing control using the image data from the image sensor so that the user can easily perform framing. is there. However, this focusing control requires (i) a long time for focusing because it is necessary to acquire image data at a plurality of lens positions while moving the focus lens, and (ii) when the brightness of the subject is low. Has the disadvantage that the focus position of the focus lens cannot be accurately detected. In view of this, a technique has been proposed and implemented in which an external light type distance measuring device is separately provided in a camera, and the above-described contrast type focusing control and focusing control by the distance measuring device are used in combination (for example, non-patent literature). 1).

  However, in a digital camera to which an external light type distance measuring device is added separately, parallax occurs between the photographing optical system and the distance measuring device because the optical system of the imaging optical device is different from the optical system of the distance measuring device. . This parallax changes variously according to the distance between the subject (main subject) that the user mainly wants to photograph and the camera. For example, in an external light passive type distance measuring device in which a plurality of distance measuring regions are arranged on a pair of straight lines, the optical axis of the optical system of the distance measuring device and the photographing optical system are usually at a position about 3 m away from the camera. Since the optical axis is designed to intersect the optical axis, the position to measure the distance with respect to the captured image is relatively shifted according to the distance to the subject. That is, depending on the distance between the camera and the main subject, a deviation occurs between the subject captured in the preview and the position where the distance measuring device is measuring the distance. As a result, depending on the distance between the camera and the main subject, a position shifted from the main subject is measured, and the main subject cannot be focused.

By the way, as a technique to correct the parallax in the camera,
(1) Technology for setting a readout area from an imaging device for imaging based on an output from the imaging device for a finder in order to correct parallax between the optical viewfinder and the imaging optical system (for example, Patent Documents) 1) A technique for changing the region displayed on the optical viewfinder according to the zoom amount (for example, Patent Document 2), and a technique for trimming and recording a captured image based on information such as a lens focal length and a subject distance ( For example, Patent Document 3) has been proposed.

  (2) Furthermore, in order to correct the parallax between the optical viewfinder and the distance measuring device, the main subject can be re-measured by moving the viewfinder field range according to the distance measured by the distance measuring device and re-ranging. For example, Patent Document 4 discloses a technique that can appropriately focus on the camera.

  Prior art documents relating to such technology include the following.

JP 2003-37772 A JP 2002-258385 A JP 1999-344746 A JP 2003-5025 A "Jet AF", [online], October 10, 2003, Konica Minolta, [Search February 4, 2004], Internet <URL: http://www2.konicaminolta.jp/products/consumer/digital#camera /dimage-z1/01.html>.

  However, in the techniques proposed in Patent Documents 1 to 4, no consideration is given to the parallax between the photographing optical system and the distance measuring device. The technique proposed in Patent Document 4 can correct the parallax between the optical finder and the distance measuring device during framing, but it is necessary to move the field of view of the finder or perform distance measurement again. Therefore, it is inferior in responsiveness and may miss a photo opportunity.

  The present invention has been made in view of the above problems, and an imaging apparatus capable of realizing high-speed and appropriate focusing control in consideration of the parallax between the imaging optical system and the distance measuring apparatus. The purpose is to provide.

  In order to solve the above-mentioned problems, the invention of claim 1 is an imaging apparatus, wherein the image acquisition means acquires an image based on a light image formed by a photographic lens, and the image is used as a preview image. And a plurality of distance measuring positions including a plurality of positions that are spaced apart from the optical axis of the photographic lens in a predetermined direction and that are separated from the optical axis in the predetermined direction. Ranging data acquisition means for obtaining external ranging data, and ranging data related to the plurality of ranging positions based on the ranging data acquired by the ranging data acquisition means Distance measurement value calculating means for calculating a value; storage means for storing disparity information indicating a positional relationship between an imaging region by the imaging means and the plurality of distance measurement positions; and a distance from the imaging device; Shoot In addition, based on the parallax information and the distance values related to the plurality of distance measurement positions calculated by the distance value calculation means, actual positions of the imaging region and the plurality of distance measurement positions Based on the recognition means for recognizing the relationship and the actual positional relation recognized by the recognition means, a distance measurement position satisfying a predetermined condition is detected as a focus position among the plurality of distance measurement positions. It is characterized by comprising detecting means and control means for performing focusing control of the taking lens based on a distance measurement value relating to the in-focus position.

  The invention according to claim 2 is the imaging apparatus according to claim 1, wherein the plurality of distance measuring positions are two-dimensionally distributed on the distance measuring object as viewed from the imaging apparatus. It is characterized by.

  The invention according to claim 3 is the imaging apparatus according to claim 1 or 2, wherein the detection means is a position closest to a predetermined position in the image among the plurality of distance measurement positions. A distance measurement position that is captured is detected as the in-focus position.

  According to a fourth aspect of the present invention, there is provided the imaging apparatus according to the third aspect, wherein the predetermined position is substantially the center of the image.

  Further, the invention of claim 5 is an imaging device, wherein the image acquisition unit acquires an image based on a light image formed by a photographic lens, and the display unit displays the image as a preview image. Outside of acquiring distance measurement data for a plurality of distance measurement positions which are arranged in a predetermined direction away from the optical axis of the photographing lens and are two-dimensionally distributed on the distance measurement object as viewed from the imaging device. Optical distance measurement data acquisition means, and distance measurement value calculation means for calculating distance values related to the plurality of distance measurement positions based on the distance measurement data acquired by the distance measurement data acquisition means And storage means for storing disparity information indicating the relationship between the imaging area by the imaging means and the plurality of distance measurement positions and the distance from the imaging device, and when taking the measurement Calculated by distance value calculation means The division means for dividing the distance measurement object into a plurality of units based on the distribution state of the distance measurement values for the plurality of distance measurement positions, the parallax information, and the distance value calculation means calculated by the distance measurement value calculation means Recognizing means for recognizing an actual positional relationship between the imaging region and the plurality of units based on distance measurement values relating to a plurality of ranging positions, and the actual positional relationship recognized by the recognizing means. Based on the detection unit that detects a unit that satisfies a predetermined condition among the plurality of units as a focused object, and a control that performs focusing control of the photographing lens based on a distance measurement value related to the focused object Means.

  The invention according to claim 6 is the imaging apparatus according to claim 5, wherein the detection means corresponds to a predetermined type corresponding to a predetermined type of object among the plurality of units based on a shape of each unit. Means for detecting a unit; and means for detecting, as the in-focus object, the predetermined type unit that is captured at a position closest to a predetermined position in the image among a plurality of predetermined type units detected by the unit detection unit. It is characterized by having.

  The invention according to claim 7 is the imaging apparatus according to claim 6, wherein the predetermined type of object is a person.

  The invention according to claim 8 is the imaging apparatus according to claim 5, wherein the detection unit is a position closest to a predetermined position in the image except for a unit related to a distant view among the plurality of units. A unit that can be captured by the method is detected as the focused object.

  According to the invention of any one of claims 1 to 4, the information indicating the relationship between the positional relationship between the imaging region and the distance measurement position and the distance from the imaging device, and the plurality of distance measurement positions. Based on the distance measurement value, the positional relationship between the shooting area and the plurality of distance measurement positions is recognized, and the distance measurement position satisfying a predetermined condition among the plurality of distance measurement positions based on the recognized positional relationship. And performing focus control based on the distance measurement value for the detected distance measurement position to perform focus control in consideration of the deviation between the subject related to the preview and the distance measurement position. it can. Therefore, it is possible to realize high-speed and appropriate focusing control in consideration of the parallax between the photographing optical system and the distance measuring device.

  According to the second aspect of the present invention, a position to be focused on is detected from a plurality of distance measurement positions distributed two-dimensionally on the distance measurement object as viewed from the imaging device. Thus, by performing the focusing control, it is possible to appropriately focus on the measurement object that spreads in a wider range in the photographing direction.

  According to the third or fourth aspect of the invention, the focus control is performed based on the distance measurement value related to the distance measurement position captured at the position closest to the predetermined position of the preview image. When framing is performed while viewing the preview, it is possible to appropriately focus on the main subject that the user mainly wants to shoot. That is, it is possible to perform focusing control that well reflects the user's intention. In particular, as in the invention described in claim 4, by performing focus control based on a distance measurement value related to a distance measurement position captured at a position closest to the approximate center of the preview image, the center of the preview is displayed. When framing is performed so that the main subject can be captured in the vicinity, the main subject can be appropriately focused.

  According to the invention described in any one of claims 5 to 7, the distance measurement object is divided into a plurality of units based on the distribution state of the distance measurement values, and the imaging region and the distance measurement position are separated. Based on the information indicating the relationship between the positional relationship and the distance from the imaging device and the distance values associated with the plurality of units, the actual positional relationship between the imaging region and the plurality of units is recognized, and based on the positional relationship. By detecting a unit satisfying a predetermined condition from a plurality of units and performing focusing control based on the distance measurement value of the detected unit, the parallax between the photographing optical system and the distance measuring device is taken into consideration. Thus, high-speed and appropriate focusing control can be realized.

  In particular, according to the invention described in claim 6 or claim 7, a predetermined type unit corresponding to a predetermined type of object is detected from a plurality of units, and is detected at a position closest to the predetermined position in the preview image. By performing focus control based on the distance measurement value related to the predetermined type unit, when performing framing while viewing the preview, the user can appropriately focus on a predetermined type of object that the user wants to mainly shoot. it can. That is, it is possible to perform focusing control that well reflects the user's intention. In particular, as in the seventh aspect of the invention, the person of a predetermined type can be appropriately focused on the person.

  According to the eighth aspect of the present invention, the focus control is performed based on the distance measurement value related to the foreground captured at the position closest to the predetermined position in the preview image, so that framing can be performed while viewing the preview. When this is done, it is possible to properly focus on the close-up scene that the user mainly wants to shoot. That is, it is possible to perform focusing control that well reflects the user's intention.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

<(1) First Embodiment>
<(1-1) Overview of imaging device>
1 and 2 are schematic views showing an outline of the imaging apparatus 1 according to the first embodiment of the present invention. 1 is a front view, and FIG. 2 is a cross-sectional view taken along line AA of FIG. As shown in FIG. 1 and FIG. 2, the imaging device 1 is configured as a digital camera having a main body 2 with each part. In FIG. 1 and FIG. 2, three orthogonal X, Y, and Z axes are attached in order to clarify the directional relationship in the three-dimensional space. In FIG. 4 and subsequent figures, similar three axes X, Y, and Z are appropriately attached.

  As shown in FIG. 1, the imaging device 1 includes a photographing lens 3, a pop-up flash device 4, a distance measuring device 5, and an AF auxiliary light emission that emits auxiliary light during autofocus (AF) on the front side of the main body 2. Part 6 is provided.

  Further, the imaging apparatus 1 includes a shutter start button (hereinafter abbreviated as “shutter button”) 7 and a mode switching dial 8 on the upper surface side of the main body 2. Further, as shown in FIG. 2, a liquid crystal display screen (rear LCD) 9 and an electronic viewfinder (EVF) 10 are provided on the back side of the main body 2.

  The taking lens (shooting optical system) 3 is a so-called zoom lens that realizes a zoom function, and mainly includes a zoom lens group 3A, a focus lens group 3B, a lens barrel 3M, and a shutter mechanism / aperture mechanism 3S. The photographing lens 3 realizes zooming by changing the lens arrangement of the zoom lens group 3A by driving the lens barrel 3M and the like, and realizes AF operation by driving the focus lens group 3B. The photographing lens 3 includes a lens position detector 28 (FIG. 3) that detects the lens arrangement (lens position) of the zoom lens group 3A and the focus lens group 3B. (CCD etc.) 13 is arranged.

  The distance measuring device 5 is for measuring the distance from the imaging device 1 to the subject, and employs, for example, an external light passive method using the principle of triangulation. The distance measuring device 5 is disposed between the photographing lens 3 and the flash device 4, that is, above the photographing lens 3 (Z-axis direction side). That is, the photographic lens 3 is disposed away from the optical axis P1 in the upward direction (Z-axis direction).

  Here, if the optical axis P1 and the optical axes P2 of the lenses La and Lb (FIG. 4) of the distance measuring device 5 are set to be parallel, the amount of deviation between the photographing lens 3 and the distance measuring device 5 is assumed. The area of the subject imaged by the image sensor 13 (hereinafter referred to as “photographing area”) and the area of the object measured by the distance measuring device 5 (hereinafter referred to as “distance area”). Will shift relatively. Therefore, the imaging apparatus 1 is configured such that the optical axis P1 and the optical axis P2 intersect at a position separated from the imaging apparatus 1 by a predetermined distance in order to suppress such a shift. In the following description, it is assumed that the predetermined distance is 3 m.

  FIG. 3 is a block diagram illustrating the internal configuration of the imaging apparatus 1. Hereinafter, the internal configuration of the imaging apparatus 1 will be described with reference to FIG.

  The imaging element 13 acquires an electronic image signal (image) based on a subject image (light image) formed by the photographing lens 3. The timing control circuit 21 generates various timing pulses for controlling the driving of the image sensor 13 and generates a drive control signal for the image sensor 13 based on a reference clock transmitted from the camera control unit 20.

  The analog image signal obtained by the image sensor 13 is subjected to predetermined analog signal processing by the signal processing circuit 14 and converted into a digital image signal by the A / D converter 15.

  The digital image signal output from the A / D converter 15 is subjected to predetermined digital signal processing by the image processing unit 22. The image processing unit 22 includes a black level correction circuit 16, a white balance (WB) circuit 17, a γ correction circuit 18, and an image memory 19. Black level correction, WB adjustment, γ, Image processing such as correction is performed. The image signal after the image processing by the image processing unit 22 is temporarily stored in the image memory 19. Various processes are performed by the camera control unit 20 using the image signal temporarily stored in the image memory 19. Note that the image processing unit 22 may be configured as an IC (Integrated Circuit) common to a CPU of the camera control unit 20 described later, or may be configured as a separate IC.

  Prior to the main photographing for acquiring a photographed image for storage, when the image signal stored in the image memory 19 is transferred to the VRAM (Video Random Access Memory) 81 by the camera control unit 20, the rear LCD 9 When output (displayed) as a preview image (live view) and transferred to the VRAM 82, the EVF 10 displays it as a live view. Here, the live view is displayed by the rear LCD 9 or the EVF 10 according to the setting by the user. This live view is updated and displayed in time order every 1/30 seconds based on, for example, an image output from the image sensor 13 every 1/30 seconds. Therefore, the user can appropriately perform framing while watching the live view. In the following description, it is assumed that the area of the subject captured by this live view during framing is the same as the shooting area by the image sensor 13.

  Further, at the time of actual photographing, the image signal stored in the image memory 19 is subjected to compression processing or the like by the camera control unit 20 to become image data for storage, and is sent via the card interface (I / F) unit 80. Stored in the memory card 90. The image data for storage can be transferred to an external device through a communication interface (I / F) 95 and a communication line.

  The operation unit 30 includes operation members such as the shutter button 7 and the mode switching dial 8 and transmits various signals to the camera control unit 20 in response to an operation by the user. When the live view is displayed on the rear LCD 9 or the EVF 10, when the shutter button 7 is pressed, an AF operation is performed, and further a main photographing operation is performed. The flash circuit 24 is a circuit that controls the light emission operation of the flash unit 4 under the control of the camera control unit 20.

  The distance measuring device 5 is an external light passive distance measuring device including a pair of lenses and a light receiving unit. Then, a distance measurement control unit 25 mainly composed of a CPU controls the operation of the distance measurement device 5 and processes data acquired by the distance measurement device 5 (hereinafter referred to as “range measurement data”). Do. In addition, the distance measurement control unit 25 responds to the pressing operation of the shutter button 7 under the control of the camera control unit 20 based on the distance measurement data from the distance measurement device 5 (the distance from the imaging device 1 to the subject ( Hereinafter, this is referred to as a “distance value”. The calculated distance measurement value is transferred to the camera control unit 20.

  The AF auxiliary light emitting unit 6 emits auxiliary light in order to improve the distance measurement accuracy by increasing the luminance of the subject when performing distance measurement, and controls the camera in response to the pressing operation of the shutter button 7. Light is emitted under the control of the unit 20.

  The lens position detection unit 28 detects each lens position in the photographic lens 3, and transfers information regarding each detected lens position to the camera control unit 20.

  The camera control unit 20 mainly includes a CPU, a storage unit (such as a ROM or a RAM) that stores various types of information, and performs overall control of each unit in the imaging apparatus 1, and a predetermined CPU is stored in the ROM or the like. Various functions are realized by executing the program.

  The camera control unit 20, for example, a focus lens so as to focus on a main subject (hereinafter referred to as “main subject”) that the user wants to shoot based on a distance measurement value from the distance measurement control unit 25. A function of determining the lens position (hereinafter referred to as “focus position”) of the group 3B (hereinafter referred to as “focus position determination function”) and the like, and a function of controlling the operation of each part in the photographing lens 3 Have Then, the camera control unit 20 controls the focus adjustment control unit 23 to drive the motor M3 and move the focus lens group 3B to the in-focus position determined by the in-focus position determination function (AF Operation).

  As described above, since the imaging apparatus 1 can determine the in-focus position by using the external light type distance measuring apparatus 5, the focus lens group is arranged at a plurality of lens positions as in a digital camera adopting a contrast method. The in-focus position can be determined without being moved. As a result, a relatively short AF operation can be realized, and the possibility of missing a photo opportunity can be reduced.

  In addition, the camera control unit 20 controls the zoom motor control unit 24 in response to the operation of the operation unit 30 by the user, thereby driving the motor M1 and appropriately changing the lens arrangement of the zoom lens group 3A ( Zooming operation). Note that the camera control unit 20 appropriately moves the lens groups 3A and 3B based on the information on the lens position from the lens position detection unit 28.

  Further, the camera control unit 20 controls the shutter / aperture control unit 26 in response to the operation of the operation unit 30 by the user, etc., thereby driving the motor M2 and realizing the opening / closing operation of the shutter / aperture mechanism 3S. . The camera control unit 20 also has a function of performing automatic exposure control (AE) for calculating the luminance of the subject based on the image data acquired by the image sensor 13 and setting the shutter speed, aperture value, and gain. .

<(1-2) Ranging device>
4 and 5 are diagrams for explaining the configuration of the distance measuring device 5. FIG. 4 is a schematic cross-sectional view of the distance measuring device 5 as viewed from below. FIG. 5 is a front view of the light receiving unit PS of the distance measuring device 5. It is a schematic diagram.

  As shown in FIG. 4, the distance measuring device 5 includes two similar lenses La and Lb that are arranged in parallel in the left-right direction (± X direction), and light incident through the lenses La and Lb. A light receiving unit PS that receives light and a package member PP that supports the lenses La and Lb and blocks light from other than the lenses La and Lb from entering the light receiving unit PS. As shown in FIGS. 4 and 5, the light receiving unit PS has two light receiving sensors PSa and PSb arranged in parallel in the left-right direction (± X direction). The light receiving sensors PSa and PSb are lenses La and Lb. The light incident through each of the two is received. 4 and 5 show the optical axes P2a and P2b of the lenses La and Lb, respectively.

  In this distance measuring device 5, light from the same object (hereinafter referred to as "range object") is received by a pair of left and right light receiving sensors PSa and PSb, and photoelectrically converted by each light receiving sensor PSa and PSb. The distance control unit 25 performs correlation calculation (ranging calculation using the principle of triangulation) of the obtained distance measurement data so that the distance to the object to be measured (hereinafter referred to as “range value”). Is calculated.

  As shown in FIG. 5, the two light receiving sensors PSa and PSb have seven line sensors PSa1 to PSa7 and PSb1 to PSb7 whose longitudinal directions are substantially horizontal (± X directions), respectively. Sensor) (hereinafter also referred to as “multi-line sensors”) arranged in parallel with each other in the direction). In the two light receiving sensors PSa and PSb, seven line sensors are similarly arranged, and in each line sensor PSa1 to PSa7 and PSb1 to PSb7, m light receiving elements are similarly arranged in a substantially horizontal direction (± X direction). ).

  FIG. 6 is a diagram for explaining a distance measuring method in the light receiving sensors PSa and PSb. As shown in FIG. 6, each line sensor composed of m light receiving elements is divided into a predetermined number n (for example, n = 40) areas (hereinafter referred to as “light receiving areas”) to receive two light receiving elements. A distance measurement value for each light receiving area is obtained by performing a correlation operation on the distance measurement data for each corresponding light receiving area in the sensors PSa and PSb. In this way, a total of 280 distance measurement values can be obtained by dividing the seven lines of line sensors into 40 light receiving areas.

  FIG. 7 is a diagram exemplifying the relative positional relationship between the imaging area CA of the imaging device 1 and the ranging area MA of the ranging device 5 when imaging a subject near 3 m, for example. A ranging area MA is formed by PSa1 to PSa7 and PSb1 to PSb7.

  Accordingly, in each of the light receiving sensors PSa and PSb, a plurality of distance measurement target positions (hereinafter referred to as “distance measurement positions”) distributed on an object (distance measurement object) as a distance measurement object as viewed from the imaging device 1. 280 light receiving areas are two-dimensionally arrayed by 40 horizontal x 7 vertical. Here, “the light receiving areas are two-dimensionally arranged” means that in each of the light receiving sensors PSa and PSb, a large number of light receiving areas are arranged with a certain extent in the vertical and horizontal directions. This means that the light receiving areas are arranged in a two-dimensional matrix.

  Here, as shown in FIG. 7, for example, when a large number of distance measurement positions are two-dimensionally arranged in a matrix, when viewed from the imaging apparatus 1, a large number of distance measurement positions forming the distance measurement area MA are obtained. On a two-dimensional plane, it looks like it is distributed with some extent, not linear. Therefore, in this specification, as shown in FIG. 7, it is understood that a large number of distance measurement positions are distributed on a plane with a certain extent rather than a straight line when viewed from the imaging device 1. As seen from the apparatus, a large number of distance measurement positions are two-dimensionally distributed on the distance measurement object.

  Further, in the distance measuring device 5 in which the 280 light receiving areas are two-dimensionally arranged in this way, the seven line sensors PSa1 to PSa7 and PSb1 to PSb7 arranged in parallel to each other in the substantially vertical direction are separated from the optical axis P1. Distance measurement data can be acquired for each of a plurality of distance measurement positions having different separation distances (that is, Z-axis coordinate values).

  FIG. 8 is a schematic diagram for explaining the parallax between the photographic lens 3 and the distance measuring device 5. In FIG. 8, the horizontal axis indicates the distance from the imaging device 1, the optical axis P1 of the photographing lens 3 is indicated by a one-dot chain line, and the optical axis P2 of the distance measuring device 5 is indicated by a relatively thick one-dot chain line. Yes. In addition, a light beam formed by the photographing lens 3, that is, a light beam 3P corresponding to the photographing area CA is indicated by a solid line, and a light flux 5P corresponding to the distance measuring area MA by the distance measuring device 5 is indicated by a thick line.

  As shown in FIG. 8, the optical axis P1 of the photographing lens 3 and the optical axis P2 of the distance measuring device 5 intersect approximately 3 m ahead of the imaging device 1, and the light flux 3P according to the distance from the imaging device 1. And the relative positional relationship between the luminous flux 5P change. That is, the relative positional relationship between the imaging area CA and the distance measurement area MA changes according to the distance from the imaging device 1.

<(1-3) Failure caused by parallax>
It is generally considered that the user performs framing so that the main subject comes near the approximate center in the vertical direction of the live view while viewing the live view displayed on the rear LCD 9 or the EVF 10.

  However, as shown in FIG. 8, since parallax occurs between the photographing lens 3 and the distance measuring device 5, the optical axis P2 of the distance measuring device 5 and the optical axis P2 of the distance measuring device 5 are determined according to the distance from the imaging device 1 to the main subject. The relative positional relationship with the main subject changes. Specifically, when the distance from the imaging apparatus 1 to the main subject is relatively short (for example, when the distance is 1 m) or when the distance is relatively long (for example, when the distance is 5 m or more, etc.), The main subject is greatly separated from the vicinity of the optical axis P2 due to the influence.

  Therefore, for example, assuming that a distance measurement value is obtained for an object located in the vicinity of the optical axis P2 of the distance measurement device 5 and an AF operation is performed based on the distance measurement value, an object different from the main subject is focused. This will cause problems such as

  FIGS. 9 to 11 are diagrams for explaining problems caused by parallax between the photographing lens 3 and the distance measuring device 5 and showing a relative positional relationship between the photographing area CA and the distance measuring area MA. .

  FIG. 9 shows an example of shooting three persons standing at a position about 3 m away from the image pickup apparatus 1, and the distance measurement area MA in the figure is an object about 3 m away from the image pickup apparatus 1. Is shown as a distance measuring object. In this case, the distance-measuring position MMA near the optical axis P2 of the distance-measuring device 5 is captured at the approximate center of the live view, and is further located on the person who is the main subject. The position of the subject to be focused (hereinafter referred to as “focused position”) FP may be used.

  FIG. 10 shows an example of shooting a sunflower flower standing at a position about 1 m away from the image pickup apparatus 1, and the distance measurement area MA in the figure shows an object about 1 m away from the image pickup apparatus 1. This shows a case where a distance measuring object is used. In this case, as compared with the case shown in FIG. 9, the distance measurement area MA is largely shifted upward with respect to the imaging area CA. Therefore, the main subject cannot be brought into focus by simply setting the distance measurement position MMA approximately in the center of the distance measurement area MA as the focus position. In this case, in order to focus on the main subject, for example, as shown in FIG. 10, it is captured near the center of the live view. The distance measurement position located below the distance area MA must be set as the in-focus position FP.

  Further, FIG. 11 shows an example of photographing three persons standing at a position about 5 m away from the imaging apparatus 1, and the distance measurement area MA in the figure is an object about 5 m away from the imaging apparatus 1. Is shown as a distance measuring object. In this case, compared to the case shown in FIG. 9, the distance measurement area MA is slightly shifted downward with respect to the imaging area CA. Therefore, also in this case, it is not possible to focus on the main subject simply by setting the distance measurement position MMA substantially in the center of the distance measurement area MA as the focus position. In this case, in order to focus on the main subject, for example, as shown in FIG. 11, the distance measurement position on the person who is the main subject, that is, the distance measurement position, ie, the measurement target, is captured near the center of the live view. The distance measurement position located obliquely above the distance position MMA must be the in-focus position FP.

  In this way, the main subject cannot be properly focused unless the focus position FP is appropriately changed in consideration of the distance from the imaging device 1 to the main subject.

  Therefore, in the imaging apparatus 1, first, as shown in FIG. 8, the relative positional relationship between the imaging area CA and the distance measurement area MA (actually each distance measurement position), the distance from the imaging apparatus 1, and Is determined in advance by the design of the imaging apparatus 1, information indicating the relationship (hereinafter referred to as “parallax information”) is stored in advance in the ROM of the camera control unit 20. Then, by the focus position determination function, the distance measurement related to the main subject to be used for determining the focus position based on the parallax information and the distance measurement value obtained by the distance measurement device 5 and the distance measurement control unit 25. Determine the value automatically. Therefore, an image focused on the main subject can be acquired.

  Specific contents of the in-focus position determination function will be described in detail in the following description of the imaging operation of the imaging apparatus 1.

<(1-4) Shooting operation>
FIG. 12 is a flowchart illustrating a shooting operation flow of the imaging apparatus 1, and this operation flow is controlled by the camera control unit 20. Here, when the user performs framing while visually recognizing the live view displayed on the rear LCD 9 or EVF 10 of the imaging device 1 and the shutter button 7 is pressed, the shooting operation flow is started, and step S1 in FIG. 12 is started. move on.

  In step S1, AE is performed and the process proceeds to step S2. Here, the luminance of the subject is calculated based on the image data acquired by the image sensor 13, and the shutter speed, aperture value, and gain at the time of actual photographing are set.

  In step S2, distance measurement is performed and the process proceeds to step S3. Here, the distance measurement device 5 acquires distance measurement data related to each distance measurement position, and the distance measurement control unit 25 calculates distance measurement values related to each distance measurement position based on the distance measurement data. The

  In step S3, the relative positional relationship with the imaging area CA is recognized for each distance measurement position, and the process proceeds to step S4. Here, for example, using the relative positional relationship between the imaging area CA and the distance measurement area MA shown in FIG. 9 as a reference state, the camera control unit 20 performs distance measurement obtained in step S2 for each distance measurement position. Based on the value and the parallax information stored in the ROM, a correction for changing the relative positional relationship of each distance measurement position with respect to the imaging area CA from the positional relationship in the reference state to the actual positional relationship (hereinafter, “ Also referred to as “parallax correction”.

  Specifically, for example, when a distance measurement value is obtained as Xm for a certain distance measurement position, the relative positional relationship between the imaging area CA and the distance measurement position in the reference state and the imaging based on the parallax information A deviation amount between the imaging area CA and the relative positional relationship between the distance measurement position when the distance between the apparatus 1 and the distance measurement position is Xm is calculated. Then, from the relative positional relationship between the shooting area CA and the distance measurement position in the reference state, correction is performed to change the position relationship of the distance measurement position with respect to the shooting area CA by the calculated shift amount. Parallax correction is realized by performing such correction for all the distance measurement positions.

  13 and 14 are schematic diagrams showing specific examples of parallax correction. The relative positional relationship between the imaging area CA and the distance measurement area MA determined by the distance measurement position before and after the parallax correction is shown. Show. 13 and 14, a person HU as a main subject stands at a position about 2 m away from the imaging device 1, and a monument ST exists at a position about 5 m away from the imaging device 1 as the background. An example in which shooting is performed in a state is shown.

  When shooting with such a composition, the distance measurement value for each distance measurement position on the person HU is about 2 m, and the distance measurement value for each distance measurement position on the monument ST is about 5 m. It is done. Then, by parallax correction, as shown in FIG. 14, a plurality of measured values whose distance measurement value is 2 m are obtained from the positional relationship between the imaging area CA and the distance measurement area MA in the reference state as shown in FIG. The distance position MA1 is corrected so as to be shifted upward by the amount of deviation from the reference state, and the plurality of distance measurement positions MA2 whose distance measurement value is determined to be 5 m are downward by the amount of deviation from the reference state. It is corrected so as to be shifted to.

  In this example, the distance measurement position is corrected so as to be shifted in the vertical direction from the reference state by an amount corresponding to the distance measurement value. However, the present invention is not limited to this. For example, parallax information and each distance measurement target Based on the distance measurement value related to the position, the relative positional relationship between each actual distance measurement position and the imaging area CA may be directly obtained. That is, in step S3, prior to the actual shooting, the camera control unit 20 uses the above-described parallax correction, a method for directly obtaining the parallax information, and each distance measurement value related to each distance measurement position. Based on this, it is only necessary to recognize the actual relative positional relationship between the imaging area CA and each distance measurement position.

  In step S4, the camera control unit 20 detects a distance measurement position that is captured at a position closest to the approximate center of the live view based on the actual positional relationship recognized in step S3, and the process proceeds to step S5. Here, if one distance measurement position is detected, the distance measurement position is detected as a focus position. For example, the distance measurement position shown in FIG. 14 is detected as the focus position FP.

  In step S5, it is determined whether or not the in-focus position detected in step S4 is one. If only one in-focus position is detected, the process proceeds to step S7, and if a plurality of in-focus positions are detected, the process proceeds to step S6.

  In step S6, the focus position with the smallest corresponding distance measurement value is detected as the focus position from the plurality of focus positions, and the process proceeds to step S7.

  In step S7, when the process proceeds from step S5, the AF operation is performed based on the distance measurement value related to the distance measurement position (focus position) detected in step S4, and the process proceeds to step S8. On the other hand, when the process proceeds from step S6, the AF operation is performed based on the distance measurement value related to the distance measurement position (focus position) detected in step S6, and the process proceeds to step S8.

  In step S8, an exposure operation for actual photographing is performed, and the process proceeds to step S9.

  In step S9, the image data obtained by the exposure operation in step S8 is subjected to various types of image processing and compression processing to obtain image data for storage, for example, stored in the memory card 90 or the like. Then, the shooting operation flow is terminated.

  As described above, in the imaging device 1 according to the first embodiment, parallax information indicating the relationship between the positional relationship between the imaging area CA and the plurality of distance measurement positions and the distance from the imaging device 1 is stored in advance in a ROM or the like. Let me. Then, when photographing, the actual positional relationship between the photographing region and each distance measurement position is recognized based on the parallax information and the distance measurement values related to the distance measurement positions. Furthermore, based on the recognized positional relationship, a distance measurement position closest to the approximate center of the live view is detected as a focus position among a plurality of distance measurement positions, and the detected focus position is related. Focus control of the taking lens 1 is performed based on the distance measurement value. Therefore, when framing is performed so that the main subject is captured near the center of the live view, the main subject can be appropriately focused. That is, it is possible to perform focusing control in consideration of the deviation amount between the subject related to the live view and the distance measurement position, and in consideration of the parallax between the photographing optical system 3 and the distance measuring device 5, High-speed and appropriate focusing control can be realized.

  Further, the in-focus position is detected from a plurality of distance measurement positions that are two-dimensionally distributed on the object to be measured as viewed from the imaging device 1, and based on the distance measurement value relating to the detected focus position. Focus control of the taking lens 1 is performed. As a result, it is possible to perform distance measurement with respect to a measurement object that spreads in a wider range in the photographing direction, and it is possible to realize more appropriate focus control.

<(2) Second Embodiment>
In the above-described first embodiment, the relative positional relationship with the imaging region is recognized for each distance measurement position based on the distance measurement value and the parallax information, and the measured object that is captured at the position closest to the center of the live view. Although controlled to focus on the distance position, in the second embodiment, a group of objects having the same distance measurement value is regarded as one object, whereby the object to be measured is divided into a plurality of units (hereinafter, referred to as “objects”). (Referred to as “divided unit”), and the relative positional relationship with the shooting area is recognized for each divided unit based on the distance measurement value and the parallax information relating to each divided unit, and the positional relationship is used. Control is performed so as to focus on a divided unit that satisfies a predetermined condition.

  The imaging apparatus 1A according to the second embodiment has the same configuration as that of the imaging apparatus 1 according to the first embodiment, and only the processing content in the camera control unit 20 is different. Description is omitted using. Hereinafter, the imaging operation of the imaging apparatus 1A according to the second embodiment will be described mainly with respect to differences from the first embodiment.

<(2-1) Shooting operation>
FIGS. 15 and 16 are flowcharts illustrating the shooting operation flow of the image pickup apparatus 1A according to the second embodiment, and this operation flow is controlled by the camera control unit 20 as in the first embodiment. Here, similarly to the first embodiment, when the user performs framing while viewing the live view displayed on the rear LCD 9 or the EVF 10 of the imaging apparatus 1 and the shutter button 7 is pressed, the shooting operation flow is started. The process proceeds to step S11 in FIG.

  In step S11, AE is performed in the same manner as in step S1 shown in FIG. 12, and the process proceeds to step S12. In step S12, distance measurement is performed in the same manner as in step S2 in FIG. 12, and the process proceeds to step S13.

  In step S13, the camera control unit 20 divides the distance measurement object into a plurality of division units based on the distribution state of the distance measurement values related to the plurality of distance measurement positions obtained in step S12, and step S14. Proceed to

  Here, a method of dividing the distance measurement object into a plurality of division units will be described. Here, as shown in FIG. 17, an example is described in which shooting is performed with a composition in which one person MN stands in front and one tree TR stands behind. To do.

  FIG. 18 is a diagram illustrating the relationship between the distance measurement object and the distance measurement value when distance measurement is performed on the distance measurement object as illustrated in FIG. 17, and solid lines ML1 to ML7 indicate seven line sensors. 3 shows the relationship between the distance measurement object and the distance measurement value obtained by. In FIG. 18, the bottom surface indicates the position of the distance measurement object, and the vertical axis indicates the magnitude of the distance measurement value.

  As shown in FIG. 18, the distance measurement value for the near-side person MN is almost the same, the distance measurement value for the rear tree TR is almost the next smallest, and the distance measurement value for the distant view BG is the same. It becomes the largest almost the same. Therefore, for example, when the error of the distance measurement value is equal to or less than a predetermined value, if the distance measurement value is recognized as having almost the same distance measurement value, the distance measurement value is related to the distance measurement value MNL related to the person MN and the tree TR. Ranging value TRL is roughly divided into ranging value BGL related to distant view BG. Therefore, here, based on the distribution of distance measurement values, the object to be measured can be divided into a division unit related to the person MN, a division unit related to the tree TR, and a division unit related to the distant view BG.

  In step S14, the positional relationship with the imaging area CA is recognized for each divided unit divided in step S13, and the process proceeds to step S15. Here, for example, the camera control unit 20 calculates an average distance measurement value (hereinafter also referred to as “average distance measurement value”) for each divided unit, and measures the shooting area CA and the measurement area when shooting a subject near 3 m. Using the relative positional relationship with the distance area MA as a reference state, based on the average distance value associated with each divided unit and the parallax information stored in the ROM, the relative relationship between the imaging area CA and each divided unit is determined. Correction (parallax correction) is performed to change the positional relationship from the positional relationship in the reference state to the actual positional relationship.

  Specifically, for example, as illustrated in FIG. 18, when the average distance measurement value related to the person MN is obtained as 2 m and the average distance measurement value related to the tree TR is determined to be 4 m, the distance measurement position related to the person MN is determined. Is the relative positional relationship of the distance measurement position related to the person MN with respect to the imaging area CA in the reference state and the imaging area CA when the distance between the imaging device 1A and the distance measurement position is 2 m from the parallax information. The amount of deviation between the relative positions of the distance measurement positions is calculated. Then, from the relative positional relationship between the imaging area CA and the distance measurement position related to the person MN in the reference state, the positional relationship between the imaging area CA and the distance measurement position related to the person MN is calculated by the calculated amount of deviation. Make corrections to change. On the other hand, with respect to the distance measurement position related to the tree TR, from the parallax information, the relative positional relationship of the distance measurement position related to the tree TR with respect to the shooting area CA in the reference state and the imaging apparatus 1A and the distance measurement position. The amount of deviation from the relative positional relationship of the distance measurement position with respect to the imaging area CA when the distance is 4 m is calculated. Then, based on the relative positional relationship between the shooting area CA and the distance measurement position related to the tree TR in the reference state, the positional relationship between the shooting area CA and the distance measurement position related to the tree TR is calculated by the calculated deviation amount. Make corrections to change. By performing such correction for all the divided units, parallax correction is realized.

  FIGS. 19 and 20 are schematic diagrams showing specific examples of parallax correction, and show the relative positional relationship between the imaging area CA and the distance measurement position of each divided unit before and after the parallax correction, respectively. . Here, by parallax correction, as shown in FIG. 20, the person MN is obtained from the relative positional relationship between the shooting area CA in the reference state as shown in FIG. 19 and the distance measurement positions related to the person MN and the tree TR. The four distance measuring position columns MNC according to the above are corrected so as to be shifted upward by the amount of deviation from the reference state, and the four distance measuring position columns TRC related to the tree TR are corrected. The correction is made so that the shift is made downward by the amount of deviation from the reference state.

  Here, the distance measurement position is corrected to be shifted in the vertical direction from the reference state by an amount corresponding to the average distance measurement value of each divided unit. However, the present invention is not limited to this. For example, disparity information Based on the average distance measurement value for each division unit, the relative positional relationship between the actual distance measurement position for each division unit and the imaging area CA may be directly obtained. That is, the camera control unit 20 uses the above-described parallax correction, a method for directly obtaining, and the like, based on the parallax information and the average distance measurement value for each division unit, the distance measurement position for each division unit. That is, it is only necessary to recognize the actual relative positional relationship between each divided unit and the imaging area CA.

  In step S15, person determination is performed for each divided unit, and the process proceeds to step S21 in FIG. Here, the shape of each division unit is analyzed, and based on the shape, a division unit (hereinafter referred to as “person unit”) corresponding to a person who is a predetermined type of object among the plurality of division units is detected. . For example, for each divided unit, it is determined whether or not the length of a plurality of distance measurement position columns corresponding to seven line sensors corresponds to a person showing a predetermined change in order from the top. Specifically, for example, like the person MN shown in FIG. 20, the length of the row MNC of the four distance measurement positions related to the person MN corresponds to the head, the neck, and the shoulder in order from the top. If it has a change, it can be determined that the division unit related to the person MN is a person.

  Here, the presence / absence of the person unit is detected in order to estimate that the person is the main subject if there is a person in the subject. If there is only one person in the subject, it can be estimated that the one person is the main subject. That is, here, when only one person unit is detected, it is detected that the person unit is an object to be focused (hereinafter referred to as “focused body”).

  In step S21, based on the result of the person determination in step S15, it is determined whether or not a person is present in the distance measuring object. Here, if there is a divided unit determined to be a person unit in step S15, the process proceeds to step S22, and if there is no person unit, the process proceeds to step S24.

  In step S22, it is determined whether there are a plurality of person units. Here, if it is determined in step S15 that there are two or more person units, the process proceeds to step S23, and if there is one person unit, the process proceeds to step S27.

  In step S23, a person unit that is captured at a position closest to the center of the live view is detected, and the process proceeds to step S25. Here, the one corresponding to the head of the person unit is detected closest to the center of the live view.

  Here, a method for detecting a person unit that is captured at a position closest to the center of the live view among the plurality of person units will be specifically described.

  For example, suppose a case where the person MN1 and the person MN2 stand in the vicinity of about 2 m and about 4 m from the imaging apparatus 1A, respectively, as shown in FIG. 21 instead of the person MN and the tree TR shown in FIG. 21 and 22 are schematic diagrams showing specific examples of the parallax correction, and show the relative positional relationship between the imaging area CA and the distance measurement position of each divided unit before and after the parallax correction, respectively. In this case, due to the parallax correction, as shown in FIG. 22, the row MN1C of the four distance measurement positions related to the person MN1 is shifted upward by the amount of deviation from the reference state. The correction is made so that the column MN2C of the seven distance measurement positions related to the person MN2 is shifted downward by the amount of deviation from the reference state.

  At this time, for each person MN1, MN2, a row of distance measurement positions corresponding to the head is detected based on a change in the length of the row of a plurality of distance measurement positions. Then, of the range of distance measurement positions corresponding to the head, the person unit having the column of the distance measurement position corresponding to the position closer to the center CO of the live view is captured at the position closest to the center of the live view. Detected as a human unit. Specifically, for example, as shown in FIG. 22, a row HL1a, HL1b of distance measurement positions corresponding to the head related to the person MN1, and a row HL2 of distance measurement positions corresponding to the head related to the person MN2. Among these, since the column HL1b corresponding to the head related to the person MN1 is captured at the position closest to the center of the live view, the person unit related to the person MN1 is captured as the person unit positioned at the position closest to the center of the live view. Detected.

  It should be noted that the person unit that is captured at a position closest to the center of the live view among the plurality of person units is detected when the subject includes a plurality of persons, near the center of the live view. This is because it is assumed that framing is performed so as to capture the person who hits it. That is, here, when there are a plurality of person units, if only one person unit that is captured at the position closest to the center in the live view is detected, the person unit is in focus. It is detected as an object to be performed (a focused object).

  In step S24, a foreground unit that is captured at a position closest to the center of the live view is detected from among a plurality of divided units (hereinafter referred to as “near view units”) excluding the divided unit related to the distant view BG, and the process proceeds to step S25. . Here, a foreground unit having a row of distance measurement positions that is captured at a position closest to the center in the live view is detected from the row of distance measurement positions related to the plurality of foreground units. Here, the division unit related to the distant view BG is excluded because when the distant view BG and the foreground (person, etc.) are mixed in the shooting area, it is common to focus on the close view. is there. Note that by providing a predetermined threshold value for the average distance measurement value related to the division unit, the division unit and the foreground unit related to the distant view BG can be distinguished. Further, in step S24, when only one foreground unit that is detected at the position closest to the center in the live view is detected among the plurality of foreground units, it is assumed that the foreground unit is a focused object. Detected.

  In step S25, it is determined whether or not there are a plurality of division units detected in step S23 or step S24. For example, in step S23, among a plurality of distance measurement position columns corresponding to the head of each person unit, there are a plurality of distance measurement position columns that can be captured at a position closest to the center of the live view. In such a case, a plurality of person units may be detected. Also in step S24, if there are a plurality of distance measurement position columns that are captured at the closest position from the center of the live view among the distance measurement position columns associated with each foreground unit, It is possible that a foreground unit is detected. Therefore, in step S25, in order to narrow down the number of divided units to be focused finally to one, as a preparation stage, it is determined whether there are a plurality of detected divided units. Here, when it is determined that there are a plurality of divided units, the process proceeds to step S26, and when it is determined that there are not a plurality of divided units, the process proceeds to step S27.

  In step S26, among the plurality of division units detected in step S23 or step S24, the division unit closest to the imaging device 1A is detected, and the process proceeds to step S27. Here, the average distance values related to the respective divided units are compared, and the divided unit related to the smallest average distance value is detected as the closest divided unit from the imaging apparatus 1A.

  Here, among the plurality of divided units, the closest divided unit from the imaging device 1A is detected because a plurality of person units are detected near the center of the live view, or a plurality of divided units other than the person units are detected. This is because it can be assumed that the division unit located at a closer distance from the imaging apparatus 1A corresponds to the main subject. That is, here, when a plurality of person units are captured near the center of the live view, a person unit closer to the imaging device 1A is detected as a focused object, and a plurality of foreground units are located near the center of the live view. If it is captured, the foreground unit closer to the imaging device 1A is detected as a focused object.

  In step S27, a distance measurement value used for AF is determined, and the process proceeds to step S28. Here, for example, when the process proceeds from step S22, the average distance measurement value related to only one person unit is determined as the distance measurement value for AF, and when the process proceeds from step S25. Then, the average distance value related to one divided unit detected in step S23 or step S24 is determined as the distance value for AF. When the process proceeds from step S26, the average distance value for one divided unit detected in step S26 is determined as the distance value for AF.

  In step S28, AF operation is performed based on the distance measurement value related to the in-focus object determined in step S27, and the process proceeds to step S29.

  In step S29 and step S30, processing similar to that in step S8 and step S9 shown in FIG. 12 is performed, and the photographing operation flow ends.

  As described above, in the imaging apparatus 1A according to the second embodiment, the distance measurement object is divided into a plurality of division units based on the distribution state of the distance measurement values. Then, the relative positional relationship between the imaging area CA and each divided unit is recognized based on the parallax information and the average distance measurement value related to each divided unit. Then, based on the relative positional relationship between the imaging area CA and each divided unit, a divided unit that satisfies a predetermined condition such as being captured at a position closest to the center of the live view is detected as a focused object. Then, focusing control of the photographing lens 3 is performed based on the distance measurement value relating to the focused object. By adopting such a configuration, it is possible to realize high-speed and appropriate focusing control in consideration of the parallax between the photographing optical system and the distance measuring device.

  Further, when a person unit is detected based on the shape of the divided unit and a plurality of person units are detected, one person unit closest to the center in the live view among the plurality of person units is focused. Detect as body. Therefore, when framing is performed while viewing the live view, it is possible to appropriately focus on the person who is the main subject that the user mainly wants to shoot. That is, it is possible to perform focusing control that well reflects the user's intention.

  When no person unit is detected, one of the foreground units closest to the center in the live view among the foreground units excluding the division unit related to the background is detected as a focused object. As a result, when framing is performed while viewing the preview, it is possible to appropriately focus on the near view, which is the main subject that the user mainly wants to shoot. That is, it is possible to perform focusing control that well reflects the user's intention.

<(3) Third embodiment>
In the imaging device 1 according to the first embodiment, as shown in FIG. 5, the light receiving unit PS of the distance measuring device 5 includes seven line sensors PSa1 to PSa7 and PSb1 to PSb7 that are separated from each other in a substantially vertical direction. However, in the imaging apparatus 1B according to the third embodiment, each of the light receiving units PS has three line sensors as shown in FIG. PSc1 to PSc3, PSd1 to PSd3 are configured to have two light receiving sensors PSc and PSd arranged in parallel and spaced apart from each other in a substantially vertical direction. In the imaging apparatus 1, the distance measurement values were obtained for each of the 40 areas for each set of line sensors. In the imaging apparatus 1B, the set of line sensors PSc1 and PSd1, the set of line sensors PSc2 and PSd2, In addition, one distance measurement value is obtained for each of the sets of line sensors PSc3 and PSd3. Due to such differences, the imaging device 1 and the imaging device 1B differ in the parallax correction in the camera control unit 20 and the method for determining the in-focus position. The configuration of the imaging apparatus 1B according to the third embodiment is the same as the configuration of the imaging apparatus 1 according to the first embodiment except for the number of line sensors. Therefore, the same reference numerals are used for the same configurations. The description is omitted.

  Hereinafter, a parallax correction and a method for determining a focus position, which are different from the first embodiment, will be described. In the following, description will be given by taking as an example the case of shooting a composition in which an airplane model AP is placed on a desk DK.

  24 and 25 are diagrams for specifically explaining the parallax correction and the determination of the in-focus position, and the relative positions of the imaging area CA and the respective measured positions MP1 to MP3 before and after the parallax correction, respectively. The positional relationship is shown. FIG. 24 is a diagram illustrating the relative positional relationship between the imaging area CA of the imaging apparatus 1 and the three distance measuring positions MP1 to MP3 by the distance measuring apparatus 5 in a state (reference state) of photographing a subject near 3 m. is there. Note that the distance measurement position MP1 is set on the object to be measured by the line sensors PSc1 and PSd1, the distance measurement position MP2 by the line sensors PSc2 and PSd2, and the distance measurement position MP3 by the line sensors PSc3 and PSd3.

  Here, for example, when shooting with a composition as shown in FIG. 24, if the distance measurement values are calculated as 70 cm, 60 cm, and 50 cm for the distance measurement positions MP1 to MP3, respectively, Parallax correction is performed based on the parallax information. Specifically, from the relative positional relationship between the imaging area CA and the distance measurement positions MP1 to MP3 in the reference state as shown in FIG. 24, as shown in FIG. The correction is made so that the shift amount is shifted upward by the amount of deviation from the reference state.

  Here, since the distance measurement values for the distance measurement positions MP1 to MP3 are smaller than 3 m, the distance measurement positions MP1 to MP3 are shifted upward by the parallax correction. Needless to say, if the distance measurement values for the distance measurement positions MP1 to MP3 are larger than 3 m, the distance measurement positions MP1 to MP3 are shifted downward by the parallax correction. In addition, here, the distance measurement position is corrected so that the distance measurement position is shifted in the vertical direction from the reference state by an amount corresponding to the distance measurement value for each of the distance measurement positions MP1 to MP3. Based on the distance measurement values for the distance measurement positions MP1 to MP3, the relative position relationship between the actual distance measurement positions MP1 to MP3 and the imaging area CA is directly obtained and recognized. May be.

  Next, based on the relative positional relationship between each of the actual distance measurement positions MP1 to MP3 and the shooting area CA shown in FIG. 25, the distance measurement position (here, the position closest to the center in the live view) Then, the distance measurement position MP3) is detected as the focus position. When a plurality of distance measurement positions that are captured at a position closest to the center in live view are detected, a distance measurement value is obtained from the plurality of distance measurement positions that are captured at a position closest to the center CE in live view. Is detected as the in-focus position. Since the imaging operation flow related to the imaging apparatus 1B according to the third embodiment is the same as that according to the first embodiment shown in FIG. 12, the description thereof is omitted here.

  As described above, in the imaging device 1B according to the third embodiment, the light receiving unit PS of the distance measuring device 5 includes the three line sensors PSc1 to PSc3 and PSd1 to PSd3 that are separated from each other in the substantially vertical direction. It has two light receiving sensors PSc and PSd provided. Then, one distance measurement value is obtained for each set of line sensors. As a result, as compared with the imaging device 1C according to the first embodiment, the configuration of the distance measuring device 5 is simplified, and the distance measurement value is calculated and the parallax correction is performed by reducing the distance measurement position. Such a calculation amount can be significantly reduced. Accordingly, it is possible to realize high-speed and appropriate focusing control with a simple configuration in consideration of the parallax between the photographing optical system and the distance measuring device.

<(4) Modification>
As mentioned above, although embodiment of this invention was described, this invention is not limited to the thing of the content demonstrated above.

  For example, in the first and second embodiments, the light receiving sensors PSa and PSb of the distance measuring device 5 are configured such that a plurality of line sensors are arranged apart from each other in a substantially vertical direction. For example, as shown in FIG. 26, m light receiving elements are provided adjacent to each other in substantially horizontal and substantially vertical directions so that m × m light receiving sensors PSa and PSb are provided. The light receiving element may be a two-dimensionally arranged light receiving sensor (hereinafter also referred to as “area sensor”). Then, each light receiving sensor PSa, PSb is divided into n areas (for example, n = 40) in substantially horizontal and vertical directions, respectively, and correlation calculation is performed on data corresponding to each area in the two light receiving sensors PSa, PSb. For example, distance measurement values for 40 × 40 = 1600 areas may be acquired.

  With such a configuration, it becomes possible to measure more finely a large number of points on the object to be measured, so that the accuracy of parallax correction and determination of the in-focus position, that is, focus control Improves accuracy. However, since the calculation amount related to the calculation of the distance measurement value and the parallax correction increases with the increase in the position to be measured, from the viewpoint of speeding up the focus control, a plurality of like the first embodiment. It can be said that it is preferable to configure the light receiving sensors PSa and PSb by arranging the line sensors apart. Therefore, the configuration of the distance measuring device 5 may be changed depending on whether priority is given to the accuracy of focusing control or speeding up.

  In the above-described embodiment, the distance measuring device 5 adopting the external light passive method is used. However, the distance measuring device 5 is not limited to this. For example, near-infrared light is projected if the external light type is used. A distance measuring device employing an active method may be used. However, near-infrared light has a limited range of light, and as described above, when the active method is used as the distance measurement method using multiple line sensors or area sensors, Since the light must be projected, the portion to be projected is increased in size and the power consumption is increased. Therefore, in consideration of the above-described problems and the like, it is preferable to use a passive method for the distance measuring device 5 and it is considered realistic.

  In the first embodiment described above, the relative positional relationship between the imaging area CA and each measured distance position is recognized by performing parallax correction for each measured distance position. For example, it is considered that the subject that is close to the imaging device and is captured at a height near the center of the live view is the main subject, and based on the smallest distance value obtained for each line sensor, each line sensor The parallax correction is performed so that the columns of the measured distance positions related to each other are shifted together, and among the measured distance position columns after the parallax correction, the measured distance is obtained at the position closest to the center in the live view. A row of distance positions may be detected, and focus control may be performed based on the smallest distance value associated with the row of distance measurement positions.

  In the above-described embodiment, since the distance measuring device 5 is arranged in the vertical direction away from the optical axis P1 of the photographing lens 3, the line sensors in the distance measuring device 5 are separated from each other in the substantially vertical direction. However, the present invention is not limited to this. For example, the distance measuring device 5 is arranged away from the optical axis P1 of the photographing lens 3 in a predetermined direction such as a horizontal direction, and each line sensor in the distance measuring device 5 is provided. May be configured to be spaced apart from each other in a substantially predetermined direction such as a substantially horizontal direction.

  In the above-described embodiment, it is assumed that framing is performed so that the main subject is located near the approximate center of the live view. The unit or the like is detected. However, the present invention is not limited to this. For example, it has a function that allows the user to specify the position of the subject to be focused by appropriately moving the cursor superimposed on the live view by the user's operation. In the case of an imaging device, a distance measurement position, a person unit, a foreground unit, or the like that is captured at a position closest to a predetermined position specified in the live view may be detected. When a plurality of distance measurement positions that are captured at a position closest to the center in the live view are detected, the distance measurement position having the smallest distance measurement value from the distance measurement positions is determined as the in-focus position. Can be detected as That is, based on the relative positional relationship between the photographing area CA and the distance measurement position, a distance measurement position that satisfies a predetermined condition among a plurality of distance measurement positions is detected as a focus position. Also good. Even with such a configuration, it is possible to obtain the same operations and effects as those of the imaging apparatus 1 according to the first embodiment.

  In the second embodiment described above, when it is determined in step S23 that there are two or more person units, a person unit that is captured at a position closest to the center in the live view is detected. For example, the person unit closest to the imaging device 1A may be detected among two or more person units. With this configuration, it is possible to easily detect a person unit to be focused from a plurality of person units.

  In the second embodiment described above, in step S27, the average distance value related to the divided unit is determined as the distance value for AF. However, the present invention is not limited to this. For example, the head of the person unit A distance measurement value related to a distance measurement position in the center of a distance measurement position column corresponding to the distance measurement position may be adopted as a distance measurement value for AF. By adopting such a configuration, it is possible to focus on a position closer to the center of the person's face that is the main subject, rather than focusing near the contour of the person's face, etc. Can be focused more appropriately.

  In the second embodiment described above, it is assumed that the distant view BG is not focused. However, since there may be a case where it is desired to focus on the distant view BG, a distant view mode for focusing on the distant view BG is provided. In the distant view mode, the distant view BG may be selectively detected as a focused object.

  In the second embodiment described above, the person unit is detected based on the shape of the divided unit. However, the present invention is not limited to this. For example, a divided unit that matches the shape of an automobile having a horizontally long shape is used. You may make it detect. That is, based on the shape of each divided unit, a divided unit (also referred to as “predetermined type unit”) corresponding to a predetermined type of object such as a person or an automobile may be detected from among the plurality of divided units. With this configuration, when framing is performed while viewing the preview, the user can properly focus on a predetermined type of object that the user mainly wants to shoot, which reflects the user's intention well. Focus control can be performed.

  In the above-described embodiment, the distance measurement device 5 is used to perform distance measurement and focus control is performed. However, the present invention is not limited to this. For example, by operating the mode switching dial 8, As in the embodiment described above, the first mode for performing focus control by performing distance measurement using the distance measuring device 5 and the second mode for performing focus control of the contrast method using image data from the image sensor 13. It may be possible to switch between modes.

  In the above-described embodiment, since the distance measuring device 5 and the photographing lens 3 are spaced apart in the vertical direction, the parallax generated in the vertical direction is corrected. However, the present invention is not limited to this. In addition, even when the distance measuring device 5 and the photographing lens 3 are spaced apart in various directions, the necessary parallax correction corresponding to the variously spaced directions is performed by the same method as in the above-described embodiment. You can also do it.

  For example, as shown in FIG. 5, when the distance measuring device 5 on which the sensor is arranged and the photographing lens 3 are spaced apart in the left-right direction, the distance measurement value and the parallax information (left-right direction) relating to each distance measurement position And the parallax correction for changing the relative positional relationship of each distance measurement position with respect to the imaging region from the positional relationship in the reference state to the actual positional relationship. You can also. Furthermore, based on the relative positional relationship of each distance measurement position with respect to the imaging region after the parallax correction, a distance measurement position satisfying a predetermined condition among a plurality of distance measurement positions is detected as a focus position. You can also. That is, based on the relative positional relationship of each distance measurement position with respect to the imaging region after parallax correction, for example, which light receiving area on the same line sensor is used to perform focus control based on the distance measurement value. It can also be determined (detected).

1 is a front view of an imaging apparatus 1 according to a first embodiment of the present invention. It is sectional drawing which looked at the imaging device 1 from the side. 3 is a block diagram showing internal functions of the imaging apparatus 1. FIG. It is sectional drawing which looked at the ranging apparatus 5 from upper direction. It is a figure which shows the arrangement | sequence of the ranging sensor in the ranging apparatus 5. FIG. It is a figure explaining the ranging method in a light receiving sensor. It is a figure which illustrates the relative positional relationship of an imaging area and a ranging area. It is a figure explaining the parallax of a photographic lens and a distance measuring device. It is a figure for demonstrating the malfunction by parallax. It is a figure for demonstrating the malfunction by parallax. It is a figure for demonstrating the malfunction by parallax. 4 is a flowchart illustrating an imaging operation flow of the imaging apparatus 1. It is a figure for demonstrating parallax correction. It is a figure for demonstrating parallax correction. It is a flowchart which illustrates the imaging | photography operation | movement flow which concerns on 2nd Embodiment. It is a flowchart which illustrates the imaging | photography operation | movement flow which concerns on 2nd Embodiment. It is a figure which illustrates a ranging object. It is a figure which illustrates the relationship between a ranging object and a ranging value. It is a figure for demonstrating parallax correction. It is a figure for demonstrating parallax correction. It is a figure which concerns on the detection method of the person unit which concerns on the screen center vicinity. It is a figure which concerns on the detection method of the person unit which concerns on the screen center vicinity. It is a front schematic diagram of the light-receiving part PS of the distance measuring device 5 which concerns on 3rd Embodiment. It is a figure for demonstrating the parallax correction | amendment and determination of a to-be-focused position. It is a figure for demonstrating the parallax correction | amendment and determination of a to-be-focused position. It is a figure which shows the light receiving sensor which concerns on a modification.

Explanation of symbols

1, 1A, 1B Imaging device 3 Shooting lens 5 Distance measuring device 9 Liquid crystal display screen (rear LCD)
10 Electronic viewfinder (EVF)
13 Image sensor 20 Camera control unit 25 Ranging control unit PS Light receiving unit

Claims (8)

  1. An imaging device,
    Image acquisition means for acquiring an image based on a light image formed by the taking lens;
    Display means for displaying the image as a preview image;
    Distance measurement data is acquired for each of a plurality of distance measurement positions including a plurality of positions that are spaced apart from the optical axis of the photographic lens in a predetermined direction and that are separated from the optical axis in the predetermined direction. External light type ranging data acquisition means,
    Distance value calculation means for calculating distance values related to the plurality of distance measurement positions based on distance measurement data acquired by the distance measurement data acquisition means;
    Storage means for storing disparity information indicating a relation between a positional relationship between the imaging region by the imaging means and the plurality of distance measurement positions and a distance from the imaging device;
    When shooting, based on the parallax information and the distance values related to the plurality of distance measurement positions calculated by the distance value calculation means, the imaging region and the plurality of distance measurement positions Recognition means for recognizing the actual positional relationship of
    Based on the actual positional relationship recognized by the recognizing means, a detecting means for detecting, as a focused position, a measured distance position that satisfies a predetermined condition among the plurality of measured positions;
    Control means for performing focus control of the photographing lens based on a distance measurement value related to the focus position;
    An imaging apparatus comprising:
  2. The imaging apparatus according to claim 1,
    The imaging apparatus, wherein the plurality of distance measurement positions are two-dimensionally distributed on a distance measuring object as viewed from the imaging apparatus.
  3. The imaging apparatus according to claim 1 or 2,
    The detection means is
    An imaging apparatus, wherein a distance measurement position that is captured at a position closest to a predetermined position in the image among the plurality of distance measurement positions is detected as the focus position.
  4. The imaging apparatus according to claim 3,
    The predetermined position is
    An imaging apparatus characterized by being substantially in the center of the image.
  5. An imaging device,
    Image acquisition means for acquiring an image based on a light image formed by the taking lens;
    Display means for displaying the image as a preview image;
    Outside of acquiring distance measurement data for a plurality of distance measurement positions which are arranged in a predetermined direction away from the optical axis of the photographing lens and are two-dimensionally distributed on the distance measurement object as viewed from the imaging device. Optical distance measuring data acquisition means;
    Distance value calculation means for calculating distance values related to the plurality of distance measurement positions based on distance measurement data acquired by the distance measurement data acquisition means;
    Storage means for storing disparity information indicating a relation between a positional relationship between the imaging region by the imaging means and the plurality of distance measurement positions and a distance from the imaging device;
    A dividing unit that divides the ranging object into a plurality of units based on a distribution state of the ranging values related to the plurality of ranging positions calculated by the ranging value calculating unit when photographing;
    Recognition of recognizing an actual positional relationship between the imaging region and the plurality of units based on the parallax information and the distance measurement values related to the plurality of distance measurement positions calculated by the distance measurement value calculation unit. Means,
    Based on the actual positional relationship recognized by the recognizing unit, a detecting unit that detects a unit satisfying a predetermined condition as a focused object among the plurality of units;
    Control means for performing focusing control of the photographing lens based on a distance measurement value related to the focused object;
    An imaging apparatus comprising:
  6. The imaging apparatus according to claim 5,
    The detection means is
    Means for detecting a predetermined type unit corresponding to a predetermined type of object among the plurality of units based on the shape of each unit;
    Means for detecting, as the in-focus object, the predetermined type unit captured at a position closest to a predetermined position in the image among a plurality of predetermined type units detected by the unit detection unit;
    An imaging device comprising:
  7. The imaging apparatus according to claim 6,
    An imaging apparatus, wherein the predetermined type of object is a person.
  8. The imaging apparatus according to claim 5,
    The detection means is
    An imaging apparatus, wherein a unit that is captured at a position closest to a predetermined position in the image is detected as the focused object, except for a unit related to a distant view among the plurality of units.
JP2004117147A 2004-04-12 2004-04-12 Imaging apparatus Pending JP2005300925A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004117147A JP2005300925A (en) 2004-04-12 2004-04-12 Imaging apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004117147A JP2005300925A (en) 2004-04-12 2004-04-12 Imaging apparatus

Publications (1)

Publication Number Publication Date
JP2005300925A true JP2005300925A (en) 2005-10-27

Family

ID=35332533

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004117147A Pending JP2005300925A (en) 2004-04-12 2004-04-12 Imaging apparatus

Country Status (1)

Country Link
JP (1) JP2005300925A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011059415A (en) * 2009-09-10 2011-03-24 Canon Inc Image pickup apparatus and range-finding method
JP2012137674A (en) * 2010-12-27 2012-07-19 Canon Inc Focus detector and imaging apparatus with the same
EP2552101A3 (en) * 2006-04-11 2014-08-27 Nikon Corporation Electronic camera and image processing apparatus

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2552101A3 (en) * 2006-04-11 2014-08-27 Nikon Corporation Electronic camera and image processing apparatus
US9485415B2 (en) 2006-04-11 2016-11-01 Nikon Corporation Electronic camera and image processing apparatus
JP2011059415A (en) * 2009-09-10 2011-03-24 Canon Inc Image pickup apparatus and range-finding method
JP2012137674A (en) * 2010-12-27 2012-07-19 Canon Inc Focus detector and imaging apparatus with the same

Similar Documents

Publication Publication Date Title
US9253390B2 (en) Image processing device, image capturing device, image processing method, and computer readable medium for setting a combination parameter for combining a plurality of image data
TWI655869B (en) Imaging device, solid-state imaging device, camera module, electronic device, and imaging method
US8890988B2 (en) Image pickup device, including gain-setting of pixel arrays, image pickup apparatus, control method, and program
JP6257245B2 (en) Imaging apparatus and control method thereof
US8730374B2 (en) Focus detection apparatus
JP5898501B2 (en) Image processing apparatus, imaging apparatus, control method, program, and recording medium
JP5034556B2 (en) Focus detection apparatus and imaging apparatus
US9411128B2 (en) Automatic focusing apparatus with cyclic pattern determination
US8773509B2 (en) Imaging device, imaging method and recording medium for adjusting imaging conditions of optical systems based on viewpoint images
US8750699B2 (en) Image capture apparatus
US8649574B2 (en) Imaging apparatus, control method of imaging apparatus, and computer program
JP4115801B2 (en) 3D imaging device
US7683962B2 (en) Camera using multiple lenses and image sensors in a rangefinder configuration to provide a range map
CN102203655B (en) Image capturing apparatus
US9456129B2 (en) Image processing device, imaging device, program, and image processing method
JP5206095B2 (en) Composition determination apparatus, composition determination method, and program
DE102012217165A1 (en) Image pickup device and focus detection method
JP5489641B2 (en) Focus detection apparatus and control method thereof
RU2456654C2 (en) Image capturing device, control method thereof and data medium
US7079188B2 (en) Object detecting apparatus
US7982794B2 (en) Digital cameras with triangulation autofocus systems and related methods
JP2013537728A (en) Video camera providing video with perceived depth
US7728903B2 (en) Focus adjustment device, focus adjustment method and camera
KR20130011424A (en) Apparatus and method for controlling focus by image sensor for outputting phase difference signal
US8154605B2 (en) Image tracking apparatus, image tracking method and camera for improved recognition of a target object