WO2015163350A1 - Image processing device, imaging device and image processing program - Google Patents

Image processing device, imaging device and image processing program Download PDF

Info

Publication number
WO2015163350A1
WO2015163350A1 PCT/JP2015/062202 JP2015062202W WO2015163350A1 WO 2015163350 A1 WO2015163350 A1 WO 2015163350A1 JP 2015062202 W JP2015062202 W JP 2015062202W WO 2015163350 A1 WO2015163350 A1 WO 2015163350A1
Authority
WO
WIPO (PCT)
Prior art keywords
parallax
amount
image data
subject
value
Prior art date
Application number
PCT/JP2015/062202
Other languages
French (fr)
Japanese (ja)
Inventor
潤弥 萩原
清茂 芝崎
石賀 健一
文樹 中村
祐介 ▲高▼梨
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2015163350A1 publication Critical patent/WO2015163350A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof

Definitions

  • the present invention relates to an image processing device, an imaging device, and an image processing program.
  • Patent Literature JP-A-8-47001
  • Stereo image data captured by a stereo imaging device may cause extreme parallax between the left and right images due to the placement of the subject in the scene, and viewers will feel uncomfortable and tired during viewing. was there. On the other hand, even when the parallax amount is adjusted to suppress such extremely generated parallax, there is a demand for the main subject to have a stereoscopic effect.
  • An image processing apparatus includes an acquisition unit that acquires image data, a calculation unit that calculates a parallax amount of each of a plurality of subjects included in an image of the image data, and parallax image data from the image data.
  • a determination unit that determines an adjustment parameter value for adjusting the amount of parallax applied to each of a plurality of subjects in accordance with the calculated amount of parallax, and applies the adjustment parameter value to the image data.
  • a generation unit that generates parallax image data that is an image having parallax amounts adjusted to each other.
  • An image processing program includes an acquisition step of acquiring image data, a calculation step of calculating the amount of parallax for each of a plurality of subjects included in the image of the image data, and parallax image data from the image data.
  • a determination step for determining an adjustment parameter value for adjusting the amount of parallax applied to each of a plurality of subjects in accordance with the calculated amount of parallax, and applying the adjustment parameter value to the image data And causing the computer to execute a generation step of generating parallax image data to be images having parallax amounts adjusted to each other.
  • the digital camera according to the present embodiment which is a form of the imaging device, is configured to generate a plurality of viewpoint images for one scene by one shooting. Each image having a different viewpoint is called a parallax image.
  • a parallax image In the present embodiment, a case where a right parallax image and a left parallax image from two viewpoints corresponding to the right eye and the left eye are generated will be described.
  • the digital camera in the present embodiment can generate a parallax-free image without parallax from the central viewpoint together with the parallax image.
  • FIG. 1 is a diagram illustrating a configuration of a digital camera 10 according to an embodiment of the present invention.
  • the digital camera 10 includes a photographic lens 20 as a photographic optical system, and guides a subject light beam incident along the optical axis 21 to the image sensor 100.
  • the photographing lens 20 may be an interchangeable lens that can be attached to and detached from the digital camera 10.
  • the digital camera 10 includes an image sensor 100, a control unit 201, an A / D conversion circuit 202, a memory 203, a drive unit 204, an image processing unit 205, a memory card IF 207, an operation unit 208, a display unit 209, and an LCD drive circuit 210. .
  • the direction parallel to the optical axis 21 toward the image sensor 100 is defined as the Z-axis plus direction
  • the direction toward the front of the drawing on the plane orthogonal to the Z-axis is the X-axis plus direction
  • the upward direction on the drawing is Y.
  • the axis is defined as the plus direction.
  • the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes of FIG.
  • the photographing lens 20 is composed of a plurality of optical lens groups, and forms an image of a subject light flux from the scene in the vicinity of its focal plane.
  • the photographic lens 20 is represented by a single virtual lens arranged in the vicinity of the pupil. Further, in the vicinity of the pupil, a diaphragm 22 that restricts the incident light beam concentrically with the optical axis 21 as the center is disposed.
  • the image sensor 100 is disposed in the vicinity of the focal plane of the photographing lens 20.
  • the image sensor 100 is an image sensor such as a CCD or CMOS sensor in which a plurality of photoelectric conversion elements are two-dimensionally arranged.
  • the image sensor 100 is controlled in timing by the drive unit 204, converts the subject image formed on the light receiving surface into an image signal, and outputs the image signal to the A / D conversion circuit 202.
  • the A / D conversion circuit 202 converts the image signal output from the image sensor 100 into a digital image signal and outputs the digital image signal to the memory 203.
  • the image processing unit 205 performs various image processing on the digital image signal using the memory 203 as a work space, and generates captured image data.
  • the captured image data includes reference image data generated from the output of the non-parallax pixels of the image sensor 100 and parallax image data generated from the output of the parallax pixels of the image sensor 100, as will be described later.
  • the control unit 201 controls the digital camera 10 in an integrated manner. For example, the aperture of the diaphragm 22 is adjusted according to the set diaphragm value, and the photographing lens 20 is advanced and retracted in the optical axis direction according to the AF evaluation value. Further, the position of the photographing lens 20 is detected, and the focal length and the focus lens position of the photographing lens 20 are grasped. Further, a timing control signal is transmitted to the drive unit 204, and a series of imaging control until the image signal output from the imaging element 100 is processed into captured image data by the image processing unit 205 is managed, and the captured image Get the data.
  • control unit 201 includes a depth information detection unit 230.
  • the depth information detection unit 230 detects the subject distribution in the depth direction with respect to the scene. Specifically, the control unit 201 detects the subject distribution from the defocus amount for each subdivided area using defocus information used for autofocus.
  • the defocus information may use the output of a phase difference sensor provided exclusively, or may use the output of parallax pixels of the image sensor 100.
  • the parallax image data processed by the image processing unit 205 can also be used.
  • the subject distribution can be detected even when the focus lens is advanced and retracted without using the defocus information and the AF evaluation value by the contrast AF method is calculated for each subdivided region.
  • the image processing unit 205 processes the image signal output from the image sensor 100 to generate captured image data.
  • the image processing unit 205 includes a calculation unit 231, a determination unit 232, a parallax image data generation unit 233, and a moving image generation unit 234.
  • the calculation unit 231 calculates the parallax amount of each subject included in the image of the captured image data, that is, the unadjusted parallax amount from the left and right parallax image data described later.
  • the determination unit 232 determines a change condition for changing the parallax amount according to the parallax amount of each subject calculated by the calculation unit 231. More specifically, the determination unit 232 determines the value of the stereoscopic adjustment parameter so that the amount of parallax between output parallax images falls within the target amount of parallax.
  • This stereoscopic adjustment parameter is a parameter that is applied to adjust the amount of parallax of parallax image data when generating parallax image data from captured image data.
  • the parallax image data generation unit 233 applies parallax adjustment parameters to the captured image data to generate parallax image data that becomes images having parallax amounts adjusted to each other. Details of the calculation unit 231, the determination unit 232, and the parallax image data generation unit 233 will be described later.
  • the moving image generation unit 234 connects the parallax image data and generates a 3D moving image file.
  • the image processing unit 205 also has general image processing functions such as adjusting image data according to the selected image format.
  • the generated captured image data is converted into a display signal by the LCD drive circuit 210 and displayed on the display unit 209.
  • the data is recorded on the memory card 220 attached to the memory card IF 207.
  • the operation unit 208 functions as a part of a reception unit that receives a user operation and transmits an instruction to the control unit 201.
  • the operation unit 208 includes a plurality of operation members such as a shutter button that receives a shooting start instruction.
  • FIG. 2 is a conceptual diagram conceptually showing a state in which a part of the image sensor 100 is enlarged.
  • the basic grid 110 includes four Bayer arrays having 4 ⁇ 2 ⁇ 2 basic units in the Y-axis direction and four in the X-axis direction.
  • a green filter (G filter) is arranged for the upper left pixel and the lower right pixel
  • a blue filter (B filter) is arranged for the lower left pixel
  • R filter red filter
  • the basic grid 110 includes parallax pixels and non-parallax pixels.
  • the parallax pixel is a pixel that receives a partial light beam that is deviated from the optical axis of the photographing lens 20 out of the incident light beam that is transmitted through the photographing lens 20.
  • the parallax pixel is provided with an aperture mask having a deviated opening that is deviated from the center of the pixel so as to transmit only the partial light flux.
  • the opening mask is provided so as to overlap the color filter.
  • the parallax Lt pixel defined so that the partial light beam reaches the left side with respect to the pixel center and the parallax specified so that the partial light beam reaches the right side with respect to the pixel center by the aperture mask.
  • the non-parallax pixel is a pixel that is not provided with an aperture mask, and is a pixel that receives the entire incident light beam that passes through the photographing lens 20.
  • the parallax pixel is not limited to the aperture mask when receiving the partial light beam that is deviated from the optical axis, but has various configurations such as a selective reflection film in which the light receiving region and the reflective region are separated, and a deviated photodiode region. Can be adopted. In other words, the parallax pixel only needs to be configured to receive a partial light beam that is deviated from the optical axis, among incident light beams that pass through the photographing lens 20.
  • Pixels in the basic grid 110 are denoted by PIJ .
  • the upper left pixel is P 11
  • the upper right pixel is P 81.
  • the parallax pixels are arranged as follows.
  • the other pixels are non-parallax pixels, and are any of the non-parallax pixel + R filter, the non-parallax pixel + G filter, and the non-parallax pixel + B filter.
  • the parallax pixels When viewed as a whole of the image sensor 100, the parallax pixels are classified into one of a first group having a G filter, a second group having an R filter, and a third group having a B filter. Includes at least one parallax Lt pixel and parallax Rt pixel belonging to each group. As in the example in the figure, these parallax pixels and non-parallax pixels may be arranged with randomness in the basic lattice 110. By arranging with randomness, RGB color information can be acquired as the output of the parallax pixels without causing bias in the spatial resolution for each color component, so that high-quality parallax image data can be obtained. can get.
  • FIG. 3 is a diagram illustrating an example of processing for generating 2D image data and parallax image data.
  • the image processing unit 205 receives raw raw image data in which output values (pixel values) are arranged in the order of pixel arrangement of the image sensor 100, and executes plane separation processing for separating the raw image data into a plurality of plane data.
  • the left column of the figure shows an example of processing for generating 2D-RGB plane data as 2D image data.
  • the image processing unit 205 In generating 2D-RGB plane data, the image processing unit 205 first removes the pixel values of the parallax pixels to form a vacant lattice. Then, the pixel value that becomes the empty grid is calculated by interpolation processing using the pixel values of the surrounding pixels. For example, the pixel value of the empty lattice P 11 is obtained by averaging the pixel values of P ⁇ 1 ⁇ 1 , P 2 ⁇ 1 , P ⁇ 12 , and P 22 which are the pixel values of the G filter pixels adjacent in the diagonal direction. To calculate.
  • the pixel value of the empty lattice P 63 is calculated by averaging the pixel values of P 43 , P 61 , P 83 , and P 65 that are adjacent R filter pixel values by skipping one pixel vertically and horizontally.
  • the pixel value of the air grating P 76 is the pixel value of the adjacent B filter skipping one pixel vertically and horizontally, and averaging operation of the pixel values of P 56, P 74, P 96 , P 78 calculate.
  • the image processing unit 205 performs image processing as a general 2D image according to a predetermined format such as JPEG when generating still image data and MPEG when generating moving image data.
  • the image processing unit 205 further separates the 2D-RGB plane data for each color and performs the interpolation processing as described above to generate each plane data as reference image data. That is, three types of data are generated: Gn plane data as green reference image plane data, Rn plane data as red reference image plane data, and Bn plane data as blue reference image plane data.
  • the right column of the figure shows an example of processing for generating two G plane data, two R plane data, and two B plane data as parallax pixel data.
  • the two G plane data are GLt plane data as left parallax image data and GRt plane data as right parallax image data.
  • the two R plane data are RLt plane data and right parallax image data as left parallax image data.
  • the two B plane data are the BLt plane data as the left parallax image data and the BRt plane data as the right parallax image data.
  • the image processing unit 205 removes pixel values other than the pixel values of the G (Lt) pixels from all output values of the image sensor 100 to form a vacant lattice.
  • two pixel values P 11 and P 55 remain in the basic grid 110. Therefore, we divided into four equal basic grid 110 vertically and horizontally, the 16 pixels of the top left is represented by an output value of the P 11, is representative of the 16 pixels in the lower right in the output value of the P 55. Then, for the upper right 16 pixels and the lower left 16 pixels, average values of neighboring representative values adjacent in the vertical and horizontal directions are averaged and interpolated. That is, the GLt plane data has one value in units of 16 pixels.
  • the image processing unit 205 when generating the GRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the G (Rt) pixel from all the output values of the image sensor 100 to obtain an empty grid. Then, two pixel values P 51 and P 15 remain in the basic grid 110. Therefore, the basic grid 110 is divided into four equal parts vertically and horizontally, the upper right 16 pixels are represented by the output value of P 51 , and the lower left 16 pixels are represented by the output value of P 15 . The upper left 16 pixels and the lower right 16 pixels are interpolated by averaging the peripheral representative values adjacent vertically and horizontally. That is, the GRt plane data has one value in units of 16 pixels. In this way, GLt plane data and GRt plane data having a resolution lower than that of 2D-RGB plane data can be generated.
  • the image processing unit 205 In generating the RLt plane data, the image processing unit 205 removes pixel values other than the pixel value of the R (Lt) pixel from all output values of the image sensor 100 to form a vacant lattice. Then, the primitive lattice 110, the pixel values of P 27 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110. Similarly, when generating the RRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the R (Rt) pixel from all output values of the image sensor 100 to form a vacant lattice. Then, the pixel value P 63 remains in the basic grid 110. This pixel value is set as a representative value for 64 pixels of the basic grid 110.
  • RLt plane data and RRt plane data having a lower resolution than 2D-RGB plane data are generated.
  • the resolution of the RLt plane data and the RRt plane data is lower than the resolution of the GLt plane data and the GRt plane data.
  • the image processing unit 205 In generating the BLt plane data, the image processing unit 205 removes pixel values other than the pixel values of the B (Lt) pixels from all output values of the image sensor 100 to form a vacant lattice. Then, the primitive lattice 110, the pixel values of P 32 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110. Similarly, when generating the BRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the B (Rt) pixel from all the output values of the image sensor 100 to obtain an empty grid. Then, the primitive lattice 110, the pixel values of P 76 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110.
  • BLt plane data and BRt plane data having a resolution lower than that of 2D-RGB plane data are generated.
  • the resolution of the BLt plane data and the BRt plane data is lower than the resolution of the GLt plane data and the GRt plane data, and is equal to the resolution of the RLt plane data and the RRt plane data.
  • image processing may be performed on the output image data so that the amount of parallax between generated images is within the target amount of parallax.
  • the image processing unit 205 generates left-view color image data and right-view color image data using these plane data.
  • color image data in which the parallax amount as a 3D image is adjusted while maintaining the blur amount of the 2D color image is generated.
  • the generation principle will be described first.
  • FIG. 4 is a diagram for explaining the concept of defocusing.
  • the parallax Lt pixel and the parallax Rt pixel receive a subject light flux that arrives from one of two parallax virtual pupils that are set symmetrically with respect to the optical axis as a partial region of the lens pupil.
  • the parallax pixel outputs an image signal obtained by photoelectrically converting only the partial light flux that has passed through the parallax virtual pupil by the action of the aperture mask that each has. Therefore, the pixel value distribution indicated by the output of the parallax pixel may be considered to be proportional to the light intensity distribution of the partial light flux that has passed through the corresponding parallax virtual pupil.
  • the output of each parallax pixel is the corresponding image point regardless of the subject luminous flux that has passed through any parallax virtual pupil. This shows a steep pixel value distribution centering on this pixel. If the parallax Lt pixels are arranged in the vicinity of the image point, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. Further, even when the parallax Rt pixels are arranged in the vicinity of the image point, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. That is, even if the subject luminous flux passes through any parallax virtual pupil, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. Match each other.
  • the peak of the pixel value distribution indicated by the parallax Lt pixel corresponds to the image point, compared to the case where the object point exists at the focal position. Appearing at a position away from the pixel in one direction, and its output value decreases. In addition, the width of the pixel having the output value is increased.
  • the peak of the pixel value distribution indicated by the parallax Rt pixel appears at a position away from the pixel corresponding to the image point in the opposite direction to the one direction in the parallax Lt pixel and at an equal distance, and the output value similarly decreases. Similarly, the width of the pixel having the output value is increased.
  • the same pixel value distribution that is gentler than that in the case where the object point exists at the focal position appears at an equal distance from each other.
  • the same pixel value distribution that becomes more gentle as compared with the state of FIG. 4B appears further apart.
  • the amount of blur and the amount of parallax increase as the object point deviates from the focal position.
  • the amount of blur and the amount of parallax change in conjunction with defocus. That is, the amount of blur and the amount of parallax have a one-to-one relationship.
  • FIGS. 4B and 4C show the case where the object point shifts away from the focal position, but when the object point moves away from the focal position, as shown in FIG.
  • the relative positional relationship between the pixel value distribution indicated by the parallax Lt pixel and the pixel value distribution indicated by the parallax Rt pixel is reversed. Due to such a defocus relationship, when viewing a parallax image, the viewer visually recognizes a subject existing far behind the focal position and visually recognizes a subject present in front.
  • FIG. 5 is a graph showing changes in the pixel value distribution described in FIGS. 4B and 4C.
  • the horizontal axis represents the pixel position, and the center position is the pixel position corresponding to the image point.
  • the vertical axis represents the output value (pixel value) of each pixel. As described above, this output value is substantially proportional to the light intensity.
  • the distribution curve 1804 and the distribution curve 1805 represent the pixel value distribution of the parallax Lt pixel and the pixel value distribution of the parallax Rt pixel in FIG. 4B, respectively. As can be seen from the figure, these distributions have a line-symmetric shape with respect to the center position. Further, a combined distribution curve 1806 obtained by adding them shows a pixel value distribution of pixels without parallax with respect to the situation of FIG. 4B, that is, a pixel value distribution when the entire subject luminous flux is received, and a substantially similar shape.
  • the distribution curve 1807 and the distribution curve 1808 represent the pixel value distribution of the parallax Lt pixel and the pixel value distribution of the parallax Rt pixel in FIG. 4C, respectively. As can be seen from the figure, these distributions are also symmetrical with respect to the center position. Also, a combined distribution curve 1809 obtained by adding them shows a shape substantially similar to the pixel value distribution of the non-parallax pixels for the situation of FIG.
  • the amount of parallax expressed as an interval between peaks is adjusted while approximately maintaining the amount of blur expressed by the spread of the pixel value distribution. That is, in this embodiment, the image processing unit 205 is adjusted between the 2D image generated from the non-parallax pixel and the 3D image generated from the parallax pixel while maintaining the blur amount of the 2D image almost as it is. An image having a parallax amount is generated.
  • FIG. 6 is a diagram illustrating a pixel value distribution for explaining the concept of the adjusted parallax amount.
  • Lt distribution curve 1901 and Rt distribution curve 1902 indicated by solid lines in the figure are distribution curves in which actual pixel values of Lt plane data and Rt plane data are plotted. For example, it corresponds to the distribution curves 1804 and 1805 in FIG.
  • the distance between the peaks of the Lt distribution curve 1901 and the Rt distribution curve 1902 represents the 3D parallax amount, and the greater the distance, the stronger the stereoscopic effect during image reproduction.
  • the 2D distribution curve 1903 obtained by adding 50% each of the Lt distribution curve 1901 and the Rt distribution curve 1902 has a convex shape with no left-right bias.
  • the 2D distribution curve 1903 corresponds to a shape in which the height of the combined distribution curve 1806 in FIG. That is, an image based on this distribution is a 2D image with a parallax amount of zero.
  • the adjusted Lt distribution curve 1905 is a curve obtained by adding 80% of the Lt distribution curve 1901 and 20% of the Rt distribution curve 1902.
  • the peak of the adjusted Lt distribution curve 1905 is displaced closer to the center than the peak of the Lt distribution curve 1901 as much as the component of the Rt distribution curve 1902 is added.
  • the adjusted Rt distribution curve 1906 is a curve obtained by adding 20% of the Lt distribution curve 1901 and 80% of the Rt distribution curve 1902.
  • the peak of the adjusted Rt distribution curve 1906 is displaced closer to the center than the peak of the Rt distribution curve 1902 by the amount to which the component of the Lt distribution curve 1901 is added.
  • the adjusted parallax amount represented by the distance between the peaks of the adjusted Lt distribution curve 1905 and the adjusted Rt distribution curve 1906 is smaller than the 3D parallax amount. Therefore, the stereoscopic effect during image reproduction is alleviated.
  • the spread of each of the adjusted Lt distribution curve 1905 and the adjusted Rt distribution curve 1906 is equivalent to the spread of the 2D distribution curve 1903, it can be said that the amount of blur is equal to that of the 2D image.
  • the amount of adjustment parallax can be controlled by how much the Lt distribution curve 1901 and the Rt distribution curve 1902 are added. Then, by applying this adjusted pixel value distribution to each plane of color image data generated from pixels without parallax, the color of the left viewpoint that gives a stereoscopic effect different from that of parallax image data generated from parallax pixels Image data and right-view color image data can be generated.
  • left-view color image data and right-view color image data are generated from the nine plane data described with reference to FIG.
  • Color image data of the left viewpoint RLt c plane data is red plane data corresponding to the left viewpoint, a green plane data GLt c plane data, and three color parallax plane data BLt c plane data is blue plane data Consists of.
  • the color image data of the right-side perspective is, RRT c plane data is red plane data corresponding to the right viewpoint, a green plane data GRT c plane data, and three of BRt c plane data is blue plane Datacolor Consists of parallax plane data.
  • FIG. 7 is a diagram for explaining color parallax plane data generation processing.
  • a generation process of RLt c plane data and RRt c plane data which are red parallax planes among color parallax planes, will be described.
  • the red parallax plane is generated using the pixel value of the Rn plane data described with reference to FIG. 3, and the pixel value of the RLt plane data and the RRt plane data. Specifically, for example, when calculating the pixel value RLt mn of the target pixel position (i m , j n ) of the RLt c plane data, first, the parallax image data generation unit 233 of the image processing unit 205 stores the Rn plane data A pixel value Rn mn is extracted from the same pixel position (i m , j n ).
  • the parallax image data generating unit 233 the same pixel position of RLt plane data (i m, j n) pixel values RLt mn from the same pixel position of RRt plane data (i m, j n) pixel values from RRt Extract mn . Then, the parallax image data generation unit 233 multiplies the pixel value Rn mn by the value obtained by distributing the pixel values RLt mn and RRt mn by the value of the stereoscopic adjustment parameter C, thereby calculating the pixel value RLt cmn . Specifically, it is calculated by the following equation (1).
  • the parallax image data generation unit 233 uses the extracted pixel value Rn mn and the pixel value RLt mn as well.
  • the pixel value RRt mn is calculated by multiplying the value obtained by distributing the three-dimensional adjustment parameter C by the value. Specifically, it is calculated by the following equation (2).
  • the parallax image data generation unit 233 sequentially executes such processing from (1, 1) which is the pixel at the left end and the upper end to (i 0 , j 0 ) which is the coordinates at the right end and the lower end.
  • the same pixel position of RLt plane data (i m, j n) from instead of extracting the pixel value RLt mn extracts the pixel value GLt mn from the same pixel position of GLt plane data (i m, j n).
  • the same pixel position of the RRT plane data (i m, j n) from instead of extracting the pixel value RRT mn extracts the pixel value GRT mn from the same pixel position of GRT plane data (i m, j n) .
  • the value of each parameter of Formula (1) and Formula (2) is changed suitably, and it processes similarly.
  • the generation processing of the GLt c plane data and the GRt c plane data which are green parallax planes is completed, the generation processing of the BLt c plane data and BRt c plane data which are blue parallax planes is executed next.
  • the pixel same pixel position (i m, j n) of Rn plane data in the above description instead of extracting the pixel values Rn mn from the same pixel position of Bn plane data (i m, j n) from Extract the value Bn mn .
  • the same pixel position of RLt plane data (i m, j n) from instead of extracting the pixel value RLt mn extracts the pixel value BLt mn from the same pixel position of BLt plane data (i m, j n).
  • the same pixel position of the RRT plane data (i m, j n) from instead of extracting the pixel value RRT mn extracts the pixel value BRt mn from the same pixel position of BRt plane data (i m, j n) .
  • the value of each parameter of Formula (1) and Formula (2) is changed suitably, and it processes similarly.
  • left-view color image data (RLt c- plane data, GLt c- plane data, BLt c- plane data) and right-view color image data (RRt c- plane data, GRt c- plane data, BRt c- plane data) Is generated. That is, the color image data of the left viewpoint and the right viewpoint can be acquired by a relatively simple process as a virtual output that does not actually exist as a pixel of the image sensor 100.
  • the value of the stereoscopic adjustment parameter C can be changed within the range of 0.5 ⁇ C ⁇ 1, the amount of parallax as a 3D image is adjusted while maintaining the amount of blur of the 2D color image due to pixels without parallax. be able to. Therefore, if these image data are reproduced by a 3D image compatible reproduction device, the viewer of the stereoscopic video display panel can appreciate the 3D video in which the stereoscopic effect is appropriately adjusted as a color image.
  • the processing is simple, it is possible to generate image data at high speed and to deal with moving images.
  • the value of the stereoscopic adjustment parameter C is determined and used in a range of 0.5 ⁇ C ⁇ 1 for each subject.
  • the advantage of generating color parallax plane data using the stereoscopic adjustment parameter C determined for each subject will be described.
  • FIG. 8 is a diagram schematically showing the relationship between the parallax amount and the value of the three-dimensional adjustment parameter.
  • the horizontal axis represents the distance from the digital camera 10, that is, the depth with respect to the scene, and the vertical axis represents the amount of parallax.
  • the digital camera 10, the object located at a distance L 10 (see focus object in the drawing) are focused.
  • the threshold value of the allowable amount of parallax is predetermined as ⁇ m.
  • the plurality of subjects included in the captured image data have unadjusted parallax amounts in the left and right parallax image data.
  • the value of the three-dimensional adjustment parameter C is determined for each subject according to such an unadjusted parallax amount, and is used for adjusting the parallax amount.
  • the parallax amount m 10 unadjusted this value C 10 is adjusted to the target value m 100 of the parallax amount.
  • the target value m 100 of the parallax amount m 10 and the parallax amount of unadjusted are both 0.
  • the value of the stereoscopic adjustment parameter C is determined as C 20 for the near-point subject, and the unadjusted parallax amount m 20 is adjusted to the parallax amount target value m 200 (m 200 ⁇ m) by this value C 20 .
  • the value of the stereoscopic adjustment parameter C is determined as C 30 for the far-point subject, and the unadjusted parallax amount m 30 is set to the parallax amount target value m 300 (m 300 ⁇ ⁇ m) by this value C 30 . It has been adjusted.
  • the unadjusted parallax amount for this subject is a value indicated by the parallax amount curve 1622 in the figure.
  • the target value of the parallax amount is a value indicated by the adjusted parallax amount curve 1623 in the drawing.
  • the parallax amount curve 1622 in the figure represents the relationship between the distance from the digital camera 10 and the unadjusted parallax amount generated in each subject when it is assumed that the subject is located at each distance.
  • the parallax amount curve 1622 includes a region outside the range of ⁇ m to + m. From the viewpoint of facilitating understanding, in the figure, the region outside the range of ⁇ m to + m in the parallax amount curve 1622 is surrounded by the frame W1, and the region within the range of ⁇ m to + m is surrounded by the frame W2. Yes.
  • the adjusted parallax amount curve 1623 represents the relationship between the distance from the digital camera 10 and the target value of the parallax amount generated in each subject when it is assumed that the subject is located at each distance.
  • the target value of the parallax amount indicated by the adjusted parallax amount curve 1623 is included in the range of ⁇ m to + m within the distance range in which the parallax amount curve 1622 is set.
  • the adjustment parallax amount curve 1623 as described above has a shape in which different values of the three-dimensional adjustment parameter C are set for a subject at least in the range of ⁇ m to + m and a subject outside the range of ⁇ m to + m. It has become. The specific shape will be described later.
  • the depth relationship between subjects that is, the relationship between the distance from the digital camera 10 and the parallax amount is not broken. Specifically, the greater the distance from the digital camera 10, the smaller the parallax amount. When the parallax amount exceeds 0, the negative value increases. In other words, the parallax amount m a distance greater object from the digital camera 10, the distance and the parallax amount m b a small object meets m b ⁇ m a.
  • the parallax amount larger than + m is adjusted to be + m which is the upper limit of the allowable parallax amount, and the parallax amount smaller than ⁇ m is the lower limit of the allowable parallax amount ⁇ m It is adjusted to become. In this way, the parallax of the subject that is not the main subject is not suppressed more than necessary, and the depth relationship of the subject is not disrupted.
  • the unadjusted parallax amount along the parallax amount curve 1622 is generally maintained for the subject in the range where the unadjusted parallax amount is in the range of ⁇ m to + m. Then, the value of the three-dimensional adjustment parameter C is determined and the amount of parallax is adjusted. As a result, the stereoscopic effect is maintained in the subject within the range of ⁇ m to + m.
  • adjusting the parallax amount so that the unadjusted parallax amount is generally maintained means that the parallax amount is adjusted so that the unadjusted parallax amount is maintained or becomes a value close to the unadjusted parallax amount. Means to be adjusted.
  • the stereoscopic adjustment parameter C different from that of the subject in the range of ⁇ m to + m is included so as to be included in the range of ⁇ m to + m. Is determined and the amount of parallax is adjusted. As a result, the amount of parallax is suppressed in subjects outside the range of ⁇ m to + m, and fatigue and discomfort during viewing are reduced.
  • the amount of parallax between a subject with a large unadjusted amount of parallax and a subject with a small amount of parallax can be intentionally varied. For this reason, it is possible to reduce the amount of parallax for a non-main subject while maintaining a stereoscopic effect for the main subject, and to reduce the sense of discomfort and fatigue during viewing.
  • the unadjusted parallax amounts are the same for the plurality of subjects, and therefore the values of the three-dimensional adjustment parameter C for adjusting the parallax amount are the same. Are also equal. From this, for each subject included in the image, the value of the stereoscopic adjustment parameter C is determined according to the unadjusted parallax amount. The unadjusted parallax amounts for a plurality of subjects included in the image Is the same as determining the value of the three-dimensional adjustment parameter C.
  • the principle of determining the three-dimensional adjustment parameter C for each subject will be described.
  • the principle described below is for facilitating understanding of a lookup table 2310 described later. Therefore, the value of the stereoscopic adjustment parameter C does not necessarily have to be determined in the digital camera 10 according to this principle.
  • the three-dimensional adjustment parameter C is such that the unadjusted parallax amount is generally maintained for subjects in the range of ⁇ m to + m among a plurality of subjects included in the image. A value is calculated. For a subject outside the range of ⁇ m to + m, the value of the stereoscopic adjustment parameter C is determined so that the parallax amount is adjusted within the range of ⁇ m to + m.
  • the parallax amount of the subject is calculated from the left and right parallax image data.
  • the unadjusted parallax amounts m 10 , m 20 , and m 30 for the focused subject, the near point subject, and the far point subject are detected, respectively.
  • the distance from the digital camera 10 to the subject is calculated based on the defocus amount from the focal position. Thereby, for example, distances L 10 , L 20 , and L 30 from the digital camera 10 to the focused subject, the near point subject, and the far point subject are calculated.
  • the target value of the parallax amount corresponding to the calculated distance is determined from the adjusted parallax amount curve 1623. Specifically, a point corresponding to the calculated distance is determined from the adjusted parallax amount curve 1623, and the vertical coordinate of this point is determined as the target value of the parallax amount.
  • the target value m 100 , the target value m 200 , and the target value m 300 of the parallax amount for the focused subject, the near point subject, and the far point subject are determined.
  • the value of the stereoscopic adjustment parameter C is calculated so that the unadjusted parallax amount is adjusted to the target value of the parallax amount for each subject.
  • the value of the stereoscopic adjustment parameter C is calculated so that the parallax amount m 10 is substantially maintained.
  • the value C 20 as parallax amount m 20 unadjusted for near-point object is adjusted to the target value m 200 of the parallax amount is calculated.
  • a value C 30 is calculated such that the unadjusted parallax amount m 30 for the far point subject is adjusted to the target value m 300 for the parallax amount.
  • a 3D image is generated using the certain stereoscopic adjustment parameter C. After that, processing for calculating the parallax amount of the subject is performed. Then, by repeating this process while feeding back the calculation result, the value of the stereo adjustment parameter C when the unadjusted parallax amount is adjusted to the target value of the parallax amount is calculated.
  • a lookup table is generated in advance from the correspondence between the unadjusted parallax amount obtained in this way, the target value of the parallax amount, and the value of the stereoscopic adjustment parameter C, and this lookup table is used. Then, the value of the three-dimensional adjustment parameter C is calculated.
  • the adjusted parallax amount curve 1623 is composed of a part in the area surrounded by the frame W1 and a part in the area surrounded by the frame W2. Accordingly, the adjustment parallax amount curve 1623 can set different values of the stereoscopic adjustment parameter C for at least a subject within the range of ⁇ m to + m and a subject outside the range of ⁇ m to + m. .
  • a portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W1 is formed so that the target value of the parallax amount is included in the range of ⁇ m to + m.
  • the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W ⁇ b> 2 is formed to approximate the parallax amount curve 1622.
  • the parallax amount smoothly transition between these subjects in the depth direction.
  • the adjusted parallax amount curve 1623 is preferably continuous in the depth direction, that is, the distance direction from the digital camera 10, and more preferably the differential value is continuous.
  • the value of the stereoscopic adjustment parameter C is determined so that the adjusted parallax amount is continuous in the depth direction. Further, the value of the stereoscopic adjustment parameter C is determined so that the differential value of the adjusted parallax amount is continuous in the depth direction.
  • the shape of the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W2 may be adjusted. Specifically, among the parts in the region surrounded by the frame W2, the part near the boundary with the frame W1 may be deformed so as to be continuous with the part surrounded by the frame W1.
  • the shape of the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W2 may be adjusted. Specifically, among the portions in the region surrounded by the frame W2, a portion near the boundary with the frame W1 may be deformed so that the differential value is continuous with respect to the portion surrounded by the frame W1.
  • the adjusted parallax amount curve 1623 for example, a curve that overlaps with the parallax amount curve 1622 in at least a part of the parallax amount section in the range of ⁇ m to + m and has ⁇ m as an asymptotic line is used.
  • a curve is derived from a hyperbolic tangent curve.
  • a portion surrounded by the frame W2 in the adjusted parallax amount curve 1623 is generated from the parallax amount curve 1622. Then, the portion in the region surrounded by the frame W2 is deformed so that the portion near the boundary with the frame W1 is continuous with the portion surrounded by the frame W1. At this time, the differential value of the adjusted parallax amount curve 1623 is made continuous at the boundary between the frame W1 and the frame W2. Thereby, the adjusted parallax amount curve 1623 is selected.
  • the functions of the parallax amount curve 1622 and the adjusted parallax amount curve 1623 described above depend on the imaging conditions that affect the parallax amount (for example, the aperture value, the focus position, the focal length when the photographing lens 20 is a zoom lens, etc.), respectively. Can change. Therefore, when the value of the stereoscopic adjustment parameter C is determined using the adjustment parallax amount curve 1623 according to the principle described above, the function of the adjustment parallax amount curve 1623 can be stored in the digital camera 10 for each imaging condition. That's fine.
  • FIG. 9 is a diagram showing a lookup table 2310 stored by the determining unit 232 in the present embodiment.
  • the lookup table 2310 is a table that is referred to when the determination unit 232 determines the value of the stereoscopic adjustment parameter C.
  • This lookup table 2310 is stored in advance in the storage unit of the digital camera 10. As shown in FIG. 9, in the look-up table 2310, each value (m 10 , m 20 , m 30 ,%) That the parallax amount m can take, and the value of the stereo adjustment parameter C corresponding to each value (C 10 , C 20 , C 30 ,...) Are described in pairs.
  • the value of the stereoscopic adjustment parameter C corresponding to each value within the range of ⁇ m to + m is determined so that the unadjusted parallax amount is generally maintained. Yes. Further, the value of the stereo adjustment parameter C corresponding to each value outside the range of ⁇ m to + m is determined so that the parallax amount is adjusted within the range of ⁇ m to + m.
  • Such a lookup table 2310 is generated through an experiment using a prototype, for example. Specifically, in the prototype of the digital camera 10, an imaging condition that affects the amount of parallax is set to any arbitrary condition.
  • Each captured image data is associated with a distance to the subject.
  • the amount of parallax of the subject is calculated from the left and right parallax image data in each captured image data, and the correspondence data between the amount of parallax and the distance from the digital camera 10 to the subject at the time of shooting is plotted on the coordinate plane.
  • the horizontal axis is the distance from the digital camera 10, that is, the depth with respect to the scene
  • the vertical axis is the amount of parallax.
  • a parallax amount curve 1622 is generated by generating approximate curves of these plots. Once the parallax amount curve 1622 is generated, the adjusted parallax amount curve 1623 is then selected as described above.
  • parallax amount curve 1622 and the adjusted parallax amount curve 1623 are obtained, one point on the horizontal axis is selected, and an unadjusted parallax amount corresponding to this point is detected from the parallax amount curve 1622. That is, the distance from the digital camera 10 is selected, and the unadjusted parallax amount when the subject is located at this distance is detected.
  • the value of the stereoscopic adjustment parameter C is determined such that the detected unadjusted parallax amount is adjusted to the target value of the parallax amount.
  • the unadjusted parallax amount for example, the parallax amount m 10
  • the value of the stereoscopic adjustment parameter C for example, the value C
  • the stereoscopic image is adjusted so that the unadjusted parallax amount is within the range of ⁇ m to + m.
  • the value of the adjustment parameter C (for example, the value C 20 ) is determined. Therefore, since the parallax amount can be suppressed in the subject that is not the main subject while the stereoscopic subject is left as the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
  • a certain stereoscopic adjustment parameter C is used. After the 3D image is generated, a process for calculating the parallax amount of the subject is performed. Then, by repeating this process while feeding back the calculation result, the value of the stereo adjustment parameter C when the unadjusted parallax amount is adjusted to the target value of the parallax amount is calculated.
  • a lookup table is generated in advance from the correspondence between the unadjusted parallax amount obtained in this way, the target value of the parallax amount, and the value of the stereoscopic adjustment parameter C, and this lookup table is used. Then, the value of the three-dimensional adjustment parameter C is calculated.
  • the unadjusted parallax amount for this point and the value of the stereoscopic adjustment parameter C are stored in the lookup table 2310 in association with each other. Thereafter, similarly, another point on the horizontal axis is selected, and the unadjusted parallax amount corresponding to that point and the value of the three-dimensional adjustment parameter C are stored in the lookup table 2310 in association with each other. As a result, a lookup table 2310 is generated.
  • the value of the stereoscopic adjustment parameter C can be determined from the unadjusted parallax amount without detecting the subject distance.
  • the value of the stereoscopic adjustment parameter C can be determined by the lookup table 2310 as long as an unadjusted parallax amount can be detected regardless of the arrangement of the subject in the scene.
  • the target value of the parallax amount is determined using the adjusted parallax amount curve 1623, and the depth relationship between the subjects (the relationship between the distance from the digital camera 10 and the parallax amount) is not broken in this curve. Therefore, as long as the depth relationship of the subject is not broken in the captured image, the depth relationship of the subject is not broken even if the parallax amount is adjusted.
  • the same lookup table 2310 can be used regardless of the imaging conditions that affect the amount of parallax (aperture value, focus position, focal length when the taking lens 20 is a zoom lens). .
  • this point will be described with a specific example.
  • FIG. 10 is a diagram illustrating the relationship between the amount of parallax and the target value of the amount of parallax when the shape of the parallax amount curve varies depending on the shooting conditions.
  • the horizontal axis represents the distance from the digital camera 10
  • the vertical axis represents the amount of parallax.
  • the digital camera 10 focuses on a subject (see the focused subject in the figure) located at the distances L 11 and L 12 . Also, between the two diagrams shown in FIGS. 10A and 10B, the focal position and the aperture value as the shooting conditions that affect the parallax amount are changed. As a result, the parallax amount curve 1626 is changed. 1628 are different from each other.
  • the distance from the digital camera 10 to each subject is different from each other. Specifically, the distance from the digital camera 10 to the in-focus subject, in a scene represented by FIG. 10 (a) has a distance L 11, the distance L 12 in the scene represented by FIG. 10 (b) It has become.
  • the distance from the digital camera 10 to the near-point subject is the distance L 21 in the scene shown in FIG. 10A and the distance L 22 in the scene shown in FIG. Yes.
  • the distance from the digital camera 10 to the far point subject is the distance L 31 in the scene shown in FIG. 10A and the distance L 32 in the scene shown in FIG. Yes.
  • any parallax amount unadjusted for near-point object is a parallax amount m 20
  • the parallax amount unadjusted for far point object has a both parallax amount m 30.
  • the unadjusted parallax amount m 10 for the focused subject, the near point subject, and the far point subject in FIG. , M 20 , m 30 are respectively adjusted so that the parallax amounts m 100 , m 200 , m 300 are adjusted.
  • the unadjusted parallax amounts m 10 , m 20 , and m 30 become parallax amounts m 100 , m 200 , and m 300 for the focused subject, the near point subject, and the far point subject in FIG.
  • the value of the three-dimensional adjustment parameter C is determined so as to be adjusted.
  • the same effect as when the function of the adjusted parallax amount curve 1623 is stored in the digital camera 10 for each imaging condition can be obtained. That is, since the main subject can have a stereoscopic effect and the amount of parallax can be suppressed in a subject that is not the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
  • FIG. 11 is a diagram for explaining changes in RGB pixel value distribution.
  • FIG. 11A shows a G (Lt) pixel, a G (Rt) pixel, and an R (Lt) pixel when a white subject light beam from an object point located at a position deviated by a certain amount from the focal position is received. , R (Rt) pixels, B (Lt) pixels, and B (Rt) pixels.
  • FIG. 11B shows R (N) pixels, G (N) pixels, and B (N) pixels that are non-parallax pixels when a white subject light beam from the object point in FIG. 11A is received. It is the graph which arranged the output value. It can be said that this graph also represents the pixel value distribution of each color.
  • FIG. 12 is a diagram illustrating the relationship between the vergence angle of the viewer and the amount of parallax.
  • the eyeball 50 represents the eyeball of the viewer, and the figure shows the right eye 51 and the left eye 52 being separated.
  • the display unit 40 reproduces non-adjusted image data whose parallax amount is not adjusted, and displays a subject 61 for the right-eye image and a subject 62 for the left-eye image.
  • Object 61 and the object 62 are the same object, so were present at a position shifted from the focal position at the time of shooting, the display unit 40 is displayed at a distance with a disparity amount D 1.
  • the viewer views the position of the lifting distance L1 (in the drawing) where the straight line connecting the right eye 51 and the subject 61 and the straight line connecting the left eye 52 and the subject 62 intersect. (Represented by a square).
  • the convergence angle at this time is ⁇ 1 as shown in the figure.
  • ⁇ 1 the convergence angle at this time.
  • the video is uncomfortable and causes eye strain. Therefore, when image processing is performed using the stereoscopic adjustment parameter in the present embodiment, adjusted image data in which the parallax amount is adjusted by the stereoscopic adjustment parameter as described above is generated.
  • the figure shows a state where the adjusted image data is reproduced over the non-adjusted image data.
  • the display unit 40 displays a subject 71 of the right-eye image and a subject 72 of the left-eye image of the adjustment image data.
  • the subject 71 and the subject 72 are the same subject, and the subjects 61 and 62 are also the same subject.
  • Object 71 and the object 72, the display unit 40 is displayed at a distance with a disparity amount D 2.
  • the viewer recognizes that the subject exists at the position of the lifting distance L2 (represented by a triangle in the figure) where the straight line connecting the right eye 51 and the subject 71 intersects with the straight line connecting the left eye 52 and the subject 72.
  • the convergence angle at this time is ⁇ 2 smaller than ⁇ 1 . Therefore, the viewer can feel the extreme feeling of lifting and can reduce the accumulation of eye strain. Note that the amount of parallax is appropriately adjusted as will be described later, so that the viewer can appreciate the video with a comfortable floating feeling (a three-dimensional effect with a feeling of depression when the defocus relationship is reversed).
  • parallax amount used as description of FIG. 12 was represented by the separation distance in the display part 40, a parallax amount can be defined in various formats. You may define by the pixel unit in picked-up image data, and you may define by the shift
  • the adjustment of the parallax amount can be executed by various methods without using the method of changing the value of the three-dimensional adjustment parameter C.
  • a method of adjusting the parallax amount without changing the value of the three-dimensional adjustment parameter C will be described.
  • FIG. 13 is a diagram schematically illustrating the relationship between the contrast indicating the sharpness of an image and the amount of parallax.
  • the horizontal axis represents the distance from the digital camera 10, and the vertical axis represents the amount of parallax and the height of contrast.
  • the digital camera 10 is focused on the main subject is located at a distance L p.
  • the contrast curve 1610 forms the highest convex curve at the distance L p that is the distance to the focal position. That illustrates how gradually blurred with increasing distance from the distance L p back and forth.
  • Parallax amount curve 1620 at a distance L p indicates parallax amount 0, than the distance L p approaches the digital camera 10 side, shows the curve slope increases. That is, the parallax amount curve 1620 shows a positive value on the near side of the distance L p , and indicates that the closer the subject is, the higher the image is visually recognized.
  • the parallax amount curve 1620 than the distance L p As the distance from the digital camera 10 side, shows the curve slope becomes smaller. That is, the parallax amount curve 1620 distance L p from indicate a negative value in the back side, it represents that it is visible sinks slowly as more distant object.
  • the subject composing the scene moves from the distance L f (the amount of parallax at this time is + m) to the distance L r ( The amount of parallax at this time may be distributed between -m). That is, if the closest near subject from the digital camera 10 exists at the distance L f and the farthest far subject exists at the distance L r , the viewer can adjust the amount of parallax without adjusting the amount of parallax in the subsequent image processing. You can enjoy 3D video comfortably.
  • the near-point object is in front of the distance L f than the distance L f '(parallax amount at this time is + m') are present, since exceeds the parallax amount allowed, viewer discomfort, fatigue Learn.
  • FIG. 14 is a diagram schematically illustrating the relationship between the subject distribution and the amount of parallax.
  • FIG. 14 corresponds to the diagram in FIG. 11 excluding the contrast curve 1610.
  • the in-focus object distance L 10 near point subject to L 20, far point object is present in L 30.
  • the parallax amount range set as the allowable range is from ⁇ m to + m
  • the value of the parallax amount curve 1620 with respect to the distance L 30 of the far-point subject is within this range.
  • the value of the parallax amount curve 1620 with respect to the distance L 20 of near-point object is over + m.
  • 14 (b) is a diagram of the subject situation, showing the concept of parallax amount when focus object is moved from the distance L 10 to the back side of the distance L 11 in FIG. 14 (a).
  • the distance L 11 is the focal position
  • the parallax amount relative to the image of the near point object has not moved (the distance L 20), as indicated by the parallax amount curve 1620, in comparison with FIG. 14 (a) It becomes considerably large. That is, the excess amount from the allowable range increases.
  • FIG. 14 (c) shows the object status of FIG. 14 (b), the near point object from the distance L 20 to the back side of the distance L 21, the concept of parallax amount when further moved to the distance L 22 It is.
  • focus position parallax amount curve 1620 remain such because of the distance L 11 draw the same curve as FIG. 14 (b), the by near point object is shifted to the rear side, the parallax amount at the time of the distance L 21 is acceptable Although exceeding the range, the excess amount is smaller than the excess amount of FIG. If further moved until the distance L 22, the parallax amount is within an allowable range.
  • the subject distribution in the depth direction with respect to the scene and the position of the subject to be focused are parameters that determine whether or not the parallax amount falls within the set allowable range.
  • FIG. 15 is a diagram schematically illustrating the relationship between the aperture value and the amount of parallax.
  • the horizontal axis represents the distance from the digital camera 10
  • the vertical axis represents the amount of parallax and the height of contrast.
  • 15A shows a state where the aperture value is F1.4
  • FIG. 15B shows a state where the aperture value is F4
  • FIG. 15C shows a state where the aperture value is F8.
  • the focal length of the taking lens 20 are the same in both states, also, the digital camera 10 is focused on the main subject is located at a distance L 10.
  • Contrast curve 1610 the highest in the distance L 10 is a distance to be focal positions in any state.
  • the aperture 22 that is, as the aperture value is increased, a relatively high value is obtained even before and after the focal length. That is, it shows that the depth of field becomes deeper as the image is taken with the aperture 22 being reduced.
  • Parallax amount curve 1620 at a distance L 10 shows the parallax amount 0, approaches the digital camera 10 side than the distance L 10, shows the curve slope increases.
  • the parallax amount curve 1620 as the distance from the digital camera 10 side than the distance L 10, shows the curve slope becomes smaller.
  • the parallax amount curve 1620 becomes gentler as the aperture value increases. That is, as compared with the case where the aperture value is F1.4, the amount of parallax in front of the focal position and the amount of parallax in the back become smaller as F4 and F8 change. If the viewer does not feel discomfort and fatigue when the amount of parallax falls within the range of ⁇ m to + m, the entire parallax amount curve 1620 falls within this range when the aperture value is F8. Even if the subject is present at any distance, the viewer can comfortably appreciate the 3D video.
  • the parallax amount exceeds + m on the short distance side of the parallax amount curve 1620.
  • the parallax amount curve 1620 at F4 exceeds + m in front of the area than the distance L 25.
  • the slope of the parallax amount curve 1620 at F4 is gentler than the slope of the parallax amount curve 1620 at F1.8, the relationship of L 25 ⁇ L 24 is established.
  • the change condition for changing the parallax amount is changed so that the parallax amount between the generated images falls within the target parallax amount (allowable parallax amount: for example, a range of ⁇ m). .
  • an imaging condition that affects the amount of parallax is changed, or a value of a stereoscopic adjustment parameter used for image processing is changed.
  • the aperture value affects the amount of parallax. Therefore, the aperture value may be changed according to the detected subject distribution so that the amount of parallax between output parallax images is within the allowable amount of parallax. For example, in the situation of FIG. 15A (the initial aperture value is F1.4, the focused subject is the distance L 10 ), and the near-point subject is present at the distance L 25 , the amount of parallax exceeds + m. Therefore, the determination unit 232 changes the aperture value from F1.4, at a distance L 25 in the F4 is a aperture value parallax amount is + m with respect to the subject.
  • the aperture value is changed to a large value not only when the near-point subject exceeds the allowable parallax amount range but also when the far-point subject exceeds the allowable parallax amount range.
  • the aperture value may be changed to a small value, that is, the direction in which the aperture 22 is opened.
  • the shutter speed can be changed to the high speed side
  • the ISO sensitivity can be changed to the low sensitivity side.
  • the relationship between the in-focus subject distance and the parallax amount curve 1620 for each aperture value is prepared in advance as a lookup table.
  • the determination unit 232 can extract and determine the aperture value to be changed by referring to the lookup table with the subject distribution and the allowable parallax amount as input values.
  • FIG. 16 is a diagram schematically showing the concept of focus shift.
  • the vertical axis and the horizontal axis are the same as those in FIG.
  • Contrast curves 1610 and the parallax amount curve 1620 focusing subject exists at a distance L 10, represents the contrast curve and a parallax amount curve when focused on the subject by moving the focus lens.
  • the peak value of the contrast curve 1610 is higher than the focus threshold E s is evaluated with focusing.
  • the parallax amount is + m 0 when referring to the parallax amount curve 1620, which exceeds the allowable parallax amount + m. Therefore, in the focus shift to correct the focus lens position in a range above the focus threshold E s, it falls within the acceptable range parallax amount at the distance L 27.
  • a parallax amount curve 1621 where the parallax amount with respect to the near-point subject is + m is selected, and a distance L p where the parallax amount is 0 in the parallax amount curve 1621 is extracted. Then, by changing the focus lens position, the distance L p as the focusing position.
  • the contrast curve 1611 is a contrast curve at this time. Since the object is present in the distance L 10 in fact, the contrast value for the subject is reduced by ⁇ e as shown. Contrast value at this time has only to above the focus threshold E s. In this way, an image shot with the focus lens position changed can be evaluated as in-focus as the image, although the contrast value for the main subject is slightly reduced, and the parallax amount for the near-point subject is within an allowable range. Yes.
  • the correction of the focusing lens position is not allowed. That is, when the parallax amount relative to the near-point object in parallax amount curve 1620 largely exceeds the allowable amount of parallax, changing the focus lens position in a range above the focus threshold E s, fit the parallax amount within the allowable range I can't. In this case, it may be used in combination with other methods such as changing the aperture value to a large value.
  • a lookup table prepared in advance as a relationship between the focused subject distance and the parallax amount curve for each aperture value may be used.
  • the determination unit 232 can extract and determine the distance L p by referring to the lookup table using the subject distribution and the allowable parallax amount as input values.
  • the control unit 201 corresponds to the distance L p to change the position of the focus lens. Control unit 201, the contrast value obtained as a result of determining whether above the focus threshold E s. If it is determined that it exceeds, the photographing sequence is continued as it is. If it is determined that the value does not exceed, the focus lens position is returned and the control shifts to another method.
  • actual control unit 201 without moving the focus lens whether the focal position is the attenuation of contrast when shifted to the L p determination unit 232 calculates the L 10, above the focus threshold E s It may be judged.
  • the contrast AF method when the focus adjustment with respect to the distance L 10, may also refer to the actual evaluation value has already been obtained.
  • the lookup table 2310 for determining the value of the three-dimensional adjustment parameter C is an imaging condition that affects the amount of parallax (a diaphragm value, a focus position, and a case where the photographing lens 20 is a zoom lens). The same can be used regardless of the focal length. Therefore, the parallax amount adjustment method as described above can be used in combination with a method of adjusting the parallax amount by changing the value of the three-dimensional adjustment parameter C.
  • FIG. 17 is a diagram schematically illustrating the relationship between the parallax amount, the value of the three-dimensional adjustment parameter, the aperture value, and the like.
  • the horizontal axis represents the distance from the digital camera 10
  • the vertical axis represents the amount of parallax.
  • the digital camera 10, the object located at a distance L 10 (see focus object in the drawing) are focused. However, the image captured by the digital camera 10, in addition to the subject located at a distance L 10, the distance L 20 and the distance L 2 one object located 30 (see a near point object and far-point object in the drawing) It is included.
  • the unadjusted parallax amounts for these three subjects are the parallax amount m 10 , the parallax amount m 20, and the parallax amount m 30 .
  • the determination unit 232 adjusts the parallax amount by changing the aperture value.
  • the parallax amount curve 1622 is transformed into the parallax amount curve 1624, and the parallax amounts given to the focused subject, the near point subject, and the far point subject become the parallax amount m 101 , the parallax amount m 201, and the parallax amount m 301 .
  • the focus position may be changed instead of the aperture value.
  • the calculation unit 231 does not adjust the parallax amount m 101 , the parallax amount m 201, and the parallax amount m 301 adjusted by changing the aperture value, the focus position, and the like, depending on the value of the stereoscopic adjustment parameter C.
  • the parallax amount for adjustment is calculated from the left and right parallax image data.
  • the determination unit 232 uses the lookup table 2310 and uses the look-up table 2310 to create a solid corresponding to the unadjusted parallax amount m 101 , parallax amount m 201, and parallax amount m 301.
  • the value of the adjustment parameter C is determined for each subject.
  • the value of the stereo adjustment parameter C determined in this way is equal to the value determined by the following method. That is, first, the adjusted parallax amount curve 1625 is generated with respect to the parallax amount curve 1624 instead of the parallax amount curve 1622. Then, the value of the stereoscopic adjustment parameter C is calculated such that the unadjusted parallax amount is adjusted to the target value of the parallax amount. In this case, for a subject whose parallax amount exceeds the allowable parallax amount, the parallax is suppressed so that the viewer does not feel discomfort or fatigue, and for a subject whose parallax amount falls within the allowable parallax amount, the parallax is more emphasized. be able to.
  • an adjusted parallax amount curve having a different shape can be obtained.
  • the same effect as that used when determining the value can be obtained. Therefore, variations in the adjustment amount of the parallax amount can be increased.
  • a function used to determine the target value of the parallax amount can be set in advance so that the adjusted parallax amount curve 1625 is obtained for each imaging condition. In this case, the amount of data increases, but the amount of parallax can be adjusted without changing the shooting conditions.
  • FIG. 18 is a diagram for explaining subject designation.
  • FIG. 18A shows a subject distribution in the depth direction from the digital camera 10 in a certain scene
  • FIG. 18B is a rear view of the digital camera 10 displaying the scene in a live view.
  • the scene is composed of an adult 301 (distance L f ), a boy 302 (distance L p ), and a girl 303 (distance L r ) in order from the digital camera 10. Then, as shown in FIG. 18B, a live view image that captures this scene is displayed on the display unit 209.
  • the boy 302 is the focused subject.
  • the amount of parallax can be suppressed by increasing the amount of adjustment of the amount of parallax for subjects that are not the main subject for the photographer. Is preferred. Accordingly, it is not always necessary to maintain the amount of parallax for a subject whose amount of parallax is in the range of ⁇ m to + m.
  • the control unit 201 receives a photographer's instruction as to which subject should be the main subject.
  • the display unit 209 displays a title 320 (for example, “Please set the main subject area”) indicating that the user instruction is accepted. In this state, the user adjusts the position and size of the frame 310 to specify a range of the area including the subject image that is desired to be the main subject.
  • the display unit 209 is provided with a touch panel 2083 as a part of the operation unit 208, and the control unit 201 acquires the output of the touch panel 2083 and determines which subject is the main subject.
  • the subject of the adult 301 not included in the frame 310 is designated as the subject whose parallax amount is desired to be reduced.
  • the determination unit 232 determines that the unadjusted parallax amount is 0.5 from the value calculated from the lookup table 2310 regardless of whether or not the unadjusted parallax amount is within the range of ⁇ m to + m.
  • a value close to is determined as the value of the three-dimensional adjustment parameter C.
  • the adjustment amount of the parallax amount becomes larger and the parallax amount is suppressed as compared with the case where the value of the stereoscopic adjustment parameter C calculated from the lookup table 2310 is used as it is.
  • the amount of adjustment of the parallax amount for each subject is, for example, as shown superimposed on each subject in FIG.
  • these adjustment amounts mean that the degree of adjustment of the parallax amount is larger as the absolute value is larger.
  • a positive value for example, +2 means that the amount of parallax is adjusted in a direction in which the distance from the digital camera 10 increases, and a negative value (for example, ⁇ 1) decreases the distance from the digital camera 10. This means that the amount of parallax is adjusted in the direction.
  • the adjustment amount of the parallax amount is reduced and the stereoscopic effect is maintained in the boy 302 and the girl 303 as the main subjects.
  • the amount of parallax adjustment is increased and the amount of parallax is reduced.
  • FIG. 19 is a processing flow in moving image shooting according to the first embodiment.
  • the flow in the figure starts when the mode button is operated by the photographer and the auto 3D video mode is started.
  • the parallax amount range is set in advance by the photographer.
  • the determination unit 232 acquires the parallax amount range set by the photographer from the system memory in step S11. In step S12, the control unit 201 executes AF and AE.
  • the control unit 201 receives from the user, via the touch panel 2083, designation of a range of an area including a subject image that is desired to be a main subject, as described with reference to FIG.
  • the control unit 201 waits for a recording start instruction for the photographer to press the recording start button.
  • step S15 When the recording start instruction is detected (YES in step S15), the control unit 201 proceeds to step S17. If no instruction is detected, the process returns to step S12. Note that after returning to step S12, the specified subject may be tracked and the process of step S13 may be skipped.
  • step S17 the control unit 201 executes AF and AE again according to the changed imaging condition.
  • step S18 the control unit 201 performs charge accumulation and readout of the image sensor 100 via the drive unit 204, and acquires captured image data as one frame. The amount of parallax between the parallax images in the captured image data acquired here does not fall within the set amount of parallax depending on the subject distribution and the shooting conditions.
  • the parallax image data generation unit 233 receives the value of the stereoscopic adjustment parameter C determined by the determination unit 232 and the captured image data, and receives color image data (RLt c plane data, GLt c plane data, left viewpoint) BLt c plane data) and right viewpoint color image data (RRt c plane data, GRt c plane data, BRt c plane data) are generated. Specific processing will be described later.
  • step S19 for the case where the photographer wants to change the main subject during moving image shooting, the control unit 201 accepts from the user the range specification of the area including the subject image that he wants to be the main subject. If the control unit 201 determines in step S21 that it has not received a recording stop instruction from the photographer, it returns to step S17 and executes the next frame process. If it is determined that a recording stop instruction has been received, the process proceeds to step S22.
  • step S22 the moving image generating unit 234 connects the continuously generated left-viewpoint color image data and right-viewpoint color image data, and executes format processing according to a 3D-compatible moving image format such as Blu-ray 3D. Generate a file. Then, the control unit 201 records the generated moving image file on the memory card 220 via the memory card IF 207, and ends a series of flows.
  • a 3D-compatible moving image format such as Blu-ray 3D.
  • the recording to the memory card 220 may be sequentially executed in synchronization with the generation of the color image data of the left viewpoint and the color image data of the right viewpoint, and the file end process may be executed in synchronization with the recording stop instruction.
  • the control unit 201 is not limited to recording on the memory card 220, and may be configured to output to an external device via a LAN, for example.
  • FIG. 20 is a processing flow of step S33 until the color image data of the left viewpoint and the parallax color image data which is the color image data of the right viewpoint are generated.
  • the parallax image data generation unit 233 acquires captured image data in step S101.
  • the captured image data is plane-separated into image data without parallax and parallax image data.
  • the parallax image data generation unit 233 executes an interpolation process for interpolating vacancies existing in the separated plane data as described with reference to FIG.
  • the parallax image data generation unit 233 initializes each variable in step S104. Specifically, first, 1 is substituted into the color variable Cset.
  • the calculation unit 231 calculates an unadjusted parallax amount for the subject displayed at the pixel at the target pixel position (i, j) in step S107 from the left and right parallax image data.
  • the determination unit 232 acquires the value of the stereoscopic adjustment parameter C corresponding to the parallax amount calculated by the calculation unit 231 from the lookup table 2310.
  • the determination unit 232 applies to a subject that is determined not to be a main subject by the photographer even if the amount of parallax is within a range of ⁇ m to + m. A value closer to 1 than the value calculated from the lookup table 2310 may be determined as the value of the stereoscopic adjustment parameter C. Further, the determination unit 232 may calculate the luminance value of the subject included in the image, and further adjust the value of the stereoscopic adjustment parameter C so that the parallax amount decreases as the luminance value decreases.
  • the value of the stereoscopic adjustment parameter C is the value determined in step S108 for the subject displayed by the target pixel.
  • the parallax image data generation unit 233 increments the parallax variable S in step S110.
  • step S111 it is determined whether or not the parallax variable S exceeds 2. If not, the process returns to step S109. If it exceeds, the process proceeds to step S112.
  • step S112 the parallax image data generation unit 233 assigns 1 to the parallax variable S and increments the coordinate variable i. Then, in step S113, it is determined whether coordinate variable i exceeds i 0. If not, the process returns to step S105. If it exceeds, the process proceeds to step S114.
  • step S114 the parallax image data generation unit 233 assigns 1 to the coordinate variable i and increments the coordinate variable j. Then, in step S115, it is determined whether coordinate variable j exceeds j 0. If not, the process returns to step S105. If it exceeds, the process proceeds to step S116.
  • step S117 the parallax image data generation unit 233 assigns 1 to the coordinate variable j and increments the color variable Cset.
  • step S118 it is determined whether or not the color variable Cset exceeds 3. If not, the process returns to step S105. If it exceeds, color image data of the left viewpoint (RLt c plane data, GLt c plane data, BLt c plane data) and color image data of the right viewpoint (RRt c plane data, GRt c plane data, BRt c plane data) If all of the above are complete, the flow returns to the flow of FIG.
  • FIG. 21 is a processing flow in moving image shooting according to the second embodiment.
  • the processing related to each processing in the processing flow of FIG. 19 is denoted by the same step number, and the description thereof is omitted except for the description of different processing and additional processing.
  • step S15 when the control unit 201 detects a recording start instruction in step S15 (YES in step S15), the process proceeds to step S16.
  • step S16 the determination unit 232 changes the shooting condition, and proceeds to step S17.
  • step S16 the determination unit 232 changes the aperture value as described with reference to FIG. 15 or changes the focus lens position as described with reference to FIG.
  • an imaging condition that affects the amount of parallax may be changed so as to be within the set range of the amount of parallax. For example, if the photographing lens 20 is a zoom lens, the focal length can be changed.
  • step S107 in the process of step S33 the calculation unit 231 calculates an unadjusted parallax amount for the subject displayed by the pixel at the target pixel position (i, j) from the left and right parallax image data.
  • This parallax amount is the parallax amount adjusted by changing the aperture value, the focus position, and the like. Also, in this flow, if the control unit 201 determines in step S21 that a recording stop instruction has not been received from the photographer, the control unit 201 returns to step S16 and executes the next frame processing.
  • FIG. 22 is a diagram illustrating a preferred opening shape.
  • each of the openings 105 and 106 has a shape in contact with a virtual center line 322 passing through the center of the pixel or a shape straddling the center line 322.
  • the shape of the opening 105 and the shape of the opening 106 are preferably the same as the respective shapes obtained by dividing the shape of the opening 104 of the non-parallax pixel by the center line 322.
  • the shape of the opening 104 is preferably equal to the shape in which the shape of the opening 105 and the shape of the opening 106 are adjacent to each other.
  • the calculation formula used by the parallax image data generation unit 233 employs the above formulas (1) and (2) using the weighted arithmetic mean, but not limited to this, various calculation formulas are adopted. Can do. For example, if the weighted geometric mean is used, it can be expressed in the same manner as the above formulas (1) and (2) Can be adopted as a calculation formula. In this case, the amount of blur maintained is not the amount of blur due to the output of the non-parallax pixel but the amount of blur due to the output of the parallax pixel.
  • FIG. 23 is a diagram for explaining cooperation between the digital camera 10 and the TV monitor 80.
  • the TV monitor 80 includes a display unit 40 made of, for example, liquid crystal, a memory card IF 81 that receives the memory card 220 taken out from the digital camera 10, a remote controller 82 that is operated by a viewer at hand, and the like.
  • the TV monitor 80 is compatible with 3D image display.
  • the display format of the 3D image is not particularly limited.
  • the right-eye image and the left-eye image may be displayed in a time-division manner, or may be an interlace in which strips are arranged in a horizontal or vertical direction. Further, it may be a side-by-side format arranged on one side and the other side of the screen.
  • the TV monitor 80 decodes a moving image file that includes the color image data of the left viewpoint and the color image data of the right viewpoint, and displays a 3D image on the display unit 40.
  • the TV monitor 80 serves as a general display device that displays a standardized moving image file.
  • the TV monitor 80 can also function as an image processing apparatus that bears at least part of the function of the control unit 201 and at least part of the function of the image processing unit 205 described with reference to FIG.
  • an image processing unit including the calculation unit 231, the determination unit 232, the parallax image data generation unit 233, and the moving image generation unit 234 described in FIG. 1 is incorporated in the TV monitor 80.
  • the digital camera 10 does not perform image processing using the stereoscopic adjustment parameter, and associates the depth information detected by the depth information detection unit 230 with the generated captured image data.
  • the TV monitor 80 determines the value of the stereoscopic adjustment parameter C for each subject with reference to the associated depth information, and performs image processing using the stereoscopic adjustment parameter C for the acquired image data. Execute.
  • the TV monitor 80 displays the 3D image with the parallax amount adjusted in this way on the display unit 40.
  • the viewer may be configured to be able to input some adjustment information during playback on the TV monitor 80.
  • the viewer can input the parallax amount range by operating the remote controller 82.
  • the TV monitor 80 acquires the input parallax amount range as adjustment information, and the determination unit 232 determines the value of the stereoscopic adjustment parameter C according to the parallax amount range. If comprised in this way, the TV monitor 80 can display the 3D image according to the preference for every viewer.
  • the parallax amount of each of the plurality of subjects included in the image of the captured image data is calculated, and the value of the stereoscopic adjustment parameter C is set for each of the plurality of subjects according to the calculated amount of parallax. Then, the value of the stereoscopic adjustment parameter C is applied to the captured image data to generate parallax image data. Therefore, unlike the case where the value of the single stereoscopic adjustment parameter C is applied to the entire captured image data, the amount of parallax adjustment is intentionally different between the main subject and the subject that is not the main subject. be able to. Therefore, since the parallax amount can be suppressed in the subject that is not the main subject while the stereoscopic subject is left as the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
  • the value of the stereoscopic adjustment parameter C is determined so that the parallax amount is adjusted to be less than the threshold value. A sense of discomfort and fatigue can be reliably reduced.
  • the value of the three-dimensional adjustment parameter C is determined so that the adjusted parallax amount is continuous in the depth direction with respect to these subjects. Is done. Therefore, it is possible to prevent the parallax amount from changing discretely between subjects that are continuous in the depth direction. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
  • the stereoscopic adjustment parameter C is set so that the differential value of the adjusted parallax amount is continuous in the depth direction with respect to these subjects. The value of is determined. Therefore, the change in the amount of parallax between subjects that are continuous in the depth direction can be smoothed. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
  • the subject for which the amount of parallax is to be suppressed is specified by the user from among a plurality of subjects included in the image by applying the value of the stereoscopic adjustment parameter C, the amount of parallax of the specified subject is suppressed. Accordingly, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
  • the brightness value of each of the plurality of subjects included in the image is calculated, and the value of the stereoscopic adjustment parameter C is determined so that the parallax amount decreases as the brightness value decreases. Therefore, it is possible to prevent a large parallax from being given to a region having a small luminance value in the image, that is, a region having low brightness. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
  • the TV monitor 80 has been described as an example of the image processing apparatus.
  • the image processing apparatus can take various forms.
  • a device such as a PC, a mobile phone, or a game device that includes or is connected to the display unit can be an image processing apparatus.
  • the configuration of outputting image data in which the amount of parallax is adjusted based on the detected depth information can of course be applied to still image shooting.
  • the still image shot in this way does not cause extreme parallax between the left and right images, and does not give the viewer a sense of incongruity.
  • the target subject is accepted by the user's instruction, but the control unit 201 may automatically select the subject.
  • the control unit 201 can set only a human image included in the scene as a target subject through the human recognition process.
  • parallax image data may be generated from image data (such as a CG image) that is not a captured image.
  • the determination unit 232 is described as determining the value of the three-dimensional adjustment parameter C using one lookup table 2310.
  • a plurality of lookup tables 2310 may be stored in the determination unit 232, and the value of the stereoscopic adjustment parameter C may be determined using the lookup table 2310 selected by the photographer.
  • the amount of parallax adjustment for each subject can be selected.
  • selecting the lookup table 2310 is equivalent to selecting a function used to determine the target value of the parallax amount, that is, a function such as the adjusted parallax amount curve 1623.
  • the plurality of lookup tables 2310 can be generated from adjusted parallax amount curves having different shapes.
  • the value of the stereo adjustment parameter C is determined using the lookup table 2310.
  • the function of the adjustment parallax amount curve 1623 is stored in the digital camera 10 for each imaging condition in accordance with the above-described principle, and the stereoscopic adjustment parameter is obtained using these functions and information on the subject distribution and the focus position in the depth direction.
  • the value of C may be determined.
  • a plurality of functions of the adjustment parallax amount curve 1623 are stored in the digital camera 10 for the same shooting condition, and the user operates the operation unit 208 to change any one of the functions. You may select as an object of use.
  • the three-dimensional adjustment parameter C when some subjects are continuous in the depth direction of the scene, the three-dimensional adjustment parameter C is set so that the differential value of the adjusted parallax amount is continuous in the depth direction for these continuous subjects. Described as determining the value. However, in such a case, it is not always necessary to determine the value of the three-dimensional adjustment parameter C so that the differential values are continuous.
  • the value of the stereoscopic adjustment parameter C may be determined so that the parallax amount is adjusted to ⁇ m.
  • the differential value is discontinuous at the boundary between the frame W1 and the frame W2.
  • the parallax is not suppressed, so Will not be lost.
  • the parallax amount is smoothly shifted between these subjects in the depth direction.
  • the subject that is continuous in the depth direction of the scene may be a part of the subject included in the image.
  • the allowable parallax threshold has been described as ⁇ m.
  • values having different absolute values may be used for the upper limit value and the lower limit value of the allowable amount of parallax.
  • Each processing flow described in this embodiment is executed by a control program that controls the control unit.
  • the control program is recorded in a built-in nonvolatile memory, and is appropriately expanded in the work memory to execute each process.
  • the control program recorded in the server is transmitted to each device via the network, and is expanded in the work memory to execute each process.
  • a control program recorded on the server is executed on the server, and each device executes processing in accordance with a control signal transmitted via the network.

Abstract

Stereo image data as imaged by a stereo imaging device could have parallax occurring in the extreme between left and right images due to the disposition of a subject in a scene, and a viewer could feel a sense of discomfort or fatigue in such a case. Accordingly, provided is an image processing device that is equipped with: an acquisition unit that acquires image data; a calculation unit that calculates the amount of parallax for each of a plurality of subjects that are included in an image of the image data; a determination unit that determines, for each of the plurality of subjects, in accordance with the calculated parallax amounts, an adjustment parameter value that is applied when parallax image data is generated from the image data and that adjusts the amount of parallax; and a generation unit that applies the adjustment parameter value to the image data and generates parallax image data which becomes an image having the mutually adjusted parallax amounts.

Description

画像処理装置、撮像装置及び画像処理プログラムImage processing apparatus, imaging apparatus, and image processing program
 本発明は、画像処理装置、撮像装置及び画像処理プログラムに関する。 The present invention relates to an image processing device, an imaging device, and an image processing program.
 2つの撮影光学系を用いて、右目用の画像と左目用の画像とから成るステレオ画像を取得するステレオ撮像装置が知られている。
[先行技術文献]
[特許文献]
  [特許文献1] 特開平8-47001号公報
There is known a stereo imaging device that acquires a stereo image composed of a right-eye image and a left-eye image using two photographing optical systems.
[Prior art documents]
[Patent Literature]
[Patent Document 1] JP-A-8-47001
 ステレオ撮像装置によって撮影されたステレオ画像データは、シーンにおける被写体の配置に起因して左右の画像間に極度に視差が生じている場合があり、鑑賞者は、鑑賞時に違和感、疲労感を覚えることがあった。一方、視差量を調整して、このような極度に生じた視差を抑制する場合であっても、主要な被写体については立体感を残して欲しいとの要望がある。 Stereo image data captured by a stereo imaging device may cause extreme parallax between the left and right images due to the placement of the subject in the scene, and viewers will feel uncomfortable and tired during viewing. was there. On the other hand, even when the parallax amount is adjusted to suppress such extremely generated parallax, there is a demand for the main subject to have a stereoscopic effect.
 本発明の第1の態様における画像処理装置は、画像データを取得する取得部と、画像データの画像に含まれる複数の被写体それぞれの視差量を算出する算出部と、画像データから視差画像データを生成するときに適用される、視差量を調整する調整パラメータ値を、算出された視差量に応じて複数の被写体のそれぞれに対して決定する決定部と、画像データに調整パラメータ値を適用して、互いに調整された視差量を有する画像となる視差画像データを生成する生成部とを備える。 An image processing apparatus according to a first aspect of the present invention includes an acquisition unit that acquires image data, a calculation unit that calculates a parallax amount of each of a plurality of subjects included in an image of the image data, and parallax image data from the image data. A determination unit that determines an adjustment parameter value for adjusting the amount of parallax applied to each of a plurality of subjects in accordance with the calculated amount of parallax, and applies the adjustment parameter value to the image data. And a generation unit that generates parallax image data that is an image having parallax amounts adjusted to each other.
 本発明の第2の態様における画像処理プログラムは、画像データを取得する取得ステップと、画像データの画像に含まれる複数の被写体それぞれの視差量を算出する算出ステップと、画像データから視差画像データを生成するときに適用される、視差量を調整する調整パラメータ値を、算出された視差量に応じて複数の被写体のそれぞれに対して決定する決定ステップと、画像データに調整パラメータ値を適用して、互いに調整された視差量を有する画像となる視差画像データを生成する生成ステップとをコンピュータに実行させる。 An image processing program according to a second aspect of the present invention includes an acquisition step of acquiring image data, a calculation step of calculating the amount of parallax for each of a plurality of subjects included in the image of the image data, and parallax image data from the image data. A determination step for determining an adjustment parameter value for adjusting the amount of parallax applied to each of a plurality of subjects in accordance with the calculated amount of parallax, and applying the adjustment parameter value to the image data And causing the computer to execute a generation step of generating parallax image data to be images having parallax amounts adjusted to each other.
 なお、上記の発明の概要は、本発明の必要な特徴の全てを列挙したものではない。また、これらの特徴群のサブコンビネーションもまた、発明となりうる。 Note that the above summary of the invention does not enumerate all the necessary features of the present invention. In addition, a sub-combination of these feature groups can also be an invention.
本発明の実施形態に係るデジタルカメラの構成を説明する図である。It is a figure explaining the structure of the digital camera which concerns on embodiment of this invention. 撮像素子の一部を拡大した様子を概念的に表す概念図である。It is a conceptual diagram which represents notably the mode that a part of imaging device was expanded. 2D画像データと視差画像データの生成処理の例を説明する図である。It is a figure explaining the example of a production | generation process of 2D image data and parallax image data. デフォーカスの概念を説明する図である。It is a figure explaining the concept of defocusing. 視差画素が出力する光強度分布を示す図である。It is a figure which shows the light intensity distribution which a parallax pixel outputs. 調整視差量の概念を説明するための光強度分布を示す図である。It is a figure which shows the light intensity distribution for demonstrating the concept of adjustment parallax amount. カラー視差プレーンデータの生成処理を説明する図である。It is a figure explaining the production | generation process of color parallax plane data. 視差量と立体調整パラメータの値との関係を模式的に示す図である。It is a figure which shows typically the relationship between the amount of parallax and the value of a three-dimensional adjustment parameter. ルックアップテーブルを示す図である。It is a figure which shows a look-up table. 撮影条件の違いにより視差量曲線の形状が異なる場合での視差量と、視差量の目標値との関係を示す図である。It is a figure which shows the relationship between the amount of parallax in the case where the shape of a parallax amount curve differs with the imaging | photography conditions, and the target value of a parallax amount. RGBの光強度分布の変化を説明する図である。It is a figure explaining the change of the light intensity distribution of RGB. 鑑賞者の輻輳角と視差量の関係を示す図である。It is a figure which shows the relationship between a viewer's convergence angle and the amount of parallax. 画像の鮮鋭度を示すコントラストと視差量の関係を模式的に示す図である。It is a figure which shows typically the relationship between the contrast which shows the sharpness of an image, and the amount of parallax. 被写体分布と視差量の関係を模式的に示す図である。It is a figure which shows typically the relationship between a subject distribution and the amount of parallax. 絞り値と視差量の関係を模式的に示す図である。It is a figure which shows typically the relationship between an aperture value and the amount of parallax. フォーカスシフトの概念を模式的に示す図である。It is a figure which shows the concept of a focus shift typically. 視差量と立体調整パラメータの値と絞り値等との関係を模式的に示す図である。It is a figure which shows typically the relationship between the amount of parallax, the value of a three-dimensional adjustment parameter, an aperture value, etc. 被写体指定を説明する図である。It is a figure explaining subject specification. 第1実施例に係る動画撮影における処理フローである。It is a processing flow in the moving image photography which concerns on 1st Example. 視差カラー画像データを生成するまでの処理フローである。It is a processing flow until it produces | generates parallax color image data. 第2実施例に係る動画撮影における処理フローである。It is a processing flow in the moving image photography which concerns on 2nd Example. 好ましい開口形状を説明する図である。It is a figure explaining a preferable opening shape. デジタルカメラとTVモニタとの連携を説明する図である。It is a figure explaining cooperation with a digital camera and TV monitor. 視差量と立体調整パラメータの値との関係を模式的に示す図である。It is a figure which shows typically the relationship between the amount of parallax and the value of a three-dimensional adjustment parameter.
 以下、発明の実施の形態を通じて本発明を説明するが、以下の実施形態は請求の範囲にかかる発明を限定するものではない。また、実施形態の中で説明されている特徴の組み合わせの全てが発明の解決手段に必須であるとは限らない。 Hereinafter, the present invention will be described through embodiments of the invention. However, the following embodiments do not limit the invention according to the claims. In addition, not all the combinations of features described in the embodiments are essential for the solving means of the invention.
 撮像装置の一形態である本実施形態に係るデジタルカメラは、1つのシーンについて複数の視点数の画像を一度の撮影により生成できるように構成されている。互いに視点の異なるそれぞれの画像を視差画像と呼ぶ。本実施形態においては、特に、右目と左目に対応する2つの視点による右視差画像と左視差画像を生成する場合について説明する。本実施形態におけるデジタルカメラは、中央視点による視差のない視差なし画像も、視差画像と共に生成できる。 The digital camera according to the present embodiment, which is a form of the imaging device, is configured to generate a plurality of viewpoint images for one scene by one shooting. Each image having a different viewpoint is called a parallax image. In the present embodiment, a case where a right parallax image and a left parallax image from two viewpoints corresponding to the right eye and the left eye are generated will be described. The digital camera in the present embodiment can generate a parallax-free image without parallax from the central viewpoint together with the parallax image.
 図1は、本発明の実施形態に係るデジタルカメラ10の構成を説明する図である。デジタルカメラ10は、撮影光学系としての撮影レンズ20を備え、光軸21に沿って入射する被写体光束を撮像素子100へ導く。撮影レンズ20は、デジタルカメラ10に対して着脱できる交換式レンズであっても構わない。デジタルカメラ10は、撮像素子100、制御部201、A/D変換回路202、メモリ203、駆動部204、画像処理部205、メモリカードIF207、操作部208、表示部209およびLCD駆動回路210を備える。 FIG. 1 is a diagram illustrating a configuration of a digital camera 10 according to an embodiment of the present invention. The digital camera 10 includes a photographic lens 20 as a photographic optical system, and guides a subject light beam incident along the optical axis 21 to the image sensor 100. The photographing lens 20 may be an interchangeable lens that can be attached to and detached from the digital camera 10. The digital camera 10 includes an image sensor 100, a control unit 201, an A / D conversion circuit 202, a memory 203, a drive unit 204, an image processing unit 205, a memory card IF 207, an operation unit 208, a display unit 209, and an LCD drive circuit 210. .
 なお、図示するように、撮像素子100へ向かう光軸21に平行な方向をZ軸プラス方向と定め、Z軸と直交する平面において紙面手前へ向かう方向をX軸プラス方向、紙面上方向をY軸プラス方向と定める。以降のいくつかの図においては、図1の座標軸を基準として、それぞれの図の向きがわかるように座標軸を表示する。 As shown in the figure, the direction parallel to the optical axis 21 toward the image sensor 100 is defined as the Z-axis plus direction, the direction toward the front of the drawing on the plane orthogonal to the Z-axis is the X-axis plus direction, and the upward direction on the drawing is Y. The axis is defined as the plus direction. In the following several figures, the coordinate axes are displayed so that the orientation of each figure can be understood with reference to the coordinate axes of FIG.
 撮影レンズ20は、複数の光学レンズ群から構成され、シーンからの被写体光束をその焦点面近傍に結像させる。なお、図1では撮影レンズ20を説明の都合上、瞳近傍に配置された仮想的な1枚のレンズで代表して表している。また、瞳近傍には、光軸21を中心として同心状に入射光束を制限する絞り22が配置されている。 The photographing lens 20 is composed of a plurality of optical lens groups, and forms an image of a subject light flux from the scene in the vicinity of its focal plane. In FIG. 1, for convenience of explanation, the photographic lens 20 is represented by a single virtual lens arranged in the vicinity of the pupil. Further, in the vicinity of the pupil, a diaphragm 22 that restricts the incident light beam concentrically with the optical axis 21 as the center is disposed.
 撮像素子100は、撮影レンズ20の焦点面近傍に配置されている。撮像素子100は、二次元的に複数の光電変換素子が配列された、例えばCCD、CMOSセンサ等のイメージセンサである。撮像素子100は、駆動部204によりタイミング制御されて、受光面上に結像された被写体像を画像信号に変換してA/D変換回路202へ出力する。 The image sensor 100 is disposed in the vicinity of the focal plane of the photographing lens 20. The image sensor 100 is an image sensor such as a CCD or CMOS sensor in which a plurality of photoelectric conversion elements are two-dimensionally arranged. The image sensor 100 is controlled in timing by the drive unit 204, converts the subject image formed on the light receiving surface into an image signal, and outputs the image signal to the A / D conversion circuit 202.
 A/D変換回路202は、撮像素子100が出力する画像信号をデジタル画像信号に変換してメモリ203へ出力する。画像処理部205は、メモリ203をワークスペースとしてデジタル画像信号に種々の画像処理を施し、撮影画像データを生成する。撮影画像データは、後述するように、撮像素子100の視差なし画素の出力から生成される基準画像データと、撮像素子100の視差画素の出力から生成される視差画像データを包含する。 The A / D conversion circuit 202 converts the image signal output from the image sensor 100 into a digital image signal and outputs the digital image signal to the memory 203. The image processing unit 205 performs various image processing on the digital image signal using the memory 203 as a work space, and generates captured image data. The captured image data includes reference image data generated from the output of the non-parallax pixels of the image sensor 100 and parallax image data generated from the output of the parallax pixels of the image sensor 100, as will be described later.
 制御部201は、デジタルカメラ10を統合的に制御する。例えば、設定された絞り値に応じて絞り22の開口を調整し、AF評価値に応じて撮影レンズ20を光軸方向に進退させる。また、撮影レンズ20の位置を検出して、撮影レンズ20の焦点距離、フォーカスレンズ位置を把握する。さらに、駆動部204に対してタイミング制御信号を送信し、撮像素子100から出力される画像信号が画像処理部205で撮影画像データに処理されるまでの一連の撮像制御を管理して、撮影画像データを取得する。 The control unit 201 controls the digital camera 10 in an integrated manner. For example, the aperture of the diaphragm 22 is adjusted according to the set diaphragm value, and the photographing lens 20 is advanced and retracted in the optical axis direction according to the AF evaluation value. Further, the position of the photographing lens 20 is detected, and the focal length and the focus lens position of the photographing lens 20 are grasped. Further, a timing control signal is transmitted to the drive unit 204, and a series of imaging control until the image signal output from the imaging element 100 is processed into captured image data by the image processing unit 205 is managed, and the captured image Get the data.
 また、制御部201は、奥行情報検出部230を包含する。奥行情報検出部230は、シーンに対して奥行き方向の被写体分布を検出する。具体的には、制御部201が、オートフォーカスに用いるデフォーカス情報を利用して、細分化された領域ごとのデフォーカス量から被写体分布を検出する。なお、デフォーカス情報は、専用に設けられた位相差センサの出力を利用しても良いし、撮像素子100の視差画素の出力を利用しても良い。視差画素の出力を利用する場合は、画像処理部205によって処理された視差画像データを用いることもできる。あるいは、デフォーカス情報を利用しなくても、フォーカスレンズを進退させ、細分化された領域ごとにコントラストAF方式によるAF評価値を算出しても被写体分布を検出することができる。 Also, the control unit 201 includes a depth information detection unit 230. The depth information detection unit 230 detects the subject distribution in the depth direction with respect to the scene. Specifically, the control unit 201 detects the subject distribution from the defocus amount for each subdivided area using defocus information used for autofocus. Note that the defocus information may use the output of a phase difference sensor provided exclusively, or may use the output of parallax pixels of the image sensor 100. When the output of the parallax pixel is used, the parallax image data processed by the image processing unit 205 can also be used. Alternatively, the subject distribution can be detected even when the focus lens is advanced and retracted without using the defocus information and the AF evaluation value by the contrast AF method is calculated for each subdivided region.
 画像処理部205は、上述の通り、撮像素子100から出力される画像信号を処理して撮影画像データを生成する。また、画像処理部205は、算出部231、決定部232、視差画像データ生成部233および動画生成部234を包含する。 As described above, the image processing unit 205 processes the image signal output from the image sensor 100 to generate captured image data. The image processing unit 205 includes a calculation unit 231, a determination unit 232, a parallax image data generation unit 233, and a moving image generation unit 234.
 算出部231は、撮影画像データの画像に含まれる各被写体の視差量、つまり未調整の視差量を、後述の左右視差画像データから算出する。決定部232は、算出部231が算出した各被写体の視差量に応じて、その視差量を変更するための変更条件を決定する。より具体的には、決定部232は、出力視差画像間の視差量がターゲットとする視差量に収まるように、立体調整パラメータの値を決定する。この立体調整パラメータは、撮影画像データから視差画像データを生成するときに視差画像データの視差量を調整するために適用されるパラメータである。 The calculation unit 231 calculates the parallax amount of each subject included in the image of the captured image data, that is, the unadjusted parallax amount from the left and right parallax image data described later. The determination unit 232 determines a change condition for changing the parallax amount according to the parallax amount of each subject calculated by the calculation unit 231. More specifically, the determination unit 232 determines the value of the stereoscopic adjustment parameter so that the amount of parallax between output parallax images falls within the target amount of parallax. This stereoscopic adjustment parameter is a parameter that is applied to adjust the amount of parallax of parallax image data when generating parallax image data from captured image data.
 視差画像データ生成部233は、撮影画像データに立体調整パラメータを適用して、互いに調整された視差量を有する画像となる視差画像データを生成する。なお、以上の算出部231、決定部232および視差画像データ生成部233については、詳細を後述する。動画生成部234は、視差画像データを繋ぎ合わせて、3Dの動画ファイルを生成する。 The parallax image data generation unit 233 applies parallax adjustment parameters to the captured image data to generate parallax image data that becomes images having parallax amounts adjusted to each other. Details of the calculation unit 231, the determination unit 232, and the parallax image data generation unit 233 will be described later. The moving image generation unit 234 connects the parallax image data and generates a 3D moving image file.
 画像処理部205は、他にも選択された画像フォーマットに従って画像データを調整するなどの画像処理一般の機能も担う。生成された撮影画像データは、LCD駆動回路210により表示信号に変換され、表示部209に表示される。また、メモリカードIF207に装着されているメモリカード220に記録される。 The image processing unit 205 also has general image processing functions such as adjusting image data according to the selected image format. The generated captured image data is converted into a display signal by the LCD drive circuit 210 and displayed on the display unit 209. The data is recorded on the memory card 220 attached to the memory card IF 207.
 操作部208は、ユーザの操作を受け付けて制御部201へ指示を伝達する受付部の一部として機能する。操作部208は、撮影開始指示を受け付けるシャッタボタン等、複数の操作部材を含む。 The operation unit 208 functions as a part of a reception unit that receives a user operation and transmits an instruction to the control unit 201. The operation unit 208 includes a plurality of operation members such as a shutter button that receives a shooting start instruction.
 図2は、撮像素子100の一部を拡大した様子を概念的に表す概念図である。画素領域には2000万個以上もの画素がマトリックス状に配列されている。本実施形態においては、隣接する8画素×8画素の64画素が一つの基本格子110を形成する。基本格子110は、2×2の4画素を基本単位とするベイヤー配列を、Y軸方向に4つ、X軸方向に4つ含む。なお、図示するように、ベイヤー配列においては、左上画素と右下画素に緑フィルタ(Gフィルタ)、左下画素に青フィルタ(Bフィルタ)、右上画素に赤フィルタ(Rフィルタ)が配される。 FIG. 2 is a conceptual diagram conceptually showing a state in which a part of the image sensor 100 is enlarged. In the pixel area, 20 million or more pixels are arranged in a matrix. In the present embodiment, 64 pixels of adjacent 8 pixels × 8 pixels form one basic lattice 110. The basic grid 110 includes four Bayer arrays having 4 × 2 × 2 basic units in the Y-axis direction and four in the X-axis direction. As shown in the figure, in the Bayer array, a green filter (G filter) is arranged for the upper left pixel and the lower right pixel, a blue filter (B filter) is arranged for the lower left pixel, and a red filter (R filter) is arranged for the upper right pixel.
 基本格子110は、視差画素と視差なし画素を含む。視差画素は、撮影レンズ20を透過する入射光束のうち、撮影レンズ20の光軸に対して偏位した部分光束を受光する画素である。視差画素には、当該部分光束のみを透過させるように、画素中心から偏位した偏位開口を有する開口マスクが設けられている。開口マスクは、例えば、カラーフィルタに重ねて設けられる。本実施形態においては、開口マスクにより、部分光束が画素中心に対して左側に到達するように規定された視差Lt画素と、部分光束が画素中心に対して右側に到達するように規定された視差Rt画素の2種類が存在する。一方、視差なし画素は、開口マスクが設けられていない画素であり、撮影レンズ20を透過する入射光束の全体を受光する画素である。 The basic grid 110 includes parallax pixels and non-parallax pixels. The parallax pixel is a pixel that receives a partial light beam that is deviated from the optical axis of the photographing lens 20 out of the incident light beam that is transmitted through the photographing lens 20. The parallax pixel is provided with an aperture mask having a deviated opening that is deviated from the center of the pixel so as to transmit only the partial light flux. For example, the opening mask is provided so as to overlap the color filter. In the present embodiment, the parallax Lt pixel defined so that the partial light beam reaches the left side with respect to the pixel center and the parallax specified so that the partial light beam reaches the right side with respect to the pixel center by the aperture mask. There are two types of Rt pixels. On the other hand, the non-parallax pixel is a pixel that is not provided with an aperture mask, and is a pixel that receives the entire incident light beam that passes through the photographing lens 20.
 なお、視差画素は、光軸から偏位した部分光束を受光するにあたり、開口マスクに限らず、受光領域と反射領域が区分された選択的反射膜、偏位したフォトダイオード領域など、様々な構成を採用し得る。すなわち、視差画素は、撮影レンズ20を透過する入射光束のうち、光軸から偏位した部分光束を受光できるように構成されていれば良い。 Note that the parallax pixel is not limited to the aperture mask when receiving the partial light beam that is deviated from the optical axis, but has various configurations such as a selective reflection film in which the light receiving region and the reflective region are separated, and a deviated photodiode region. Can be adopted. In other words, the parallax pixel only needs to be configured to receive a partial light beam that is deviated from the optical axis, among incident light beams that pass through the photographing lens 20.
 基本格子110内の画素をPIJで表す。例えば、左上画素はP11であり、右上画素はP81である。図に示すように、視差画素は以下のように配列されている。 Pixels in the basic grid 110 are denoted by PIJ . For example, the upper left pixel is P 11, the upper right pixel is P 81. As shown in the figure, the parallax pixels are arranged as follows.
 P11…視差Lt画素+Gフィルタ(=G(Lt))
 P51…視差Rt画素+Gフィルタ(=G(Rt))
 P32…視差Lt画素+Bフィルタ(=B(Lt))
 P63…視差Rt画素+Rフィルタ(=R(Rt))
 P15…視差Rt画素+Gフィルタ(=G(Rt))
 P55…視差Lt画素+Gフィルタ(=G(Lt))
 P76…視差Rt画素+Bフィルタ(=B(Rt))
 P27…視差Lt画素+Rフィルタ(=R(Lt))
 他の画素は視差なし画素であり、視差無し画素+Rフィルタ、視差なし画素+Gフィルタ、視差無し画素+Bフィルタのいずれかである。
P 11 : Parallax Lt pixel + G filter (= G (Lt))
P 51 ... Parallax Rt pixel + G filter (= G (Rt))
P 32 ... Parallax Lt pixel + B filter (= B (Lt))
P 63 ... Parallax Rt pixel + R filter (= R (Rt))
P 15 ... Parallax Rt pixel + G filter (= G (Rt))
P 55 ... Parallax Lt pixel + G filter (= G (Lt))
P 76 ... Parallax Rt pixel + B filter (= B (Rt))
P 27 ... Parallax Lt pixel + R filter (= R (Lt))
The other pixels are non-parallax pixels, and are any of the non-parallax pixel + R filter, the non-parallax pixel + G filter, and the non-parallax pixel + B filter.
 撮像素子100の全体でみた場合に、視差画素は、Gフィルタを有する第1群と、Rフィルタを有する第2群と、Bフィルタを有する第3群のいずれかに区分され、基本格子110には、それぞれの群に属する視差Lt画素および視差Rt画素が少なくとも1つは含まれる。図の例のように、これらの視差画素および視差なし画素が、基本格子110内においてランダム性を有して配置されると良い。ランダム性を有して配置されることにより、色成分ごとの空間分解能に偏りを生じさせることなく、視差画素の出力としてRGBのカラー情報を取得することができるので、高品質な視差画像データが得られる。 When viewed as a whole of the image sensor 100, the parallax pixels are classified into one of a first group having a G filter, a second group having an R filter, and a third group having a B filter. Includes at least one parallax Lt pixel and parallax Rt pixel belonging to each group. As in the example in the figure, these parallax pixels and non-parallax pixels may be arranged with randomness in the basic lattice 110. By arranging with randomness, RGB color information can be acquired as the output of the parallax pixels without causing bias in the spatial resolution for each color component, so that high-quality parallax image data can be obtained. can get.
 次に、撮像素子100から出力される撮影画像データから2D画像データと視差画像データを生成する処理の概念を説明する。図3は、2D画像データと視差画像データの生成処理の例を説明する図である。 Next, the concept of processing for generating 2D image data and parallax image data from captured image data output from the image sensor 100 will be described. FIG. 3 is a diagram illustrating an example of processing for generating 2D image data and parallax image data.
 基本格子110における視差画素および視差なし画素の配列からもわかるように、撮像素子100の出力をその画素配列に一致させてそのまま羅列しても、特定の像を表す画像データにはならない。撮像素子100の画素出力を、同一に特徴付けられた画素グループごとに分離して寄せ集めてはじめて、その特徴に即した一つの像を表す画像データが形成される。例えば、左右の視差画素をそれぞれ寄せ集めると、互いに視差を有する左右の視差画像データが得られる。このように、同一に特徴付けられた画素グループごとに分離して寄せ集められたそれぞれの画像データを、プレーンデータと呼ぶ。 As can be seen from the arrangement of parallax pixels and non-parallax pixels in the basic grid 110, even if the output of the image sensor 100 is aligned with the pixel arrangement and arranged as it is, it does not become image data representing a specific image. Only when the pixel outputs of the image sensor 100 are separated and collected for each pixel group characterized in the same manner, image data representing one image in accordance with the characteristics is formed. For example, when the left and right parallax pixels are gathered together, left and right parallax image data having parallax can be obtained. In this way, each piece of image data separated and collected for each identically characterized pixel group is referred to as plane data.
 画像処理部205は、撮像素子100の画素配列順にその出力値(画素値)が羅列されたRAW元画像データを受け取り、複数のプレーンデータに分離するプレーン分離処理を実行する。図の左列は、2D画像データとしての2D-RGBプレーンデータの生成処理の例を示す。 The image processing unit 205 receives raw raw image data in which output values (pixel values) are arranged in the order of pixel arrangement of the image sensor 100, and executes plane separation processing for separating the raw image data into a plurality of plane data. The left column of the figure shows an example of processing for generating 2D-RGB plane data as 2D image data.
 2D-RGBプレーンデータを生成するにあたり、画像処理部205は、まず視差画素の画素値を除去して、空格子とする。そして、空格子となった画素値を、周辺画素の画素値を用いて補間処理により算出する。例えば、空格子P11の画素値は、斜め方向に隣接するGフィルタ画素の画素値である、P-1-1、P2-1、P-12、P22の画素値を平均化演算して算出する。また、例えば空格子P63の画素値は、上下左右に1画素飛ばして隣接するRフィルタの画素値である、P43、P61、P83、P65の画素値を平均化演算して算出する。同様に、例えば空格子P76の画素値は、上下左右に1画素飛ばして隣接するBフィルタの画素値である、P56、P74、P96、P78の画素値を平均化演算して算出する。 In generating 2D-RGB plane data, the image processing unit 205 first removes the pixel values of the parallax pixels to form a vacant lattice. Then, the pixel value that becomes the empty grid is calculated by interpolation processing using the pixel values of the surrounding pixels. For example, the pixel value of the empty lattice P 11 is obtained by averaging the pixel values of P −1 −1 , P 2−1 , P −12 , and P 22 which are the pixel values of the G filter pixels adjacent in the diagonal direction. To calculate. Further, for example, the pixel value of the empty lattice P 63 is calculated by averaging the pixel values of P 43 , P 61 , P 83 , and P 65 that are adjacent R filter pixel values by skipping one pixel vertically and horizontally. To do. Similarly, for example, the pixel value of the air grating P 76 is the pixel value of the adjacent B filter skipping one pixel vertically and horizontally, and averaging operation of the pixel values of P 56, P 74, P 96 , P 78 calculate.
 このように補間された2D-RGBプレーンデータは、ベイヤー配列を有する通常の撮像素子の出力と同様であるので、その後は2D画像データとして各種処理を行うことができる。すなわち、公知のベイヤー補間を行って、各画素にRGBデータの揃ったカラー画像データを生成する。画像処理部205は、静止画データを生成する場合にはJPEG等の、動画データを生成する場合にはMPEG等の、予め定められたフォーマットに従って一般的な2D画像としての画像処理を行う。 Since the 2D-RGB plane data interpolated in this way is the same as the output of a normal image sensor having a Bayer array, various processes can be performed as 2D image data thereafter. That is, known Bayer interpolation is performed to generate color image data in which RGB data is aligned for each pixel. The image processing unit 205 performs image processing as a general 2D image according to a predetermined format such as JPEG when generating still image data and MPEG when generating moving image data.
 本実施形態においては、画像処理部205は、2D-RGBプレーンデータをさらに色ごとに分離し、上述のような補間処理を施して、基準画像データとしての各プレーンデータを生成する。すなわち、緑色の基準画像プレーンデータとしてのGnプレーンデータ、赤色の基準画像プレーンデータとしてのRnプレーンデータ、および青色の基準画像プレーンデータとしてのBnプレーンデータの3つを生成する。 In the present embodiment, the image processing unit 205 further separates the 2D-RGB plane data for each color and performs the interpolation processing as described above to generate each plane data as reference image data. That is, three types of data are generated: Gn plane data as green reference image plane data, Rn plane data as red reference image plane data, and Bn plane data as blue reference image plane data.
 図の右列は、視差画素データとしての2つのGプレーンデータ、2つのRプレーンデータおよび2つのBプレーンデータの生成処理の例を示す。2つのGプレーンデータは、左視差画像データとしてのGLtプレーンデータと右視差画像データとしてのGRtプレーンデータであり、2つのRプレーンデータは、左視差画像データとしてのRLtプレーンデータと右視差画像データとしてのRRtプレーンデータであり、2つのBプレーンデータは、左視差画像データとしてのBLtプレーンデータと右視差画像データとしてのBRtプレーンデータである。 The right column of the figure shows an example of processing for generating two G plane data, two R plane data, and two B plane data as parallax pixel data. The two G plane data are GLt plane data as left parallax image data and GRt plane data as right parallax image data. The two R plane data are RLt plane data and right parallax image data as left parallax image data. The two B plane data are the BLt plane data as the left parallax image data and the BRt plane data as the right parallax image data.
 GLtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からG(Lt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P11とP55の2つの画素値が残る。そこで、基本格子110を縦横に4等分し、左上の16画素分をP11の出力値で代表させ、右下の16画素分をP55の出力値で代表させる。そして、右上の16画素分および左下の16画素分は、上下左右に隣接する周辺の代表値を平均化演算して補間する。すなわち、GLtプレーンデータは、16画素単位で一つの値を有する。 In generating the GLt plane data, the image processing unit 205 removes pixel values other than the pixel values of the G (Lt) pixels from all output values of the image sensor 100 to form a vacant lattice. As a result, two pixel values P 11 and P 55 remain in the basic grid 110. Therefore, we divided into four equal basic grid 110 vertically and horizontally, the 16 pixels of the top left is represented by an output value of the P 11, is representative of the 16 pixels in the lower right in the output value of the P 55. Then, for the upper right 16 pixels and the lower left 16 pixels, average values of neighboring representative values adjacent in the vertical and horizontal directions are averaged and interpolated. That is, the GLt plane data has one value in units of 16 pixels.
 同様に、GRtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からG(Rt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P51とP15の2つの画素値が残る。そこで、基本格子110を縦横に4等分し、右上の16画素分をP51の出力値で代表させ、左下の16画素分をP15の出力値で代表させる。そして、左上の16画素分および右下の16画素分は、上下左右に隣接する周辺の代表値を平均化演算して補間する。すなわち、GRtプレーンデータは、16画素単位で一つの値を有する。このようにして、2D-RGBプレーンデータよりは解像度の低いGLtプレーンデータとGRtプレーンデータを生成することができる。 Similarly, when generating the GRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the G (Rt) pixel from all the output values of the image sensor 100 to obtain an empty grid. Then, two pixel values P 51 and P 15 remain in the basic grid 110. Therefore, the basic grid 110 is divided into four equal parts vertically and horizontally, the upper right 16 pixels are represented by the output value of P 51 , and the lower left 16 pixels are represented by the output value of P 15 . The upper left 16 pixels and the lower right 16 pixels are interpolated by averaging the peripheral representative values adjacent vertically and horizontally. That is, the GRt plane data has one value in units of 16 pixels. In this way, GLt plane data and GRt plane data having a resolution lower than that of 2D-RGB plane data can be generated.
 RLtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からR(Lt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P27の画素値が残る。この画素値を基本格子110の64画素分の代表値とする。同様に、RRtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からR(Rt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P63の画素値が残る。この画素値を基本格子110の64画素分の代表値とする。このようにして、2D-RGBプレーンデータよりは解像度の低いRLtプレーンデータとRRtプレーンデータが生成される。この場合、RLtプレーンデータとRRtプレーンデータの解像度は、GLtプレーンデータとGRtプレーンデータの解像度よりも低い。 In generating the RLt plane data, the image processing unit 205 removes pixel values other than the pixel value of the R (Lt) pixel from all output values of the image sensor 100 to form a vacant lattice. Then, the primitive lattice 110, the pixel values of P 27 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110. Similarly, when generating the RRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the R (Rt) pixel from all output values of the image sensor 100 to form a vacant lattice. Then, the pixel value P 63 remains in the basic grid 110. This pixel value is set as a representative value for 64 pixels of the basic grid 110. In this way, RLt plane data and RRt plane data having a lower resolution than 2D-RGB plane data are generated. In this case, the resolution of the RLt plane data and the RRt plane data is lower than the resolution of the GLt plane data and the GRt plane data.
 BLtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からB(Lt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P32の画素値が残る。この画素値を基本格子110の64画素分の代表値とする。同様に、BRtプレーンデータを生成するにあたり、画像処理部205は、撮像素子100の全出力値からB(Rt)画素の画素値以外の画素値を除去して空格子とする。すると、基本格子110には、P76の画素値が残る。この画素値を基本格子110の64画素分の代表値とする。このようにして、2D-RGBプレーンデータよりは解像度の低いBLtプレーンデータとBRtプレーンデータが生成される。この場合、BLtプレーンデータとBRtプレーンデータの解像度は、GLtプレーンデータとGRtプレーンデータの解像度よりも低く、RLtプレーンデータとRRtプレーンデータの解像度と同等である。 In generating the BLt plane data, the image processing unit 205 removes pixel values other than the pixel values of the B (Lt) pixels from all output values of the image sensor 100 to form a vacant lattice. Then, the primitive lattice 110, the pixel values of P 32 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110. Similarly, when generating the BRt plane data, the image processing unit 205 removes pixel values other than the pixel value of the B (Rt) pixel from all the output values of the image sensor 100 to obtain an empty grid. Then, the primitive lattice 110, the pixel values of P 76 remains. This pixel value is set as a representative value for 64 pixels of the basic grid 110. In this way, BLt plane data and BRt plane data having a resolution lower than that of 2D-RGB plane data are generated. In this case, the resolution of the BLt plane data and the BRt plane data is lower than the resolution of the GLt plane data and the GRt plane data, and is equal to the resolution of the RLt plane data and the RRt plane data.
 本実施形態においては、生成される画像間の視差量がターゲットとする視差量に収まるように、出力画像データに対して画像処理を施す場合がある。この場合、画像処理部205は、これらのプレーンデータを用いて、左側視点のカラー画像データおよび右側視点のカラー画像データを生成する。特に、立体調整パラメータを導入することにより、2Dカラー画像のぼけ量を維持したまま3D画像としての視差量を調整したカラー画像データを生成する。具体的な処理に先立って、まず生成原理について説明する。 In the present embodiment, image processing may be performed on the output image data so that the amount of parallax between generated images is within the target amount of parallax. In this case, the image processing unit 205 generates left-view color image data and right-view color image data using these plane data. In particular, by introducing the stereo adjustment parameter, color image data in which the parallax amount as a 3D image is adjusted while maintaining the blur amount of the 2D color image is generated. Prior to specific processing, the generation principle will be described first.
 図4は、デフォーカスの概念を説明する図である。視差Lt画素および視差Rt画素は、レンズ瞳の部分領域としてそれぞれ光軸対称に設定された2つの視差仮想瞳のいずれかから到達する被写体光束を受光する。本実施形態の光学系においては、実際の被写体光束はレンズ瞳の全体を通過するので、視差画素に到達するまでは、視差仮想瞳に対応する光強度分布が互いに区別されるのではない。しかし、視差画素は、それぞれが有する開口マスクの作用により、視差仮想瞳を通過した部分光束のみを光電変換した画像信号を出力する。したがって、視差画素の出力が示す画素値分布は、それぞれ対応する視差仮想瞳を通過した部分光束の光強度分布と比例関係にあると考えても良い。 FIG. 4 is a diagram for explaining the concept of defocusing. The parallax Lt pixel and the parallax Rt pixel receive a subject light flux that arrives from one of two parallax virtual pupils that are set symmetrically with respect to the optical axis as a partial region of the lens pupil. In the optical system of the present embodiment, since the actual subject light flux passes through the entire lens pupil, the light intensity distributions corresponding to the parallax virtual pupil are not distinguished from each other until the parallax pixel is reached. However, the parallax pixel outputs an image signal obtained by photoelectrically converting only the partial light flux that has passed through the parallax virtual pupil by the action of the aperture mask that each has. Therefore, the pixel value distribution indicated by the output of the parallax pixel may be considered to be proportional to the light intensity distribution of the partial light flux that has passed through the corresponding parallax virtual pupil.
 図4(a)で示すように、被写体である物点が焦点位置に存在する場合、いずれの視差仮想瞳を通った被写体光束であっても、それぞれの視差画素の出力は、対応する像点の画素を中心として急峻な画素値分布を示す。像点付近に視差Lt画素が配列されていれば、像点に対応する画素の出力値が最も大きく、周辺に配列された画素の出力値が急激に低下する。また、像点付近に視差Rt画素が配列されていても、像点に対応する画素の出力値が最も大きく、周辺に配列された画素の出力値が急激に低下する。すなわち、被写体光束がいずれの視差仮想瞳を通過しても、像点に対応する画素の出力値が最も大きく、周辺に配列された画素の出力値が急激に低下する分布を示し、それぞれの分布は互いに一致する。 As shown in FIG. 4A, when an object point that is a subject exists at the focal position, the output of each parallax pixel is the corresponding image point regardless of the subject luminous flux that has passed through any parallax virtual pupil. This shows a steep pixel value distribution centering on this pixel. If the parallax Lt pixels are arranged in the vicinity of the image point, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. Further, even when the parallax Rt pixels are arranged in the vicinity of the image point, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. That is, even if the subject luminous flux passes through any parallax virtual pupil, the output value of the pixel corresponding to the image point is the largest, and the output value of the pixels arranged in the vicinity rapidly decreases. Match each other.
 一方、図4(b)に示すように、物点が焦点位置からずれると、物点が焦点位置に存在した場合に比べて、視差Lt画素が示す画素値分布のピークは、像点に対応する画素から一方向に離れた位置に現れ、かつその出力値は低下する。また、出力値を有する画素の幅も広がる。視差Rt画素が示す画素値分布のピークは、像点に対応する画素から、視差Lt画素における一方向とは逆向きかつ等距離に離れた位置に現れ、同様にその出力値は低下する。また、同様に出力値を有する画素の幅も広がる。すなわち、物点が焦点位置に存在した場合に比べてなだらかとなった同一の画素値分布が、互いに等距離に離間して現れる。また、図4(c)に示すように、さらに物点が焦点位置からずれると、図4(b)の状態に比べて、さらになだらかとなった同一の画素値分布が、より離間して現れる。つまり、物点が焦点位置から大きくずれる程、ぼけ量と視差量が増すと言える。別言すれば、デフォーカスに応じて、ぼけ量と視差量は連動して変化する。すなわち、ぼけ量と視差量は、一対一に対応する関係を有する。 On the other hand, as shown in FIG. 4B, when the object point deviates from the focal position, the peak of the pixel value distribution indicated by the parallax Lt pixel corresponds to the image point, compared to the case where the object point exists at the focal position. Appearing at a position away from the pixel in one direction, and its output value decreases. In addition, the width of the pixel having the output value is increased. The peak of the pixel value distribution indicated by the parallax Rt pixel appears at a position away from the pixel corresponding to the image point in the opposite direction to the one direction in the parallax Lt pixel and at an equal distance, and the output value similarly decreases. Similarly, the width of the pixel having the output value is increased. That is, the same pixel value distribution that is gentler than that in the case where the object point exists at the focal position appears at an equal distance from each other. Further, as shown in FIG. 4C, when the object point further deviates from the focal position, the same pixel value distribution that becomes more gentle as compared with the state of FIG. 4B appears further apart. . That is, it can be said that the amount of blur and the amount of parallax increase as the object point deviates from the focal position. In other words, the amount of blur and the amount of parallax change in conjunction with defocus. That is, the amount of blur and the amount of parallax have a one-to-one relationship.
 また、図4(b)(c)は、物点が焦点位置から遠ざかる方向へずれる場合を示すが、物点が焦点位置から近づく方向へずれる場合は、図4(d)に示すように、図4(b)(c)と比べて、視差Lt画素が示す画素値分布と視差Rt画素が示す画素値分布の相対的な位置関係が逆転する。このようなデフォーカス関係により、視差画像を鑑賞するときに鑑賞者は、焦点位置より奥に存在する被写体を遠くに視認し、手前に存在する被写体を近くに視認する。 FIGS. 4B and 4C show the case where the object point shifts away from the focal position, but when the object point moves away from the focal position, as shown in FIG. Compared to FIGS. 4B and 4C, the relative positional relationship between the pixel value distribution indicated by the parallax Lt pixel and the pixel value distribution indicated by the parallax Rt pixel is reversed. Due to such a defocus relationship, when viewing a parallax image, the viewer visually recognizes a subject existing far behind the focal position and visually recognizes a subject present in front.
 図4(b)(c)で説明した画素値分布の変化をそれぞれグラフ化すると、図5のように表される。図において、横軸は画素位置を表し、中心位置が像点に対応する画素位置である。縦軸は各画素の出力値(画素値)を表す。この出力値は上述の通り実質的に光強度に比例する。 FIG. 5 is a graph showing changes in the pixel value distribution described in FIGS. 4B and 4C. In the figure, the horizontal axis represents the pixel position, and the center position is the pixel position corresponding to the image point. The vertical axis represents the output value (pixel value) of each pixel. As described above, this output value is substantially proportional to the light intensity.
 分布曲線1804と分布曲線1805は、それぞれ図4(b)の視差Lt画素の画素値分布と視差Rt画素の画素値分布を表す。図からわかるように、これらの分布は中心位置に対して線対称の形状を成す。また、これらを足し合わせた合成分布曲線1806は、図4(b)の状況に対する視差なし画素の画素値分布、すなわち被写体光束の全体を受光した場合の画素値分布と略相似形状を示す。 The distribution curve 1804 and the distribution curve 1805 represent the pixel value distribution of the parallax Lt pixel and the pixel value distribution of the parallax Rt pixel in FIG. 4B, respectively. As can be seen from the figure, these distributions have a line-symmetric shape with respect to the center position. Further, a combined distribution curve 1806 obtained by adding them shows a pixel value distribution of pixels without parallax with respect to the situation of FIG. 4B, that is, a pixel value distribution when the entire subject luminous flux is received, and a substantially similar shape.
 分布曲線1807と分布曲線1808は、それぞれ図4(c)の視差Lt画素の画素値分布と視差Rt画素の画素値分布を表す。図からわかるように、これらの分布も中心位置に対して線対称の形状を成す。また、これらを足し合わせた合成分布曲線1809は、図4(c)の状況に対する視差なし画素の画素値分布と略相似形状を示す。 The distribution curve 1807 and the distribution curve 1808 represent the pixel value distribution of the parallax Lt pixel and the pixel value distribution of the parallax Rt pixel in FIG. 4C, respectively. As can be seen from the figure, these distributions are also symmetrical with respect to the center position. Also, a combined distribution curve 1809 obtained by adding them shows a shape substantially similar to the pixel value distribution of the non-parallax pixels for the situation of FIG.
 本実施形態において立体調整パラメータを利用して画像処理する場合は、実際に撮像素子100の出力値として取得され、空格子が補間処理された、このような画素値分布の視差Lt画素の画素値と視差Rt画素の画素値とを用いて、仮想的な画素値分布をつくり出す。このとき、画素値分布の広がりによって表現されるぼけ量はおよそ維持しつつ、ピーク間の間隔として表現される視差量を調整する。つまり、本実施形態において画像処理部205は、2D画像のぼけ量をほぼそのまま維持しつつも、視差無し画素から生成される2D画像と視差画素から生成される3D画像との間に調整された視差量を有する画像を生成する。図6は、調整視差量の概念を説明するための画素値分布を示す図である。 In the present embodiment, when image processing is performed using the stereoscopic adjustment parameter, the pixel value of the parallax Lt pixel of such a pixel value distribution obtained as an output value of the image sensor 100 and subjected to interpolation processing on the empty grid. And a pixel value of the parallax Rt pixel are used to create a virtual pixel value distribution. At this time, the amount of parallax expressed as an interval between peaks is adjusted while approximately maintaining the amount of blur expressed by the spread of the pixel value distribution. That is, in this embodiment, the image processing unit 205 is adjusted between the 2D image generated from the non-parallax pixel and the 3D image generated from the parallax pixel while maintaining the blur amount of the 2D image almost as it is. An image having a parallax amount is generated. FIG. 6 is a diagram illustrating a pixel value distribution for explaining the concept of the adjusted parallax amount.
 図において実線で示すLt分布曲線1901とRt分布曲線1902は、LtプレーンデータとRtプレーンデータの実際の画素値をプロットした分布曲線である。例えば、図5における分布曲線1804、1805に相当する。そして、Lt分布曲線1901とRt分布曲線1902のそれぞれのピーク間距離は3D視差量を表し、この距離が大きいほど、画像再生時の立体感が強くなる。 Lt distribution curve 1901 and Rt distribution curve 1902 indicated by solid lines in the figure are distribution curves in which actual pixel values of Lt plane data and Rt plane data are plotted. For example, it corresponds to the distribution curves 1804 and 1805 in FIG. The distance between the peaks of the Lt distribution curve 1901 and the Rt distribution curve 1902 represents the 3D parallax amount, and the greater the distance, the stronger the stereoscopic effect during image reproduction.
 Lt分布曲線1901とRt分布曲線1902とをそれぞれ5割として足し合わせた2D分布曲線1903は、左右に偏りのない凸形状となる。2D分布曲線1903は、図5における合成分布曲線1806の高さを1/2にした形状に相当する。すなわち、この分布に基づく画像は、視差量0の2D画像となる。 The 2D distribution curve 1903 obtained by adding 50% each of the Lt distribution curve 1901 and the Rt distribution curve 1902 has a convex shape with no left-right bias. The 2D distribution curve 1903 corresponds to a shape in which the height of the combined distribution curve 1806 in FIG. That is, an image based on this distribution is a 2D image with a parallax amount of zero.
 調整Lt分布曲線1905は、Lt分布曲線1901の8割と、Rt分布曲線1902の2割を足し合わせた曲線である。調整Lt分布曲線1905のピークは、Rt分布曲線1902の成分が加えられる分だけ、Lt分布曲線1901のピークよりも中心寄りに変位する。同様に、調整Rt分布曲線1906は、Lt分布曲線1901の2割と、Rt分布曲線1902の8割を足し合わせた曲線である。調整Rt分布曲線1906のピークは、Lt分布曲線1901の成分が加えられる分だけ、Rt分布曲線1902のピークよりも中心寄りに変位する。 The adjusted Lt distribution curve 1905 is a curve obtained by adding 80% of the Lt distribution curve 1901 and 20% of the Rt distribution curve 1902. The peak of the adjusted Lt distribution curve 1905 is displaced closer to the center than the peak of the Lt distribution curve 1901 as much as the component of the Rt distribution curve 1902 is added. Similarly, the adjusted Rt distribution curve 1906 is a curve obtained by adding 20% of the Lt distribution curve 1901 and 80% of the Rt distribution curve 1902. The peak of the adjusted Rt distribution curve 1906 is displaced closer to the center than the peak of the Rt distribution curve 1902 by the amount to which the component of the Lt distribution curve 1901 is added.
 したがって、調整Lt分布曲線1905と調整Rt分布曲線1906のそれぞれのピーク間距離で表される調整視差量は、3D視差量よりも小さくなる。したがって、画像再生時の立体感は、緩和される。一方で、調整Lt分布曲線1905と調整Rt分布曲線1906のそれぞれの分布の広がりは、2D分布曲線1903の広がりと同等なので、ぼけ量は2D画像のそれと等しいと言える。 Therefore, the adjusted parallax amount represented by the distance between the peaks of the adjusted Lt distribution curve 1905 and the adjusted Rt distribution curve 1906 is smaller than the 3D parallax amount. Therefore, the stereoscopic effect during image reproduction is alleviated. On the other hand, since the spread of each of the adjusted Lt distribution curve 1905 and the adjusted Rt distribution curve 1906 is equivalent to the spread of the 2D distribution curve 1903, it can be said that the amount of blur is equal to that of the 2D image.
 すなわち、Lt分布曲線1901とRt分布曲線1902をそれぞれどれくらいの割合で加算するかにより、調整視差量を制御することができる。そして、この調整された画素値分布を、視差なし画素から生成されたカラー画像データの各プレーンに適用することにより、視差画素から生成された視差画像データとは異なる立体感を与える左側視点のカラー画像データと右側視点のカラー画像データとを生成することができる。 That is, the amount of adjustment parallax can be controlled by how much the Lt distribution curve 1901 and the Rt distribution curve 1902 are added. Then, by applying this adjusted pixel value distribution to each plane of color image data generated from pixels without parallax, the color of the left viewpoint that gives a stereoscopic effect different from that of parallax image data generated from parallax pixels Image data and right-view color image data can be generated.
 本実施形態においては、図3を用いて説明した9つのプレーンデータから、左側視点のカラー画像データと右側視点のカラー画像データを生成する。左側視点のカラー画像データは、左側視点に対応する赤色プレーンデータであるRLtプレーンデータ、緑色プレーンデータであるGLtプレーンデータ、および青色プレーンデータであるBLtプレーンデータの3つのカラー視差プレーンデータによって構成される。同様に、右側視点のカラー画像データは、右側視点に対応する赤色プレーンデータであるRRtプレーンデータ、緑色プレーンデータであるGRtプレーンデータ、および青色プレーンデータであるBRtプレーンデータの3つのカラー視差プレーンデータによって構成される。 In the present embodiment, left-view color image data and right-view color image data are generated from the nine plane data described with reference to FIG. Color image data of the left viewpoint, RLt c plane data is red plane data corresponding to the left viewpoint, a green plane data GLt c plane data, and three color parallax plane data BLt c plane data is blue plane data Consists of. Similarly, the color image data of the right-side perspective is, RRT c plane data is red plane data corresponding to the right viewpoint, a green plane data GRT c plane data, and three of BRt c plane data is blue plane Datacolor Consists of parallax plane data.
 図7は、カラー視差プレーンデータの生成処理を説明する図である。特に、カラー視差プレーンのうち赤色視差プレーンである、RLtプレーンデータとRRtプレーンデータの生成処理について示す。 FIG. 7 is a diagram for explaining color parallax plane data generation processing. In particular, a generation process of RLt c plane data and RRt c plane data, which are red parallax planes among color parallax planes, will be described.
 赤色視差プレーンは、図3を用いて説明したRnプレーンデータの画素値と、RLtプレーンデータおよびRRtプレーンデータの画素値とを用いて生成する。具体的には、例えばRLtプレーンデータの対象画素位置(i,j)の画素値RLtmnを算出する場合、まず、画像処理部205の視差画像データ生成部233は、Rnプレーンデータの同一画素位置(i,j)から画素値Rnmnを抽出する。次に、視差画像データ生成部233は、RLtプレーンデータの同一画素位置(i,j)から画素値RLtmnを、RRtプレーンデータの同一画素位置(i,j)から画素値RRtmnを抽出する。そして、視差画像データ生成部233は、画素値Rnmnに、画素値RLtmnとRRtmnを立体調整パラメータCの値で分配した値を乗じて、画素値RLtcmnを算出する。具体的には、以下の式(1)により算出する。
Figure JPOXMLDOC01-appb-M000001
The red parallax plane is generated using the pixel value of the Rn plane data described with reference to FIG. 3, and the pixel value of the RLt plane data and the RRt plane data. Specifically, for example, when calculating the pixel value RLt mn of the target pixel position (i m , j n ) of the RLt c plane data, first, the parallax image data generation unit 233 of the image processing unit 205 stores the Rn plane data A pixel value Rn mn is extracted from the same pixel position (i m , j n ). Next, the parallax image data generating unit 233, the same pixel position of RLt plane data (i m, j n) pixel values RLt mn from the same pixel position of RRt plane data (i m, j n) pixel values from RRt Extract mn . Then, the parallax image data generation unit 233 multiplies the pixel value Rn mn by the value obtained by distributing the pixel values RLt mn and RRt mn by the value of the stereoscopic adjustment parameter C, thereby calculating the pixel value RLt cmn . Specifically, it is calculated by the following equation (1).
Figure JPOXMLDOC01-appb-M000001
 同様に、RRtプレーンデータの対象画素位置(i,j)の画素値RRtcmnを算出する場合も、視差画像データ生成部233は、抽出した画素値Rnmnに、画素値RLtmnと画素値RRtmnを立体調整パラメータCの値で分配した値を乗じて算出する。具体的には、以下の式(2)により算出する。
Figure JPOXMLDOC01-appb-M000002
 視差画像データ生成部233は、このような処理を、左端かつ上端の画素である(1、1)から右端かつ下端の座標である(i,j)まで順次実行する。
Similarly, when calculating the pixel value RRt cmd of the target pixel position (i m , j n ) of the RRt c plane data, the parallax image data generation unit 233 uses the extracted pixel value Rn mn and the pixel value RLt mn as well. The pixel value RRt mn is calculated by multiplying the value obtained by distributing the three-dimensional adjustment parameter C by the value. Specifically, it is calculated by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
The parallax image data generation unit 233 sequentially executes such processing from (1, 1) which is the pixel at the left end and the upper end to (i 0 , j 0 ) which is the coordinates at the right end and the lower end.
 そして、赤色視差プレーンであるRLtプレーンデータとRRtプレーンデータの生成処理が完了したら、次に緑色視差プレーンであるGLtプレーンデータとGRtプレーンデータの生成処理を実行する。具体的には、上述の説明においてRnプレーンデータの同一画素位置(i,j)から画素値Rnmnを抽出する代わりに、Gnプレーンデータの同一画素位置(i,j)から画素値Gnmnを抽出する。また、RLtプレーンデータの同一画素位置(i,j)から画素値RLtmnを抽出する代わりに、GLtプレーンデータの同一画素位置(i,j)から画素値GLtmnを抽出する。同様に、RRtプレーンデータの同一画素位置(i,j)から画素値RRtmnを抽出する代わりに、GRtプレーンデータの同一画素位置(i,j)から画素値GRtmnを抽出する。そして、式(1)および式(2)の各パラメータの値を適宜変更して同様に処理する。 When the generation processing of RLt c plane data and RRt c plane data that are red parallax planes is completed, generation processing of GLt c plane data and GRt c plane data that are green parallax planes is then executed. Specifically, the pixel same pixel position (i m, j n) of Rn plane data in the above description, instead of extracting the pixel values Rn mn from the same pixel position of Gn plane data (i m, j n) from The value Gn mn is extracted. Moreover, the same pixel position of RLt plane data (i m, j n) from instead of extracting the pixel value RLt mn, extracts the pixel value GLt mn from the same pixel position of GLt plane data (i m, j n). Similarly, the same pixel position of the RRT plane data (i m, j n) from instead of extracting the pixel value RRT mn, extracts the pixel value GRT mn from the same pixel position of GRT plane data (i m, j n) . And the value of each parameter of Formula (1) and Formula (2) is changed suitably, and it processes similarly.
 さらに、緑色視差プレーンであるGLtプレーンデータとGRtプレーンデータの生成処理が完了したら、次に青色視差プレーンであるBLtプレーンデータとBRtプレーンデータの生成処理を実行する。具体的には、上述の説明においてRnプレーンデータの同一画素位置(i,j)から画素値Rnmnを抽出する代わりに、Bnプレーンデータの同一画素位置(i,j)から画素値Bnmnを抽出する。また、RLtプレーンデータの同一画素位置(i,j)から画素値RLtmnを抽出する代わりに、BLtプレーンデータの同一画素位置(i,j)から画素値BLtmnを抽出する。同様に、RRtプレーンデータの同一画素位置(i,j)から画素値RRtmnを抽出する代わりに、BRtプレーンデータの同一画素位置(i,j)から画素値BRtmnを抽出する。そして、式(1)および式(2)の各パラメータの値を適宜変更して同様に処理する。 Further, when the generation processing of the GLt c plane data and the GRt c plane data which are green parallax planes is completed, the generation processing of the BLt c plane data and BRt c plane data which are blue parallax planes is executed next. Specifically, the pixel same pixel position (i m, j n) of Rn plane data in the above description, instead of extracting the pixel values Rn mn from the same pixel position of Bn plane data (i m, j n) from Extract the value Bn mn . Moreover, the same pixel position of RLt plane data (i m, j n) from instead of extracting the pixel value RLt mn, extracts the pixel value BLt mn from the same pixel position of BLt plane data (i m, j n). Similarly, the same pixel position of the RRT plane data (i m, j n) from instead of extracting the pixel value RRT mn, extracts the pixel value BRt mn from the same pixel position of BRt plane data (i m, j n) . And the value of each parameter of Formula (1) and Formula (2) is changed suitably, and it processes similarly.
 以上の処理により、左側視点のカラー画像データ(RLtプレーンデータ、GLtプレーンデータ、BLtプレーンデータ)および右側視点のカラー画像データ(RRtプレーンデータ、GRtプレーンデータ、BRtプレーンデータ)が生成される。すなわち、実際には撮像素子100の画素として存在しない仮想的な出力として、左側視点および右側視点のカラー画像データを、比較的簡易な処理により取得することができる。 Through the above processing, left-view color image data (RLt c- plane data, GLt c- plane data, BLt c- plane data) and right-view color image data (RRt c- plane data, GRt c- plane data, BRt c- plane data) Is generated. That is, the color image data of the left viewpoint and the right viewpoint can be acquired by a relatively simple process as a virtual output that does not actually exist as a pixel of the image sensor 100.
 しかも、立体調整パラメータCの値を0.5<C<1の範囲で変更できるので、視差無し画素による2Dカラー画像のぼけ量を維持したまま、3D画像としての視差量の大きさを調整することができる。したがって、これらの画像データを3D画像対応の再生装置で再生すれば、立体映像表示パネルの鑑賞者は、カラー画像として立体感が適度に調整された3D映像を鑑賞できる。特に、処理が簡易なので高速に画像データを生成することができ、動画像にも対応できる。 In addition, since the value of the stereoscopic adjustment parameter C can be changed within the range of 0.5 <C <1, the amount of parallax as a 3D image is adjusted while maintaining the amount of blur of the 2D color image due to pixels without parallax. be able to. Therefore, if these image data are reproduced by a 3D image compatible reproduction device, the viewer of the stereoscopic video display panel can appreciate the 3D video in which the stereoscopic effect is appropriately adjusted as a color image. In particular, since the processing is simple, it is possible to generate image data at high speed and to deal with moving images.
 但し、本実施形態では、以上のようなカラー視差プレーンデータの生成処理において、被写体ごとに立体調整パラメータCの値を0.5<C<1の範囲で決定して用いる。この点、立体調整パラメータCの値の決定に関する具体的な説明に先立って、まずは被写体ごとに決定される立体調整パラメータCを用いてカラー視差プレーンデータを生成する利点について説明する。 However, in the present embodiment, in the color parallax plane data generation process as described above, the value of the stereoscopic adjustment parameter C is determined and used in a range of 0.5 <C <1 for each subject. In this regard, prior to specific description regarding determination of the value of the stereoscopic adjustment parameter C, first, the advantage of generating color parallax plane data using the stereoscopic adjustment parameter C determined for each subject will be described.
 図8は、視差量と立体調整パラメータの値との関係を模式的に示す図である。この図において、横軸はデジタルカメラ10からの距離、つまりシーンに対する奥行きを表し、縦軸は視差量を表す。 FIG. 8 is a diagram schematically showing the relationship between the parallax amount and the value of the three-dimensional adjustment parameter. In this figure, the horizontal axis represents the distance from the digital camera 10, that is, the depth with respect to the scene, and the vertical axis represents the amount of parallax.
 なお、デジタルカメラ10は、距離L10に位置する被写体(図中の合焦被写体を参照)に焦点を合わせている。また、本実施形態においては、許容される視差量の閾値が±mとして予め定められている。 The digital camera 10, the object located at a distance L 10 (see focus object in the drawing) are focused. In the present embodiment, the threshold value of the allowable amount of parallax is predetermined as ± m.
 撮影画像データの画像に含まれる複数の被写体は、左右視差画像データにおいてそれぞれ未調整の視差量を有している。例えば、図8で表されるシーンの撮影画像では、距離L10に位置する被写体に加え、距離L20及び距離L30に位置する2つの被写体(図中の近点被写体及び遠点被写体を参照)が含まれている。これら3つの被写体についての未調整の視差量は視差量m10(m10=0)、視差量m20(m20>m)及び視差量m30(m30<-m)となっている。 The plurality of subjects included in the captured image data have unadjusted parallax amounts in the left and right parallax image data. For example, in the photographic image of a scene represented in Figure 8, in addition to the subject located at a distance L 10, 2 two subject located at a distance L 20 and the distance L 30 (see a near point object and far-point object in FIG. )It is included. The unadjusted parallax amounts for these three subjects are the parallax amount m 10 (m 10 = 0), the parallax amount m 20 (m 20 > m), and the parallax amount m 30 (m 30 <−m).
 そして、本実施形態では、このような未調整の視差量に応じて被写体のそれぞれに対して立体調整パラメータCの値が決定され、視差量の調整に用いられている。例えば、合焦被写体に対しては立体調整パラメータCの値がC10として決定され、この値C10によって未調整の視差量m10は視差量の目標値m100に調整されている。但し、視差量m10は合焦被写体の視差量であるので、未調整の視差量m10及び視差量の目標値m100はいずれも0である。 In this embodiment, the value of the three-dimensional adjustment parameter C is determined for each subject according to such an unadjusted parallax amount, and is used for adjusting the parallax amount. For example, for focusing the object value of the three-dimensional adjustment parameter C is determined as C 10, the parallax amount m 10 unadjusted this value C 10 is adjusted to the target value m 100 of the parallax amount. However, since the amount of parallax m 10 is a parallax amount of the focusing object, the target value m 100 of the parallax amount m 10 and the parallax amount of unadjusted are both 0.
 また、近点被写体に対しては立体調整パラメータCの値がC20として決定され、この値C20によって未調整の視差量m20は視差量の目標値m200(m200≦m)に調整されている。また、遠点被写体に対しては立体調整パラメータCの値がC30として決定され、この値C30によって未調整の視差量m30は視差量の目標値m300(m300≧-m)に調整されている。 Further, the value of the stereoscopic adjustment parameter C is determined as C 20 for the near-point subject, and the unadjusted parallax amount m 20 is adjusted to the parallax amount target value m 200 (m 200 ≦ m) by this value C 20 . Has been. Further, the value of the stereoscopic adjustment parameter C is determined as C 30 for the far-point subject, and the unadjusted parallax amount m 30 is set to the parallax amount target value m 300 (m 300 ≧ −m) by this value C 30 . It has been adjusted.
 同様に、図8で表されるシーンに他の被写体が存在する場合には、この被写体についての未調整の視差量は、図中の視差量曲線1622で示される値となる。そして、視差量の目標値は、図中の調整視差量曲線1623で示される値となる。 Similarly, when another subject exists in the scene shown in FIG. 8, the unadjusted parallax amount for this subject is a value indicated by the parallax amount curve 1622 in the figure. The target value of the parallax amount is a value indicated by the adjusted parallax amount curve 1623 in the drawing.
 ここで、図中の視差量曲線1622は、デジタルカメラ10からの距離と、各距離に被写体が位置したと仮定した場合に各被写体で生じる未調整の視差量との関係を表す。この視差量曲線1622は、-m~+mの範囲外の領域を含んでいる。理解を容易にする観点から、図では、視差量曲線1622のうち、-m~+mの範囲外の領域を枠W1で囲っており、-m~+mの範囲内の領域を枠W2で囲っている。 Here, the parallax amount curve 1622 in the figure represents the relationship between the distance from the digital camera 10 and the unadjusted parallax amount generated in each subject when it is assumed that the subject is located at each distance. The parallax amount curve 1622 includes a region outside the range of −m to + m. From the viewpoint of facilitating understanding, in the figure, the region outside the range of −m to + m in the parallax amount curve 1622 is surrounded by the frame W1, and the region within the range of −m to + m is surrounded by the frame W2. Yes.
 また、調整視差量曲線1623は、デジタルカメラ10からの距離と、各距離に被写体が位置したと仮定した場合に各被写体で生じさせる視差量の目標値との関係を表す。この調整視差量曲線1623で示される視差量の目標値は、視差量曲線1622が設定された距離の範囲において、-m~+mの範囲内に含まれる。 The adjusted parallax amount curve 1623 represents the relationship between the distance from the digital camera 10 and the target value of the parallax amount generated in each subject when it is assumed that the subject is located at each distance. The target value of the parallax amount indicated by the adjusted parallax amount curve 1623 is included in the range of −m to + m within the distance range in which the parallax amount curve 1622 is set.
 以上のような調整視差量曲線1623は、少なくとも-m~+mの範囲内の被写体と、-m~+mの範囲外の被写体とに対して異なる立体調整パラメータCの値が設定されるような形状となっている。具体的な形状は後述する。 The adjustment parallax amount curve 1623 as described above has a shape in which different values of the three-dimensional adjustment parameter C are set for a subject at least in the range of −m to + m and a subject outside the range of −m to + m. It has become. The specific shape will be described later.
 なお、これらの視差量曲線1622および調整視差量曲線1623の関数では、被写体同士の奥行き関係、つまり、デジタルカメラ10からの距離と、視差量との関係が崩れていない。具体的には、デジタルカメラ10からの距離が大きいほど、視差量が小さくなり、視差量が0を超えると負の値が大きくなる。換言すると、デジタルカメラ10からの距離が大きい被写体の視差量mと、距離が小さい被写体の視差量mとがm≧mを満たしている。
 なお、調整視差量曲線1623では、+mより大きい視差量は、許容視差量の上限である+mになるように調整され、-mより小さい視差量は、許容視差量の下限である-mになるように調整される。このようにすると、主要被写体でない被写体の視差を必要以上に抑制することがなく、かつ、被写体関係の奥行き関係が崩れることもない。
In the functions of the parallax amount curve 1622 and the adjusted parallax amount curve 1623, the depth relationship between subjects, that is, the relationship between the distance from the digital camera 10 and the parallax amount is not broken. Specifically, the greater the distance from the digital camera 10, the smaller the parallax amount. When the parallax amount exceeds 0, the negative value increases. In other words, the parallax amount m a distance greater object from the digital camera 10, the distance and the parallax amount m b a small object meets m b ≧ m a.
In the adjusted parallax amount curve 1623, the parallax amount larger than + m is adjusted to be + m which is the upper limit of the allowable parallax amount, and the parallax amount smaller than −m is the lower limit of the allowable parallax amount −m It is adjusted to become. In this way, the parallax of the subject that is not the main subject is not suppressed more than necessary, and the depth relationship of the subject is not disrupted.
 以上のような調整視差量曲線1623によれば、未調整の視差量が-m~+mの範囲内の被写体では、この被写体について視差量曲線1622に沿う未調整の視差量が概ね維持されるように立体調整パラメータCの値が決定されて視差量が調整される。これにより、-m~+mの範囲内の被写体では、立体感が維持される。ここで、未調整の視差量が概ね維持されるよう視差量が調整されるとは、未調整の視差量が維持されるか、或いは未調整の視差量に近い値になるよう、視差量が調整されることを意味する。 According to the adjusted parallax amount curve 1623 as described above, the unadjusted parallax amount along the parallax amount curve 1622 is generally maintained for the subject in the range where the unadjusted parallax amount is in the range of −m to + m. Then, the value of the three-dimensional adjustment parameter C is determined and the amount of parallax is adjusted. As a result, the stereoscopic effect is maintained in the subject within the range of −m to + m. Here, adjusting the parallax amount so that the unadjusted parallax amount is generally maintained means that the parallax amount is adjusted so that the unadjusted parallax amount is maintained or becomes a value close to the unadjusted parallax amount. Means to be adjusted.
 一方、未調整の視差量が-m~+mの範囲に含まれない被写体では、-m~+mの範囲内に含まれるように、-m~+mの範囲内の被写体とは異なる立体調整パラメータCの値が決定されて視差量が調整される。これにより、-m~+mの範囲外の被写体では視差量が抑制され、鑑賞時の疲労感、違和感が低減される。 On the other hand, in a subject whose unadjusted parallax amount is not included in the range of −m to + m, the stereoscopic adjustment parameter C different from that of the subject in the range of −m to + m is included so as to be included in the range of −m to + m. Is determined and the amount of parallax is adjusted. As a result, the amount of parallax is suppressed in subjects outside the range of −m to + m, and fatigue and discomfort during viewing are reduced.
 このように、被写体ごとに決定された立体調整パラメータCの値を用いてカラー視差プレーンデータを生成する場合には、未調整の視差量が大きい被写体と、視差量が小さい被写体とで、視差量の調整量を意図的に相違させることができる。そのため、主要な被写体については立体感を残しつつ、主要でない被写体では視差量を抑制することができるため、鑑賞時の違和感、疲労感を低減することができる。 As described above, when the color parallax plane data is generated using the value of the stereoscopic adjustment parameter C determined for each subject, the amount of parallax between a subject with a large unadjusted amount of parallax and a subject with a small amount of parallax. The adjustment amount can be intentionally varied. For this reason, it is possible to reduce the amount of parallax for a non-main subject while maintaining a stereoscopic effect for the main subject, and to reduce the sense of discomfort and fatigue during viewing.
 なお、デジタルカメラ10からの距離が等しい被写体が複数存在する場合には、これら複数の被写体については未調整の視差量同士が等しいため、この視差量を調整するための立体調整パラメータCの値同士も等しくなる。このことから、画像に含まれる被写体のそれぞれに対し、未調整の視差量に応じて立体調整パラメータCの値が決定されるとは、画像に含まれる複数の被写体についての未調整の視差量それぞれに対して立体調整パラメータCの値が決定されることと同義である。 When there are a plurality of subjects having the same distance from the digital camera 10, the unadjusted parallax amounts are the same for the plurality of subjects, and therefore the values of the three-dimensional adjustment parameter C for adjusting the parallax amount are the same. Are also equal. From this, for each subject included in the image, the value of the stereoscopic adjustment parameter C is determined according to the unadjusted parallax amount. The unadjusted parallax amounts for a plurality of subjects included in the image Is the same as determining the value of the three-dimensional adjustment parameter C.
 続いて、被写体ごとに立体調整パラメータCを決定する原理について説明する。なお、以下で説明する原理は、後述のルックアップテーブル2310を理解しやすくするためのものである。従って、必ずしもこの原理の通りにデジタルカメラ10内で立体調整パラメータCの値が決定されなくても良い。 Subsequently, the principle of determining the three-dimensional adjustment parameter C for each subject will be described. The principle described below is for facilitating understanding of a lookup table 2310 described later. Therefore, the value of the stereoscopic adjustment parameter C does not necessarily have to be determined in the digital camera 10 according to this principle.
 上述したように、本実施形態では、画像に含まれる複数の被写体のうち-m~+mの範囲内の被写体に対しては、未調整の視差量が概ね維持されるような立体調整パラメータCの値が算出される。また、-m~+mの範囲外の被写体に対しては、-m~+mの範囲内に視差量が調整されるように立体調整パラメータCの値が決定される。 As described above, in the present embodiment, the three-dimensional adjustment parameter C is such that the unadjusted parallax amount is generally maintained for subjects in the range of −m to + m among a plurality of subjects included in the image. A value is calculated. For a subject outside the range of −m to + m, the value of the stereoscopic adjustment parameter C is determined so that the parallax amount is adjusted within the range of −m to + m.
 このように立体調整パラメータCの値を決定するには、まず、被写体の視差量が左右視差画像データから演算される。これにより、例えば、合焦被写体、近点被写体、遠点被写体についての未調整の視差量m10、m20、m30がそれぞれ検出される。 To determine the value of the three-dimensional adjustment parameter C in this way, first, the parallax amount of the subject is calculated from the left and right parallax image data. Thereby, for example, the unadjusted parallax amounts m 10 , m 20 , and m 30 for the focused subject, the near point subject, and the far point subject are detected, respectively.
 また、焦点位置からのデフォーカス量に基づいて、デジタルカメラ10から、この被写体までの距離が算出される。これにより、例えば、デジタルカメラ10から合焦被写体、近点被写体、遠点被写体までの距離L10、L20、L30が算出される。 Further, the distance from the digital camera 10 to the subject is calculated based on the defocus amount from the focal position. Thereby, for example, distances L 10 , L 20 , and L 30 from the digital camera 10 to the focused subject, the near point subject, and the far point subject are calculated.
 次に、-m~+mの範囲外に位置する各被写体について、算出された距離に対応する視差量の目標値が調整視差量曲線1623から決定される。具体的には、調整視差量曲線1623上から、算出された距離に対応する点が決定され、この点の縦軸座標が視差量の目標値として決定される。これにより、例えば、合焦被写体、近点被写体、遠点被写体についての視差量の目標値m100、目標値m200、目標値m300が決定される。 Next, for each subject located outside the range of −m to + m, the target value of the parallax amount corresponding to the calculated distance is determined from the adjusted parallax amount curve 1623. Specifically, a point corresponding to the calculated distance is determined from the adjusted parallax amount curve 1623, and the vertical coordinate of this point is determined as the target value of the parallax amount. Thereby, for example, the target value m 100 , the target value m 200 , and the target value m 300 of the parallax amount for the focused subject, the near point subject, and the far point subject are determined.
 そして、被写体ごとに未調整の視差量が視差量の目標値に調整されるような立体調整パラメータCの値が算出される。これにより、例えば、未調整の視差量m10(m10=0)の合焦被写体については、視差量m10が概ね維持されるような立体調整パラメータCの値が算出される。また、近点被写体についての未調整の視差量m20が視差量の目標値m200に調整されるような値C20が算出される。また、遠点被写体についての未調整の視差量m30が視差量の目標値m300に調整されるような値C30が算出される。 Then, the value of the stereoscopic adjustment parameter C is calculated so that the unadjusted parallax amount is adjusted to the target value of the parallax amount for each subject. Thus, for example, for a focused subject with an unadjusted parallax amount m 10 (m 10 = 0), the value of the stereoscopic adjustment parameter C is calculated so that the parallax amount m 10 is substantially maintained. The value C 20 as parallax amount m 20 unadjusted for near-point object is adjusted to the target value m 200 of the parallax amount is calculated. Further, a value C 30 is calculated such that the unadjusted parallax amount m 30 for the far point subject is adjusted to the target value m 300 for the parallax amount.
 なお、或る被写体についての未調整の視差量を視差量の目標値に調整するような立体調整パラメータCの値を算出するには、まず、或る立体調整パラメータCを用いて3D画像を生成した後、その被写体の視差量を算出する処理を行う。そして、算出結果をフィードバックしつつ、この処理を繰り返し行ことで、未調整の視差量が視差量の目標値に調整されるときの立体調整パラメータCの値を算出する。好ましくは、このようにして得られる未調整の視差量と、視差量の目標値と、立体調整パラメータCの値との対応関係から予めルックアップテーブルを生成しておき、このルックアップテーブルを用いて立体調整パラメータCの値を算出する。 In order to calculate the value of the stereoscopic adjustment parameter C that adjusts the unadjusted parallax amount for a certain subject to the target value of the parallax amount, first, a 3D image is generated using the certain stereoscopic adjustment parameter C. After that, processing for calculating the parallax amount of the subject is performed. Then, by repeating this process while feeding back the calculation result, the value of the stereo adjustment parameter C when the unadjusted parallax amount is adjusted to the target value of the parallax amount is calculated. Preferably, a lookup table is generated in advance from the correspondence between the unadjusted parallax amount obtained in this way, the target value of the parallax amount, and the value of the stereoscopic adjustment parameter C, and this lookup table is used. Then, the value of the three-dimensional adjustment parameter C is calculated.
 ここで、以上のような立体調整パラメータCの値の決定に用いられる調整視差量曲線1623の具体的な形状について説明する。調整視差量曲線1623は、枠W1で囲まれた領域内の部分と、枠W2で囲まれた領域内の部分とから構成されている。これにより、調整視差量曲線1623は、少なくとも-m~+mの範囲内の被写体と、-m~+mの範囲外の被写体とに対して異なる立体調整パラメータCの値を設定できるようになっている。 Here, a specific shape of the adjustment parallax amount curve 1623 used for determining the value of the three-dimensional adjustment parameter C as described above will be described. The adjusted parallax amount curve 1623 is composed of a part in the area surrounded by the frame W1 and a part in the area surrounded by the frame W2. Accordingly, the adjustment parallax amount curve 1623 can set different values of the stereoscopic adjustment parameter C for at least a subject within the range of −m to + m and a subject outside the range of −m to + m. .
 また、調整視差量曲線1623のうち、枠W1で囲まれた領域内の部分は、視差量の目標値が-m~+mの範囲内に含まれるように形成されている。また、調整視差量曲線1623のうち、枠W2で囲まれた領域内の部分は、視差量曲線1622に近似して形成されている。 Further, a portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W1 is formed so that the target value of the parallax amount is included in the range of −m to + m. In addition, the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W <b> 2 is formed to approximate the parallax amount curve 1622.
 ここで、被写体がシーンの奥行き方向に連続する場合、換言すれば、被写体が奥行き方向に並んでいる場合について検討する。この場合、これらの被写体の間では、奥行き方向に対して視差量が滑らかに推移することが好ましい。 Here, consider the case where the subject is continuous in the depth direction of the scene, in other words, the case where the subject is aligned in the depth direction. In this case, it is preferable that the parallax amount smoothly transition between these subjects in the depth direction.
 そのため、調整視差量曲線1623は、奥行き方向、つまりデジタルカメラ10からの距離方向において連続することが好ましく、微分値が連続することがより好ましい。この場合には、調整された視差量が奥行き方向において連続するように、立体調整パラメータCの値が決定される。また、調整された視差量の微分値が奥行き方向において連続するように立体調整パラメータCの値が決定される。 Therefore, the adjusted parallax amount curve 1623 is preferably continuous in the depth direction, that is, the distance direction from the digital camera 10, and more preferably the differential value is continuous. In this case, the value of the stereoscopic adjustment parameter C is determined so that the adjusted parallax amount is continuous in the depth direction. Further, the value of the stereoscopic adjustment parameter C is determined so that the differential value of the adjusted parallax amount is continuous in the depth direction.
 調整視差量曲線1623を距離方向に連続させるには、調整視差量曲線1623のうち、枠W2で囲まれた領域内の部分の形状を調整すれば良い。具体的には、枠W2で囲まれた領域内の部分のうち、枠W1との境界付近の部分を、枠W1で囲まれた部分に対して連続するように変形すれば良い。 In order to make the adjusted parallax amount curve 1623 continuous in the distance direction, the shape of the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W2 may be adjusted. Specifically, among the parts in the region surrounded by the frame W2, the part near the boundary with the frame W1 may be deformed so as to be continuous with the part surrounded by the frame W1.
 また、調整視差量曲線1623の微分値を距離方向に連続させるには、調整視差量曲線1623のうち、枠W2で囲まれた領域内の部分の形状を調整すれば良い。具体的には、枠W2で囲まれた領域内の部分のうち、枠W1との境界付近の部分を、枠W1で囲まれた部分に対して微分値が連続するように変形すれば良い。 Further, in order to make the differential value of the adjusted parallax amount curve 1623 continuous in the distance direction, the shape of the portion of the adjusted parallax amount curve 1623 in the region surrounded by the frame W2 may be adjusted. Specifically, among the portions in the region surrounded by the frame W2, a portion near the boundary with the frame W1 may be deformed so that the differential value is continuous with respect to the portion surrounded by the frame W1.
 調整視差量曲線1623を選定するには、例えば、まず-m~+mの範囲内における少なくとも一部の視差量区間で視差量曲線1622に重なり、かつ、±mを漸近線とするような曲線を生成することで、調整視差量曲線1623のうち、枠W1で囲まれた部分を生成する。このような曲線は、ハイパボリックタンジェント曲線から導出される。 In order to select the adjusted parallax amount curve 1623, for example, a curve that overlaps with the parallax amount curve 1622 in at least a part of the parallax amount section in the range of −m to + m and has ± m as an asymptotic line is used. By generating, the portion surrounded by the frame W1 in the adjusted parallax amount curve 1623 is generated. Such a curve is derived from a hyperbolic tangent curve.
 また、調整視差量曲線1623のうち、枠W2で囲まれた部分を、視差量曲線1622から生成する。そして、枠W2で囲まれた領域内の部分のうち、枠W1との境界付近の部分を、枠W1で囲まれた部分に対して連続するように変形する。このとき、枠W1および枠W2の境界で調整視差量曲線1623の微分値が連続するようにする。これにより調整視差量曲線1623が選定される。 Further, a portion surrounded by the frame W2 in the adjusted parallax amount curve 1623 is generated from the parallax amount curve 1622. Then, the portion in the region surrounded by the frame W2 is deformed so that the portion near the boundary with the frame W1 is continuous with the portion surrounded by the frame W1. At this time, the differential value of the adjusted parallax amount curve 1623 is made continuous at the boundary between the frame W1 and the frame W2. Thereby, the adjusted parallax amount curve 1623 is selected.
 なお、上述の視差量曲線1622および調整視差量曲線1623の関数は、視差量に影響を与える撮像条件(例えば絞り値、フォーカス位置、撮影レンズ20がズームレンズである場合の焦点距離等)によってそれぞれ変化し得る。従って、上述のような原理により調整視差量曲線1623を用いて立体調整パラメータCの値を決定する場合には、撮像条件ごとに調整視差量曲線1623の関数をデジタルカメラ10内に記憶させておけばよい。 Note that the functions of the parallax amount curve 1622 and the adjusted parallax amount curve 1623 described above depend on the imaging conditions that affect the parallax amount (for example, the aperture value, the focus position, the focal length when the photographing lens 20 is a zoom lens, etc.), respectively. Can change. Therefore, when the value of the stereoscopic adjustment parameter C is determined using the adjustment parallax amount curve 1623 according to the principle described above, the function of the adjustment parallax amount curve 1623 can be stored in the digital camera 10 for each imaging condition. That's fine.
 しかしながら、撮像条件は絞り値、フォーカス位置および焦点距離などについての各値の組み合わせによって、無数に存在する。従って、撮像条件ごとに調整視差量曲線1623の関数を記憶させると、データ量が多くなってしまう。 However, there are an infinite number of imaging conditions depending on combinations of values such as aperture value, focus position, and focal length. Therefore, if the function of the adjusted parallax amount curve 1623 is stored for each imaging condition, the amount of data increases.
 そのため、本実施形態では、調整視差量曲線1623についての複数の関数をデジタルカメラ10に記憶させることなく、上述の原理に従って立体調整パラメータCの値を予め実験により取得しておく。以下、具体的な構成について説明する。図9は、本実施の形態における決定部232によって記憶されるルックアップテーブル2310を示す図である。 Therefore, in this embodiment, the value of the stereoscopic adjustment parameter C is obtained in advance by an experiment in accordance with the above-described principle without causing the digital camera 10 to store a plurality of functions regarding the adjusted parallax amount curve 1623. Hereinafter, a specific configuration will be described. FIG. 9 is a diagram showing a lookup table 2310 stored by the determining unit 232 in the present embodiment.
 ルックアップテーブル2310は、決定部232が立体調整パラメータCの値を決定するときに参照されるテーブルである。このルックアップテーブル2310は、デジタルカメラ10の記憶部に予め記憶されている。図9に示すように、ルックアップテーブル2310では、視差量mが取り得る各値(m10、m20、m30、…)と、各値に対応する立体調整パラメータCの値(C10、C20、C30、…)とがペアで記述されている。 The lookup table 2310 is a table that is referred to when the determination unit 232 determines the value of the stereoscopic adjustment parameter C. This lookup table 2310 is stored in advance in the storage unit of the digital camera 10. As shown in FIG. 9, in the look-up table 2310, each value (m 10 , m 20 , m 30 ,...) That the parallax amount m can take, and the value of the stereo adjustment parameter C corresponding to each value (C 10 , C 20 , C 30 ,...) Are described in pairs.
 ここで、ルックアップテーブル2310における視差量mのうち、-m~+mの範囲内の各値に対応する立体調整パラメータCの値は、未調整の視差量が概ね維持されるように決定されている。また、-m~+mの範囲外の各値に対応する立体調整パラメータCの値は、-m~+mの範囲内に視差量が調整されるように決定されている。 Here, of the parallax amount m in the lookup table 2310, the value of the stereoscopic adjustment parameter C corresponding to each value within the range of −m to + m is determined so that the unadjusted parallax amount is generally maintained. Yes. Further, the value of the stereo adjustment parameter C corresponding to each value outside the range of −m to + m is determined so that the parallax amount is adjusted within the range of −m to + m.
 このようなルックアップテーブル2310は、例えば試作機を用いた実験を通じて生成される。具体的には、デジタルカメラ10の試作機において、視差量に影響を与える撮像条件を任意の何れかの条件に設定する。 Such a lookup table 2310 is generated through an experiment using a prototype, for example. Specifically, in the prototype of the digital camera 10, an imaging condition that affects the amount of parallax is set to any arbitrary condition.
 このとき、撮影条件を一定にしたままシーンに対する奥行き方向に被写体の位置を移動させつつ画像を撮影する。各々の撮影画像データには、被写体までの距離を対応付ける。 At this time, an image is taken while moving the position of the subject in the depth direction with respect to the scene while keeping the shooting conditions constant. Each captured image data is associated with a distance to the subject.
 次に、各撮影画像データにおける左右の視差画像データから被写体の視差量を算出し、この視差量と、撮影時点でのデジタルカメラ10から被写体までの距離との対応データを座標平面にプロットする。なお、この座標平面では、上述の図8に示したように、横軸はデジタルカメラ10からの距離、つまりシーンに対する奥行きであり、縦軸は視差量である。 Next, the amount of parallax of the subject is calculated from the left and right parallax image data in each captured image data, and the correspondence data between the amount of parallax and the distance from the digital camera 10 to the subject at the time of shooting is plotted on the coordinate plane. In this coordinate plane, as shown in FIG. 8 described above, the horizontal axis is the distance from the digital camera 10, that is, the depth with respect to the scene, and the vertical axis is the amount of parallax.
 そして、これらのプロットの近似曲線を生成することにより、視差量曲線1622を生成する。視差量曲線1622が生成されたら、次に、上述したように調整視差量曲線1623を選定する。 Then, a parallax amount curve 1622 is generated by generating approximate curves of these plots. Once the parallax amount curve 1622 is generated, the adjusted parallax amount curve 1623 is then selected as described above.
 これらの視差量曲線1622および調整視差量曲線1623が得られたら、横軸上の1点を選択し、この点に対応する未調整の視差量を視差量曲線1622から検出する。つまり、デジタルカメラ10からの距離を選択し、この距離に被写体が位置したときの未調整の視差量を検出する。 When the parallax amount curve 1622 and the adjusted parallax amount curve 1623 are obtained, one point on the horizontal axis is selected, and an unadjusted parallax amount corresponding to this point is detected from the parallax amount curve 1622. That is, the distance from the digital camera 10 is selected, and the unadjusted parallax amount when the subject is located at this distance is detected.
 そして、検出された未調整の視差量が視差量の目標値に調整されるような立体調整パラメータCの値を決定する。これにより、未調整の視差量(例えば視差量m10)が-m~+mの範囲内であるときは、未調整の視差量が概ね維持されるような立体調整パラメータCの値(例えば値C10)が決定される。 Then, the value of the stereoscopic adjustment parameter C is determined such that the detected unadjusted parallax amount is adjusted to the target value of the parallax amount. Thereby, when the unadjusted parallax amount (for example, the parallax amount m 10 ) is in the range of −m to + m, the value of the stereoscopic adjustment parameter C (for example, the value C) such that the unadjusted parallax amount is generally maintained. 10 ) is determined.
 一方、検出された未調整の視差量(例えば視差量m20)が-m~+mの範囲外であるときは、未調整の視差量が-m~+mの範囲内に調整されるような立体調整パラメータCの値(例えば値C20)が決定される。よって、主要被写体については立体感を残しつつ、主要被写体でない被写体では視差量を抑制することができるため、鑑賞時の違和感、疲労感を低減することができる。 On the other hand, when the detected unadjusted parallax amount (for example, the parallax amount m 20 ) is outside the range of −m to + m, the stereoscopic image is adjusted so that the unadjusted parallax amount is within the range of −m to + m. The value of the adjustment parameter C (for example, the value C 20 ) is determined. Therefore, since the parallax amount can be suppressed in the subject that is not the main subject while the stereoscopic subject is left as the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
 なお、或る被写体についての未調整の視差量を視差量の目標値に調整するような立体調整パラメータCの値を算出するには、上述したように、まず、或る立体調整パラメータCを用いて3D画像を生成した後、その被写体の視差量を算出する処理を行う。そして、算出結果をフィードバックしつつ、この処理を繰り返し行ことで、未調整の視差量が視差量の目標値に調整されるときの立体調整パラメータCの値を算出する。好ましくは、このようにして得られる未調整の視差量と、視差量の目標値と、立体調整パラメータCの値との対応関係から予めルックアップテーブルを生成しておき、このルックアップテーブルを用いて立体調整パラメータCの値を算出する。 In order to calculate the value of the stereoscopic adjustment parameter C that adjusts the unadjusted parallax amount for a certain subject to the target value of the parallax amount, first, as described above, a certain stereoscopic adjustment parameter C is used. After the 3D image is generated, a process for calculating the parallax amount of the subject is performed. Then, by repeating this process while feeding back the calculation result, the value of the stereo adjustment parameter C when the unadjusted parallax amount is adjusted to the target value of the parallax amount is calculated. Preferably, a lookup table is generated in advance from the correspondence between the unadjusted parallax amount obtained in this way, the target value of the parallax amount, and the value of the stereoscopic adjustment parameter C, and this lookup table is used. Then, the value of the three-dimensional adjustment parameter C is calculated.
 そして、この点についての未調整の視差量と、立体調整パラメータCの値とを対応付けてルックアップテーブル2310に記憶させる。以降、同様にして、横軸上の他の点を選択して、その点に対応する未調整の視差量と、立体調整パラメータCの値とを対応付けてルックアップテーブル2310に記憶させる。これにより、ルックアップテーブル2310が生成される。 Then, the unadjusted parallax amount for this point and the value of the stereoscopic adjustment parameter C are stored in the lookup table 2310 in association with each other. Thereafter, similarly, another point on the horizontal axis is selected, and the unadjusted parallax amount corresponding to that point and the value of the three-dimensional adjustment parameter C are stored in the lookup table 2310 in association with each other. As a result, a lookup table 2310 is generated.
 以上のようなルックアップテーブル2310によれば、被写体距離を検出することなく、未調整の視差量から立体調整パラメータCの値を決定することができる。逆言すれば、シーンにおける被写体の配置に関わらず、未調整の視差量さえ検出できれば、ルックアップテーブル2310により立体調整パラメータCの値を決定することができる。 According to the lookup table 2310 as described above, the value of the stereoscopic adjustment parameter C can be determined from the unadjusted parallax amount without detecting the subject distance. In other words, the value of the stereoscopic adjustment parameter C can be determined by the lookup table 2310 as long as an unadjusted parallax amount can be detected regardless of the arrangement of the subject in the scene.
 ここで、視差量の目標値は調整視差量曲線1623を用いて決定されており、この曲線では被写体同士の奥行き関係(デジタルカメラ10からの距離と視差量との関係)が崩れていない。そのため、撮影画像において被写体の奥行き関係が崩れていない限り、視差量を調整しても被写体の奥行き関係が崩れることはない。 Here, the target value of the parallax amount is determined using the adjusted parallax amount curve 1623, and the depth relationship between the subjects (the relationship between the distance from the digital camera 10 and the parallax amount) is not broken in this curve. Therefore, as long as the depth relationship of the subject is not broken in the captured image, the depth relationship of the subject is not broken even if the parallax amount is adjusted.
 このことから、ルックアップテーブル2310は、視差量に影響を与える撮像条件(絞り値やフォーカス位置、撮影レンズ20がズームレンズである場合の焦点距離)に関わらず、同一のものを用いることができる。以下、この点について具体例を挙げて説明する。 Therefore, the same lookup table 2310 can be used regardless of the imaging conditions that affect the amount of parallax (aperture value, focus position, focal length when the taking lens 20 is a zoom lens). . Hereinafter, this point will be described with a specific example.
 図10は、撮影条件の違いにより視差量曲線の形状が異なる場合での視差量と、視差量の目標値との関係を示す図である。この図においては、図8と同様に、横軸はデジタルカメラ10からの距離を表し、縦軸は視差量を表す。 FIG. 10 is a diagram illustrating the relationship between the amount of parallax and the target value of the amount of parallax when the shape of the parallax amount curve varies depending on the shooting conditions. In this figure, as in FIG. 8, the horizontal axis represents the distance from the digital camera 10, and the vertical axis represents the amount of parallax.
 なお、デジタルカメラ10は、距離L11、L12に位置する被写体(図中の合焦被写体を参照)に焦点を合わせている。また、図10(a)、図10(b)に示す2つの図の間では、視差量に影響を与える撮影条件としての焦点位置及び絞り値が変更されており、その結果、視差量曲線1626、1628が互いに異なっている。 The digital camera 10 focuses on a subject (see the focused subject in the figure) located at the distances L 11 and L 12 . Also, between the two diagrams shown in FIGS. 10A and 10B, the focal position and the aperture value as the shooting conditions that affect the parallax amount are changed. As a result, the parallax amount curve 1626 is changed. 1628 are different from each other.
 これらの図10(a)、図10(b)で表されるシーンでは、デジタルカメラ10から各被写体までの距離が互いに異なっている。具体的には、デジタルカメラ10から合焦被写体までの距離は、図10(a)で表されるシーンでは距離L11となっており、図10(b)で表されるシーンでは距離L12となっている。 In the scenes shown in FIGS. 10A and 10B, the distance from the digital camera 10 to each subject is different from each other. Specifically, the distance from the digital camera 10 to the in-focus subject, in a scene represented by FIG. 10 (a) has a distance L 11, the distance L 12 in the scene represented by FIG. 10 (b) It has become.
 また、デジタルカメラ10から近点被写体までの距離は、図10(a)で表されるシーンでは距離L21となっており、図10(b)で表されるシーンでは距離L22となっている。また、デジタルカメラ10から遠点被写体までの距離は、図10(a)で表されるシーンでは距離L31となっており、図10(b)で表されるシーンでは距離L32となっている。 Further, the distance from the digital camera 10 to the near-point subject is the distance L 21 in the scene shown in FIG. 10A and the distance L 22 in the scene shown in FIG. Yes. The distance from the digital camera 10 to the far point subject is the distance L 31 in the scene shown in FIG. 10A and the distance L 32 in the scene shown in FIG. Yes.
 一方、合焦被写体についての未調整の視差量は、いずれも視差量m10(=0)となっている。また、近点被写体についての未調整の視差量はいずれも視差量m20となっており、遠点被写体についての未調整の視差量はいずれも視差量m30となっている。 On the other hand, the unadjusted parallax amount for the focused subject is the parallax amount m 10 (= 0). Moreover, any parallax amount unadjusted for near-point object is a parallax amount m 20, the parallax amount unadjusted for far point object has a both parallax amount m 30.
 このような場合において、ルックアップテーブル2310を用いて立体調整パラメータCを決定すると、図10(a)における合焦被写体、近点被写体および遠点被写体に対しては、未調整の視差量m10、m20、m30が視差量m100、m200、m300に調整されるように立体調整パラメータCの値がそれぞれ決定される。同様に、図10(b)における合焦被写体、近点被写体および遠点被写体に対しては、未調整の視差量m10、m20、m30が視差量m100、m200、m300に調整されるように立体調整パラメータCの値がそれぞれ決定される。 In such a case, when the stereoscopic adjustment parameter C is determined using the look-up table 2310, the unadjusted parallax amount m 10 for the focused subject, the near point subject, and the far point subject in FIG. , M 20 , m 30 are respectively adjusted so that the parallax amounts m 100 , m 200 , m 300 are adjusted. Similarly, the unadjusted parallax amounts m 10 , m 20 , and m 30 become parallax amounts m 100 , m 200 , and m 300 for the focused subject, the near point subject, and the far point subject in FIG. The value of the three-dimensional adjustment parameter C is determined so as to be adjusted.
 同様に、デジタルカメラ10からの各距離に被写体が位置したと仮定した場合に、その距離と、各被写体に対して立体調整パラメータCにより生じさせる視差量の目標値との関係は、図10(a)、図10(b)に太線の破線で表した曲線1627,1629の通りとなる。このことから、視差量に影響を与える撮像条件(絞り値やフォーカス位置、撮影レンズ20がズームレンズである場合の焦点距離)に関わらず、同一のルックアップテーブル2310を用いて立体調整パラメータCの値を決定できることがわかる。 Similarly, assuming that the subject is located at each distance from the digital camera 10, the relationship between the distance and the target value of the parallax amount generated by the stereoscopic adjustment parameter C for each subject is shown in FIG. a) and curves 1627 and 1629 represented by bold broken lines in FIG. Therefore, regardless of the imaging conditions that affect the amount of parallax (aperture value, focus position, focal length when the taking lens 20 is a zoom lens), the same lookup table 2310 is used to adjust the stereoscopic adjustment parameter C. It can be seen that the value can be determined.
 そして、以上のルックアップテーブル2310によれば、撮像条件ごとに調整視差量曲線1623の関数をデジタルカメラ10内に記憶させる場合と同様の効果を得ることができる。すなわち、主要被写体については立体感を残しつつ、主要被写体でない被写体では視差量を抑制することができるため、鑑賞時の違和感、疲労感を低減することができる。 Then, according to the lookup table 2310 described above, the same effect as when the function of the adjusted parallax amount curve 1623 is stored in the digital camera 10 for each imaging condition can be obtained. That is, since the main subject can have a stereoscopic effect and the amount of parallax can be suppressed in a subject that is not the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
 次に、以上の処理を画素値分布とカラーの観点から説明する。図11は、RGBの画素値分布の変化を説明する図である。図11(a)は、焦点位置から一定量だけずれた位置に存在する物点からのある白色被写体光束を受光した場合の、G(Lt)画素、G(Rt)画素、R(Lt)画素、R(Rt)画素、B(Lt)画素およびB(Rt)画素のそれぞれの出力値を並べたグラフである。 Next, the above processing will be described from the viewpoint of pixel value distribution and color. FIG. 11 is a diagram for explaining changes in RGB pixel value distribution. FIG. 11A shows a G (Lt) pixel, a G (Rt) pixel, and an R (Lt) pixel when a white subject light beam from an object point located at a position deviated by a certain amount from the focal position is received. , R (Rt) pixels, B (Lt) pixels, and B (Rt) pixels.
 図11(b)は、図11(a)における物点からのある白色被写体光束を受光した場合の、視差無し画素であるR(N)画素、G(N)画素およびB(N)画素の出力値を並べたグラフである。このグラフも、各色の画素値分布を表すと言える。 FIG. 11B shows R (N) pixels, G (N) pixels, and B (N) pixels that are non-parallax pixels when a white subject light beam from the object point in FIG. 11A is received. It is the graph which arranged the output value. It can be said that this graph also represents the pixel value distribution of each color.
 C=0.8として対応する画素ごとに上述の処理を施すと、図11(c)のグラフで表される画素値分布となる。図から分かるように、RGBそれぞれの画素値に応じた分布が得られる。 When the above processing is performed for each corresponding pixel with C = 0.8, the pixel value distribution represented by the graph in FIG. As can be seen from the figure, a distribution corresponding to the pixel values of RGB is obtained.
 次に、3D画像データを再生装置で再生した場合の、鑑賞者と映像との関係について説明する。図12は、鑑賞者の輻輳角と視差量の関係を示す図である。眼球50は、鑑賞者の眼球を表し、図は、右目51と左目52が離間している様子を示す。 Next, the relationship between the viewer and the video when 3D image data is played back by a playback device will be described. FIG. 12 is a diagram illustrating the relationship between the vergence angle of the viewer and the amount of parallax. The eyeball 50 represents the eyeball of the viewer, and the figure shows the right eye 51 and the left eye 52 being separated.
 表示部40には、視差量が調整されていない非調整画像データが再生され、右目用画像の被写体61と左目用画像の被写体62が表示されている。被写体61と被写体62は同一の被写体であり、撮影時において焦点位置からずれた位置に存在していたので、表示部40においては、視差量Dをもって離間して表示されている。 The display unit 40 reproduces non-adjusted image data whose parallax amount is not adjusted, and displays a subject 61 for the right-eye image and a subject 62 for the left-eye image. Object 61 and the object 62 are the same object, so were present at a position shifted from the focal position at the time of shooting, the display unit 40 is displayed at a distance with a disparity amount D 1.
 眼球50は、これらを一致させて視認しようとするので、鑑賞者は、右目51と被写体61を結ぶ直線と、左目52と被写体62を結ぶ直線とが交差する、浮き上がり距離L1の位置(図において四角で表す)に被写体が存在するように認識する。 Since the eyeball 50 is intended to be viewed with the eyeballs coincident with each other, the viewer views the position of the lifting distance L1 (in the drawing) where the straight line connecting the right eye 51 and the subject 61 and the straight line connecting the left eye 52 and the subject 62 intersect. (Represented by a square).
 このときの輻輳角は、図示するようにθである。一般的に、輻輳角が大きくなると、映像に対して違和感を覚え、また、眼精疲労の原因ともなる。そこで、本実施形態において立体調整パラメータを利用して画像処理する場合は、上述のように視差量を立体調整パラメータによって調整した調整画像データを生成する。なお、図は、非調整画像データに重ねて調整画像データが再生されている様子を示す。 The convergence angle at this time is θ 1 as shown in the figure. In general, when the angle of convergence is increased, the video is uncomfortable and causes eye strain. Therefore, when image processing is performed using the stereoscopic adjustment parameter in the present embodiment, adjusted image data in which the parallax amount is adjusted by the stereoscopic adjustment parameter as described above is generated. The figure shows a state where the adjusted image data is reproduced over the non-adjusted image data.
 表示部40には、調整画像データの右目用画像の被写体71と左目用画像の被写体72が表示されている。被写体71と被写体72は同一の被写体であり、また被写体61、62とも同一の被写体である。被写体71と被写体72は、表示部40においては、視差量Dをもって離間して表示されている。鑑賞者は、右目51と被写体71を結ぶ直線と、左目52と被写体72を結ぶ直線とが交差する、浮き上がり距離L2の位置(図において三角で表す)に被写体が存在するように認識する。 The display unit 40 displays a subject 71 of the right-eye image and a subject 72 of the left-eye image of the adjustment image data. The subject 71 and the subject 72 are the same subject, and the subjects 61 and 62 are also the same subject. Object 71 and the object 72, the display unit 40 is displayed at a distance with a disparity amount D 2. The viewer recognizes that the subject exists at the position of the lifting distance L2 (represented by a triangle in the figure) where the straight line connecting the right eye 51 and the subject 71 intersects with the straight line connecting the left eye 52 and the subject 72.
 このときの輻輳角は、θよりも小さいθである。したがって、鑑賞者は極端な浮き上がり感を感じることなく、また眼精疲労の蓄積も軽減できる。なお、視差量は、後述するように適度に調整されるので、鑑賞者は、心地良い浮き上がり感(デフォーカス関係が逆転した場合の沈み込み感も合せて立体感)をもって映像を鑑賞できる。 The convergence angle at this time is θ 2 smaller than θ 1 . Therefore, the viewer can feel the extreme feeling of lifting and can reduce the accumulation of eye strain. Note that the amount of parallax is appropriately adjusted as will be described later, so that the viewer can appreciate the video with a comfortable floating feeling (a three-dimensional effect with a feeling of depression when the defocus relationship is reversed).
 なお、図12の説明として用いた視差量は、表示部40における離間距離で表わされたが、視差量は、さまざまな形式で定義され得る。撮影画像データにおけるピクセル単位で定義しても良いし、画像の横幅に対するずれ幅で定義しても良い。 In addition, although the parallax amount used as description of FIG. 12 was represented by the separation distance in the display part 40, a parallax amount can be defined in various formats. You may define by the pixel unit in picked-up image data, and you may define by the shift | offset | difference width with respect to the horizontal width of an image.
 また、視差量の調整は、立体調整パラメータCの値を変更する手法を用いなくても、さまざまな手法で実行され得る。以下、立体調整パラメータCの値の変更によらずに視差量を調整する手法について説明する。 Further, the adjustment of the parallax amount can be executed by various methods without using the method of changing the value of the three-dimensional adjustment parameter C. Hereinafter, a method of adjusting the parallax amount without changing the value of the three-dimensional adjustment parameter C will be described.
 図13は、画像の鮮鋭度を示すコントラストと視差量の関係を模式的に示す図である。横軸は、デジタルカメラ10からの距離を表し、縦軸は、視差量とコントラストの高さを表す。デジタルカメラ10は、距離Lに位置する主要被写体に焦点を合わせている。 FIG. 13 is a diagram schematically illustrating the relationship between the contrast indicating the sharpness of an image and the amount of parallax. The horizontal axis represents the distance from the digital camera 10, and the vertical axis represents the amount of parallax and the height of contrast. The digital camera 10 is focused on the main subject is located at a distance L p.
 コントラスト曲線1610は、焦点位置までの距離である距離Lで最も高い凸状の曲線を成す。すなわち、距離Lから前後に離れるにつれて徐々にぼけていく様子を示す。 The contrast curve 1610 forms the highest convex curve at the distance L p that is the distance to the focal position. That illustrates how gradually blurred with increasing distance from the distance L p back and forth.
 視差量曲線1620は、距離Lにおいて視差量0を示し、距離Lよりデジタルカメラ10側に近づくにつれ、傾きが大きくなるカーブを示す。すなわち、視差量曲線1620は距離Lより手前側で正の値を示し、より近い被写体ほど大きく浮き上がって視認されることを表している。 Parallax amount curve 1620, at a distance L p indicates parallax amount 0, than the distance L p approaches the digital camera 10 side, shows the curve slope increases. That is, the parallax amount curve 1620 shows a positive value on the near side of the distance L p , and indicates that the closer the subject is, the higher the image is visually recognized.
 他方、視差量曲線1620は、距離Lよりデジタルカメラ10側から遠ざかるにつれ、傾きが小さくなるカーブを示す。すなわち、視差量曲線1620は距離Lより奥側で負の値を示し、より遠い被写体ほど緩やかに沈み込んで視認されることを表している。 On the other hand, the parallax amount curve 1620 than the distance L p As the distance from the digital camera 10 side, shows the curve slope becomes smaller. That is, the parallax amount curve 1620 distance L p from indicate a negative value in the back side, it represents that it is visible sinks slowly as more distant object.
 視差量が-mから+mの範囲に含まれる場合に鑑賞者が違和感、疲労感を覚えないとすると、シーンを構成する被写体が距離L(このときの視差量は+m)から距離L(このときの視差量は-m)の間に分布していれば良い。すなわち、デジタルカメラ10から最も近い近点被写体が距離Lに存在し、最も遠い遠点被写体が距離Lに存在すれば、後段の画像処理で視差量を調整しなくても、鑑賞者は心地よく3D映像を鑑賞できる。一方、近点被写体が距離Lよりも手前の距離Lf'(このときの視差量は+m')に存在すると、許容される視差量を超えてしまうので、鑑賞者は違和感、疲労感を覚える。 If the viewer does not feel discomfort and fatigue when the amount of parallax is in the range of −m to + m, the subject composing the scene moves from the distance L f (the amount of parallax at this time is + m) to the distance L r ( The amount of parallax at this time may be distributed between -m). That is, if the closest near subject from the digital camera 10 exists at the distance L f and the farthest far subject exists at the distance L r , the viewer can adjust the amount of parallax without adjusting the amount of parallax in the subsequent image processing. You can enjoy 3D video comfortably. On the other hand, when the near-point object is in front of the distance L f than the distance L f '(parallax amount at this time is + m') are present, since exceeds the parallax amount allowed, viewer discomfort, fatigue Learn.
 さらに被写体分布と視差量の関係について説明を続ける。図14は、被写体分布と視差量の関係を模式的に示す図である。 Furthermore, we will continue to explain the relationship between the subject distribution and the amount of parallax. FIG. 14 is a diagram schematically illustrating the relationship between the subject distribution and the amount of parallax.
 図14の各図は、図11のうち、コントラスト曲線1610を除いた図に対応する。また、焦点を合わせる対象である主要被写体としての合焦被写体の他に、近点被写体と遠点被写体とが存在する場合を想定している。図14(a)においては、合焦被写体は距離L10に、近点被写体はL20に、遠点被写体はL30に存在する。 Each diagram in FIG. 14 corresponds to the diagram in FIG. 11 excluding the contrast curve 1610. In addition, it is assumed that there are a near point subject and a far point subject in addition to the focused subject as the main subject to be focused. In FIG. 14 (a), the in-focus object distance L 10, near point subject to L 20, far point object is present in L 30.
 許容範囲として設定された視差量範囲が-mから+mである場合、遠点被写体の距離L30に対する視差量曲線1620の値はこの範囲に収まっている。しかし、近点被写体の距離L20に対する視差量曲線1620の値は+mを超えている。 When the parallax amount range set as the allowable range is from −m to + m, the value of the parallax amount curve 1620 with respect to the distance L 30 of the far-point subject is within this range. However, the value of the parallax amount curve 1620 with respect to the distance L 20 of near-point object is over + m.
 図14(b)は、図14(a)の被写体状況から、合焦被写体が距離L10から奥側の距離L11へ移動した場合の視差量の概念を示す図である。この場合、距離L11が焦点位置となるので、移動していない近点被写体(距離L20)の像に対する視差量は、視差量曲線1620で示されるように、図14(a)に比べて相当大きくなる。すなわち、許容範囲からの超過量が大きくなる。 14 (b) is a diagram of the subject situation, showing the concept of parallax amount when focus object is moved from the distance L 10 to the back side of the distance L 11 in FIG. 14 (a). In this case, the distance L 11 is the focal position, the parallax amount relative to the image of the near point object has not moved (the distance L 20), as indicated by the parallax amount curve 1620, in comparison with FIG. 14 (a) It becomes considerably large. That is, the excess amount from the allowable range increases.
 図14(c)は、図14(b)の被写体状況から、近点被写体が距離L20から奥側の距離L21へ、さらには距離L22へ移動した場合の視差量の概念を示す図である。焦点位置は距離L11のままなので視差量曲線1620は図14(b)と同じ曲線を描くが、近点被写体が奥側にずれたことにより、距離L21の時点での視差量は、許容範囲を超えているものの、その超過量は、図14(b)の超過量よりも小さくなる。さらに距離L22まで移動すれば、その視差量は許容範囲に収まる。 FIG. 14 (c), shows the object status of FIG. 14 (b), the near point object from the distance L 20 to the back side of the distance L 21, the concept of parallax amount when further moved to the distance L 22 It is. Although focus position parallax amount curve 1620 remain such because of the distance L 11 draw the same curve as FIG. 14 (b), the by near point object is shifted to the rear side, the parallax amount at the time of the distance L 21 is acceptable Although exceeding the range, the excess amount is smaller than the excess amount of FIG. If further moved until the distance L 22, the parallax amount is within an allowable range.
 すなわち、シーンに対する奥行き方向の被写体分布と、焦点を合わせる被写体の位置は、設定された許容範囲に視差量が収まるか否かを決定するパラメータであると言える。 That is, it can be said that the subject distribution in the depth direction with respect to the scene and the position of the subject to be focused are parameters that determine whether or not the parallax amount falls within the set allowable range.
 次に、絞り値と視差量の関係について説明する。図15は、絞り値と視差量の関係を模式的に示す図である。図13と同様に、横軸は、デジタルカメラ10からの距離を表し、縦軸は、視差量とコントラストの高さを表す。また、図15(a)は、絞り値がF1.4の状態を表し、図15(b)は、絞り値がF4の状態を表し、図15(c)は、絞り値がF8の状態を表す。なお、撮影レンズ20の焦点距離はいずれの状態においても同一であり、また、デジタルカメラ10は、距離L10に位置する主要被写体に焦点を合わせている。 Next, the relationship between the aperture value and the parallax amount will be described. FIG. 15 is a diagram schematically illustrating the relationship between the aperture value and the amount of parallax. Similarly to FIG. 13, the horizontal axis represents the distance from the digital camera 10, and the vertical axis represents the amount of parallax and the height of contrast. 15A shows a state where the aperture value is F1.4, FIG. 15B shows a state where the aperture value is F4, and FIG. 15C shows a state where the aperture value is F8. To express. Incidentally, the focal length of the taking lens 20 are the same in both states, also, the digital camera 10 is focused on the main subject is located at a distance L 10.
 コントラスト曲線1610は、いずれの状態においても焦点位置までの距離である距離L10で最も高い。一方で、絞り22を絞るほど、すなわち絞り値が大きくなるほど焦点距離前後でも比較的高い値を示す。すなわち、絞り22を絞った状態で撮影された画像ほど被写界深度が深くなることを示す。視差量曲線1620は、距離L10において視差量0を示し、距離L10よりデジタルカメラ10側に近づくにつれ、傾きが大きくなるカーブを示す。他方、視差量曲線1620は、距離L10よりデジタルカメラ10側から遠ざかるにつれ、傾きが小さくなるカーブを示す。 Contrast curve 1610, the highest in the distance L 10 is a distance to be focal positions in any state. On the other hand, as the aperture 22 is reduced, that is, as the aperture value is increased, a relatively high value is obtained even before and after the focal length. That is, it shows that the depth of field becomes deeper as the image is taken with the aperture 22 being reduced. Parallax amount curve 1620, at a distance L 10 shows the parallax amount 0, approaches the digital camera 10 side than the distance L 10, shows the curve slope increases. On the other hand, the parallax amount curve 1620, as the distance from the digital camera 10 side than the distance L 10, shows the curve slope becomes smaller.
 視差量曲線1620は、絞り値が大きくなるほど、その変化がなだらかになる。すなわち、絞り値がF1.4の場合に比べ、F4、F8と推移するに従い、焦点位置より手前の視差量および奥の視差量が小さくなる。視差量が-mから+mの範囲に含まれる場合に鑑賞者が違和感、疲労感を覚えないとすると、絞り値がF8の場合であれば視差量曲線1620の全体がこの範囲に収まっているので、いずれの距離に被写体が存在しても鑑賞者は心地よく3D映像を鑑賞できる。 The parallax amount curve 1620 becomes gentler as the aperture value increases. That is, as compared with the case where the aperture value is F1.4, the amount of parallax in front of the focal position and the amount of parallax in the back become smaller as F4 and F8 change. If the viewer does not feel discomfort and fatigue when the amount of parallax falls within the range of −m to + m, the entire parallax amount curve 1620 falls within this range when the aperture value is F8. Even if the subject is present at any distance, the viewer can comfortably appreciate the 3D video.
 一方、絞り値がF1.4およびF4の場合では、視差量は、視差量曲線1620の近距離側で+mを超えている。具体的には、F1.4の場合で、距離L24よりも手前の領域で+mを超えており、F4の場合で、距離L25よりも手前の領域で+mを超えている。この場合、F4における視差量曲線1620の傾きの方が、F1.8における視差量曲線1620傾きよりも緩やかであるので、L25<L24の関係が成り立つ。これらの絞り値において、距離L24または距離L25よりも近距離に被写体が存在すれば、撮影された3D映像の鑑賞時において、鑑賞者は違和感、疲労感を覚える。 On the other hand, when the aperture values are F1.4 and F4, the parallax amount exceeds + m on the short distance side of the parallax amount curve 1620. Specifically, in the case of F1.4, than the distance L 24 exceeds the + m in front of the region, in the case of F4, exceeds + m in front of the area than the distance L 25. In this case, since the slope of the parallax amount curve 1620 at F4 is gentler than the slope of the parallax amount curve 1620 at F1.8, the relationship of L 25 <L 24 is established. At these aperture values, if the subject is present at a distance closer than the distance L 24 or the distance L 25 , the viewer will feel uncomfortable and tired when viewing the captured 3D video.
 そこで、本実施形態においては、生成される画像間の視差量がターゲットとする視差量(許容視差量:例えば±mの範囲)に収まるように、視差量を変更するための変更条件を変更する。具体的には、視差量に影響を与える撮像条件を変更したり、画像処理に用いる立体調整パラメータの値を変更したりする。 Therefore, in the present embodiment, the change condition for changing the parallax amount is changed so that the parallax amount between the generated images falls within the target parallax amount (allowable parallax amount: for example, a range of ± m). . Specifically, an imaging condition that affects the amount of parallax is changed, or a value of a stereoscopic adjustment parameter used for image processing is changed.
 まず、撮像条件の変更について説明する。図15を用いて説明したように絞り値が視差量に影響を与えるので、出力視差画像間の視差量が許容視差量に収まるように、検出した被写体分布に応じて絞り値を変更すると良い。例えば図15(a)の状況(初期絞り値がF1.4、合焦被写体が距離L10)において、近点被写体が距離L25に存在すると、その視差量は+mを超えてしまう。そこで、決定部232は、絞り値をF1.4から、距離L25における被写体に対する視差量が+mとなる絞り値であるF4に変更する。 First, the change of imaging conditions will be described. As described with reference to FIG. 15, the aperture value affects the amount of parallax. Therefore, the aperture value may be changed according to the detected subject distribution so that the amount of parallax between output parallax images is within the allowable amount of parallax. For example, in the situation of FIG. 15A (the initial aperture value is F1.4, the focused subject is the distance L 10 ), and the near-point subject is present at the distance L 25 , the amount of parallax exceeds + m. Therefore, the determination unit 232 changes the aperture value from F1.4, at a distance L 25 in the F4 is a aperture value parallax amount is + m with respect to the subject.
 近点被写体が許容視差量の範囲を超える場合に限らず、遠点被写体が許容視差量の範囲を超える場合でも、絞り値を大きな値に変更する。なお、近点被写体および遠点被写体の視差量が、許容視差量に対して余裕がある場合には、絞り値を小さな値、すなわち絞り22を開く方向に変更しても良い。この場合、シャッタ速度を高速側に変更したり、ISO感度を低感度側に変更したりすることができる。 The aperture value is changed to a large value not only when the near-point subject exceeds the allowable parallax amount range but also when the far-point subject exceeds the allowable parallax amount range. When the parallax amount of the near point subject and the far point subject has a margin with respect to the allowable parallax amount, the aperture value may be changed to a small value, that is, the direction in which the aperture 22 is opened. In this case, the shutter speed can be changed to the high speed side, and the ISO sensitivity can be changed to the low sensitivity side.
 各絞り値に対する合焦被写体距離と視差量曲線1620の関係は、ルックアップテーブルとして予め用意されている。決定部232は、被写体分布と許容視差量を入力値として当該ルックアップテーブルを参照すれば、変更すべき絞り値を抽出、決定することができる。 The relationship between the in-focus subject distance and the parallax amount curve 1620 for each aperture value is prepared in advance as a lookup table. The determination unit 232 can extract and determine the aperture value to be changed by referring to the lookup table with the subject distribution and the allowable parallax amount as input values.
 撮像条件の変更として、絞り値を変更する以外にも、フォーカス位置を変更するフォーカスシフトの手法がある。図16は、フォーカスシフトの概念を模式的に示す図である。縦軸および横軸は、図13と同様である。 In addition to changing the aperture value, there is a focus shift method for changing the focus position as a change in the imaging condition. FIG. 16 is a diagram schematically showing the concept of focus shift. The vertical axis and the horizontal axis are the same as those in FIG.
 コントラスト曲線1610と視差量曲線1620は、合焦被写体が距離L10に存在し、フォーカスレンズを移動させてこの被写体に焦点を合わせたときのコントラスト曲線と視差量曲線を表す。この場合、コントラスト曲線1610のピーク値は、合焦と評価される合焦閾値Eを上回っている。 Contrast curves 1610 and the parallax amount curve 1620, focusing subject exists at a distance L 10, represents the contrast curve and a parallax amount curve when focused on the subject by moving the focus lens. In this case, the peak value of the contrast curve 1610, is higher than the focus threshold E s is evaluated with focusing.
 近点被写体が距離L27の位置に存在すると、その視差量は、視差量曲線1620を参照すると+mであり、許容視差量+mを超えている。そこで、フォーカスシフトにおいては、合焦閾値Eを上回る範囲でフォーカスレンズ位置を修正し、距離L27における視差量を許容範囲に収める。 When the near-point subject exists at the position of the distance L 27 , the parallax amount is + m 0 when referring to the parallax amount curve 1620, which exceeds the allowable parallax amount + m. Therefore, in the focus shift to correct the focus lens position in a range above the focus threshold E s, it falls within the acceptable range parallax amount at the distance L 27.
 図の例の場合、近点被写体に対する視差量が+mとなる視差量曲線1621を選択し、この視差量曲線1621において視差量が0となる距離Lを抽出する。そして、フォーカスレンズ位置を変更して、距離Lを合焦位置とする。コントラスト曲線1611は、このときのコントラスト曲線である。実際には距離L10に被写体が存在するので、当該被写体に対するコントラスト値は、図示するようにΔeだけ低下する。このときのコントラスト値が合焦閾値Eを上回っていれば良い。このようにフォーカスレンズ位置を変更して撮影された画像は、主要被写体に対するコントラスト値は若干低下するものの、画像としては合焦と評価でき、かつ、近点被写体に対する視差量は許容範囲に収まっている。 In the case of the example in the figure, a parallax amount curve 1621 where the parallax amount with respect to the near-point subject is + m is selected, and a distance L p where the parallax amount is 0 in the parallax amount curve 1621 is extracted. Then, by changing the focus lens position, the distance L p as the focusing position. The contrast curve 1611 is a contrast curve at this time. Since the object is present in the distance L 10 in fact, the contrast value for the subject is reduced by Δe as shown. Contrast value at this time has only to above the focus threshold E s. In this way, an image shot with the focus lens position changed can be evaluated as in-focus as the image, although the contrast value for the main subject is slightly reduced, and the parallax amount for the near-point subject is within an allowable range. Yes.
 距離Lに対するコントラスト値が合焦閾値Eを上回っていない場合には、フォーカスレンズ位置の修正は、許容されない。すなわち、視差量曲線1620において近点被写体に対する視差量が許容視差量を大きく上回る場合には、合焦閾値Eを上回る範囲でフォーカスレンズ位置を変更しても、当該視差量を許容範囲に収めることができない。この場合、例えば絞り値を大きな値に変更するなど、他の手法と併用すると良い。 If the contrast value for the distance L p does not exceed the focus threshold E s, the correction of the focusing lens position is not allowed. That is, when the parallax amount relative to the near-point object in parallax amount curve 1620 largely exceeds the allowable amount of parallax, changing the focus lens position in a range above the focus threshold E s, fit the parallax amount within the allowable range I can't. In this case, it may be used in combination with other methods such as changing the aperture value to a large value.
 フォーカスシフトによる視差量調整の場合も、各絞り値に対する合焦被写体距離と視差量曲線の関係として予め用意されているルックアップテーブルを利用すれば良い。決定部232は、被写体分布と許容視差量を入力値として当該ルックアップテーブルを参照すれば、距離Lを抽出、決定できる。制御部201は、距離Lに対応してフォーカスレンズの位置を変更する。制御部201は、その結果得られるコントラスト値が合焦閾値Eを上回るか否かを判断する。上回ると判断すれば、そのまま撮影シーケンスを続行する。上回らないと判断すれば、フォーカスレンズ位置を戻して、他の手法を併用するなどの制御に移行する。あるいは、実際に制御部201がフォーカスレンズを移動させることなく、焦点位置がL10からLへシフトしたときのコントラストの減衰量を決定部232が演算し、合焦閾値Eを上回るか否かを判断しても良い。この場合、例えばコントラストAF方式であれば、距離L10に対するフォーカス調整時の、すでに取得されている実際の評価値を参照することもできる。 In the case of adjusting the parallax amount by focus shift, a lookup table prepared in advance as a relationship between the focused subject distance and the parallax amount curve for each aperture value may be used. The determination unit 232 can extract and determine the distance L p by referring to the lookup table using the subject distribution and the allowable parallax amount as input values. The control unit 201 corresponds to the distance L p to change the position of the focus lens. Control unit 201, the contrast value obtained as a result of determining whether above the focus threshold E s. If it is determined that it exceeds, the photographing sequence is continued as it is. If it is determined that the value does not exceed, the focus lens position is returned and the control shifts to another method. Alternatively, actual control unit 201 without moving the focus lens, whether the focal position is the attenuation of contrast when shifted to the L p determination unit 232 calculates the L 10, above the focus threshold E s It may be judged. In this case, for example, if the contrast AF method, when the focus adjustment with respect to the distance L 10, may also refer to the actual evaluation value has already been obtained.
 ここで、上述したように、立体調整パラメータCの値を決定するためのルックアップテーブル2310は、視差量に影響を与える撮像条件(絞り値やフォーカス位置、撮影レンズ20がズームレンズである場合の焦点距離)に関わらず、同一のものを用いることができる。そのため、以上のような視差量の調整の手法は、立体調整パラメータCの値の変更により視差量を調整する手法と併せて用いることができる。 Here, as described above, the lookup table 2310 for determining the value of the three-dimensional adjustment parameter C is an imaging condition that affects the amount of parallax (a diaphragm value, a focus position, and a case where the photographing lens 20 is a zoom lens). The same can be used regardless of the focal length. Therefore, the parallax amount adjustment method as described above can be used in combination with a method of adjusting the parallax amount by changing the value of the three-dimensional adjustment parameter C.
 図17は、視差量と立体調整パラメータの値と絞り値等との関係を模式的に示す図である。この図においては、図8と同様に、横軸はデジタルカメラ10からの距離を表し、縦軸は視差量を表す。 FIG. 17 is a diagram schematically illustrating the relationship between the parallax amount, the value of the three-dimensional adjustment parameter, the aperture value, and the like. In this figure, as in FIG. 8, the horizontal axis represents the distance from the digital camera 10, and the vertical axis represents the amount of parallax.
 なお、デジタルカメラ10は、距離L10に位置する被写体(図中の合焦被写体を参照)に焦点を合わせている。但し、デジタルカメラ10で撮影される画像には、距離L10に位置する被写体に加え、距離L20及び距離L30に位置する2つの被写体(図中の近点被写体及び遠点被写体を参照)が含まれている。これら3つの被写体についての未調整の視差量は視差量m10、視差量m20及び視差量m30となっている。 The digital camera 10, the object located at a distance L 10 (see focus object in the drawing) are focused. However, the image captured by the digital camera 10, in addition to the subject located at a distance L 10, the distance L 20 and the distance L 2 one object located 30 (see a near point object and far-point object in the drawing) It is included. The unadjusted parallax amounts for these three subjects are the parallax amount m 10 , the parallax amount m 20, and the parallax amount m 30 .
 立体調整パラメータCの値の変更による手法と、他の手法とを併用して視差量を調整する場合には、まず決定部232が絞り値の変更により視差量の調整を行う。これにより、視差量曲線1622は視差量曲線1624に変形され、合焦被写体、近点被写体、遠点被写体に与えられる視差量は視差量m101、視差量m201及び視差量m301となる。なお、絞り値に代えてフォーカス位置を変更しても良い。 When the parallax amount is adjusted by using the method by changing the value of the three-dimensional adjustment parameter C and another method, first, the determination unit 232 adjusts the parallax amount by changing the aperture value. Thus, the parallax amount curve 1622 is transformed into the parallax amount curve 1624, and the parallax amounts given to the focused subject, the near point subject, and the far point subject become the parallax amount m 101 , the parallax amount m 201, and the parallax amount m 301 . Note that the focus position may be changed instead of the aperture value.
 次に、算出部231は、絞り値及びフォーカス位置等の変更により調整された視差量m101、視差量m201及び視差量m301を、立体調整パラメータCの値によっては未だ調整されていない未調整の視差量として左右の視差画像データから算出する。そして、決定部232は、図10を用いて説明したのと同様に、ルックアップテーブル2310を用いて、これらの未調整の視差量m101、視差量m201及び視差量m301に対応する立体調整パラメータCの値を被写体のそれぞれに対して決定する。 Next, the calculation unit 231 does not adjust the parallax amount m 101 , the parallax amount m 201, and the parallax amount m 301 adjusted by changing the aperture value, the focus position, and the like, depending on the value of the stereoscopic adjustment parameter C. The parallax amount for adjustment is calculated from the left and right parallax image data. Then, as described with reference to FIG. 10, the determination unit 232 uses the lookup table 2310 and uses the look-up table 2310 to create a solid corresponding to the unadjusted parallax amount m 101 , parallax amount m 201, and parallax amount m 301. The value of the adjustment parameter C is determined for each subject.
 このようにして決定される立体調整パラメータCの値は、以下の手法で決定される値と等しい。つまり、まず視差量曲線1622ではなく視差量曲線1624に対して調整視差量曲線1625を生成する。そして、未調整の視差量が視差量の目標値に調整されるような立体調整パラメータCの値を算出する。この場合、視差量が許容視差量を超える被写体については、鑑賞者が違和感、疲労感を感じないように視差を抑制しつつ、視差量が許容視差量に収まる被写体については、より視差を強調することができる。 The value of the stereo adjustment parameter C determined in this way is equal to the value determined by the following method. That is, first, the adjusted parallax amount curve 1625 is generated with respect to the parallax amount curve 1624 instead of the parallax amount curve 1622. Then, the value of the stereoscopic adjustment parameter C is calculated such that the unadjusted parallax amount is adjusted to the target value of the parallax amount. In this case, for a subject whose parallax amount exceeds the allowable parallax amount, the parallax is suppressed so that the viewer does not feel discomfort or fatigue, and for a subject whose parallax amount falls within the allowable parallax amount, the parallax is more emphasized. be able to.
 よって、以上のようにして撮像条件の変更による視差量の調整と、立体調整パラメータCの値の変更による視差量の調整とを併せて行えば、異なる形状の調整視差量曲線を視差量の目標値の決定に用いた場合と同様の効果を得ることができる。そのため、視差量の調整量のバリエーションを増やすことができる。
 なお、撮像条件ごとに調整視差量曲線1625となるように、視差量の目標値を定めるために用いられる関数をあらかじめ設定することもできる。この場合、データ量は大きくなるが、撮影条件を変更しなくても視差量が調整できる。
Therefore, if the adjustment of the parallax amount by changing the imaging condition and the adjustment of the parallax amount by changing the value of the three-dimensional adjustment parameter C are performed together as described above, an adjusted parallax amount curve having a different shape can be obtained. The same effect as that used when determining the value can be obtained. Therefore, variations in the adjustment amount of the parallax amount can be increased.
It should be noted that a function used to determine the target value of the parallax amount can be set in advance so that the adjusted parallax amount curve 1625 is obtained for each imaging condition. In this case, the amount of data increases, but the amount of parallax can be adjusted without changing the shooting conditions.
 図18は、被写体指定を説明する図である。特に図18(a)は、あるシーンにおけるデジタルカメラ10から奥行方向の被写体分布を示し、図18(b)は、そのシーンをライブビューで表示するデジタルカメラ10の背面図である。 FIG. 18 is a diagram for explaining subject designation. In particular, FIG. 18A shows a subject distribution in the depth direction from the digital camera 10 in a certain scene, and FIG. 18B is a rear view of the digital camera 10 displaying the scene in a live view.
 図18(a)に示すように、シーンは、デジタルカメラ10から近い順に大人301(距離L)、少年302(距離L)、少女303(距離L)から構成される。そして図18(b)に示すように、このシーンを捉えたライブビュー画像が、表示部209に表示されている。ここでは、少年302が合焦被写体である。 As shown in FIG. 18A, the scene is composed of an adult 301 (distance L f ), a boy 302 (distance L p ), and a girl 303 (distance L r ) in order from the digital camera 10. Then, as shown in FIG. 18B, a live view image that captures this scene is displayed on the display unit 209. Here, the boy 302 is the focused subject.
 これまでの説明においては、図8を用いて説明したように、未調整の視差量が-m~+mの範囲外となっている被写体については、視差量の調整量が大きくなるような立体調整パラメータCの値を算出した。また、未調整の視差量が-m~+mの範囲内となっている被写体については、未調整の視差量が概ね維持されるような立体調整パラメータCの値を算出した。 In the description so far, as described with reference to FIG. 8, for the subject whose unadjusted parallax amount is outside the range of −m to + m, the three-dimensional adjustment that increases the parallax amount adjustment amount. The value of parameter C was calculated. In addition, for a subject whose unadjusted parallax amount is within the range of −m to + m, the value of the stereoscopic adjustment parameter C was calculated so that the unadjusted parallax amount is generally maintained.
 しかしながら、視差量が-m~+mの範囲内であるか否かに関わらず、撮影者にとって主要被写体ではない被写体に対しては、視差量の調整量を大きくすることで視差量を抑制することが好ましい。したがって、必ずしも視差量が-m~+mの範囲内の被写体に対し、視差量を概ね維持しなくても良い。 However, regardless of whether or not the amount of parallax is within the range of −m to + m, the amount of parallax can be suppressed by increasing the amount of adjustment of the amount of parallax for subjects that are not the main subject for the photographer. Is preferred. Accordingly, it is not always necessary to maintain the amount of parallax for a subject whose amount of parallax is in the range of −m to + m.
 例えば、図18(a)のシーンにおいて少年302と少女303を主な被写体とすると、鑑賞時において鑑賞者はこの2人を注視することが想定される。したがって、大人301の被写体像については鑑賞者が違和感、疲労感を覚えないように視差量を抑制しても良い。そこで、制御部201は、いずれの被写体を主要被写体とすべきか、撮影者の指示を受け付ける。 For example, if the boy 302 and the girl 303 are the main subjects in the scene of FIG. 18A, it is assumed that the viewer gazes at these two people at the time of viewing. Therefore, the parallax amount may be suppressed so that the viewer does not feel discomfort and fatigue with respect to the subject image of the adult 301. Therefore, the control unit 201 receives a photographer's instruction as to which subject should be the main subject.
 表示部209には、ユーザ指示を受け付ける状態であることを示す、タイトル320(例えば「主要被写体の領域を設定して下さい」)が表示されている。この状態において、ユーザは、枠310の位置及び大きさを調整することで、主要被写体にしたい被写体像の含まれる領域を範囲指定する。表示部209には操作部208の一部としてタッチパネル2083が重ねて設けられており、制御部201は、タッチパネル2083の出力を取得して、いずれの被写体を主要被写体とするかを決定する。 The display unit 209 displays a title 320 (for example, “Please set the main subject area”) indicating that the user instruction is accepted. In this state, the user adjusts the position and size of the frame 310 to specify a range of the area including the subject image that is desired to be the main subject. The display unit 209 is provided with a touch panel 2083 as a part of the operation unit 208, and the control unit 201 acquires the output of the touch panel 2083 and determines which subject is the main subject.
 これにより、枠310に含まれない大人301の被写体が、視差量を小さくしたい被写体として指定される。そして、大人301の被写体については、決定部232は、未調整の視差量が-m~+mの範囲内であるか否かに関わらず、ルックアップテーブル2310から算出される値よりも0.5に近い値を立体調整パラメータCの値として決定する。但し、この場合には、大人301の被写体と、他の被写体との間で奥行き関係が崩れないように立体調整パラメータCの値を決定することが好ましい。 Thereby, the subject of the adult 301 not included in the frame 310 is designated as the subject whose parallax amount is desired to be reduced. For the subject of the adult 301, the determination unit 232 determines that the unadjusted parallax amount is 0.5 from the value calculated from the lookup table 2310 regardless of whether or not the unadjusted parallax amount is within the range of −m to + m. A value close to is determined as the value of the three-dimensional adjustment parameter C. However, in this case, it is preferable to determine the value of the three-dimensional adjustment parameter C so that the depth relationship between the subject of the adult 301 and another subject does not collapse.
 これにより、大人301の被写体では、ルックアップテーブル2310から算出される立体調整パラメータCの値をそのまま用いる場合と比較して視差量の調整量が大きくなり、視差量が抑制される。その結果、各被写体についての視差量の調整量は、例えば図18(b)において各被写体に重ねて示す通りとなる。 Thereby, in the subject of the adult 301, the adjustment amount of the parallax amount becomes larger and the parallax amount is suppressed as compared with the case where the value of the stereoscopic adjustment parameter C calculated from the lookup table 2310 is used as it is. As a result, the amount of adjustment of the parallax amount for each subject is, for example, as shown superimposed on each subject in FIG.
 ここで、これらの調整量は、絶対値が大きいほど、視差量の調整度合いが大きいことを意味する。また、正の値(例えば+2)はデジタルカメラ10からの距離が大きくなる方向に視差量が調整されることを意味し、負の値(例えば-1)はデジタルカメラ10からの距離が小さくなる方向に視差量が調整されることを意味する。 Here, these adjustment amounts mean that the degree of adjustment of the parallax amount is larger as the absolute value is larger. A positive value (for example, +2) means that the amount of parallax is adjusted in a direction in which the distance from the digital camera 10 increases, and a negative value (for example, −1) decreases the distance from the digital camera 10. This means that the amount of parallax is adjusted in the direction.
 これらの調整量から分かるように、以上のような処理によれば、主要被写体としての少年302及び少女303では視差量の調整量が小さくなって立体感が維持される。一方、主要被写体ではない大人301では視差量の調整量が大きくなって、視差量が小さくなる。 As can be seen from these adjustment amounts, according to the processing as described above, the adjustment amount of the parallax amount is reduced and the stereoscopic effect is maintained in the boy 302 and the girl 303 as the main subjects. On the other hand, for the adult 301 who is not the main subject, the amount of parallax adjustment is increased and the amount of parallax is reduced.
 次にデジタルカメラ10の動画撮影における処理フローについて説明する。上述のように、生成される画像間の視差量が許容視差量に収まるようにするには、画像処理に用いる立体調整パラメータCの値を変更する場合と、視差量に影響を与える撮像条件及び立体調整パラメータCの値をそれぞれ変更する場合とがある。 Next, a processing flow in moving image shooting of the digital camera 10 will be described. As described above, in order for the amount of parallax between generated images to fall within the allowable amount of parallax, the case where the value of the stereoscopic adjustment parameter C used for image processing is changed, the imaging conditions that affect the amount of parallax, and The value of the three-dimensional adjustment parameter C may be changed.
 <第1実施例>
 第1実施例として、画像処理に用いる立体調整パラメータCの値を変更することで視差量を調整する動画撮影の処理フローを説明する。図19は、第1実施例に係る動画撮影における処理フローである。
<First embodiment>
As a first embodiment, a processing flow of moving image shooting for adjusting the amount of parallax by changing the value of the stereoscopic adjustment parameter C used for image processing will be described. FIG. 19 is a processing flow in moving image shooting according to the first embodiment.
 なお、図のフローは、撮影者によりモードボタンが操作されてオート3D動画モードが開始された時点から始まる。視差量範囲は、事前に撮影者により設定されている。 Note that the flow in the figure starts when the mode button is operated by the photographer and the auto 3D video mode is started. The parallax amount range is set in advance by the photographer.
 オート3D動画モードが開始されると、決定部232は、ステップS11で、撮影者が設定した視差量範囲をシステムメモリから取得する。制御部201は、ステップS12で、AF、AEを実行する。 When the auto 3D moving image mode is started, the determination unit 232 acquires the parallax amount range set by the photographer from the system memory in step S11. In step S12, the control unit 201 executes AF and AE.
 ステップS13へ進み、制御部201は、例えば図18を用いて説明したように、タッチパネル2083を介して主要被写体にしたい被写体像の含まれる領域の範囲指定をユーザから受け付ける。ステップS15へ進み、制御部201は、撮影者が録画開始ボタンを押下げる録画開始指示を待つ。 Proceeding to step S13, the control unit 201 receives from the user, via the touch panel 2083, designation of a range of an area including a subject image that is desired to be a main subject, as described with reference to FIG. In step S15, the control unit 201 waits for a recording start instruction for the photographer to press the recording start button.
 録画開始指示を検出すると(ステップS15のYES)、制御部201は、ステップS17へ進む。指示を検出しない場合は、ステップS12へ戻る。なお、ステップS12へ戻った後においては、指定された被写体の追従を行い、ステップS13の処理をスキップしても良い。 When the recording start instruction is detected (YES in step S15), the control unit 201 proceeds to step S17. If no instruction is detected, the process returns to step S12. Note that after returning to step S12, the specified subject may be tracked and the process of step S13 may be skipped.
 制御部201は、ステップS17では、変更された撮像条件に従って、再びAF,AEを実行する。そして、制御部201は、ステップS18で、駆動部204を介して撮像素子100の電荷蓄積、読み出しを実行し、1フレームとしての撮影画像データを取得する。ここで取得された撮影画像データにおける視差画像間の視差量は、被写体分布および撮影条件によっては、設定された視差量に収まっていない。 In step S17, the control unit 201 executes AF and AE again according to the changed imaging condition. In step S18, the control unit 201 performs charge accumulation and readout of the image sensor 100 via the drive unit 204, and acquires captured image data as one frame. The amount of parallax between the parallax images in the captured image data acquired here does not fall within the set amount of parallax depending on the subject distribution and the shooting conditions.
 ステップS33では、視差画像データ生成部233は、決定部232により決定される立体調整パラメータCの値と撮影画像データとを受け取り、左側視点のカラー画像データ(RLtプレーンデータ、GLtプレーンデータ、BLtプレーンデータ)および右側視点のカラー画像データ(RRtプレーンデータ、GRtプレーンデータ、BRtプレーンデータ)を生成する。具体的な処理は後述する。 In step S33, the parallax image data generation unit 233 receives the value of the stereoscopic adjustment parameter C determined by the determination unit 232 and the captured image data, and receives color image data (RLt c plane data, GLt c plane data, left viewpoint) BLt c plane data) and right viewpoint color image data (RRt c plane data, GRt c plane data, BRt c plane data) are generated. Specific processing will be described later.
 ステップS19では、動画撮影中に主要被写体を撮影者が変更したいと考えた場合のために、制御部201は、主要被写体にしたい被写体像の含まれる領域の範囲指定をユーザから受け付ける。そして、制御部201は、ステップS21で、撮影者から録画停止指示を受けていないと判断すればステップS17へ戻り次のフレーム処理を実行する。録画停止指示を受けたと判断すればステップS22へ進む。 In step S19, for the case where the photographer wants to change the main subject during moving image shooting, the control unit 201 accepts from the user the range specification of the area including the subject image that he wants to be the main subject. If the control unit 201 determines in step S21 that it has not received a recording stop instruction from the photographer, it returns to step S17 and executes the next frame process. If it is determined that a recording stop instruction has been received, the process proceeds to step S22.
 ステップS22では、動画生成部234は、連続的に生成された左側視点のカラー画像データと右側視点のカラー画像データを繋ぎ合わせ、Blu-ray3Dなどの3D対応動画フォーマットに従ってフォーマット処理を実行し、動画ファイルを生成する。そして、制御部201は、生成された動画ファイルを、メモリカードIF207を介してメモリカード220へ記録し、一連のフローを終了する。 In step S22, the moving image generating unit 234 connects the continuously generated left-viewpoint color image data and right-viewpoint color image data, and executes format processing according to a 3D-compatible moving image format such as Blu-ray 3D. Generate a file. Then, the control unit 201 records the generated moving image file on the memory card 220 via the memory card IF 207, and ends a series of flows.
 なお、メモリカード220への記録は、左側視点のカラー画像データと右側視点のカラー画像データの生成に同期して逐次実行し、録画停止指示に同期してファイル終端処理を実行しても良い。また、制御部201は、メモリカード220へ記録するに限らず、例えばLANを介して外部機器に出力するように構成しても良い。 Note that the recording to the memory card 220 may be sequentially executed in synchronization with the generation of the color image data of the left viewpoint and the color image data of the right viewpoint, and the file end process may be executed in synchronization with the recording stop instruction. Further, the control unit 201 is not limited to recording on the memory card 220, and may be configured to output to an external device via a LAN, for example.
 図20は、左側視点のカラー画像データと右側視点のカラー画像データである視差カラー画像データを生成するまでの、ステップS33の処理フローである。 FIG. 20 is a processing flow of step S33 until the color image data of the left viewpoint and the parallax color image data which is the color image data of the right viewpoint are generated.
 視差画像データ生成部233は、ステップS101で、撮影画像データを取得する。そして、ステップS102において、図3を用いて説明したように、撮影画像データを、視差なし画像データと視差画像データにプレーン分離する。視差画像データ生成部233は、ステップS103で、図3を用いて説明したように分離した各プレーンデータに存在する空格子を補間する補間処理を実行する。 The parallax image data generation unit 233 acquires captured image data in step S101. In step S102, as described with reference to FIG. 3, the captured image data is plane-separated into image data without parallax and parallax image data. In step S103, the parallax image data generation unit 233 executes an interpolation process for interpolating vacancies existing in the separated plane data as described with reference to FIG.
 視差画像データ生成部233は、ステップS104で、各変数の初期化を行う。具体的には、まず、カラー変数Csetに1を代入する。カラー変数Csetは、1=赤、2=緑、3=青を表す。また、座標変数であるiとjに1を代入する。さらに、視差変数Sに1を代入する。視差変数Sは、1=左、2=右を表す。 The parallax image data generation unit 233 initializes each variable in step S104. Specifically, first, 1 is substituted into the color variable Cset. The color variable Cset represents 1 = red, 2 = green, 3 = blue. Also, 1 is substituted into the coordinate variables i and j. Further, 1 is substituted into the parallax variable S. The parallax variable S represents 1 = left, 2 = right.
 視差画像データ生成部233は、ステップS105において、Csetプレーンの対象画素位置(i,j)から画素値を抽出する。例えばCset=1で、対象画素位置が(1,1)である場合、抽出する画素値は、Rn11である。さらに、視差画像データ生成部233は、ステップS106において、LtCsetプレーンデータ、RtCsetプレーンデータの対象画素位置(i,j)から画素値を抽出する。例えば対象画素位置が(1,1)である場合、抽出する画素値は、LtCset11とRtCset11である。 In step S105, the parallax image data generation unit 233 extracts a pixel value from the target pixel position (i, j) of the Cset plane. For example, Cset = 1, when a target pixel position (1,1), pixel values to be extracted is Rn 11. Further, in step S106, the parallax image data generation unit 233 extracts a pixel value from the target pixel position (i, j) of the Lt Cset plane data and the Rt Cset plane data. For example, when the target pixel position is (1, 1), the pixel values to be extracted are Lt Cset11 and Rt Cset11 .
 算出部231は、ステップS107において対象画素位置(i,j)の画素で表示される被写体についての未調整の視差量を、左右の視差画像データから算出する。そして決定部232は、ステップS108において、算出部231が算出した視差量に対応する立体調整パラメータCの値をルックアップテーブル2310から取得する。 The calculation unit 231 calculates an unadjusted parallax amount for the subject displayed at the pixel at the target pixel position (i, j) in step S107 from the left and right parallax image data. In step S <b> 108, the determination unit 232 acquires the value of the stereoscopic adjustment parameter C corresponding to the parallax amount calculated by the calculation unit 231 from the lookup table 2310.
 このとき、決定部232は、図18を用いて説明したように、視差量が-m~+mの範囲内の被写体であっても、撮影者により主要被写体ではないとされた被写体に対しては、ルックアップテーブル2310から算出される値よりも1に近い値を立体調整パラメータCの値として決定しても良い。また、決定部232は、画像に含まれる被写体の輝度値を演算し、輝度値が小さいほど視差量が小さくなるよう、更に立体調整パラメータCの値を調整しても良い。 At this time, as described with reference to FIG. 18, the determination unit 232 applies to a subject that is determined not to be a main subject by the photographer even if the amount of parallax is within a range of −m to + m. A value closer to 1 than the value calculated from the lookup table 2310 may be determined as the value of the stereoscopic adjustment parameter C. Further, the determination unit 232 may calculate the luminance value of the subject included in the image, and further adjust the value of the stereoscopic adjustment parameter C so that the parallax amount decreases as the luminance value decreases.
 視差画像データ生成部233は、ステップS109において、視差変数Sに対応する対象画素位置(i,j)の画素値を算出する。例えばCset=1、S=1で、対象画素位置が(1,1)である場合、RRLtC11を算出する。具体的には、例えば、上述の式(1)により算出する。ここで、立体調整パラメータCの値は、対象画素により表示される被写体についてステップS108で決定された値である。 In step S109, the parallax image data generation unit 233 calculates the pixel value of the target pixel position (i, j) corresponding to the parallax variable S. For example, when Cset = 1 and S = 1 and the target pixel position is (1, 1), RRLt C11 is calculated. Specifically, for example, it is calculated by the above-described equation (1). Here, the value of the stereoscopic adjustment parameter C is the value determined in step S108 for the subject displayed by the target pixel.
 視差画像データ生成部233は、ステップS110で、視差変数Sをインクリメントする。そして、ステップS111で、視差変数Sが2を超えたか否かを判断する。超えていなければステップS109へ戻る。超えていればステップS112へ進む。 The parallax image data generation unit 233 increments the parallax variable S in step S110. In step S111, it is determined whether or not the parallax variable S exceeds 2. If not, the process returns to step S109. If it exceeds, the process proceeds to step S112.
 視差画像データ生成部233は、ステップS112で、視差変数Sに1を代入すると共に、座標変数iをインクリメントする。そして、ステップS113で、座標変数iがiを超えたか否かを判断する。超えていなければステップS105へ戻る。超えていればステップS114へ進む。 In step S112, the parallax image data generation unit 233 assigns 1 to the parallax variable S and increments the coordinate variable i. Then, in step S113, it is determined whether coordinate variable i exceeds i 0. If not, the process returns to step S105. If it exceeds, the process proceeds to step S114.
 視差画像データ生成部233は、ステップS114で、座標変数iに1を代入すると共に、座標変数jをインクリメントする。そして、ステップS115で、座標変数jがjを超えたか否かを判断する。超えていなければステップS105へ戻る。超えていればステップS116へ進む。 In step S114, the parallax image data generation unit 233 assigns 1 to the coordinate variable i and increments the coordinate variable j. Then, in step S115, it is determined whether coordinate variable j exceeds j 0. If not, the process returns to step S105. If it exceeds, the process proceeds to step S116.
 ステップS116まで進むと、Csetに対する左右それぞれの全画素の画素値が揃うので、視差画像データ生成部233は、これらの画素値を並べて、プレーン画像データを生成する。例えばCset=1である場合、RLtプレーンデータとRRtプレーンデータを生成する。 When the process proceeds to step S116, since the pixel values of all the left and right pixels for Cset are aligned, the parallax image data generation unit 233 arranges these pixel values and generates plain image data. For example, when Cset = 1, RLt c plane data and RRt c plane data are generated.
 ステップS117ヘ進み、視差画像データ生成部233は、座標変数jに1を代入すると共に、カラー変数Csetをインクリメントする。そして、ステップS118で、カラー変数Csetが3を超えたか否かを判断する。超えていなければステップS105へ戻る。超えていれば、左側視点のカラー画像データ(RLtプレーンデータ、GLtプレーンデータ、BLtプレーンデータ)および右側視点のカラー画像データ(RRtプレーンデータ、GRtプレーンデータ、BRtプレーンデータ)の全てが揃ったとして、図19のフローに戻る。 In step S117, the parallax image data generation unit 233 assigns 1 to the coordinate variable j and increments the color variable Cset. In step S118, it is determined whether or not the color variable Cset exceeds 3. If not, the process returns to step S105. If it exceeds, color image data of the left viewpoint (RLt c plane data, GLt c plane data, BLt c plane data) and color image data of the right viewpoint (RRt c plane data, GRt c plane data, BRt c plane data) If all of the above are complete, the flow returns to the flow of FIG.
 <第2実施例>
 第2実施例として、視差量に影響を与える撮像条件及び立体調整パラメータCの値を変更することで視差量を調整する動画撮影の処理フローを説明する。図21は、第2実施例に係る動画撮影における処理フローである。図19の処理フローの各処理と関連する処理については同一のステップ番号を付すことにより、異なる処理および追加的な処理の説明を除いて、その説明を省略する。
<Second embodiment>
As a second embodiment, a processing flow of moving image shooting for adjusting the parallax amount by changing the imaging condition that affects the parallax amount and the value of the three-dimensional adjustment parameter C will be described. FIG. 21 is a processing flow in moving image shooting according to the second embodiment. The processing related to each processing in the processing flow of FIG. 19 is denoted by the same step number, and the description thereof is omitted except for the description of different processing and additional processing.
 本フローにおいては、制御部201は、ステップS15で録画開始指示を検出すると(ステップS15のYES)、ステップS16へ進む。ステップS16において決定部232は、撮影条件を変更し、ステップS17へ進む。 In this flow, when the control unit 201 detects a recording start instruction in step S15 (YES in step S15), the process proceeds to step S16. In step S16, the determination unit 232 changes the shooting condition, and proceeds to step S17.
 具体的には、ステップS16において決定部232は、図15を用いて説明したように絞り値を変更したり、図16を用いて説明したようにフォーカスレンズ位置を変更したりする。この他にも、視差量に影響を与える撮像条件を、設定された視差量範囲に収まるように変更しても良い。例えば、撮影レンズ20がズームレンズであれば焦点距離を変更することもできる。 Specifically, in step S16, the determination unit 232 changes the aperture value as described with reference to FIG. 15 or changes the focus lens position as described with reference to FIG. In addition to this, an imaging condition that affects the amount of parallax may be changed so as to be within the set range of the amount of parallax. For example, if the photographing lens 20 is a zoom lens, the focal length can be changed.
 そして、ステップS33の処理におけるステップS107の処理において算出部231は、対象画素位置(i,j)の画素で表示される被写体についての未調整の視差量を、左右の視差画像データから算出する。この視差量は、絞り値及びフォーカス位置等の変更により調整された視差量である。また、本フローにおいて制御部201は、ステップS21で、撮影者から録画停止指示を受けていないと判断すればステップS16へ戻り次のフレーム処理を実行する。 In the process of step S107 in the process of step S33, the calculation unit 231 calculates an unadjusted parallax amount for the subject displayed by the pixel at the target pixel position (i, j) from the left and right parallax image data. This parallax amount is the parallax amount adjusted by changing the aperture value, the focus position, and the like. Also, in this flow, if the control unit 201 determines in step S21 that a recording stop instruction has not been received from the photographer, the control unit 201 returns to step S16 and executes the next frame processing.
 次に、図2を用いて説明した開口マスクの好ましい開口形状について説明する。図22は、好ましい開口形状を説明する図である。 Next, a preferable opening shape of the opening mask described with reference to FIG. 2 will be described. FIG. 22 is a diagram illustrating a preferred opening shape.
 視差Lt画素の開口部105、および視差Rt画素の開口部106は、それぞれ対応する画素に対して中心を含んで互いに反対方向に偏位していることが好ましい。具体的には、開口部105および106のそれぞれが、画素中心を通る仮想的な中心線322と接する形状であるか、あるいは、中心線322を跨ぐ形状であることが好ましい。 It is preferable that the opening part 105 of the parallax Lt pixel and the opening part 106 of the parallax Rt pixel are deviated in directions opposite to each other including the center with respect to the corresponding pixel. Specifically, it is preferable that each of the openings 105 and 106 has a shape in contact with a virtual center line 322 passing through the center of the pixel or a shape straddling the center line 322.
 特に、図示するように、開口部105の形状と開口部106の形状は、視差なし画素の開口部104の形状を中心線322で分割したそれぞれの形状と同一であることが好ましい。別言すれば、開口部104の形状は、開口部105の形状と開口部106の形状を隣接させた形状に等しいことが好ましい。 In particular, as illustrated, the shape of the opening 105 and the shape of the opening 106 are preferably the same as the respective shapes obtained by dividing the shape of the opening 104 of the non-parallax pixel by the center line 322. In other words, the shape of the opening 104 is preferably equal to the shape in which the shape of the opening 105 and the shape of the opening 106 are adjacent to each other.
 以上の説明においては、視差画像データ生成部233が用いる計算式は、加重相加平均を利用した上記式(1)(2)を採用したが、これに限らず様々な計算式を採用することができる。例えば、加重相乗平均を利用すれば、上記式(1)(2)と同様に表して、
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
 を計算式として採用できる。この場合、維持されるぼけ量が、視差なし画素の出力によるぼけ量ではなく、視差画素の出力によるぼけ量となる。
In the above description, the calculation formula used by the parallax image data generation unit 233 employs the above formulas (1) and (2) using the weighted arithmetic mean, but not limited to this, various calculation formulas are adopted. Can do. For example, if the weighted geometric mean is used, it can be expressed in the same manner as the above formulas (1) and (2)
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
Can be adopted as a calculation formula. In this case, the amount of blur maintained is not the amount of blur due to the output of the non-parallax pixel but the amount of blur due to the output of the parallax pixel.
 また、他の計算式としては、上記式(1)(2)と同様に表して、
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
 を採用しても良い。この場合、GLtcmn、GRtcmn、BLtcmn、BRtcmnをそれぞれ算出するときも、立方根の項は変化しない。
Moreover, as another calculation formula, it represents like the said Formula (1) (2),
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
May be adopted. In this case, when calculating GLt cmn , GRt cmn , BLt cmn , and BRt cmn , the cubic root term does not change.
 さらには、
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
 を採用しても良い。この場合も、GLtcmn、GRtcmn、BLtcmn、BRtcmnをそれぞれ算出するときも、立方根の項は変化しない。
Moreover,
Figure JPOXMLDOC01-appb-M000007
Figure JPOXMLDOC01-appb-M000008
May be adopted. Also in this case, when calculating GLt cmn , GRt cmn , BLt cmn , and BRt cmn , the cubic root term does not change.
 次に表示装置との連携について説明する。図23は、デジタルカメラ10とTVモニタ80との連携を説明する図である。TVモニタ80は、例えば液晶から構成される表示部40、デジタルカメラ10から取り出されたメモリカード220を受容するメモリカードIF81、鑑賞者が手元で操作するリモコン82等により構成されている。TVモニタ80は3D画像の表示に対応している。3D画像の表示形式は、特に限定されない。例えば、右目用画像と左目用画像を時分割で表示しても良いし、水平方向または垂直方向に短冊状にそれぞれが並んだインターレースであっても良い。また、画面の一方側と他方側に並ぶサイドバイサイド形式であっても良い。 Next, cooperation with the display device will be described. FIG. 23 is a diagram for explaining cooperation between the digital camera 10 and the TV monitor 80. The TV monitor 80 includes a display unit 40 made of, for example, liquid crystal, a memory card IF 81 that receives the memory card 220 taken out from the digital camera 10, a remote controller 82 that is operated by a viewer at hand, and the like. The TV monitor 80 is compatible with 3D image display. The display format of the 3D image is not particularly limited. For example, the right-eye image and the left-eye image may be displayed in a time-division manner, or may be an interlace in which strips are arranged in a horizontal or vertical direction. Further, it may be a side-by-side format arranged on one side and the other side of the screen.
 TVモニタ80は、左側視点のカラー画像データと右側視点のカラー画像データを含んでフォーマット化された動画ファイルをデコードして、3D画像を表示部40に表示する。この場合、TVモニタ80は、規格化された動画ファイルを表示する一般的な表示装置としての機能を担う。 The TV monitor 80 decodes a moving image file that includes the color image data of the left viewpoint and the color image data of the right viewpoint, and displays a 3D image on the display unit 40. In this case, the TV monitor 80 serves as a general display device that displays a standardized moving image file.
 しかしながら、TVモニタ80は、図1を用いて説明した制御部201の機能の少なくとも一部、および画像処理部205の少なくとも一部の機能を担う画像処理装置として機能させることもできる。具体的には、図1で説明した算出部231、決定部232、視差画像データ生成部233及び動画生成部234を包含する画像処理部をTVモニタ80に組み込む。このように構成することにより、実施例2におけるデジタルカメラ10とTVモニタ80の組み合わせによる機能分担とは異なる機能分担を実現することができる。 However, the TV monitor 80 can also function as an image processing apparatus that bears at least part of the function of the control unit 201 and at least part of the function of the image processing unit 205 described with reference to FIG. Specifically, an image processing unit including the calculation unit 231, the determination unit 232, the parallax image data generation unit 233, and the moving image generation unit 234 described in FIG. 1 is incorporated in the TV monitor 80. By configuring in this way, it is possible to realize function sharing different from the function sharing by the combination of the digital camera 10 and the TV monitor 80 in the second embodiment.
 具体的には、デジタルカメラ10は、立体調整パラメータによる画像処理を実行せず、奥行情報検出部230が検出した奥行情報を、生成した撮影画像データに関連付ける。そして、TVモニタ80は、画像処理装置として、関連付けられた奥行情報を参照して立体調整パラメータCの値を被写体ごとに決定し、取得した画像データに対して立体調整パラメータCを用いた画像処理を実行する。TVモニタ80は、このように視差量が調整された3D画像を表示部40に表示する。 Specifically, the digital camera 10 does not perform image processing using the stereoscopic adjustment parameter, and associates the depth information detected by the depth information detection unit 230 with the generated captured image data. As an image processing apparatus, the TV monitor 80 determines the value of the stereoscopic adjustment parameter C for each subject with reference to the associated depth information, and performs image processing using the stereoscopic adjustment parameter C for the acquired image data. Execute. The TV monitor 80 displays the 3D image with the parallax amount adjusted in this way on the display unit 40.
 上記の変形例においては、鑑賞者が、TVモニタ80での再生時に、一部の調整情報を入力できるように構成しても良い。例えば、鑑賞者は、リモコン82を操作して、視差量範囲を入力できる。TVモニタ80は、入力された視差量範囲を調整情報として取得し、決定部232は、この視差量範囲に従って立体調整パラメータCの値を決定する。このように構成すれば、TVモニタ80は、鑑賞者ごとの好みに応じた3D画像を表示することができる。 In the above modification, the viewer may be configured to be able to input some adjustment information during playback on the TV monitor 80. For example, the viewer can input the parallax amount range by operating the remote controller 82. The TV monitor 80 acquires the input parallax amount range as adjustment information, and the determination unit 232 determines the value of the stereoscopic adjustment parameter C according to the parallax amount range. If comprised in this way, the TV monitor 80 can display the 3D image according to the preference for every viewer.
 以上の実施形態によれば、撮影画像データの画像に含まれる複数の被写体それぞれの視差量が算出され、算出された視差量に応じて複数の被写体のそれぞれに対して立体調整パラメータCの値が決定され、この立体調整パラメータCの値が撮影画像データに適用されて視差画像データが生成される。従って、撮影画像データの全体に対して単一の立体調整パラメータCの値を適用する場合と異なり、主要被写体と、主要被写体ではない被写体との間で視差量の調整量を意図的に相違させることができる。よって、主要被写体については立体感を残しつつ、主要被写体でない被写体では視差量を抑制することができるため、鑑賞時の違和感、疲労感を低減することができる。 According to the above embodiment, the parallax amount of each of the plurality of subjects included in the image of the captured image data is calculated, and the value of the stereoscopic adjustment parameter C is set for each of the plurality of subjects according to the calculated amount of parallax. Then, the value of the stereoscopic adjustment parameter C is applied to the captured image data to generate parallax image data. Therefore, unlike the case where the value of the single stereoscopic adjustment parameter C is applied to the entire captured image data, the amount of parallax adjustment is intentionally different between the main subject and the subject that is not the main subject. be able to. Therefore, since the parallax amount can be suppressed in the subject that is not the main subject while the stereoscopic subject is left as the main subject, it is possible to reduce a sense of discomfort and fatigue during viewing.
 また、画像に含まれる複数の被写体のうち視差量が閾値を超えた被写体に対しては、閾値未満に視差量が調整されるように立体調整パラメータCの値が決定されるので、鑑賞時の違和感、疲労感を確実に低減することができる。 In addition, for a subject whose parallax amount exceeds a threshold value among a plurality of subjects included in the image, the value of the stereoscopic adjustment parameter C is determined so that the parallax amount is adjusted to be less than the threshold value. A sense of discomfort and fatigue can be reliably reduced.
 また、画像に含まれる複数の被写体うち、少なくとも一部の被写体が奥行き方向に連続する場合に、調整された視差量がこれらの被写体に関し奥行き方向において連続するように立体調整パラメータCの値が決定される。従って、奥行き方向に連続する被写体の間で視差量が離散的に変化してしまうのが防止される。よって、鑑賞時の違和感、疲労感を確実に低減することができる。 Further, when at least some of the plurality of subjects included in the image are continuous in the depth direction, the value of the three-dimensional adjustment parameter C is determined so that the adjusted parallax amount is continuous in the depth direction with respect to these subjects. Is done. Therefore, it is possible to prevent the parallax amount from changing discretely between subjects that are continuous in the depth direction. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
 また、画像に含まれる複数の被写体のうち、少なくとも一部の被写体が奥行き方向に連続する場合に、調整された視差量の微分値がこれらの被写体に関し奥行き方向において連続するように立体調整パラメータCの値が決定される。従って、奥行き方向に連続する被写体の間で視差量の変化を滑らかにすることができる。よって、鑑賞時の違和感、疲労感を確実に低減することができる。 Further, when at least some of the subjects included in the image are continuous in the depth direction, the stereoscopic adjustment parameter C is set so that the differential value of the adjusted parallax amount is continuous in the depth direction with respect to these subjects. The value of is determined. Therefore, the change in the amount of parallax between subjects that are continuous in the depth direction can be smoothed. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
 また、画像に含まれる複数の被写体から、立体調整パラメータCの値を適用して視差量を抑制したい被写体が使用者に指定されるので、指定される被写体の視差量が抑制される。従って、鑑賞時の違和感、疲労感を確実に低減することができる。 In addition, since the subject for which the amount of parallax is to be suppressed is specified by the user from among a plurality of subjects included in the image by applying the value of the stereoscopic adjustment parameter C, the amount of parallax of the specified subject is suppressed. Accordingly, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
 また、画像に含まれる複数の被写体それぞれの輝度値が演算され、輝度値が小さいほど視差量が小さくなるように立体調整パラメータCの値が決定される。従って、画像内で輝度値が小さい領域、つまり明度の低い領域に対して大きな視差が与えられるのが防止される。よって、鑑賞時の違和感、疲労感を確実に低減することができる。 Also, the brightness value of each of the plurality of subjects included in the image is calculated, and the value of the stereoscopic adjustment parameter C is determined so that the parallax amount decreases as the brightness value decreases. Therefore, it is possible to prevent a large parallax from being given to a region having a small luminance value in the image, that is, a region having low brightness. Therefore, it is possible to reliably reduce a sense of incongruity and fatigue during viewing.
 なお、以上の実施形態では、TVモニタ80を画像処理装置の一例として説明したが、画像処理装置はさまざまな形態を採り得る。例えば、PC、携帯電話、ゲーム機器など、表示部を備える、あるいは表示部に接続される機器は画像処理装置になり得る。 In the above embodiment, the TV monitor 80 has been described as an example of the image processing apparatus. However, the image processing apparatus can take various forms. For example, a device such as a PC, a mobile phone, or a game device that includes or is connected to the display unit can be an image processing apparatus.
 また、以上の実施形態では動画撮影を前提として説明したが、検出した奥行情報に基づいて視差量を調整した画像データを出力する構成は、もちろん静止画撮影についても適用できる。このように撮影された静止画像は、左右の画像間に極端な視差を生じさせず、鑑賞者に違和感を与えない。 Further, although the above embodiment has been described on the premise of moving image shooting, the configuration of outputting image data in which the amount of parallax is adjusted based on the detected depth information can of course be applied to still image shooting. The still image shot in this way does not cause extreme parallax between the left and right images, and does not give the viewer a sense of incongruity.
 また、以上の実施形態では、対象とする被写体をユーザの指示により受け付けたが、制御部201が自動的に対象とする被写体を選定しても良い。例えば、制御部201は、人物認識処理により、シーンに含まれる人物像に限って対象被写体とすることができる。 In the above embodiment, the target subject is accepted by the user's instruction, but the control unit 201 may automatically select the subject. For example, the control unit 201 can set only a human image included in the scene as a target subject through the human recognition process.
 また、以上の実施形態では、撮影画像データから視差画像データ(Lt視差プレーンデータ及びRt視差プレーンデータ)を生成することとして説明した。しかしながら、撮影画像ではない画像データ(CG画像など)から視差画像データを生成しても良い。 Further, in the above embodiment it has been described as generating the parallax image data (Lt c disparity plane data and Rt c disparity plane data) from the photographed image data. However, parallax image data may be generated from image data (such as a CG image) that is not a captured image.
 また、以上の実施形態では、決定部232は1つのルックアップテーブル2310を用いて立体調整パラメータCの値を決定することとして説明した。しかしながら、決定部232に複数のルックアップテーブル2310を記憶させておき、撮影者に選択されるルックアップテーブル2310を用いて立体調整パラメータCの値を決定しても良い。この場合には、各被写体についての視差量の調整量を選択することができる。なお、ルックアップテーブル2310を選択することは、視差量の目標値を定めるために用いられる関数、つまり調整視差量曲線1623等の関数を選択することに等しい。 In the above embodiment, the determination unit 232 is described as determining the value of the three-dimensional adjustment parameter C using one lookup table 2310. However, a plurality of lookup tables 2310 may be stored in the determination unit 232, and the value of the stereoscopic adjustment parameter C may be determined using the lookup table 2310 selected by the photographer. In this case, the amount of parallax adjustment for each subject can be selected. Note that selecting the lookup table 2310 is equivalent to selecting a function used to determine the target value of the parallax amount, that is, a function such as the adjusted parallax amount curve 1623.
 この場合には、視差量の調整量のバリエーションを増やすことができる。なお、複数のルックアップテーブル2310は、形状の異なる調整視差量曲線から生成することができる。 In this case, variations in the amount of parallax adjustment can be increased. Note that the plurality of lookup tables 2310 can be generated from adjusted parallax amount curves having different shapes.
 また、以上の実施形態では、ルックアップテーブル2310を用いて立体調整パラメータCの値を決定することとして説明した。しかしながら、上述の原理に従って撮像条件ごとに調整視差量曲線1623の関数をデジタルカメラ10内に記憶させておき、これらの関数と、奥行き方向の被写体分布およびフォーカス位置の情報とを用いて立体調整パラメータCの値を決定しても良い。更に、この場合には、同一の撮影条件に対して調整視差量曲線1623の関数をデジタルカメラ10内に複数記憶させておき、使用者が操作部208を操作することで、いずれかの関数を使用対象として選択しても良い。 In the above embodiment, the value of the stereo adjustment parameter C is determined using the lookup table 2310. However, the function of the adjustment parallax amount curve 1623 is stored in the digital camera 10 for each imaging condition in accordance with the above-described principle, and the stereoscopic adjustment parameter is obtained using these functions and information on the subject distribution and the focus position in the depth direction. The value of C may be determined. Furthermore, in this case, a plurality of functions of the adjustment parallax amount curve 1623 are stored in the digital camera 10 for the same shooting condition, and the user operates the operation unit 208 to change any one of the functions. You may select as an object of use.
 また、以上の実施形態では、一部の被写体がシーンの奥行き方向に連続する場合に、これら連続する被写体に関し、調整された視差量の微分値が奥行き方向において連続するように立体調整パラメータCの値を決定することとして説明した。しかしながら、このような場合に、必ずしも微分値が連続するように立体調整パラメータCの値を決定しなくても良い。 Further, in the above embodiment, when some subjects are continuous in the depth direction of the scene, the three-dimensional adjustment parameter C is set so that the differential value of the adjusted parallax amount is continuous in the depth direction for these continuous subjects. Described as determining the value. However, in such a case, it is not always necessary to determine the value of the three-dimensional adjustment parameter C so that the differential values are continuous.
 例えば、図24に示すように、画像に含まれる複数の被写体のうち、未調整の視差量が-m~+mの範囲内に含まれない被写体、つまり図中の枠W1で囲まれた領域内に位置する被写体に対しては、±mに視差量が調整されるように立体調整パラメータCの値を決定しても良い。この場合には、枠W1と枠W2との境界で微分値が不連続となる。この場合、-m~+mの範囲内に含まれない視差量のみ調整すればよいので処理が簡単になるとともに、-m~+mの範囲内に含まれる被写体については、視差が抑制されないので立体感が失われることがない。 For example, as shown in FIG. 24, among the plurality of subjects included in the image, subjects whose unadjusted parallax amount is not included in the range of −m to + m, that is, in the region surrounded by the frame W1 in the figure. For the subject located at, the value of the stereoscopic adjustment parameter C may be determined so that the parallax amount is adjusted to ± m. In this case, the differential value is discontinuous at the boundary between the frame W1 and the frame W2. In this case, it is only necessary to adjust the amount of parallax that is not included in the range of −m to + m, so that the processing becomes simple. For the subject included in the range of −m to + m, the parallax is not suppressed, so Will not be lost.
 また、以上の実施形態では、画像に含まれる被写体がシーンの奥行き方向に連続する場合に、これらの被写体の間で奥行き方向に対し視差量を滑らかに推移させることとして説明した。しかしながら、シーンの奥行き方向に連続する被写体は、画像に含まれる一部の被写体であっても良い。 Further, in the above embodiment, when the subjects included in the image are continuous in the depth direction of the scene, the parallax amount is smoothly shifted between these subjects in the depth direction. However, the subject that is continuous in the depth direction of the scene may be a part of the subject included in the image.
 また、以上の実施形態では、許容される視差量の閾値を±mとして説明した。しかしながら、許容される視差量の上限値および下限値に対し、互いに絶対値の異なる値を用いても良い。 In the above embodiment, the allowable parallax threshold has been described as ± m. However, values having different absolute values may be used for the upper limit value and the lower limit value of the allowable amount of parallax.
 以上の本実施形態において説明した各処理フローは、制御部を制御する制御プログラムによって実行される。制御プログラムは、内蔵する不揮発性メモリに記録されており、適宜ワークメモリに展開されて各処理を実行する。あるいは、サーバに記録された制御プログラムが、ネットワークを介して各装置に送信され、ワークメモリに展開されて各処理を実行する。または、サーバに記録された制御プログラムがサーバ上で実行され、各装置は、ネットワークを介して送信されてくる制御信号に即して処理を実行する。 Each processing flow described in this embodiment is executed by a control program that controls the control unit. The control program is recorded in a built-in nonvolatile memory, and is appropriately expanded in the work memory to execute each process. Alternatively, the control program recorded in the server is transmitted to each device via the network, and is expanded in the work memory to execute each process. Alternatively, a control program recorded on the server is executed on the server, and each device executes processing in accordance with a control signal transmitted via the network.
 以上、本発明を実施の形態を用いて説明したが、本発明の技術的範囲は上記実施の形態に記載の範囲には限定されない。上記実施の形態に、多様な変更または改良を加えることが可能であることが当業者に明らかである。その様な変更または改良を加えた形態も本発明の技術的範囲に含まれ得ることが、請求の範囲の記載から明らかである。 As mentioned above, although this invention was demonstrated using embodiment, the technical scope of this invention is not limited to the range as described in the said embodiment. It will be apparent to those skilled in the art that various modifications or improvements can be added to the above-described embodiment. It is apparent from the scope of the claims that the embodiments added with such changes or improvements can be included in the technical scope of the present invention.
 請求の範囲、明細書、および図面中において示した装置、システム、プログラム、および方法における動作、手順、ステップ、および段階等の各処理の実行順序は、特段「より前に」、「先立って」等と明示しておらず、また、前の処理の出力を後の処理で用いるのでない限り、任意の順序で実現しうることに留意すべきである。請求の範囲、明細書、および図面中の動作フローに関して、便宜上「まず、」、「次に、」等を用いて説明したとしても、この順で実施することが必須であることを意味するものではない。 The execution order of each process such as operations, procedures, steps, and stages in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior”. It should be noted that they can be implemented in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for the sake of convenience, it means that it is essential to carry out in this order. is not.
10 デジタルカメラ、20 撮影レンズ、21 光軸、22 絞り、50 眼球、51 右目、52 左目、40 表示部、61、62、71、72 被写体、80 TVモニタ、81 メモリカードIF、82 リモコン、100 撮像素子、104、105、106 開口部、110 基本格子、201 制御部、202 A/D変換回路、203 メモリ、204 駆動部、205 画像処理部、207 メモリカードIF、208 操作部、209 表示部、210 LCD駆動回路、220 メモリカード、230 奥行情報検出部、231 算出部、232 決定部、233 視差画像データ生成部、234 動画生成部、301 大人、302 少年、303 少女、310 枠、320 タイトル、322 中心線、1610、1611 コントラスト曲線、1620、1621、1622、1624、1626、1628 視差量曲線、1623、1625 調整視差量曲線、1627、1629 曲線、1804、1805、1807、1808 分布曲線、1806、1809 合成分布曲線、1901 Lt分布曲線、1902 Rt分布曲線、1903 2D分布曲線、1905 調整Lt分布曲線、1906 調整Rt分布曲線、2083 タッチパネル、2310 ルックアップテーブル 10 digital camera, 20 taking lens, 21 optical axis, 22 aperture, 50 eyeball, 51 right eye, 52 left eye, 40 display unit, 61, 62, 71, 72 subject, 80 TV monitor, 81 memory card IF, 82 remote control, 100 Image sensor, 104, 105, 106 opening, 110 basic grid, 201 control unit, 202 A / D conversion circuit, 203 memory, 204 drive unit, 205 image processing unit, 207 memory card IF, 208 operation unit, 209 display unit , 210 LCD drive circuit, 220 memory card, 230 depth information detection unit, 231 calculation unit, 232 determination unit, 233 parallax image data generation unit, 234 video generation unit, 301 adult, 302 boy, 303 girl, 310 frame, 320 title 322 center line, 16 0, 1611 Contrast curve, 1620, 1621, 1622, 1624, 1626, 1628 Parallax amount curve, 1623, 1625 Adjusted parallax amount curve, 1627, 1629 curve, 1804, 1805, 1807, 1808 Distribution curve, 1806, 1809 Composite distribution curve , 1901 Lt distribution curve, 1902 Rt distribution curve, 1903 2D distribution curve, 1905 adjusted Lt distribution curve, 1906 adjusted Rt distribution curve, 2083 touch panel, 2310 lookup table

Claims (14)

  1.  画像データを取得する取得部と、
     前記画像データの画像に含まれる被写体の視差量を算出する算出部と、
     前記画像データから視差画像データを生成するときに適用される、前記視差量を調整する調整値を、前記視差量に応じて決定する決定部と、
     前記画像データに前記調整値を適用して、調整された前記視差量を有する画像となる視差画像データを生成する生成部と
    を備える画像処理装置。
    An acquisition unit for acquiring image data;
    A calculation unit for calculating a parallax amount of a subject included in the image of the image data;
    A determination unit configured to determine an adjustment value for adjusting the parallax amount, which is applied when generating parallax image data from the image data, according to the parallax amount;
    An image processing apparatus comprising: a generation unit configured to apply the adjustment value to the image data and generate parallax image data that becomes an image having the adjusted parallax amount.
  2.  前記決定部は、前記算出部によって算出された前記視差量の絶対値が予め定められた閾値を超えた場合、前記視差量が前記閾値未満に調整されるように前記調整値を決定する請求項1に記載の画像処理装置。 The determination unit determines the adjustment value so that the parallax amount is adjusted to be less than the threshold when the absolute value of the parallax amount calculated by the calculation unit exceeds a predetermined threshold. The image processing apparatus according to 1.
  3.  前記決定部は、前記被写体がシーンの奥行き方向に連続する場合に、調整された前記視差量が前記被写体に関し、前記奥行き方向において連続するように前記調整値を決定する請求項1または2に記載の画像処理装置。 The said determination part determines the said adjustment value so that the said parallax amount adjusted regarding the said subject may be continued in the said depth direction when the said subject continues in the depth direction of a scene. Image processing apparatus.
  4.  前記決定部は、前記被写体がシーンの奥行き方向に連続する場合に、調整された前記視差量の微分値が前記奥行き方向において連続するように前記調整値を決定する請求項3に記載の画像処理装置。 The image processing according to claim 3, wherein the determination unit determines the adjustment value so that a differential value of the adjusted parallax amount is continuous in the depth direction when the subject is continuous in a depth direction of the scene. apparatus.
  5.  前記算出部は、複数の前記被写体それぞれの視差量を算出し、
     前記決定部は、前記調整値を、算出された前記視差量に応じて前記複数の被写体のそれぞれに対して決定する請求項1から4のいずれか1項に記載の画像処理装置。
    The calculating unit calculates a parallax amount of each of the plurality of subjects;
    5. The image processing apparatus according to claim 1, wherein the determination unit determines the adjustment value for each of the plurality of subjects according to the calculated amount of parallax.
  6.  調整された前記視差量を定めるために前記決定部に用いられる関数を使用者に選択させる選択部を備える請求項1から5のいずれか1項に記載の画像処理装置。 The image processing apparatus according to claim 1, further comprising a selection unit that causes a user to select a function used by the determination unit in order to determine the adjusted amount of parallax.
  7.  前記調整値を適用して前記視差量を抑制したい被写体を使用者に指定させる指定部を備える請求項1から6のいずれか1項に記載の画像処理装置。 The image processing apparatus according to any one of claims 1 to 6, further comprising a designation unit that allows a user to designate a subject for which the amount of parallax is to be suppressed by applying the adjustment value.
  8.  前記被写体の輝度値を演算する演算部を備え、
     前記決定部は、前記輝度値が小さいほど前記視差量が小さくなるように前記調整値を決定する請求項1から7のいずれか1項に記載の画像処理装置。
    A calculation unit for calculating a luminance value of the subject;
    The image processing apparatus according to claim 1, wherein the determination unit determines the adjustment value so that the parallax amount decreases as the luminance value decreases.
  9.  前記画像データは、互いに視差を有する第1視差画像データと第2視差画像データを包含し、
     前記生成部は、前記第1視差画像データと前記第2視差画像データに前記調整値を適用して、互いに前記第1視差画像データと前記第2視差画像データのそれぞれの画像間における視差量とは異なる視差量を有する画像となるように第3視差画像データと第4視差画像データを生成する請求項1から8のいずれか1項に記載の画像処理装置。
    The image data includes first parallax image data and second parallax image data having parallax with each other,
    The generating unit applies the adjustment value to the first parallax image data and the second parallax image data, and a parallax amount between the images of the first parallax image data and the second parallax image data. The image processing apparatus according to claim 1, wherein the third parallax image data and the fourth parallax image data are generated so that images having different parallax amounts are obtained.
  10.  光学系を通過する入射光束のうち前記光学系の光軸に対して第1方向へ偏位した第1部分光束を受光する第1視差画素群と、前記第1方向とは異なる第2方向へ偏位した第2部分光束を受光する第2視差画素群とを含む撮像素子と、
     前記第1視差画素群の出力に基づいて生成された第1視差画像データと前記第2視差画素群の出力に基づいて生成された第2視差画像データとを含む前記画像データを前記取得部によって取得する請求項1から9のいずれか1項に記載の画像処理装置と
    を備える撮像装置。
    A first parallax pixel group that receives a first partial light beam that is displaced in a first direction with respect to an optical axis of the optical system among incident light beams that pass through the optical system, and a second direction that is different from the first direction. An imaging device including a second parallax pixel group that receives the displaced second partial light beam;
    The image data including the first parallax image data generated based on the output of the first parallax pixel group and the second parallax image data generated based on the output of the second parallax pixel group is obtained by the acquisition unit. An imaging apparatus comprising: the image processing apparatus according to claim 1 to be acquired.
  11.  画像データを取得する取得部と、
     前記画像データの画像に含まれる被写体の視差量を算出する算出部と、
     前記視差量に基づいて前記被写体の奥行き方向の距離を検出する検出部と、
     前記被写体のうち、前記奥行き方向の距離が予め定められた範囲を超える部分に対して、前記視差量を調整する決定部と、
     前記調整された前記視差量を有する視差画像データを生成する生成部と
    を備える画像処理装置。
    An acquisition unit for acquiring image data;
    A calculation unit for calculating a parallax amount of a subject included in the image of the image data;
    A detection unit that detects a distance in the depth direction of the subject based on the amount of parallax;
    A determination unit that adjusts the amount of parallax for a portion of the subject in which the distance in the depth direction exceeds a predetermined range;
    An image processing apparatus comprising: a generation unit that generates parallax image data having the adjusted parallax amount.
  12.  前記決定部は、前記奥行き方向の距離が予め定められた範囲を超える前記被写体の部分に対して、その視差量が予め定められた範囲未満となるように当該視差量を調整する請求項11に記載の画像処理装置。 The said determination part adjusts the said amount of parallax so that the amount of parallax may become less than the predetermined range with respect to the part of the said subject where the distance of the said depth direction exceeds the predetermined range. The image processing apparatus described.
  13.  前記決定部は、前記被写体の奥行き方向に関する位置関係を維持したまま前記視差量を調整する請求項11または12に記載の画像処理装置。 The image processing apparatus according to claim 11 or 12, wherein the determination unit adjusts the parallax amount while maintaining a positional relationship of the subject in the depth direction.
  14.  画像データを取得する取得ステップと、
     前記画像データの画像に含まれる被写体の視差量を算出する算出ステップと、
     前記画像データから視差画像データを生成するときに適用される、前記視差量を調整する調整値を、前記視差量に対して決定する決定ステップと、
     前記画像データに前記調整値を適用して、互いに調整された前記視差量を有する画像となる視差画像データを生成する生成ステップと
    をコンピュータに実行させる画像処理プログラム。
    An acquisition step of acquiring image data;
    A calculation step of calculating a parallax amount of a subject included in the image of the image data;
    A determination step for determining an adjustment value for adjusting the parallax amount, which is applied when generating parallax image data from the image data, for the parallax amount;
    An image processing program for causing a computer to execute a generation step of generating parallax image data that is an image having the parallax amount adjusted to each other by applying the adjustment value to the image data.
PCT/JP2015/062202 2014-04-22 2015-04-22 Image processing device, imaging device and image processing program WO2015163350A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2014-088496 2014-04-22
JP2014088496A JP2017108194A (en) 2014-04-22 2014-04-22 Image processing apparatus, imaging device and image processing program

Publications (1)

Publication Number Publication Date
WO2015163350A1 true WO2015163350A1 (en) 2015-10-29

Family

ID=54332513

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/062202 WO2015163350A1 (en) 2014-04-22 2015-04-22 Image processing device, imaging device and image processing program

Country Status (2)

Country Link
JP (1) JP2017108194A (en)
WO (1) WO2015163350A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099120A (en) * 2021-04-13 2021-07-09 南昌虚拟现实研究院股份有限公司 Depth information acquisition method and device, readable storage medium and depth camera

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011259012A (en) * 2010-06-04 2011-12-22 Fujifilm Corp Three-dimensional image reproduction device and three-dimensional image reproduction method
JP2011259045A (en) * 2010-06-07 2011-12-22 Sony Corp Stereoscopic image display device, parallax conversion device, parallax conversion method and program
JP2012213070A (en) * 2011-03-31 2012-11-01 Toshiba Corp Video signal generation device, video signal generation method, and control program
WO2012153378A1 (en) * 2011-05-06 2012-11-15 富士通株式会社 Stereoscopic image generation device, stereoscopic image generation method, stereoscopic image generation program
JP2013118515A (en) * 2011-12-02 2013-06-13 Canon Inc Image processing apparatus and method
JP2013229764A (en) * 2012-04-25 2013-11-07 Nikon Corp Image processing apparatus, image pickup apparatus, and image processing program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011259012A (en) * 2010-06-04 2011-12-22 Fujifilm Corp Three-dimensional image reproduction device and three-dimensional image reproduction method
JP2011259045A (en) * 2010-06-07 2011-12-22 Sony Corp Stereoscopic image display device, parallax conversion device, parallax conversion method and program
JP2012213070A (en) * 2011-03-31 2012-11-01 Toshiba Corp Video signal generation device, video signal generation method, and control program
WO2012153378A1 (en) * 2011-05-06 2012-11-15 富士通株式会社 Stereoscopic image generation device, stereoscopic image generation method, stereoscopic image generation program
JP2013118515A (en) * 2011-12-02 2013-06-13 Canon Inc Image processing apparatus and method
JP2013229764A (en) * 2012-04-25 2013-11-07 Nikon Corp Image processing apparatus, image pickup apparatus, and image processing program

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099120A (en) * 2021-04-13 2021-07-09 南昌虚拟现实研究院股份有限公司 Depth information acquisition method and device, readable storage medium and depth camera
CN113099120B (en) * 2021-04-13 2023-04-18 南昌虚拟现实研究院股份有限公司 Depth information acquisition method and device, readable storage medium and depth camera

Also Published As

Publication number Publication date
JP2017108194A (en) 2017-06-15

Similar Documents

Publication Publication Date Title
US8885026B2 (en) Imaging device and imaging method
US9077976B2 (en) Single-eye stereoscopic image capturing device
JP5814692B2 (en) Imaging apparatus, control method therefor, and program
JP5788518B2 (en) Monocular stereoscopic photographing apparatus, photographing method and program
JP5368350B2 (en) Stereo imaging device
JP6036840B2 (en) Imaging apparatus, image processing apparatus, control program for imaging apparatus, and control program for image processing apparatus
WO2012001970A1 (en) Image processing device, method, and program
JP5449551B2 (en) Image output apparatus, method and program
JP2014036362A (en) Imaging device, control method therefor, and control program
US9124866B2 (en) Image output device, method, and recording medium therefor
JP6004741B2 (en) Image processing apparatus, control method therefor, and imaging apparatus
WO2015163350A1 (en) Image processing device, imaging device and image processing program
JP6439285B2 (en) Image processing apparatus, imaging apparatus, and image processing program
JP2012124650A (en) Imaging apparatus, and imaging method
JPWO2012001958A1 (en) Image processing apparatus and method, and program
JP6070061B2 (en) Image processing apparatus, photographing apparatus, image processing method, and program
JP6070060B2 (en) Imaging apparatus and program
JP2024008154A (en) Image processing device, display device, and image processing method
JP6255753B2 (en) Image processing apparatus and imaging apparatus
JP2014085608A (en) Imaging device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15782683

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 15782683

Country of ref document: EP

Kind code of ref document: A1