WO2012124331A1 - Three-dimensional image pickup device - Google Patents

Three-dimensional image pickup device Download PDF

Info

Publication number
WO2012124331A1
WO2012124331A1 PCT/JP2012/001798 JP2012001798W WO2012124331A1 WO 2012124331 A1 WO2012124331 A1 WO 2012124331A1 JP 2012001798 W JP2012001798 W JP 2012001798W WO 2012124331 A1 WO2012124331 A1 WO 2012124331A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
distance
imaging
cameras
Prior art date
Application number
PCT/JP2012/001798
Other languages
French (fr)
Japanese (ja)
Inventor
東郷 仁麿
孝幸 有馬
井村 康治
和之 田中
中村 剛
郁雄 渕上
山口 徹
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011058832A external-priority patent/JP2014102266A/en
Priority claimed from JP2011058827A external-priority patent/JP2014103431A/en
Priority claimed from JP2011058826A external-priority patent/JP2014102265A/en
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012124331A1 publication Critical patent/WO2012124331A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor

Definitions

  • the present invention relates to a camera-equipped portable electronic device such as a mobile phone having a 3D camera capable of 3D imaging and a digital still camera.
  • FIGS. 24 (a), (b) and (c) show a conventional structure example of a 3D camera consisting of two cameras. For example, details are described in Patent Document 1.
  • the two cameras 12 and 13 with an optical zoom are disposed apart from each other on the back surface of the display element 15 of the housing 11 of the imaging device.
  • FIG. 24C shows a block diagram of the imaging apparatus.
  • the first camera 12 and the second camera 13 with optical zoom are controlled by the camera control means 18.
  • the positions of the lenses 16 of the first camera 12 and the second camera 13 are adjusted based on the adjustment value of the look-up table so that the optical zoom magnifications of the first camera 12 and the second camera 13 coincide.
  • FIG. 25 shows a block diagram of the imaging apparatus.
  • Two cameras 22 and 23 are disposed apart from each other on the back surface of the display element of the housing.
  • the first camera 22 has an optical zoom
  • the second camera 23 is a single focus camera without an optical zoom. Further, the number of pixels of the first camera 22 is larger than the number of pixels of the second camera 23.
  • the first camera 22 with an optical zoom and the second camera 23 of a single focus type are controlled by a camera control means 28.
  • the optical zoom magnification of the first camera 22 and the electronic zoom magnification of the second camera 23 are adjusted so that the zoom magnifications of the first camera 22 and the second camera 23 match.
  • resolution enhancement of the second camera image is performed as follows.
  • a region of the first camera image (referred to as a corresponding block) corresponding to a part of the second camera image (referred to as a block) is searched for by image processing, and a block of the second camera image is converted to a corresponding block of the first camera image replace.
  • the resolution of the second camera image can be increased, and the difference in resolution between the first camera image and the second camera image can be reduced.
  • FIGS. 26 (a), (b) and (c) show an example of a conventional structure of a 3D camera consisting of two cameras. For example, details are described in Patent Document 3.
  • the two cameras 32 and 33 are disposed apart from each other in the housing 31 of the imaging apparatus. As shown in FIG. 26 (b), it is possible to mechanically change the distance between optical axes (baseline length) of the camera and to change the angle of the optical axis of the camera.
  • FIG. 26C shows a block diagram.
  • the first camera 32 and the second camera 33 are controlled by the camera control means 38.
  • position adjustment (parallax adjustment) of the first camera 32 and the second camera 33 in the horizontal direction is also performed.
  • the parallax means the difference in the horizontal position of the left and right images with respect to a certain subject.
  • the subject can be freely placed in front of or behind the 3D display screen.
  • FIG. 27 shows the relationship between the subject distance from the 3D camera to the subject and the parallax angle (convergence angle) of the left and right images when the optical axes of the 3D camera are parallel.
  • the three-dimensional image reproduced on the 3D display may not be in front of or behind the display screen.
  • the amount of deviation with respect to the convergence angle on the display screen becomes a certain value or more, comfortable stereoscopic vision becomes difficult. Furthermore, if this shift amount becomes large, stereoscopic vision itself can not be performed.
  • the parallax angle is a difference of about ⁇ 1 degree or less with respect to the display screen.
  • the base length is 6.5 cm and the subject distance is from about 1 m or more to infinity, there is no problem because the parallax angle is 2 degrees or less ( ⁇ 1 degree or less) as shown in FIG.
  • the parallax angle is 2 degrees or less ( ⁇ 1 degree or less) as shown in FIG.
  • the parallax angle can be made 2 degrees or less even for a short distance object of about 50 cm.
  • the convergence angle increases as the object approaches the camera, and the display screen jumps out, making it difficult to view stereoscopically.
  • Patent Document 4 describes a method of electronically changing the convergence angle.
  • the convergence angle can be changed by adjusting the cutout horizontal position of the left and right images.
  • Patent Document 5 describes a method of generating a parallax image such that a specific object in a subject (object) in the 3D expression space falls within the recommended parallax range.
  • the parallax adjustment is performed so that the designated object is displayed near the surface of the display screen.
  • FIGS. 28 (a) and 28 (b) show an example of the conventional structure of a 3D camera consisting of a twin-lens camera.
  • the first camera 42 and the second camera 43 are disposed apart from each other in the housing 41 of the imaging device.
  • a display element 45 for displaying a 3D photographed image is disposed on the back surface of the housing 41.
  • the display element 45 is, for example, a parallax barrier type naked eye 3D liquid crystal or the like.
  • FIG. 29 shows a state where the display screen is vertically elongated by rotating the case 41 by 90 degrees. When the subject is vertically long, shooting is often performed in this manner.
  • FIG. 30 (a), (b) shows landscape shooting
  • FIG. 30 (b) shows portrait shooting
  • shooting is performed by two cameras 52 and 53 which are disposed apart from each other in the direction of the long side of the display element 55 as in the conventional case.
  • the camera sensors 57 of the left and right cameras are both arranged horizontally long.
  • the arrangement of the camera sensor 57 is changed from horizontal to vertical by a mechanism in which the first camera 52 and the second camera 53 rotate 90 degrees with respect to the camera optical axis.
  • the display element 55 also rotates 90 degrees in the same direction as the rotation direction of the camera, thereby enabling display of a vertically long 3D image.
  • the display device 55 needs a 3D display device capable of vertical display and horizontal display.
  • parallax information (or distance information) is another method of generating a 3D image from a 3D camera. There is a method of using). After measuring the parallax from the main image and the sub image of the twin-lens camera including the main image sensor and the sub image sensor disposed spatially apart, it is possible to generate a 3D image from the parallax information and the main image.
  • Japanese Patent No. 3303254 Japanese Patent Application Laid-Open No. 2005-210217 Japanese Patent Application Laid-Open No. 07-167633 Japanese Patent Application Laid-Open No. 08-251625 Japanese Patent Application Laid-Open No. 2004-220127 Japanese Patent Application Laid-Open No. 10-224820 Japanese Patent Laid-Open Publication No. 2005-20606
  • Patent Document 1 The 3D optical zoom camera described in Patent Document 1 described in the first prior art needs two expensive and large optical zoom cameras, so there is a problem that the portable terminal becomes expensive and large. .
  • the 3D optical zoom camera described in Patent Document 2 described in the second prior art is inexpensive and can be miniaturized because one side is a single-focus one-way camera. It takes time for software processing of resolution conversion to take moving pictures. Further, there may be a case where corresponding blocks (or corresponding pixels) of the optical zoom camera image and the single focus camera image can not be found. At this time, the single focus camera image is an image partially deteriorated in low resolution.
  • first and second conventional examples assume that the first cameras 12 and 22 have an optical zoom function, they are simply a 3D camera consisting of a high resolution third camera and a low resolution fourth camera. In the case of the same problem occurs. The issues are described below.
  • An image cut out from the third camera with the maximum number of pixels of AL is taken as a third camera generated image, and the number of pixels is BL (BL ⁇ AL).
  • an image cut out of the fourth camera (AR ⁇ AL) with the maximum number of pixels of AR is taken as a fourth camera generated image, and the number of pixels is BR (BR ⁇ AR).
  • AR ⁇ AL image cut out of the fourth camera with the maximum number of pixels of AR
  • BR ⁇ AR
  • the 3D image having the third camera generated image and the fourth camera generated image as the left and right images has a large resolution difference and becomes a difficult 3D image .
  • the resolutions of the left and right images are different, the image quality of the 3D image is degraded.
  • Patent Document 3 described in the third prior art can achieve both the 3D effect and the viewability by changing the relative arrangement of the cameras, but moves the two cameras continuously. Since the function is required, there is a problem that the camera becomes expensive and large.
  • Patent Document 4 since the 3D camera described in Patent Document 4 can change the horizontal distance (parallax) of the subject in the left and right images depending on the image cutout position, a complicated mechanical structure as in Patent Document 3 is unnecessary. Only the adjustment of parallax in the horizontal direction is possible, and it is not possible to adjust the height of the sense of depth between the subjects (3D sense). Therefore, in the 3D image in which the short distance subject and the long distance subject are shown, it has not been possible to adjust so that all subjects have a comfortable parallax.
  • the feeling of popping up on the display screen is too strong to be seen. Therefore, if the parallax of the short distance object is reduced to weaken the feeling of popping out, a distant object may be placed in the back and it may be difficult to view or may not be stereoscopically viewable.
  • Patent Document 6 proposes a method capable of realizing 3D vertical shooting, but a mechanism in which two cameras physically rotate with respect to the camera optical axis to perform vertical 3D shooting, The rotation mechanism of a display element is required, and the subject that an imaging device enlarges occurred.
  • An object of the present invention is to provide a simple 3D imaging apparatus capable of generating a 3D image by a method suitable for an imaging state.
  • a 3D imaging apparatus includes: two cameras having different optical axes; a distance measuring unit that measures a distance from an object based on parallax of two camera images captured by each of the two cameras; A 3D image is generated based on a first 3D image generation unit that generates a 3D image from an image, a distance to an object measured by the distance measurement unit, and a camera image captured by one of the two cameras And generating a 3D image by the first 3D image generation unit and a 3D image by the second 3D image generation unit according to a photographing state based on the positions of the two cameras with respect to the subject. Switch to one of the generation of.
  • the two cameras are a first camera with an optical zoom function and a single-focus second camera, and the zoom magnification by the first camera is smaller than a predetermined magnification.
  • the first 3D image generation unit generates a 3D image from the optical zoom image of the first camera and the electronic zoom image of the second camera, and when the zoom magnification is larger than the predetermined magnification, the second 3D image is generated.
  • the 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the first camera.
  • the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and the number of pixels of the 3D image to be generated is a predetermined number of pixels
  • the first 3D image generation unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and the number of pixels of the 3D image generated is the above
  • the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the third camera.
  • the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and in the case of 3D moving image shooting, the first 3D image generation The unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and in the case of 3D still image shooting, the second 3D image generator converts the distance measurement unit A 3D image is generated based on the measured distance to the subject and the camera image captured by the third camera.
  • the first 3D image generating unit captures one of the two cameras.
  • the first 3D image is generated from the captured camera image and the camera image captured by the other of the two cameras, and in the short distance shooting mode, the distance to the subject measured by the distance measuring unit is less than the predetermined value.
  • the second 3D image generation unit generates a second 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras, and the second 3D image is generated.
  • the parallax angle difference between the first object and the second object at different parallax angles in the image is smaller than the parallax angle difference between the first object and the second object in the first 3D image.
  • the 3D imaging device of the present invention further includes a display unit that outputs the first 3D image and / or the second 3D image.
  • both long distance and short distance shooting can be performed with an inexpensive camera configuration, and 3D shooting that achieves both 3D effect and easy viewing can be realized. .
  • the distance measuring unit calculates the closest distance of the third object closest to the 3D imaging device, and when the closest distance is equal to or more than the predetermined value, the far distance When the shooting mode is selected and the closest distance is less than the predetermined value, the short distance shooting mode is selected.
  • the 3D imaging device of the present invention includes a housing and a rectangular display element disposed in the housing, and the two cameras are arranged side by side in the long side direction of the display element,
  • the distance measuring unit measures the distance from the parallax of the display element to the subject of the two camera images captured by each of the two cameras.
  • the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras.
  • both vertical 3D imaging and horizontal 3D imaging can be made compact and inexpensive without physically changing the arrangement of the cameras.
  • the first 3D image generation unit in the horizontal 3D imaging mode in which the long side of the display element is horizontal, the first 3D image generation unit generates a 3D image from camera images captured by each of the two cameras.
  • the 3D imaging apparatus further includes an orientation sensor that detects whether the long side of the display element is horizontal or vertical, and the vertical 3D imaging mode or the horizontal 3D imaging is performed according to the detection result of the orientation sensor. Switch to mode.
  • the display element has a 3D display function, displays a 3D preview in the horizontal 3D imaging mode, and either one of the two cameras in the vertical 3D imaging mode
  • the preview display by the display element is performed with a 2D image using an image.
  • preview display can be performed without depending on the generation time of the 3D image.
  • the display element is a pseudo 3D image in which parallax is added in a short side direction of the display element from one camera image of the two cameras in the vertical 3D imaging mode.
  • the preview display is performed.
  • the preview in the vertical 3D shooting mode can be displayed in a pseudo 3D manner, and the difference from the 2D shooting can be easily recognized.
  • parallax settings of left and right images in the horizontal 3D imaging mode and the vertical 3D imaging mode can be set independently.
  • the parallax setting value of the vertical 3D shooting mode is for short distance rather than the horizontal 3D shooting mode.
  • This configuration makes it easy to perform short distance shooting during vertical shooting.
  • the 3D imaging apparatus includes a first camera, an optical axis different from the optical axis of the first camera, and a second camera having a larger angle of view than the first camera, and the first camera.
  • a camera control unit for controlling the second camera to synchronously capture a camera image of a shooting range equal to the angle of view with a shooting time of the first camera; a camera image of the first camera and a camera of the second camera
  • 3D image generation unit for generating a 3D image from the image.
  • the camera control unit changes the frequency of the clock signal input to the second camera so that the imaging time of the camera image of the second camera is the imaging time of the first camera Synchronize to
  • FIG. 1 Diagram showing the relationship between the number of pixels and the resolution ratio in the first 3D image generation method Flow chart of 3D imaging according to the second embodiment of the present invention (A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the third embodiment of the present invention (A), (b) A diagram showing a general 3D image capturing method FIG.
  • FIG. 10 is a diagram showing a 3D image generation method according to Embodiment 3 of the present invention (A), (b) The figure which shows the detail of the 3D image generation method of Embodiment 3 of this invention 3D imaging flowchart of the third embodiment of the present invention (A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the fourth embodiment of the present invention (A), (b) Schematic of horizontal shooting mode and vertical shooting mode according to the fourth embodiment of the present invention The figure which shows the 3D image generation method of Embodiment 4 of this invention.
  • 3D imaging flowchart of the fourth embodiment of the present invention 3D imaging flowchart of the fifth embodiment of the present invention
  • (A), (b) The figure which shows the 3D image generation method of Embodiment 6 of this invention (A), (b), (c) Schematic structural view of a 3D imaging apparatus according to a seventh embodiment of the present invention The figure which shows the image which the sensor of each camera of Embodiment 7 of this invention acquired.
  • FIG. 1A, FIG. 1B, and FIG. 1C are schematic structural views of a 3D imaging apparatus according to the present embodiment.
  • a 3D camera 104 including a first camera 102 and a second camera 103 is disposed on the back of the display element 105 of the rectangular casing 101 of the mobile terminal device.
  • the first camera 102 is a camera with an optical zoom function
  • the second camera 103 is a single focus camera without an optical zoom function. Since the second camera 103 is cheaper and smaller than the first camera 102, the portable terminal can be made smaller or thinner than when two cameras with an optical zoom in FIG. 24A are used. .
  • the first camera 102 with an optical zoom function is a camera for the left image in FIG. 1A, it may be a camera for the right image.
  • FIG. 1C shows a block diagram of the portable terminal device.
  • the first camera 102 and the second camera 103 are controlled by the camera control means 108, and the photographed image is stored in the storage means 110 or displayed using the display element 105.
  • the first camera 102 comprises a sensor 107 and a lens 106 and is provided with an optical zoom function, and the position of the lens 106 is controlled by the zoom control signal 109 to obtain a desired zoom magnification.
  • FIG. 2 shows a 3D image generation method at the time of zooming.
  • a 3D image generation method when the zoom magnification N is smaller than the switching magnification N0 will be described. This is called a first 3D image generation method.
  • a 3D image is generated by using the optical zoom image [1] as a left image and the electronic zoom image [2] as a right image.
  • the first 3D image generation method is performed only when the zoom magnification N is smaller than N0.
  • YR (or XR) is larger than YL (or XL) when the zoom magnification is large. It becomes smaller.
  • the inventors conducted a subjective evaluation as to whether or not the 3D image looks comfortable as a result.
  • a zoom magnification at which YR / YL is, for example, 0.4 or more is N0
  • a 3D image can be generated by the first 3D image generation method described above when N ⁇ N0.
  • the second 3D image generation method is a method of calculating or estimating distance information (depth information) for each pixel or minute region, and generating a 3D image from the distance information.
  • the parallax information (distance information) of each pixel is obtained by establishing correspondence with each pixel (or a minute region) using correlation calculation of image processing or the like. [5] can be calculated.
  • Left and right images [6] and [7] are obtained from the first camera left image [3] and disparity information [5].
  • both left and right images can be made high resolution images.
  • FIG. 4 shows a 3D imaging flowchart at the time of zooming according to the first embodiment of the present invention.
  • high resolution 3D imaging can be performed in a wide zoom magnification range.
  • the first 3D image generation method is more desirable than the second 3D image generation method.
  • the second 3D image generating method takes processing time and the frame rate of the moving image becomes small, so the first 3D image generating method is more desirable than the second 3D image generating method.
  • FIG. 5 shows a 3D imaging flowchart at this time.
  • a first 3D image generation method with high processing speed is adopted, and in the case of still image shooting, a high resolution second 3D image generation method is adopted.
  • a 3D image can be generated without reducing the frame rate of the 3D moving image.
  • Second Embodiment In the second embodiment, a method of changing the 3D image generation method according to the number of pixels of the 3D image to be generated will be described.
  • a mobile terminal device having a single housing is described, but in addition, the mobile terminal device is attached to an electronic device having a small camera such as a foldable mobile terminal or a digital still camera (DSC). The same is true for the case.
  • a small camera such as a foldable mobile terminal or a digital still camera (DSC).
  • FIG. 6A is a block diagram of the 3D imaging apparatus according to the second embodiment.
  • the third camera 111 and the fourth camera 112 both have no optical zoom function, they may have an optical zoom function.
  • the 3D image generation method shown in FIG. 6B is the same as the first 3D image generation method of the first embodiment.
  • the number of pixels indicates the number of vertical pixels ⁇ the number of horizontal pixels, and the resolution means the number of pixels in the original image. Also, for simplicity, it is assumed that the aspect ratio of the image is constant.
  • An image generated from the third camera 111 with the maximum number of pixels of AL ([6]) is a third camera generated image ([8]), and the number of pixels is BL (BL ⁇ AL).
  • a resized image to an image of pixel number C is a [10] image
  • a resized image of [9] image to an image of pixel number C is a [11] image, from [10] and [11] Get a 3D image.
  • FIG. 7 shows an example of a graph of resolution ratio P ((right resolution) / (left resolution)) with respect to the number C of pixels of the 3D image.
  • the first 3D image generation method is not suitable because the resolution difference is large at the resolution ratio P0 or less. Therefore, when C> C0, a second 3D image generation method is used to generate a 3D image.
  • FIG. 6C shows a second 3D image generation method.
  • FIG. 8 shows a flowchart of 3D imaging according to the second embodiment of the present invention.
  • the distance information [12] is calculated from the third camera image [10] and the fourth camera image [11], and the third camera image [10] of pixel number C with high resolution and the distance information [10] Obtain a 3D image from 12].
  • the generation method is the same as the second 3D image generation method of the first embodiment. Thus, 3D images with the same resolution of the left and right images can be obtained.
  • FIGS. 9A, 9B, and 9C are schematic structural views of the 3D imaging apparatus of the present embodiment.
  • the third embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition to electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs). The same applies to the case of wearing.
  • DSCs digital still cameras
  • the 3D camera 204 including the first camera 202 and the second camera 203 is disposed on the back of the display element 205 of the rectangular casing 201 of the mobile terminal device.
  • FIG.9 (c) shows the block block diagram of a portable terminal device.
  • the first camera 202 and the second camera 203 are controlled by the camera control unit 208, and store the photographed image in the storage unit 209 or display it using the display element 205.
  • the distance information measuring means 210 measures distance information of the subject from the images of the first camera 202 and the second camera 203.
  • FIGS. 10A and 10B show a general 3D image capturing method in the case where there is a long distance object 211 and a short distance object 212.
  • FIG. 10A shows the case where both the far-distance object 211 and the short-distance object 212 are behind the virtual projection plane (or 3D display screen).
  • FIG. 10B shows the case where the short distance object 212 is near the 3D camera 204, the long distance object 211 is behind the virtual projection plane, and the short distance object 212 is in front.
  • the parallax angle range D2 of the subject in FIG. 10 (b) is wider than the parallax angle range D1 of the subject in FIG. 10 (a).
  • the parallax (C2L-C2R) of the short distance object 212 can be reduced by shifting the horizontal position of the left and right images as in Patent Document 4, but at this time the parallax (C1L-C1R) of the long distance object 211 It may increase in the negative direction, making it impossible to view stereoscopically.
  • a 3D image is obtained with the left and right images of the first camera image and the second camera image. This will be referred to as the long distance imaging mode.
  • is small, and the parallax angle range D1 is narrow, so that both the long distance object 211 and the near distance object 212 can be comfortably viewed stereoscopically. it can.
  • the parallax angle can be within ⁇ 1 degree, and a comfortable 3D image can be obtained.
  • the distance of the near distance object 212 closest to the camera is less than a certain value (for example, 1 m), it is called a short distance shooting mode, and the following 3D image generation method is adopted.
  • a certain value for example, 1 m
  • FIG. 11 shows a 3D image generation method in the short distance shooting mode and the long distance shooting mode. Further, FIG. 13 shows a flowchart of 3D imaging.
  • the long-distance shooting mode is set, and a 3D image is obtained from the left and right images [1] and [2].
  • the short-distance shooting mode is selected, and from left and right images [3] and [4], correlation calculation of image processing is performed on each pixel (or minute block).
  • the parallax information (distance information) [5] of each pixel is calculated.
  • the measurement of the parallax information (distance information) is performed by the distance information measuring means 210.
  • left and right images [6] and [7] are obtained from the first camera image [3] (or the second camera image [4]) and the parallax information [5].
  • the sense of depth can be made smaller than the actual one by multiplying the calculated distance information by a certain constant value (a) smaller than 1 in order to make the parallax angle range narrower than the actual.
  • in the short distance shooting mode of FIG. 12 (a) corresponds to the short distance object of FIG. 10 (b) in the images [1] and [2] of the long distance shooting mode. It can be made smaller than the difference
  • the parallax angle range D3 becomes narrower than the parallax angle range D2, and even when the short distance object is very close, the fluctuation of the parallax angle is not too large, and comfortable stereoscopic vision can be achieved.
  • the user may switch between the long distance shooting mode and the short distance shooting mode, but in order to switch automatically, the 3D image pickup apparatus has means for measuring the distance of the closest subject. is necessary.
  • a general distance measuring device using a reflection time of radar or infrared rays may be used, but since the portable terminal device becomes expensive and the size increases, here A method of estimating from the images of the camera 202 and the second camera 203 is adopted. This eliminates the need for additional components for the distance measuring means.
  • the distance information measurement means 210 if the corresponding pixels (or blocks) of the left and right camera images are known, the distance of the pixels (blocks) can be estimated. It can be estimated that the closest subject among the pixels is the short distance subject.
  • distance information may be calculated for pixels (or blocks) of several points in the photographed image.
  • the distance of the closest subject may be estimated from the distance information of the pixels (or blocks) near the center.
  • FIGS. 14 (a), (b) and (c) are schematic structural diagrams of the 3D imaging apparatus of the present embodiment.
  • the fourth embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition, electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs) are described. The same applies to the case of wearing.
  • DSCs digital still cameras
  • a binocular 3D camera 304 including a first camera 302 and a second camera 303 is disposed on the back of the display element 305 of the rectangular casing 301 of the portable terminal device. ing.
  • the first camera 302 and the second camera 303 are composed of a camera sensor 307 and a lens 306.
  • FIG. 14C shows a block diagram of the portable terminal device.
  • the first camera 302 and the second camera 303 are controlled by the camera control unit 308, store the captured image in the storage unit 309, and display the captured image using the display element 305.
  • the display element 305 is, for example, a naked-eye 3D liquid crystal having a barrier liquid crystal, and can switch between horizontal and vertical display of a 3D image by switching the barrier direction to vertical and horizontal.
  • 3D display itself may not be possible.
  • the distance information extraction unit 310 has a function of extracting distance information of an object from images of the first camera 302 and the second camera 303.
  • this portable terminal incorporates an attitude sensor 311 including an acceleration sensor, an azimuth sensor, and the like, and can detect the attitude of the terminal.
  • the imaging in which the long side of the display element is horizontal is referred to as a horizontal 3D imaging mode
  • the imaging in which the long side of the display element is vertical is referred to as a vertical 3D imaging mode.
  • the posture sensor 311 detects whether the long side of the display element is horizontal or vertical, and automatically determines whether to set the horizontal 3D shooting mode or the vertical 3D shooting mode.
  • the user may select the horizontal 3D photographing mode or the vertical 3D photographing mode, so the posture sensor 311 may not be provided.
  • FIG. 15A shows an outline of the horizontal 3D shooting mode
  • FIG. 15B shows an outline of the vertical 3D shooting mode.
  • the long sides of the camera sensors 307 of the first camera 302 and the second camera 303 are parallel to the long sides of the display element.
  • a 3D image is obtained by converting the first camera image and the second camera image into left and right images as in the prior art.
  • the left and right images are 3D images having left and right parallax in the short side direction of the display element.
  • the vertical 3D image having left and right parallax in the short side direction can not be obtained only by using the first camera image and the second camera image as the left and right images. Next, a method of generating a vertical 3D image will be described.
  • FIG. 16 shows a generation procedure of horizontal 3D images and vertical 3D images.
  • a 3D image is obtained from the left and right images [1] and [2].
  • each pixel (or minute block) is made to correspond to each other by using correlation calculation of image processing or the like from the vertically oriented left and right images [3] and [4].
  • left and right images [6] and [6] having left and right parallax (BL-BR) in the short side direction of the camera image 7] get.
  • a 3D image generation method using distance information may be used.
  • the 3D image generation method using distance information takes a long time to process, it is preferable to generate a 3D image from a conventional twin-lens camera image in the horizontal 3D imaging mode.
  • FIG. 17 shows an example of a 3D imaging flowchart.
  • the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to horizontal 3D imaging mode.
  • the 3D preview display image uses the first camera image and the second camera image.
  • the 3D images generated in steps S302 and S303 are used as the 3D preview display image. At this time, it is necessary to switch the barrier liquid crystal of the parallax barrier liquid crystal to the mode of vertical display.
  • the 3D image generation method with a normal twin-lens camera and the 3D image generation method from distance information and one-sided camera image, it is compact and inexpensive without physically changing the arrangement of the cameras.
  • vertical 3D shooting and horizontal 3D shooting can be compatible.
  • the basic part of the 3D image generation method of the fifth embodiment is the same as that of the fourth embodiment, so the description will be omitted.
  • FIG. 18 shows an example of a 3D imaging flowchart of the fifth embodiment.
  • the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to the horizontal 3D imaging mode.
  • the fourth embodiment differs from FIG. 17 in the method of preview display in the case of the vertical 3D shooting mode in step S312.
  • the preview display at the time of vertical 3D shooting uses the preview image generated from the 3D image generated from the first or second camera image and the distance information, but it takes time to generate the 3D image using distance information Therefore, there is a problem that the start time of the preview display is delayed or the display frame rate is slow.
  • the preview display is performed using a 2D image of the first or second camera image.
  • the 3D barrier liquid crystal is not in 3D display mode but in 2D display mode.
  • FIG. 19 shows an example of a 3D imaging flowchart using another preview display.
  • FIG. 19 The difference between FIG. 19 and FIG. 18 is the method of preview display in the vertical 3D shooting mode in step S321.
  • a pseudo 3D image is previewed from the first camera image or the second camera image.
  • the pseudo 3D image refers to the following image.
  • the left and right images ([8] and [9]) are two images that differ in the amount of horizontal shift (C) of the 2D image ([3]) of the first camera image or the second camera image. I assume. This is called a pseudo 3D image.
  • an image with parallax can be generated, and a preview image different from the preview at 2D shooting can be displayed. This makes it easy for the user to recognize whether 3D shooting or 2D shooting and distinguish between 2D shooting and 3D shooting. There is no need to display the
  • preview display since the preview display method at the time of vertical shooting does not require generation of a 3D image, preview display can be performed in a short time. It is not necessary to display characters or marks in the 2D display screen to distinguish it from the 2D display preview.
  • the basic part of the 3D image generation method of the sixth embodiment is the same as that of the fourth and fifth embodiments, so the description will be omitted.
  • 21 (a) and 21 (b) are diagrams showing a 3D image generation method according to the sixth embodiment.
  • FIG. 21A shows a 3D image at the time of horizontal shooting
  • FIG. 21B shows a 3D image at the time of vertical shooting.
  • the parallax of the left and right images can be set independently at the time of horizontal shooting and vertical shooting. It is assumed that the subject shoots a short distance subject such as a person more frequently in the vertical shooting than in the horizontal shooting.
  • the relative horizontal distance between the left and right images is adjusted in advance to a predetermined value so that the parallax angle (or parallax C2) of the subject at a short distance does not become too large. That is, the parallax setting value is made to be for the short distance. As a result, a comfortable parallax 3D image can be obtained even in close-up shooting in vertical shooting.
  • the parallax setting value is used for long-distance so that the parallax angle (or parallax C1) does not become too small.
  • Seventh Embodiment 22 (a), (b) and (c) are schematic structural views of the 3D imaging apparatus of the present embodiment.
  • the seventh embodiment will be described on the assumption of a portable terminal device having a single housing, but in addition, it is an electronic device having a small camera such as a folding portable terminal or a digital still camera (DSC). The same applies to the case of wearing.
  • a portable terminal device having a single housing
  • DSC digital still camera
  • a 3D camera 404 including a first camera 402 and a second camera 403 is disposed on the back surface of the display element 405 of the rectangular casing 401 of the portable terminal device.
  • FIG.22 (c) shows the block block diagram of a portable terminal device.
  • the first camera 402 and the second camera 403 are controlled by the camera control unit 408, and save the photographed image in the storage unit 409 or display it using the display element 405.
  • the first camera 402 and the second camera 403 are composed of a sensor 407 and a lens 406. However, the angle of view of the second camera 403 is larger than the angle of view of the first camera 402.
  • the camera control means 408 synchronizes the imaging range of the first camera image and the second camera image with the imaging time by adjusting the scan range and the scan speed for the image acquired by the sensor 407 of the second camera 403. Further, the camera control unit 408 obtains a 3D image with the first and second camera images as left and right images.
  • the camera control unit 408 limits the scan range in the vertical direction between the position A1 and the position A2 in FIG. 23 so that the second camera 403 scans only the range of the same angle of view as the angle of view of the first camera 402. That is, the camera control unit 408 does not scan the range not captured by the first camera 402 in the image acquired by the sensor 407 of the second camera 403.
  • the camera control means 408 controls the second camera 403 so that the time for scanning the adjusted range with respect to the second camera 403 is the same time T2 as the time T1 for scanning all of the images of the sensor 407 of the first camera 402. Among the images acquired by the sensor 407, the speed of scanning the limited range is adjusted.
  • the adjustment of the scan speed is performed by changing the sampling period of the pixels of the second camera 403, the blanking period, or the frequency of the clock signal input to the second camera 403.
  • the camera control unit 408 matches the vertical angle of view from the start to the end of the scan with two images acquired at the same time by the first camera 402 and the second camera 403 having different angles of view. Scan the same angle of view image at the same speed. Therefore, the first camera image of the first camera 402 and the second camera image of the second camera 403 synchronized with the angle of view of the first camera image and the imaging time can be obtained.
  • the imaging time of the first camera 402 and the imaging time of the second camera 403 are matched by reducing the scanning speed of the second camera 403 having a large angle of view.
  • the present invention is not limited to this method, and the imaging time for the same angle of view of two camera images may be synchronized by increasing the scanning speed of the first camera 402 with a small angle of view.
  • the imaging time can be similarly adjusted. For example, if the first camera 402 performs an optical zoom, the angle of view of the first camera 402 decreases. At this time, the camera control unit 408 adjusts the scan range and the scan speed of the second camera 403 in accordance with the change of the optical zoom magnification of the first camera 402 to obtain the imaging ranges and imaging times of the two camera images. It can synchronize.
  • the camera control unit 408 changes the frequency of the clock signal input to the second camera 403 to 1 / n.
  • the camera control means 408 can synchronize the imaging range of two camera images with the imaging time only by adjusting only the scan range of the second camera 403 to 1 / n in accordance with the magnification.
  • first camera 402 and the second camera 403 may have a mechanism to synchronize externally, and may be configured to synchronize the imaging start time.
  • the angle of view of the first camera and the angle of view of the second camera are different.
  • the scan range and scan speed for the image acquired by the sensor of the camera with the larger angle of view are adjusted so that the imaging range and imaging time of the first camera image and the second camera image are synchronized. Therefore, it is possible to electrically obtain left and right images for a 3D image without performing an operation for changing the angle of view with respect to the camera. For this reason, it is possible to obtain 3D images of a moving subject.
  • the synchronization of the imaging time of the first camera image and the second camera image may be performed by changing the frequency of the clock signal. Therefore, it is possible to cope with the change of the angle of view without changing the setting such as the blank period of the camera.
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application No. 2011-058828
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application No. 2011-058827
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application No. 2011-058832 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application No. 2011-058832 Japanese Patent Application filed on March 17, 2011
  • the present invention is useful as a 3D camera, and can be used in various electronic devices having cameras such as mobile phones, mobile terminals, digital still cameras, and so on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

This three-dimensional image pickup device is provided with: two cameras having different optical axes, respectively; a distance measuring unit, which measures a distance to an object on the basis of parallax of two camera images picked up by the cameras; a first three-dimensional image generating unit, which generates a three-dimensional image on the basis of the two camera images; and a second three-dimensional image generating unit, which generates a three-dimensional image on the basis of the distance to the object and a camera image picked up by one of the two cameras. Corresponding to photographing states on the basis of the positions of the two cameras with respect to the object, switching is performed between three-dimensional image generation using the first three-dimensional image generating unit, and three-dimensional image generation using the second three-dimensional image generating unit.

Description

3D撮像装置3D imaging device
 本発明は、3D撮影が可能な3Dカメラを有する携帯電話やデジタルスチルカメラなどのカメラ付き携帯型電子機器に関するものである。 The present invention relates to a camera-equipped portable electronic device such as a mobile phone having a 3D camera capable of 3D imaging and a digital still camera.
 3D(立体)撮影方式として、人間の目の間隔である約6cm~8cm程度離れた光軸を持つ2個のカメラ(2眼カメラ)で、左目視線の左画像と右目視線の右画像をそれぞれ撮影する方法がある。 As a 3D (three-dimensional) shooting method, with two cameras (binary cameras) having an optical axis separated by about 6 cm to 8 cm, which is the distance between human eyes, the left image of the left viewing line and the right image of the right viewing line There is a way to shoot.
(第一の従来例)
 図24(a),(b),(c)は2つのカメラからなる3Dカメラの従来構造例を示す。例えば、特許文献1に詳細が記載されている。
(First conventional example)
FIGS. 24 (a), (b) and (c) show a conventional structure example of a 3D camera consisting of two cameras. For example, details are described in Patent Document 1.
 図24(a),(b)のように撮像装置の筐体11の表示素子15の裏面に2個の光学ズーム付きカメラ12,13が離れて配置されている。 As shown in FIGS. 24A and 24B, the two cameras 12 and 13 with an optical zoom are disposed apart from each other on the back surface of the display element 15 of the housing 11 of the imaging device.
 図24(c)は撮像装置のブロック構成図を示す。光学ズーム付きの第一カメラ12と第二カメラ13はカメラ制御手段18で制御される。ズーム時には、第一カメラ12および第二カメラ13のレンズ16の位置をルックアップテーブルの調整値に基づいて調整し、第一カメラ12と第二カメラ13の光学ズーム倍率が一致するようにする。 FIG. 24C shows a block diagram of the imaging apparatus. The first camera 12 and the second camera 13 with optical zoom are controlled by the camera control means 18. At the time of zooming, the positions of the lenses 16 of the first camera 12 and the second camera 13 are adjusted based on the adjustment value of the look-up table so that the optical zoom magnifications of the first camera 12 and the second camera 13 coincide.
(第二の従来例)
 次に、3Dカメラの第二の従来例について説明する。例えば、特許文献2に詳細が記載されている。図25に撮像装置のブロック構成図を示す。
(Second conventional example)
Next, a second conventional example of the 3D camera will be described. For example, details are described in Patent Document 2. FIG. 25 shows a block diagram of the imaging apparatus.
 筐体の表示素子の裏面に2個のカメラ22,23が離れて配置されている。第一カメラ22は光学ズーム付きで、第二カメラ23は光学ズーム無しの単焦点カメラとする。また、第一カメラ22の画素数は第二カメラ23の画素数より大きい。 Two cameras 22 and 23 are disposed apart from each other on the back surface of the display element of the housing. The first camera 22 has an optical zoom, and the second camera 23 is a single focus camera without an optical zoom. Further, the number of pixels of the first camera 22 is larger than the number of pixels of the second camera 23.
 光学ズーム付き第一カメラ22と単焦点型の第二カメラ23はカメラ制御手段28で制御される。ズーム時には、第一カメラ22と第二カメラ23のズーム倍率が一致するように第一カメラ22の光学ズーム倍率と第二カメラ23の電子ズーム倍率を調整する。 The first camera 22 with an optical zoom and the second camera 23 of a single focus type are controlled by a camera control means 28. At the time of zooming, the optical zoom magnification of the first camera 22 and the electronic zoom magnification of the second camera 23 are adjusted so that the zoom magnifications of the first camera 22 and the second camera 23 match.
 第二カメラ23の解像度は第一カメラ22の解像度に比べて低いため、以下のように第二カメラ画像の高解像度化を行う。 Since the resolution of the second camera 23 is lower than that of the first camera 22, resolution enhancement of the second camera image is performed as follows.
 第二カメラ画像の一部(これをブロックと呼ぶ)に対応する第一カメラ画像の領域(対応ブロックと呼ぶ)を画像処理により探し出し、第二カメラ画像のブロックを第一カメラ画像の対応ブロックに置き換える。 A region of the first camera image (referred to as a corresponding block) corresponding to a part of the second camera image (referred to as a block) is searched for by image processing, and a block of the second camera image is converted to a corresponding block of the first camera image replace.
 以上により、第二カメラ画像が高解像度化することができ、第一カメラ画像と第二カメラ画像の解像度差を小さくすることができる。 As described above, the resolution of the second camera image can be increased, and the difference in resolution between the first camera image and the second camera image can be reduced.
(第三の従来例)
 図26(a),(b),(c)は2つのカメラからなる3Dカメラの従来構造例を示す。例えば、特許文献3に詳細が記載されている。
(Third prior art example)
FIGS. 26 (a), (b) and (c) show an example of a conventional structure of a 3D camera consisting of two cameras. For example, details are described in Patent Document 3.
 図26(a)のように撮像装置の筐体31に2個のカメラ32,33が離れて配置されている。図26(b)に示すようにカメラの光軸間距離(基線長)を機構的に変えることと、カメラ光軸の角度を変えることが可能である。 As shown in FIG. 26A, the two cameras 32 and 33 are disposed apart from each other in the housing 31 of the imaging apparatus. As shown in FIG. 26 (b), it is possible to mechanically change the distance between optical axes (baseline length) of the camera and to change the angle of the optical axis of the camera.
 図26(c)はブロック構成図を示す。第一カメラ32と第二カメラ33はカメラ制御手段38で制御される。3D撮影時には、第一カメラ32および第二カメラ33の水平方向の位置調整(視差調整)なども行う。 FIG. 26C shows a block diagram. The first camera 32 and the second camera 33 are controlled by the camera control means 38. At the time of 3D imaging, position adjustment (parallax adjustment) of the first camera 32 and the second camera 33 in the horizontal direction is also performed.
 ここで、視差とは、ある被写体に対する、左右画像の水平方向の位置の差を意味する。左右視差の大小を制御することによって、被写体を3Dディスプレイ画面から手前または奥に自由に配置することができる。 Here, the parallax means the difference in the horizontal position of the left and right images with respect to a certain subject. By controlling the magnitude of the left and right parallax, the subject can be freely placed in front of or behind the 3D display screen.
 図27は、3Dカメラの光軸が平行のときに、3Dカメラから被写体までの被写体距離と左右画像の視差角(輻輳角)の関係を示す。 FIG. 27 shows the relationship between the subject distance from the 3D camera to the subject and the parallax angle (convergence angle) of the left and right images when the optical axes of the 3D camera are parallel.
 被写体距離が短くなると、視差角が急激に大きくなる。また、2つのカメラの光軸間距離(基線長)が長くなるほど視差角が大きくなり、3D効果(奥行き感)が増す一方、3Dディスプレイ画面に対して飛び出し感が強すぎて、立体視できなくなるという課題も有する。 As the subject distance decreases, the parallax angle rapidly increases. Also, as the distance between the optical axes (baseline length) of the two cameras increases, the parallax angle increases, and the 3D effect (feeling of depth) increases, while the feeling of popping out on the 3D display screen becomes too strong to be three-dimensional It also has the task of
 3Dディスプレイで再現する立体像はディスプレイ画面に対して手前過ぎても奥過ぎてもよくない。ディスプレイ画面における輻輳角に対するずれ量が、ある値以上になると快適な立体視が困難になる。さらに、このずれ量が大きくなると立体視自体ができなくなる。 The three-dimensional image reproduced on the 3D display may not be in front of or behind the display screen. When the amount of deviation with respect to the convergence angle on the display screen becomes a certain value or more, comfortable stereoscopic vision becomes difficult. Furthermore, if this shift amount becomes large, stereoscopic vision itself can not be performed.
 一般に、視差角がディスプレイ画面上に対しておよそ±1度以下の差であれば、快適な立体視が可能であると言われている。 Generally, it is said that a comfortable stereoscopic view is possible if the parallax angle is a difference of about ± 1 degree or less with respect to the display screen.
 例えば、基線長が6.5cmの場合、被写体距離が約1m以上から無限遠であれば、図27に示すように、視差角が2度以下(±1度以下)に収まるので問題ないが、1m未満の近距離被写体になると快適な立体視ができない。 For example, when the base length is 6.5 cm and the subject distance is from about 1 m or more to infinity, there is no problem because the parallax angle is 2 degrees or less (± 1 degree or less) as shown in FIG. When it comes to an object at close range less than 1 m, comfortable stereoscopic vision can not be achieved.
 そこで、特許文献3には、3Dカメラの基線長や左右カメラの光軸角すなわち輻輳角を3D撮影画像に応じて変化させることで、3D効果(=奥行き感)と見やすさを両立する方法について記載されている。 Therefore, in Patent Document 3, a method is compatible with the 3D effect (= depth feeling) and the viewability by changing the base length of the 3D camera and the optical axis angle of the left and right cameras, that is, the convergence angle according to the 3D photographed image. Have been described.
 近距離と遠距離の被写体が混在しているときにはカメラの基線長を短くすることで、左右視差が小さくなり見やすい3D画像が得られる。但し、3D効果は小さくなる方向である。 When short-distance and long-distance subjects are mixed, shortening the baseline length of the camera reduces the left-right parallax and provides an easy-to-see 3D image. However, the 3D effect tends to be smaller.
 例えば、基線長を6.5cmから3cmに小さくすると、図27に示すように視差角が小さくなるため、約50cmの近距離被写体であっても、視差角を2度以下にすることができる。 For example, when the base length is reduced from 6.5 cm to 3 cm, the parallax angle decreases as shown in FIG. 27. Therefore, the parallax angle can be made 2 degrees or less even for a short distance object of about 50 cm.
 また、2つのカメラの光軸が平行のまま、被写体がカメラに近づくと輻輳角が大きくなり、ディスプレイ画面から飛び出すため、立体視しにくくなる。 Further, while the optical axes of the two cameras are parallel, the convergence angle increases as the object approaches the camera, and the display screen jumps out, making it difficult to view stereoscopically.
 そこで被写体が近いときには、図26(b)に示すように2つのカメラの光軸をカメラから近いところで交差するように斜めにすると、3D表示時の視差を小さくすることができ、近くでも見やすい3D画像が得られる。 Therefore, when the subject is close, as shown in FIG. 26 (b), the optical axes of the two cameras are diagonally crossed so that the parallax at the time of 3D display can be reduced, and 3D is easy to see near. An image is obtained.
 上述のように、機構的に2つのカメラの輻輳角を変化させる機能を追加すると、カメラが大型で重くなるため、電子的に輻輳角を変化させる方法が、特許文献4に記載されている。 As described above, since adding a function to mechanically change the convergence angle of two cameras causes a camera to be large and heavy, Patent Document 4 describes a method of electronically changing the convergence angle.
 左右画像の切り出し水平位置を調整することで輻輳角を変化させることができる。 The convergence angle can be changed by adjusting the cutout horizontal position of the left and right images.
 但し、この電子的方法では一部の被写体の視差は調整できるが、近距離から遠距離の被写体全ての視差を同時に小さくすることはできない。 However, with this electronic method, although the parallax of some objects can be adjusted, it is not possible to simultaneously reduce the parallax of all objects from a short distance to a long distance.
 特許文献5には、3Dの表現空間上にある被写体(オブジェクト)の中の特定のオブジェクトが推奨の視差範囲内に収まるように、視差画像を生成する方法が記載されている。 Patent Document 5 describes a method of generating a parallax image such that a specific object in a subject (object) in the 3D expression space falls within the recommended parallax range.
 例えば、指定されたオブジェクトが表示画面の表面付近に表示されるように視差調整を行う。 For example, the parallax adjustment is performed so that the designated object is displayed near the surface of the display screen.
(第四の従来例)
 図28(a),(b)は2眼カメラからなる3Dカメラの従来構造例を示す。
(Fourth conventional example)
FIGS. 28 (a) and 28 (b) show an example of the conventional structure of a 3D camera consisting of a twin-lens camera.
 撮像装置の筐体41に第一カメラ42と第二カメラ43が離れて配置されている。筐体41裏面には、3D撮影画像を表示するための表示素子45が配置されている。表示素子45は例えば、視差バリア方式の裸眼3D液晶などが用いられている。 The first camera 42 and the second camera 43 are disposed apart from each other in the housing 41 of the imaging device. On the back surface of the housing 41, a display element 45 for displaying a 3D photographed image is disposed. The display element 45 is, for example, a parallax barrier type naked eye 3D liquid crystal or the like.
 図29は、筐体41を90度回転して、表示画面を縦長にしたときの様子を示す。被写体が縦長である場合には、このような形態で撮影することが多い。 FIG. 29 shows a state where the display screen is vertically elongated by rotating the case 41 by 90 degrees. When the subject is vertically long, shooting is often performed in this manner.
 このとき、第一カメラ42と第二カメラ43が上下に配置されるため、左右画像の被写体には上下方向の視差が発生する。このため、図28(a),(b)に示した従来の2眼型3Dカメラでは3Dの縦撮影ができない。また、その画像の3D表示もできない。 At this time, since the first camera 42 and the second camera 43 are vertically disposed, parallax in the vertical direction occurs on the subject of the left and right images. For this reason, the conventional twin-lens 3D camera shown in FIGS. 28 (a) and 28 (b) can not perform 3D vertical shooting. In addition, 3D display of the image is also impossible.
 そこで、縦撮影の場合には2眼カメラ配置の物理的な変更による解決方法が、例えば特許文献6に提案されている。 Therefore, in the case of vertical shooting, a solution method based on physical change of the arrangement of twin-lens cameras is proposed, for example, in Patent Document 6.
 図30(a),(b)に特許文献6の3Dカメラ構造を示す。図30(a)は横長撮影、図30(b)は縦長撮影を示す。 The 3D camera structure of patent document 6 is shown to FIG. 30 (a), (b). FIG. 30 (a) shows landscape shooting, and FIG. 30 (b) shows portrait shooting.
 横長撮影のときは従来と同様で表示素子55の長辺方向に離れて配置された2つのカメラ52,53で撮影する。なお、左右カメラのカメラセンサ57はともに横長に配置されている。 In the case of the horizontal shooting, shooting is performed by two cameras 52 and 53 which are disposed apart from each other in the direction of the long side of the display element 55 as in the conventional case. The camera sensors 57 of the left and right cameras are both arranged horizontally long.
 一方、縦撮影時は、第一カメラ52と第二カメラ53がそれぞれカメラ光軸に対して90度回転する機構により、カメラセンサ57の配置を横長から縦長に変更する。 On the other hand, at the time of vertical shooting, the arrangement of the camera sensor 57 is changed from horizontal to vertical by a mechanism in which the first camera 52 and the second camera 53 rotate 90 degrees with respect to the camera optical axis.
 表示素子55も上記のカメラの回転方向と同じ方向に90度回転することで縦長の3D画像の表示が可能となる。 The display element 55 also rotates 90 degrees in the same direction as the rotation direction of the camera, thereby enabling display of a vertically long 3D image.
 表示素子55は、縦表示と横表示が可能な3D表示素子が必要である。 The display device 55 needs a 3D display device capable of vertical display and horizontal display.
 例えば、バリア液晶を有する裸眼3D液晶を用いて、バリアの方向を縦と横に切り替えることにより、3D画像の横長表示と縦長表示が可能である。 For example, by switching the barrier direction between vertical and horizontal using naked-eye 3D liquid crystal having barrier liquid crystal, horizontal display and vertical display of a 3D image are possible.
 なお、表示素子55自体を回転する代わりに、表示画像だけを回転させても構わない。 Note that instead of rotating the display element 55 itself, only the display image may be rotated.
 上記説明した第一の従来例~第四の従来例の他にも、例えば特許文献7に記載されているように、3Dカメラから3D画像を生成する他の方法として、視差情報(または距離情報)を用いる方法がある。空間的に離れて配置された主撮像素子と従撮像素子からなる2眼カメラの主画像と従画像から視差を測定したのち、視差情報と主画像から3D画像を生成することができる。 In addition to the first to fourth conventional examples described above, as described in, for example, Patent Document 7, parallax information (or distance information) is another method of generating a 3D image from a 3D camera. There is a method of using). After measuring the parallax from the main image and the sub image of the twin-lens camera including the main image sensor and the sub image sensor disposed spatially apart, it is possible to generate a 3D image from the parallax information and the main image.
日本国特許第3303254号公報Japanese Patent No. 3303254 日本国特開2005-210217号公報Japanese Patent Application Laid-Open No. 2005-210217 日本国特開平07-167633号公報Japanese Patent Application Laid-Open No. 07-167633 日本国特開平08-251625号公報Japanese Patent Application Laid-Open No. 08-251625 日本国特開2004-220127号公報Japanese Patent Application Laid-Open No. 2004-220127 日本国特開平10-224820号公報Japanese Patent Application Laid-Open No. 10-224820 日本国特開2005-20606号公報Japanese Patent Laid-Open Publication No. 2005-20606
 上記第一の従来例で説明した特許文献1に記載された3Dの光学ズームカメラは、高価で大きい光学ズームカメラを2個必要とするため、携帯端末が高価で大型化するという課題があった。 The 3D optical zoom camera described in Patent Document 1 described in the first prior art needs two expensive and large optical zoom cameras, so there is a problem that the portable terminal becomes expensive and large. .
 一方、上記第二の従来例で説明した特許文献2に記載された3D光学ズームカメラは、片側が単焦点方カメラであるため、安価で小型化が可能であるが、単焦点カメラ画像の高解像度化のソフト処理に時間がかかり動画撮影が困難であった。また、光学ズームカメラ画像と単焦点カメラ画像の対応ブロック(または対応画素)が見つからない場合があり、このときには、単焦点カメラ画像が部分的に低解像度の劣化した画像となっていた。 On the other hand, the 3D optical zoom camera described in Patent Document 2 described in the second prior art is inexpensive and can be miniaturized because one side is a single-focus one-way camera. It takes time for software processing of resolution conversion to take moving pictures. Further, there may be a case where corresponding blocks (or corresponding pixels) of the optical zoom camera image and the single focus camera image can not be found. At this time, the single focus camera image is an image partially deteriorated in low resolution.
 なお、特許文献7に記載された3Dカメラによる3D画像生成方法は、3D画像生成の処理時間がかかるため、動画撮影が困難であった。 In addition, since the processing time of 3D image generation with the 3D camera described in patent document 7 takes time for 3D image generation, moving image shooting is difficult.
 上記第一の従来例及び第二の従来例では第一カメラ12,22が光学ズーム機能付きの場合を想定したが、単に、高解像度の第三カメラと低解像度の第四カメラからなる3Dカメラの場合も、同様の課題が発生する。以下にその課題について説明する。 Although the first and second conventional examples assume that the first cameras 12 and 22 have an optical zoom function, they are simply a 3D camera consisting of a high resolution third camera and a low resolution fourth camera. In the case of the same problem occurs. The issues are described below.
 最大画素数がALの第三カメラから切り取った画像を第三カメラ生成画像とし、画素数をBL(BL≦AL)とする。一方、最大画素数がARの第四カメラ(AR<AL)から切り取った画像を第四カメラ生成画像とし、画素数をBR(BR≦AR)とする。ここでは簡単のため、画像の縦横比は一定と仮定する。 An image cut out from the third camera with the maximum number of pixels of AL is taken as a third camera generated image, and the number of pixels is BL (BL ≦ AL). On the other hand, an image cut out of the fourth camera (AR <AL) with the maximum number of pixels of AR is taken as a fourth camera generated image, and the number of pixels is BR (BR ≦ AR). Here, for simplicity, it is assumed that the aspect ratio of the image is constant.
 画素数BLに対する画素数BRの比(BR/BL)が小さくなると、第三カメラ生成画像と第四カメラ生成画像を左右画像とする3D画像は、解像度差が大きくなり見にくい3D画像となってしまう。このように、左右画像の解像度が異なると3D画像の画質劣化の要因となる。 When the ratio (BR / BL) of the number of pixels BR to the number of pixels BL is reduced, the 3D image having the third camera generated image and the fourth camera generated image as the left and right images has a large resolution difference and becomes a difficult 3D image . As described above, when the resolutions of the left and right images are different, the image quality of the 3D image is degraded.
 上記第三の従来例で説明した特許文献3に記載された3Dカメラは、カメラの相対配置を変化させることで、3D効果と見やすさを両立できたが、2個のカメラを連続的に動く機能が必要なため、高価で大型のカメラとなってしまう課題があった。 The 3D camera described in Patent Document 3 described in the third prior art can achieve both the 3D effect and the viewability by changing the relative arrangement of the cameras, but moves the two cameras continuously. Since the function is required, there is a problem that the camera becomes expensive and large.
 一方、特許文献4に記載された3Dカメラは、左右画像の被写体の水平距離(視差)を画像切り出し位置によって変えることができるので、特許文献3のような複雑な機構的構造が不要であるが、水平方向の視差の調整のみが可能であり、被写体間の奥行き感(3D感)の高低を調整することはできない。従って、近距離被写体と遠距離被写体が写っている3D画像では、全ての被写体で快適な視差になるように調整することができなかった。 On the other hand, since the 3D camera described in Patent Document 4 can change the horizontal distance (parallax) of the subject in the left and right images depending on the image cutout position, a complicated mechanical structure as in Patent Document 3 is unnecessary. Only the adjustment of parallax in the horizontal direction is possible, and it is not possible to adjust the height of the sense of depth between the subjects (3D sense). Therefore, in the 3D image in which the short distance subject and the long distance subject are shown, it has not been possible to adjust so that all subjects have a comfortable parallax.
 特に、近距離被写体の距離が例えば1m未満で非常に近い場合には、ディスプレイ画面に対する飛び出し感が強すぎて見にくくなる。そこで飛び出し感を弱めるために近距離被写体の視差を小さくすると、遠方の被写体が奥に配置され見にくくなるか、発散して立体視できなくなることがあった。 In particular, when the distance of the short distance object is, for example, less than 1 m and very close, the feeling of popping up on the display screen is too strong to be seen. Therefore, if the parallax of the short distance object is reduced to weaken the feeling of popping out, a distant object may be placed in the back and it may be difficult to view or may not be stereoscopically viewable.
 また、特許文献5に記載された方法では、被写体(オブジェクト)の位置に応じて、奥行き感と視差角(輻輳角)を調整することができるが、3D画像生成にソフト処理時間がかかるという課題があった。 Further, according to the method described in Patent Document 5, the sense of depth and the parallax angle (convergence angle) can be adjusted according to the position of the object (object), but the problem is that soft processing time is required for 3D image generation. was there.
 上記第四の従来例で説明した2眼カメラによる3D撮影では、表示画面を縦長になるように筐体を回転した状態で、縦撮影することができなかった。 In the 3D imaging with the twin-lens camera described in the fourth conventional example, it was not possible to perform vertical imaging while rotating the casing so that the display screen is vertically long.
 一方、特許文献6には、3Dの縦撮影を実現できる方法が提案されているが、縦長の3D撮影を行うために、2つカメラがカメラ光軸に対して物理的に回転する機構や、表示素子の回転機構が必要であり、撮像装置が大型化するという課題があった。 On the other hand, Patent Document 6 proposes a method capable of realizing 3D vertical shooting, but a mechanism in which two cameras physically rotate with respect to the camera optical axis to perform vertical 3D shooting, The rotation mechanism of a display element is required, and the subject that an imaging device enlarges occurred.
 このように、3D撮像装置に設けられた2眼カメラの種類若しくは画素数、被写体との距離、又は被写体に対する2眼カメラの配置状態などの撮影状態によって、3D画像を生成する適当な手法はそれぞれ異なる。しかし、従来の技術では、撮影状態に応じて適当な手法で3D画像を生成可能な、簡易な構成の3D撮像装置が実現されていない。 Thus, depending on the type or number of pixels of the binocular camera provided in the 3D imaging apparatus, the distance to the subject, or the imaging state of the binocular camera with respect to the subject, suitable methods for generating a 3D image are It is different. However, in the related art, a 3D imaging apparatus having a simple configuration capable of generating a 3D image by an appropriate method according to a shooting state has not been realized.
 本発明の目的は、撮影状態に応じて適した手法で3D画像を生成可能な、簡易な構成の3D撮像装置を提供することである。 An object of the present invention is to provide a simple 3D imaging apparatus capable of generating a 3D image by a method suitable for an imaging state.
 次に、上記の課題を解決するための手段について述べる。 Next, means for solving the above problems will be described.
 本発明の3D撮像装置は、異なる光軸を持つ2つのカメラと、前記2つのカメラの各々が撮像した2つのカメラ画像の視差から被写体までの距離を測定する距離測定部と、前記2つのカメラ画像から3D画像を生成する第一3D画像生成部と、前記距離測定部が測定した被写体までの距離と、前記2つのカメラの一方が撮像したカメラ画像とに基づいて、3D画像を生成する第二3D画像生成部と、を備え、前記被写体に対する前記2つのカメラの位置に基づく撮影状態に応じて、前記第一3D画像生成部による3D画像の生成及び前記第二3D画像生成部による3D画像の生成のいずれかに切り替える。 A 3D imaging apparatus according to the present invention includes: two cameras having different optical axes; a distance measuring unit that measures a distance from an object based on parallax of two camera images captured by each of the two cameras; A 3D image is generated based on a first 3D image generation unit that generates a 3D image from an image, a distance to an object measured by the distance measurement unit, and a camera image captured by one of the two cameras And generating a 3D image by the first 3D image generation unit and a 3D image by the second 3D image generation unit according to a photographing state based on the positions of the two cameras with respect to the subject. Switch to one of the generation of.
 また、本発明の3D撮像装置では、前記2つのカメラは、光学ズーム機能付きの第一カメラと、単焦点型の第二カメラとであり、前記第一カメラによるズーム倍率が所定の倍率より小さいときには、前記第一3D画像生成部が、前記第一カメラの光学ズーム画像と前記第二カメラの電子ズーム画像から3D画像を生成し、前記ズーム倍率が前記所定の倍率より大きいときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第一カメラが撮像したカメラ画像とに基づいて、3D画像を生成する。 In the 3D imaging apparatus according to the present invention, the two cameras are a first camera with an optical zoom function and a single-focus second camera, and the zoom magnification by the first camera is smaller than a predetermined magnification. Sometimes, the first 3D image generation unit generates a 3D image from the optical zoom image of the first camera and the electronic zoom image of the second camera, and when the zoom magnification is larger than the predetermined magnification, the second 3D image is generated. The 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the first camera.
 この構成により、低倍率では、第一カメラと第二カメラの画像から3D画像を生成することで短時間の3D画像生成が可能となる。また、高倍率では光学ズーム画像から3D画像を生成することで、安価なカメラ構成で高解像度を維持したまま、ズーム倍率が高い3D撮影を実現することが出来る。 With this configuration, it is possible to generate a 3D image in a short time by generating a 3D image from the images of the first camera and the second camera at low magnification. Further, by generating a 3D image from an optical zoom image at high magnification, 3D imaging with high zoom magnification can be realized while maintaining high resolution with an inexpensive camera configuration.
 また、本発明の3D撮像装置では、前記2つのカメラは、第三カメラと、前記第三カメラよりも画素数が小さい第四カメラとであり、生成する3D画像の画素数が所定の画素数よりも小さいときには、前記第一3D画像生成部が、前記第三カメラが撮像したカメラ画像と前記第四カメラが撮像したカメラ画像から3D画像を生成し、前記生成する3D画像の画素数が前記所定の画素数よりも大きいときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第三カメラが撮像したカメラ画像とに基づいて、3D画像を生成する。 In the 3D imaging apparatus of the present invention, the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and the number of pixels of the 3D image to be generated is a predetermined number of pixels When smaller than the above, the first 3D image generation unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and the number of pixels of the 3D image generated is the above When the number of pixels is larger than a predetermined number of pixels, the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the third camera. .
 この構成により、低倍率では、第三カメラと第四カメラの画像から3D画像を生成することで短時間の3D画像生成が可能となる。また、高倍率では第三カメラ画像から3D画像を生成することで、安価なカメラ構成で高解像度を維持したまま、ズーム倍率が高い3D撮影を実現することが出来る。また、左右画像の解像度差が大きい3D画像の画質劣化を抑制する。 With this configuration, it is possible to generate a 3D image in a short time by generating a 3D image from the images of the third camera and the fourth camera at low magnification. In addition, by generating a 3D image from the third camera image at high magnification, 3D imaging with a high zoom magnification can be realized while maintaining high resolution with an inexpensive camera configuration. In addition, the image quality deterioration of the 3D image having a large difference in resolution between the left and right images is suppressed.
 また、本発明の3D撮像装置では、前記2つのカメラは、第三カメラと、前記第三カメラよりも画素数が小さい第四カメラとであり、3D動画撮影のときには、前記第一3D画像生成部が、前記第三カメラが撮像したカメラ画像と前記第四カメラが撮像したカメラ画像から3D画像を生成し、3D静止画撮影のときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第三カメラが撮像したカメラ画像とに基づいて、3D画像を生成する。 Further, in the 3D imaging device of the present invention, the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and in the case of 3D moving image shooting, the first 3D image generation The unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and in the case of 3D still image shooting, the second 3D image generator converts the distance measurement unit A 3D image is generated based on the measured distance to the subject and the camera image captured by the third camera.
 この構成により、動画撮影時は、第三カメラと第四カメラの画像から3D画像を生成することで短時間の3D動画生成が可能となる。また、静止画撮影時は、第三カメラ画像から3D画像を生成することで、安価なカメラ構成で高解像度を維持したまま、ズーム倍率が高い3D静止画撮影を実現することが出来る。また、左右画像の解像度差が大きい3D画像の画質劣化を抑制する。 With this configuration, it is possible to generate 3D moving images in a short time by generating 3D images from images of the third camera and the fourth camera at the time of moving image shooting. In addition, at the time of still image shooting, by generating a 3D image from the third camera image, it is possible to realize 3D still image shooting with a high zoom magnification while maintaining high resolution with an inexpensive camera configuration. In addition, the image quality deterioration of the 3D image having a large difference in resolution between the left and right images is suppressed.
 また、本発明の3D撮像装置では、前記距離測定部が測定した被写体までの距離が所定値以上である遠距離撮影モードでは、前記第一3D画像生成部が、前記2つのカメラの一方が撮像したカメラ画像と前記2つのカメラの他方が撮像したカメラ画像から第一3D画像を生成し、前記距離測定部が測定した被写体までの距離が前記所定値未満である近距離撮影モードでは、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記2つのカメラのいずれか一方が撮像したカメラ画像とに基づいて、第二3D画像を生成し、前記第二3D画像において視差角が異なる第一被写体と第二被写体の視差角差が、前記第一3D画像における前記第一被写体と前記第二被写体の視差角差よりも小さい。 Further, in the 3D imaging apparatus according to the present invention, in the long distance imaging mode in which the distance to the subject measured by the distance measuring unit is a predetermined value or more, the first 3D image generating unit captures one of the two cameras. The first 3D image is generated from the captured camera image and the camera image captured by the other of the two cameras, and in the short distance shooting mode, the distance to the subject measured by the distance measuring unit is less than the predetermined value. The second 3D image generation unit generates a second 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras, and the second 3D image is generated. The parallax angle difference between the first object and the second object at different parallax angles in the image is smaller than the parallax angle difference between the first object and the second object in the first 3D image.
 また、本発明の3D撮像装置は、前記第一3D画像及び/又は前記第二3D画像を出力する表示部を備える。 The 3D imaging device of the present invention further includes a display unit that outputs the first 3D image and / or the second 3D image.
 この構成により、近距離撮影モードにおいて距離測定部を用いることで、安価なカメラ構成で遠距離と近距離撮影の両方が可能で、3D効果と見やすさを両立する3D撮影を実現することが出来る。 With this configuration, by using the distance measuring unit in the short distance shooting mode, both long distance and short distance shooting can be performed with an inexpensive camera configuration, and 3D shooting that achieves both 3D effect and easy viewing can be realized. .
 また、本発明の3D撮像装置では、前記距離測定部は、前記3D撮像装置から最も近い第三被写体の最近接距離を算出し、前記最近接距離が前記所定値以上の場合は、前記遠距離撮影モードとし、前記最近接距離が前記所定値未満の場合は、前記近距離撮影モードとする。 Further, in the 3D imaging device of the present invention, the distance measuring unit calculates the closest distance of the third object closest to the 3D imaging device, and when the closest distance is equal to or more than the predetermined value, the far distance When the shooting mode is selected and the closest distance is less than the predetermined value, the short distance shooting mode is selected.
 この構成により、距離測定部により最近接の被写体距離を測定することで、近距離撮影モードと遠距離撮影モードをユーザが切り替えることが不要となる。 With this configuration, it is not necessary for the user to switch between the short distance imaging mode and the long distance imaging mode by measuring the closest object distance by the distance measuring unit.
 また、本発明の3D撮像装置は、筐体と、前記筐体に配置された長方形の表示素子と、を備え、前記2つのカメラは、前記表示素子の長辺方向に並んで配置され、前記表示素子の長辺を垂直にした縦3D撮影モードでは、前記距離測定部が、前記2つのカメラの各々が撮像した2つのカメラ画像の前記表示素子の短辺方向の視差から被写体までの距離を測定し、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記2つのカメラの一方が撮像したカメラ画像とに基づいて、3D画像を生成する。 Further, the 3D imaging device of the present invention includes a housing and a rectangular display element disposed in the housing, and the two cameras are arranged side by side in the long side direction of the display element, In the vertical 3D imaging mode in which the long side of the display element is vertical, the distance measuring unit measures the distance from the parallax of the display element to the subject of the two camera images captured by each of the two cameras. The second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras.
 この構成により、被写体までの距離から3D画像を生成することで、カメラの配置を物理的に変更することなく、小型で安価に縦3D撮影と横3D撮影を両立できる。 With this configuration, by generating a 3D image from the distance to the subject, both vertical 3D imaging and horizontal 3D imaging can be made compact and inexpensive without physically changing the arrangement of the cameras.
 また、本発明の3D撮像装置では、前記表示素子の長辺を水平にした横3D撮影モードでは、前記第一3D画像生成部が、前記2つのカメラの各々が撮像したカメラ画像から3D画像を生成する。 Further, in the 3D imaging apparatus of the present invention, in the horizontal 3D imaging mode in which the long side of the display element is horizontal, the first 3D image generation unit generates a 3D image from camera images captured by each of the two cameras. Generate
 この構成により、横3D撮影モードのときには3D画像の生成時間を短縮することが出来る。 With this configuration, it is possible to shorten the generation time of the 3D image in the horizontal 3D imaging mode.
 また、本発明の3D撮像装置は、前記表示素子の長辺が水平か垂直かを検出する姿勢センサを有し、前記姿勢センサの検出結果に応じて、前記縦3D撮影モード又は前記横3D撮影モードに切り替える。 The 3D imaging apparatus according to the present invention further includes an orientation sensor that detects whether the long side of the display element is horizontal or vertical, and the vertical 3D imaging mode or the horizontal 3D imaging is performed according to the detection result of the orientation sensor. Switch to mode.
 この構成により、姿勢センサの出力結果に応じて縦3D撮影モードへ自動切換えすることで、ユーザによる切り替えが不要となる。 With this configuration, automatic switching to the vertical 3D imaging mode according to the output result of the posture sensor eliminates the need for switching by the user.
 また、本発明の3D撮像装置では、前記表示素子は、3D表示機能を有し、前記横3D撮影モードでは3Dプレビュー表示し、前記縦3D撮影モードでは、前記2つのカメラのいずれか一方のカメラ画像を用いた2D画像で前記表示素子によるプレビュー表示を行う。 Further, in the 3D imaging apparatus according to the present invention, the display element has a 3D display function, displays a 3D preview in the horizontal 3D imaging mode, and either one of the two cameras in the vertical 3D imaging mode The preview display by the display element is performed with a 2D image using an image.
 この構成により、縦3D撮影モード時においても3D画像の生成時間に依存せず、プレビュー表示が可能となる。 With this configuration, even in the vertical 3D imaging mode, preview display can be performed without depending on the generation time of the 3D image.
 また、本発明の3D撮像装置では、前記表示素子は、前記縦3D撮影モードでは、前記2つのカメラのいずれか一方のカメラ画像から前記表示素子の短辺方向に視差を追加した擬似3D画像で前記プレビュー表示を行う。 Further, in the 3D imaging apparatus according to the present invention, the display element is a pseudo 3D image in which parallax is added in a short side direction of the display element from one camera image of the two cameras in the vertical 3D imaging mode. The preview display is performed.
 この構成により、縦3D撮影モード時のプレビューを擬似的に3D表示でき、2D撮影時との差が認識しやすくなる。 With this configuration, the preview in the vertical 3D shooting mode can be displayed in a pseudo 3D manner, and the difference from the 2D shooting can be easily recognized.
 また、本発明の3D撮像装置では、前記横3D撮影モードと前記縦3D撮影モードの左右画像の視差設定が独立にそれぞれ設定できる。 Further, in the 3D imaging device of the present invention, parallax settings of left and right images in the horizontal 3D imaging mode and the vertical 3D imaging mode can be set independently.
 この構成により、縦撮影と横撮影にそれぞれ最適な視差を設定することが出来る。 With this configuration, it is possible to set parallaxes that are respectively optimal for vertical shooting and horizontal shooting.
 また、本発明の3D撮像装置では、前記横3D撮影モードよりも前記縦3D撮影モードの前記視差設定値が近距離用である。 Further, in the 3D imaging device of the present invention, the parallax setting value of the vertical 3D shooting mode is for short distance rather than the horizontal 3D shooting mode.
 この構成により、縦撮影時に近距離撮影がしやすくなる。 This configuration makes it easy to perform short distance shooting during vertical shooting.
 また、本発明の3D撮像装置は、第一カメラと、前記第一カメラの光軸と異なる光軸を有し、前記第一カメラよりも画角が大きな第二カメラと、前記第一カメラの画角に等しい撮影範囲のカメラ画像を前記第一カメラの撮像時間に同期して撮像するよう前記第二カメラを制御するカメラ制御部と、前記第一カメラのカメラ画像と前記第二カメラのカメラ画像とから3D画像を生成する3D画像生成部と、を備える。 The 3D imaging apparatus according to the present invention includes a first camera, an optical axis different from the optical axis of the first camera, and a second camera having a larger angle of view than the first camera, and the first camera. A camera control unit for controlling the second camera to synchronously capture a camera image of a shooting range equal to the angle of view with a shooting time of the first camera; a camera image of the first camera and a camera of the second camera And 3D image generation unit for generating a 3D image from the image.
 この構成により、カメラに対して画角を変更するための操作を行わずして、3D画像のための左右画像を電気的に取得することができる。このため、動く被写体の3D映像も得ることができる。 With this configuration, it is possible to electrically obtain left and right images for a 3D image without performing an operation for changing the angle of view of the camera. For this reason, it is possible to obtain 3D images of a moving subject.
 また、本発明の3D撮像装置では前記カメラ制御部は、前記第二カメラに入力するクロック信号の周波数を変更することで、前記第二カメラのカメラ画像の撮像時間が前記第一カメラの撮像時間に同期させる。 In the 3D imaging apparatus according to the present invention, the camera control unit changes the frequency of the clock signal input to the second camera so that the imaging time of the camera image of the second camera is the imaging time of the first camera Synchronize to
 この構成により、カメラのブランク期間等の設定を変えることなく、画角の変更に対応することができる。 With this configuration, it is possible to cope with the change of the angle of view without changing the setting of the blank period of the camera and the like.
 本発明によれば、撮影状態に応じて適した手法で3D画像を生成可能な、簡易な構成の3D撮像装置を提供できる。 According to the present invention, it is possible to provide a simple 3D imaging apparatus capable of generating a 3D image by a method suitable for an imaging state.
(a),(b),(c)本発明の実施の形態1の3D撮像装置の概略構造図(A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the first embodiment of the present invention 本発明の実施の形態1のズーム時の3D画像の生成方法を示す図The figure which shows the production | generation method of the 3D image at the time of the zoom of Embodiment 1 of this invention. 非対称な3D画像の主観評価結果を示す図Figure showing subjective evaluation results of asymmetric 3D images 本発明の実施の形態1の3D撮影のフローチャートFlow chart of 3D imaging according to the first embodiment of the present invention 本発明の実施の形態1の第二の3D撮影のフローチャートFlow chart of second 3D imaging according to the first embodiment of the present invention (a),(b),(c)本発明の実施の形態2の3D画像の生成方法を示す図(A), (b), (c) The figure which shows the production | generation method of 3D image of Embodiment 2 of this invention. 第一3D画像生成方法における画素数と解像度比の関係を示す図Diagram showing the relationship between the number of pixels and the resolution ratio in the first 3D image generation method 本発明の実施の形態2の3D撮影のフローチャートFlow chart of 3D imaging according to the second embodiment of the present invention (a),(b),(c)本発明の実施の形態3の3D撮像装置の概略構造図(A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the third embodiment of the present invention (a),(b)一般的な3D画像撮影方法を示す図(A), (b) A diagram showing a general 3D image capturing method 本発明の実施の形態3の3D画像生成方法を示す図FIG. 10 is a diagram showing a 3D image generation method according to Embodiment 3 of the present invention (a),(b)本発明の実施の形態3の3D画像生成方法の詳細を示す図(A), (b) The figure which shows the detail of the 3D image generation method of Embodiment 3 of this invention 本発明の実施の形態3の3D撮影フローチャート3D imaging flowchart of the third embodiment of the present invention (a),(b),(c)本発明の実施の形態4の3D撮像装置の概略構造図(A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the fourth embodiment of the present invention (a),(b)本発明の実施の形態4の横撮影モードと縦撮影モードの概略図(A), (b) Schematic of horizontal shooting mode and vertical shooting mode according to the fourth embodiment of the present invention 本発明の実施の形態4の3D画像生成方法を示す図The figure which shows the 3D image generation method of Embodiment 4 of this invention. 本発明の実施の形態4の3D撮影フローチャート3D imaging flowchart of the fourth embodiment of the present invention 本発明の実施の形態5の3D撮影フローチャート3D imaging flowchart of the fifth embodiment of the present invention 本発明の実施の形態5の第二の3D撮影フローチャートSecond 3D imaging flowchart of the fifth embodiment of the present invention 本発明の実施の形態5の第二の3D撮影フローチャートのプレビュー表示画像を示す図The figure which shows the preview display image of the 2nd 3D imaging | photography flowchart of Embodiment 5 of this invention. (a),(b)本発明の実施の形態6の3D画像生成方法を示す図(A), (b) The figure which shows the 3D image generation method of Embodiment 6 of this invention (a),(b),(c)本発明の実施の形態7の3D撮像装置の概略構造図(A), (b), (c) Schematic structural view of a 3D imaging apparatus according to a seventh embodiment of the present invention 本発明の実施の形態7の各カメラのセンサが取得したイメージを示す図The figure which shows the image which the sensor of each camera of Embodiment 7 of this invention acquired. (a),(b),(c)従来の3D撮像装置の概略構造図(A), (b), (c) Schematic structure diagram of a conventional 3D imaging device 従来の3D撮像装置の概略構造図Schematic structure diagram of a conventional 3D imaging device (a),(b),(c)従来の3D撮像装置の概略構造図(A), (b), (c) Schematic structure diagram of a conventional 3D imaging device 被写体距離と視差角の関係を示すグラフの図Diagram of graph showing relationship between subject distance and parallax angle (a),(b)従来の3D撮像装置の概略構造図(A), (b) Schematic structure diagram of a conventional 3D imaging device 筐体41を90度回転して、表示画面を縦長にしたときの様子を示す図The figure which shows a mode when the housing | casing 41 is rotated 90 degree | times and a display screen is made longitudinally long. (a),(b)従来の3D撮像装置の概略構造図(A), (b) Schematic structure diagram of a conventional 3D imaging device
 以下、本発明の実施の形態について、図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 (実施の形態1)
 図1(a),(b),(c)は、本実施の形態の3D撮像装置の概略構造図である。
Embodiment 1
FIG. 1A, FIG. 1B, and FIG. 1C are schematic structural views of a 3D imaging apparatus according to the present embodiment.
 以下、本実施の形態1では、1個の筐体からなる携帯端末装置を想定して説明するが、その他、折畳み型の携帯端末やデジタルスチルカメラ(DSC)など、小型カメラを有する電子機器に装着した場合も同様である。 Hereinafter, in the first embodiment, the description will be made on the assumption of a portable terminal device having a single housing, but in addition, an electronic device having a small camera such as a foldable portable terminal or a digital still camera (DSC) is described. The same applies to the case of wearing.
 図1(a),(b)に示すように、携帯端末装置の長方形の筐体101の表示素子105の裏面に第一カメラ102と第二カメラ103からなる3Dカメラ104が配置されている。第一カメラ102は光学ズーム機能がついたカメラであり、第二カメラ103は光学ズーム機能がついていない単焦点カメラである。第二カメラ103は第一カメラ102に比べて安価で小型であるため、図24(a)の光学ズーム付きカメラ2個を用いたときよりも、携帯端末の小型化または薄型化が可能となる。 As shown in FIGS. 1A and 1B, a 3D camera 104 including a first camera 102 and a second camera 103 is disposed on the back of the display element 105 of the rectangular casing 101 of the mobile terminal device. The first camera 102 is a camera with an optical zoom function, and the second camera 103 is a single focus camera without an optical zoom function. Since the second camera 103 is cheaper and smaller than the first camera 102, the portable terminal can be made smaller or thinner than when two cameras with an optical zoom in FIG. 24A are used. .
 なお、図1(a)は光学ズーム機能付き第一カメラ102が左画像用カメラであるが、右画像用カメラとしても構わない。 Although the first camera 102 with an optical zoom function is a camera for the left image in FIG. 1A, it may be a camera for the right image.
 図1(c)は携帯端末装置のブロック構成図を示す。第一カメラ102と第二カメラ103はカメラ制御手段108で制御され、撮影画像を記憶手段110に保存したり、表示素子105を用いて表示する。 FIG. 1C shows a block diagram of the portable terminal device. The first camera 102 and the second camera 103 are controlled by the camera control means 108, and the photographed image is stored in the storage means 110 or displayed using the display element 105.
 第一カメラ102はセンサ107とレンズ106からなり、光学ズーム機能が付いていて、レンズ106の位置をズーム制御信号109により制御することで所望のズーム倍率を得る。 The first camera 102 comprises a sensor 107 and a lens 106 and is provided with an optical zoom function, and the position of the lens 106 is controlled by the zoom control signal 109 to obtain a desired zoom magnification.
 図2にズーム時の3D画像生成方法を示す。 FIG. 2 shows a 3D image generation method at the time of zooming.
 まず、ズーム倍率Nが切り換え倍率N0より小さいときの3D画像生成方法について説明する。これを第一3D画像生成方法と呼ぶことにする。 First, a 3D image generation method when the zoom magnification N is smaller than the switching magnification N0 will be described. This is called a first 3D image generation method.
 第一カメラ102(左カメラ)はズーム倍率Nに応じて、光学ズーム画像を取得する。また、第二カメラ103(右カメラ)の画像から、光学ズーム画像と同じ画角の画像を切り出したのち、光学ズーム画像と同じ画素数に画素調整(=リサイズ)した電子ズーム画像を作成する。光学ズーム画像[1]を左画像、電子ズーム画像[2]を右画像とすることで、3D画像を生成する。 The first camera 102 (left camera) acquires an optical zoom image according to the zoom magnification N. Further, an image of the same angle of view as the optical zoom image is cut out from the image of the second camera 103 (right camera), and then an electronic zoom image in which the number of pixels is adjusted (= resized) to the same number of pixels as the optical zoom image is created. A 3D image is generated by using the optical zoom image [1] as a left image and the electronic zoom image [2] as a right image.
 なお、電子ズーム画像は、光学ズーム画像に比べて倍率Nが大きくなるにつれて、相対的に解像度が低下するため、第一3D画像生成方法は、ズーム倍率NがN0よりも小さいときのみとする。 Note that since the resolution of the electronic zoom image is relatively reduced as the magnification N is larger than the optical zoom image, the first 3D image generation method is performed only when the zoom magnification N is smaller than N0.
 右画像と左画像の解像度を示す元画像の縦画素数をYL、YR、横画素数をXL、XRとすると、ズーム倍率が大きくなると、YL(またはXL)に比べてYR(またはXR)が小さくなる。 Assuming that the number of vertical pixels of the original image indicating the resolution of the right and left images is YL and YR, and the number of horizontal pixels is XL and XR, YR (or XR) is larger than YL (or XL) when the zoom magnification is large. It becomes smaller.
 3Dの左右画像の解像度が異なるときに、3D画像として違和感なく見えるかどうかを本発明者らが主観評価を実施した結果を図3に示す。 When the resolutions of the 3D left and right images are different, the inventors conducted a subjective evaluation as to whether or not the 3D image looks comfortable as a result.
 左画像がフルHD(画素数1920×1080ドット)の画像に対して、右画像の画素数をフルHDよりも小さくしたときに、3D画像として違和感がないかどうかを、4種類の3D画像について評価した結果を示している。 With regard to four types of 3D images, whether there is no sense of incongruity as a 3D image when the number of pixels of the right image is made smaller than that of full HD with respect to an image whose left image is full HD (pixel number 1920 × 1080 dots) It shows the result of evaluation.
 縦画素数比は、左画像の縦画素YL(=フルHDの1080ドット)に対する、右画像の縦画素数YRの比を表していて、小さいほど左右の画像の解像度差が大きいことを意味する。 The vertical pixel number ratio represents the ratio of the vertical pixel number YR of the right image to the vertical pixel YL (= full HD 1080 dots) of the left image, and means that the smaller the resolution difference between the left and right images, the smaller the ratio. .
 発明者の主観評価結果から、YR/YLがおよそ0.4以上であれば、右画像の解像度の低下は気にならず、3D画像としてそれほど違和感のない画像が得られることがわかった。 According to the inventor's subjective evaluation results, it was found that if YR / YL is about 0.4 or more, the reduction in the resolution of the right image is not a concern, and an image with less discomfort can be obtained as a 3D image.
 従って、YR/YLが例えば0.4以上となるときのあるズーム倍率をN0とすれば、N<N0のときには、上記の第一3D画像生成方法で3D画像の生成が可能であることがわかった。 Therefore, assuming that a zoom magnification at which YR / YL is, for example, 0.4 or more is N0, it can be understood that a 3D image can be generated by the first 3D image generation method described above when N <N0. The
 なお、N>N0の場合には、右画像の解像度が低下するため良好な3D画像が得られない。 In the case of N> N0, the resolution of the right image is lowered, and a good 3D image can not be obtained.
 そこで次に、ズーム倍率がN0より大きいときの3D画像生成方法について説明する。これを第二3D画像生成方法と呼ぶことにする。 Therefore, next, a 3D image generation method when the zoom magnification is larger than N0 will be described. This will be called a second 3D image generation method.
 第二3D画像生成方法は、各画素または微小領域ごとの距離情報(デプス情報)を算出または推定し、その距離情報から3D画像を生成する方法である。 The second 3D image generation method is a method of calculating or estimating distance information (depth information) for each pixel or minute region, and generating a 3D image from the distance information.
 デプス情報の算出方法として、画素の明るさから推定する方法なども存在するが、より正確な測定方法は、2眼カメラの左右画像から各画素または微小領域ごとの視差情報(=距離情報)を得る方法であり、例えば特許文献7に記載されている。 As a method of calculating depth information, there is also a method of estimating from the brightness of a pixel, etc., but a more accurate measurement method uses parallax information (= distance information) for each pixel or minute area from left and right images of a twin-lens camera It is a method of obtaining and is described, for example, in Patent Document 7.
 ここでは、2眼カメラの左右画像から各画素または微小領域ごとの視差情報(=距離情報)を得る方法を採用する。 Here, a method of obtaining parallax information (= distance information) for each pixel or a minute area from the left and right images of a twin-lens camera is adopted.
 図2の左右画像[3]と[4]から、各画素(または微小領域)に対して、画像処理の相関演算などを用いて対応関係をつけることで、各画素の視差情報(距離情報)[5]が算出できる。 From the left and right images [3] and [4] of FIG. 2, the parallax information (distance information) of each pixel is obtained by establishing correspondence with each pixel (or a minute region) using correlation calculation of image processing or the like. [5] can be calculated.
 第一カメラ左画像[3]と視差情報[5]から、左右画像[6]と[7]を得る。 Left and right images [6] and [7] are obtained from the first camera left image [3] and disparity information [5].
 3D画像生成には、解像度の高い第一カメラ画像[3]を用いているため、左右画像ともに高解像度画像にすることができる。 Since the first camera image [3] with high resolution is used for 3D image generation, both left and right images can be made high resolution images.
 したがって、高倍率(N>N0)のときには第二3D画像生成方法を用いることで高解像度の3D画像が得られる。 Accordingly, when the magnification is high (N> N0), a high resolution 3D image can be obtained by using the second 3D image generation method.
 図4には、本発明の実施の形態1のズーム時の3D撮影フローチャートを示す。 FIG. 4 shows a 3D imaging flowchart at the time of zooming according to the first embodiment of the present invention.
 以上のように、第一の3D画像生成方法と第二の3D画像生成方法を組み合わせることで、広いズーム倍率範囲において高解像度の3D撮影が可能となる。 As described above, by combining the first 3D image generation method and the second 3D image generation method, high resolution 3D imaging can be performed in a wide zoom magnification range.
 低倍率(N<N0)の場合にも、第二3D画像生成方法を用いることが可能だが、第二3D画像生成方法は処理時間がかかるため、第二カメラ画像の解像度低下が許容される程度であれば、第二3D画像生成方法よりも第一3D画像生成方法のほうが望ましい。 Even in the case of low magnification (N <N0), it is possible to use the second 3D image generation method, but the second 3D image generation method takes processing time, and therefore the resolution reduction of the second camera image is acceptable. Therefore, the first 3D image generation method is more desirable than the second 3D image generation method.
 また、3D動画撮影の場合も、第二3D画像生成方法は処理時間がかかり、動画のフレームレートが小さくなってしまうため、第二3D画像生成方法よりも第一3D画像生成方法のほうが望ましい。 Also, in the case of 3D moving image shooting, the second 3D image generating method takes processing time and the frame rate of the moving image becomes small, so the first 3D image generating method is more desirable than the second 3D image generating method.
 そこで、第一3D画像生成方法と第二3D画像生成方法を、ズーム倍率に応じて切り換える代わりに、動画と静止画で切り換える方法も有効である。 Therefore, instead of switching the first 3D image generation method and the second 3D image generation method according to the zoom magnification, a method of switching between a moving image and a still image is also effective.
 図5には、このときの3D撮影フローチャートを示す。動画撮影の場合には、処理速度の早い第一3D画像生成方法を採用し、静止画撮影の場合には、高解像度の第二3D画像生成方法を採用する。 FIG. 5 shows a 3D imaging flowchart at this time. In the case of moving image shooting, a first 3D image generation method with high processing speed is adopted, and in the case of still image shooting, a high resolution second 3D image generation method is adopted.
 これにより、3D動画のフレームレートを低下することなく3D画像を生成することができる。 Thus, a 3D image can be generated without reducing the frame rate of the 3D moving image.
 (実施の形態2)
 本実施の形態2では、生成する3D画像の画素数に応じて、3D画像生成方法を変える方法について説明する。
Second Embodiment
In the second embodiment, a method of changing the 3D image generation method according to the number of pixels of the 3D image to be generated will be described.
 本実施の形態2では、1個の筐体からなる携帯端末装置を想定して説明するが、その他、折畳み型の携帯端末やデジタルスチルカメラ(DSC)など、小型カメラを有する電子機器に装着した場合も同様である。 In the second embodiment, a mobile terminal device having a single housing is described, but in addition, the mobile terminal device is attached to an electronic device having a small camera such as a foldable mobile terminal or a digital still camera (DSC). The same is true for the case.
 図6(a)は、本実施の形態2の3D撮像装置のブロック構成図である。 FIG. 6A is a block diagram of the 3D imaging apparatus according to the second embodiment.
 ここでは、第三カメラ111と第四カメラ112ともに光学ズーム機能は無しとするが、光学ズーム機能があっても構わない。 Here, although the third camera 111 and the fourth camera 112 both have no optical zoom function, they may have an optical zoom function.
 図6(b)に示す3D画像生成方法は、実施の形態1の第一3D画像生成方法と同様である。 The 3D image generation method shown in FIG. 6B is the same as the first 3D image generation method of the first embodiment.
 以下、画素数とは縦画素数×横画素数を指し、解像度とは、元画像における画素数を意味する。また、ここでは簡単のため画像の縦横比は一定と仮定する。 Hereinafter, the number of pixels indicates the number of vertical pixels × the number of horizontal pixels, and the resolution means the number of pixels in the original image. Also, for simplicity, it is assumed that the aspect ratio of the image is constant.
 最大画素数がAL([6])の第三カメラ111から生成した画像を第三カメラ生成画像([8])とし、画素数をBL(BL≦AL)とする。一方、最大画素数がAR(AR<ALとする)([7])の第四カメラ112から生成した画像を第四カメラ生成画像([9])とし、画素数をBR(BR≦AR)とする。 An image generated from the third camera 111 with the maximum number of pixels of AL ([6]) is a third camera generated image ([8]), and the number of pixels is BL (BL ≦ AL). On the other hand, an image generated from the fourth camera 112 of which the maximum number of pixels is AR (where AR <AL) ([7]) is a fourth camera generated image ([9]), and the number of pixels is BR (BR ≦ AR) I assume.
 [8]画像を画素数Cの画像にリサイズしたものを[10]画像とし、[9]画像を画素数Cの画像にリサイズしたものを[11]画像とし、[10]と[11]から3D画像を得る。 [8] A resized image to an image of pixel number C is a [10] image, and a resized image of [9] image to an image of pixel number C is a [11] image, from [10] and [11] Get a 3D image.
 図7に3D画像の画素数Cに対する解像度比P((右解像度)/(左解像度))のグラフの一例を示す。 FIG. 7 shows an example of a graph of resolution ratio P ((right resolution) / (left resolution)) with respect to the number C of pixels of the 3D image.
 画素数CがBRよりも小さいときには、[10]の解像度はBLからCに減少し、[11]の解像度もBRからCに減少する。従って、左右解像度の比は、(左解像度)/(右解像度)=C/C=1となる。 When the number of pixels C is smaller than BR, the resolution of [10] decreases from BL to C, and the resolution of [11] also decreases from BR to C. Therefore, the ratio of the left and right resolutions is (left resolution) / (right resolution) = C / C = 1.
 一方、画素数CがBRよりも大きく、BLよりも小さいときには、[10]の解像度(画素数)はBLからCに減少し、[11]の解像度はBRのままである。従って、左右解像度の比は、(右解像度)/(左解像度)=BR/C>BR/BLとなる。 On the other hand, when the number of pixels C is larger than BR and smaller than BL, the resolution (number of pixels) of [10] decreases from BL to C, and the resolution of [11] remains BR. Therefore, the ratio of the left and right resolutions is (right resolution) / (left resolution) = BR / C> BR / BL.
 一方、画素数CがBLよりも大きいときには、[10]の解像度はBLのままで、[11]の解像度もBRのままである。従って、左右解像度の比Pは、(左解像度)/(右解像度)=BL/BRとなる。 On the other hand, when the pixel number C is larger than BL, the resolution of [10] remains BL, and the resolution of [11] also remains BR. Therefore, the ratio P of the left and right resolutions is (left resolution) / (right resolution) = BL / BR.
 ここで解像度比Pが、例えば0.3×0.3=0.09以上となるある解像度をP0と設定し、このときの3D画像の解像度CをC0とすると、実施の形態1の図3の結果からわかるように、C<C0であれば、解像度比PがP0より大きいため、第一3D画像生成方法を用いた3D画像で問題ない。 Here, if a certain resolution at which the resolution ratio P is, for example, 0.3 × 0.3 = 0.09 or more is set as P0, and the resolution C of the 3D image at this time is C0, FIG. As understood from the result of the above, if C <C0, the resolution ratio P is larger than P0, so there is no problem in the 3D image using the first 3D image generating method.
 一方、C>C0のときには解像度比P0以下で解像度差が大きいため、第一3D画像生成方法は適さない。そこで、C>C0のときには第二3D画像生成方法を用いて3D画像を生成する。 On the other hand, when C> C0, the first 3D image generation method is not suitable because the resolution difference is large at the resolution ratio P0 or less. Therefore, when C> C0, a second 3D image generation method is used to generate a 3D image.
 図6(c)に第二3D画像生成方法を示す。また、図8には、本発明の実施の形態2の3D撮影のフローチャートを示す。 FIG. 6C shows a second 3D image generation method. Further, FIG. 8 shows a flowchart of 3D imaging according to the second embodiment of the present invention.
 C>C0のときには、第三カメラ画像[10]と第四カメラ画像[11]から距離情報[12]を算出し、高解像で画素数Cの第三カメラ画像[10]と距離情報[12]から3D画像を得る。生成方法は実施の形態1の第二3D画像生成方法と同様である。これにより、左右画像の解像度が同じ3D画像が得られる。 When C> C0, the distance information [12] is calculated from the third camera image [10] and the fourth camera image [11], and the third camera image [10] of pixel number C with high resolution and the distance information [10] Obtain a 3D image from 12]. The generation method is the same as the second 3D image generation method of the first embodiment. Thus, 3D images with the same resolution of the left and right images can be obtained.
 以上のように、生成する3D画像の画素数Cに応じて、3D画像生成方法を変えることによって、左右画像の解像度差が小さく見やすい3D画像が得られる。 As described above, by changing the 3D image generation method according to the number of pixels C of the 3D image to be generated, it is possible to obtain a 3D image with a small difference in resolution between left and right images.
 (実施の形態3)
 図9(a),(b),(c)は、本実施の形態の3D撮像装置の概略構造図である。
Third Embodiment
FIGS. 9A, 9B, and 9C are schematic structural views of the 3D imaging apparatus of the present embodiment.
 以下、本実施の形態3では、1個の筐体からなる携帯端末装置を想定して説明するが、その他、折畳み型の携帯端末やデジタルスチルカメラ(DSC)など、小型カメラを有する電子機器に装着した場合も同様である。 In the following, the third embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition to electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs). The same applies to the case of wearing.
 図9(a),(b)に示すように、携帯端末装置の長方形の筐体201の表示素子205の裏面に第一カメラ202と第二カメラ203からなる3Dカメラ204が配置されている。 As shown in FIGS. 9A and 9B, the 3D camera 204 including the first camera 202 and the second camera 203 is disposed on the back of the display element 205 of the rectangular casing 201 of the mobile terminal device.
 図9(c)は携帯端末装置のブロック構成図を示す。第一カメラ202と第二カメラ203はカメラ制御手段208で制御され、撮影画像を記憶手段209に保存したり、表示素子205を用いて表示する。 FIG.9 (c) shows the block block diagram of a portable terminal device. The first camera 202 and the second camera 203 are controlled by the camera control unit 208, and store the photographed image in the storage unit 209 or display it using the display element 205.
 距離情報測定手段210は、第一カメラ202と第二カメラ203の画像から被写体の距離情報を測定するものである。 The distance information measuring means 210 measures distance information of the subject from the images of the first camera 202 and the second camera 203.
 図10(a),(b)に、遠距離被写体211と近距離被写体212がある場合の一般的な3D画像撮影方法を示す。図10(a)は遠距離被写体211と近距離被写体212ともに、仮想投影面(または3Dディスプレイ画面)より奥にある場合を示す。一方、図10(b)は近距離被写体212が3Dカメラ204の近くにあり、遠距離被写体211が仮想投影面より奥にあって、近距離被写体212が手前にある場合を示す。図10(b)の被写体の視差角範囲D2は図10(a)の被写体の視差角範囲D1よりも広い。 FIGS. 10A and 10B show a general 3D image capturing method in the case where there is a long distance object 211 and a short distance object 212. FIG. 10A shows the case where both the far-distance object 211 and the short-distance object 212 are behind the virtual projection plane (or 3D display screen). On the other hand, FIG. 10B shows the case where the short distance object 212 is near the 3D camera 204, the long distance object 211 is behind the virtual projection plane, and the short distance object 212 is in front. The parallax angle range D2 of the subject in FIG. 10 (b) is wider than the parallax angle range D1 of the subject in FIG. 10 (a).
 図10(b)のように近距離被写体212が非常に近くに存在するときには、視差はC2L-C2Rで非常に大きいため、この状態では3Dディスプレイ画面に対して手前に飛び出しすぎて立体視しにくい。 As shown in FIG. 10B, when the short-distance object 212 is very close, the parallax is very large in C2L-C2R, so in this state, it is too close to the 3D display screen to make stereoscopic viewing difficult. .
 例えば特許文献4のように左右画像の水平位置をずらすことで、近距離被写体212の視差(C2L-C2R)を小さくすることはできるが、このとき遠距離被写体211の視差(C1L-C1R)がマイナス方向に大きくなり、立体視不可能になることがある。 For example, the parallax (C2L-C2R) of the short distance object 212 can be reduced by shifting the horizontal position of the left and right images as in Patent Document 4, but at this time the parallax (C1L-C1R) of the long distance object 211 It may increase in the negative direction, making it impossible to view stereoscopically.
 これは、近距離被写体212と遠距離被写体211の視差の差(ΔCL-ΔCR)が大きいことが原因である。 This is because the difference in parallax (ΔCL−ΔCR) between the near distance object 212 and the long distance object 211 is large.
 そこで、以下のような3D画像生成方法を採用する。 Therefore, the following 3D image generation method is adopted.
 カメラから最も近い近距離被写体212の距離がある値以上の場合は、第一カメラ画像と第二カメラ画像を左右画像とした3D画像を得る。これを遠距離撮影モードと呼ぶことにする。 If the distance of the closest subject 212 closest to the camera is equal to or greater than a certain value, a 3D image is obtained with the left and right images of the first camera image and the second camera image. This will be referred to as the long distance imaging mode.
 図10(a)に示すように、2つの被写体の視差の差|ΔBL-ΔBR|が小さく、視差角範囲D1が狭いため、遠距離被写体211も近距離被写体212も快適に立体視することができる。 As shown in FIG. 10A, the parallax difference between two objects | ΔBL−ΔBR | is small, and the parallax angle range D1 is narrow, so that both the long distance object 211 and the near distance object 212 can be comfortably viewed stereoscopically. it can.
 例えば、カメラの基線長が6.5cmのときに、最も近い近距離被写体の距離が1m以上であれば、視差角が±1度の範囲内に収めることができ、快適な3D画像が得られる。 For example, when the base length of the camera is 6.5 cm, if the distance of the closest subject is 1 m or more, the parallax angle can be within ± 1 degree, and a comfortable 3D image can be obtained. .
 次に、カメラから最も近い近距離被写体212の距離がある値(例えば1m)未満の場合は、近距離撮影モードと呼び、以下の3D画像生成方法を採用する。 Next, when the distance of the near distance object 212 closest to the camera is less than a certain value (for example, 1 m), it is called a short distance shooting mode, and the following 3D image generation method is adopted.
 第一カメラ画像と第二カメラ画像から、各画素または微小領域ごとの距離情報(デプス情報)を算出または推定し、その距離情報から3D画像を生成する方法であり、例えば特許文献7に記載されている。 This is a method of calculating or estimating distance information (depth information) for each pixel or minute area from the first camera image and the second camera image, and generating a 3D image from the distance information, as described in, for example, Patent Document 7 ing.
 図11に近距離撮影モードと遠距離撮影モードの3D画像生成方法を示す。また、図13には3D撮影のフローチャートを示す。 FIG. 11 shows a 3D image generation method in the short distance shooting mode and the long distance shooting mode. Further, FIG. 13 shows a flowchart of 3D imaging.
 最も近い近距離被写体距離LがL0よりも大きいときには遠距離撮影モードとし、左右画像[1]と[2]から3D画像を得る。 When the closest short-distance subject distance L is larger than L0, the long-distance shooting mode is set, and a 3D image is obtained from the left and right images [1] and [2].
 一方、近距離被写体距離LがL0よりも小さいときには近距離撮影モードとし、左右画像[3]と[4]から、各画素(または微小ブロック)に対して、画像処理の相関演算などを用いて対応関係をつけることで、各画素の視差情報(距離情報)[5]を算出する。この視差情報(距離情報)の測定は、距離情報測定手段210で実施する。さらに、第一カメラ画像[3](または第二カメラ画像[4])と視差情報[5]から、左右画像[6]と[7]を得る。 On the other hand, when the short-distance subject distance L is smaller than L0, the short-distance shooting mode is selected, and from left and right images [3] and [4], correlation calculation of image processing is performed on each pixel (or minute block). By providing the correspondence, the parallax information (distance information) [5] of each pixel is calculated. The measurement of the parallax information (distance information) is performed by the distance information measuring means 210. Further, left and right images [6] and [7] are obtained from the first camera image [3] (or the second camera image [4]) and the parallax information [5].
 近距離撮影モードでは、視差角範囲を実際より狭くするために、算出した距離情報に1より小さいある一定値(a)を掛け合わせることにより、実際よりも奥行き感を小さくすることができる。 In the short distance shooting mode, the sense of depth can be made smaller than the actual one by multiplying the calculated distance information by a certain constant value (a) smaller than 1 in order to make the parallax angle range narrower than the actual.
 例えば、図12(b)のように、[3]と[4]から算出される各画素(x、y)の距離情報[5]から得られる視差角をA(x,y)とすると、A’(x、y)=A(x、y)×a (0<a<1)とすれば、視差角範囲を実際より狭くすることができる。 For example, as shown in FIG. 12B, assuming that the parallax angle obtained from the distance information [5] of each pixel (x, y) calculated from [3] and [4] is A (x, y), If A ′ (x, y) = A (x, y) × a (0 <a <1), the parallax angle range can be made narrower than in reality.
 すなわち、図12(a)の近距離撮影モードにおける視差角差|ΔCL‘-ΔCR’|が、遠距離撮影モードの画像[1]と[2]における、図10(b)の近距離被写体と遠距離被写体の視差角の差|ΔCL-ΔCR|よりも小さくすることができる。 That is, the parallax angle difference | ΔCL′−ΔCR ′ | in the short distance shooting mode of FIG. 12 (a) corresponds to the short distance object of FIG. 10 (b) in the images [1] and [2] of the long distance shooting mode. It can be made smaller than the difference | ΔCL−ΔCR | of the parallax angle of the long distance object.
 これにより、視差角範囲D3が視差角範囲D2よりも狭くなり、近距離被写体が非常に近い場合でも視差角の変動が大きすぎず、快適な立体視が可能となる。 As a result, the parallax angle range D3 becomes narrower than the parallax angle range D2, and even when the short distance object is very close, the fluctuation of the parallax angle is not too large, and comfortable stereoscopic vision can be achieved.
 次に、最も近い近距離被写体の距離の測定方法について述べる。 Next, a method of measuring the distance of the closest near object will be described.
 遠距離撮影モードと近距離撮影モードの切り換えはユーザ自身が行っても構わないが、自動的に切り換えるためには、本3D撮像装置が最も近い被写体の距離を測定する手段を有していることが必要である。 The user may switch between the long distance shooting mode and the short distance shooting mode, but in order to switch automatically, the 3D image pickup apparatus has means for measuring the distance of the closest subject. is necessary.
 近距離被写体の距離の測定方法として、レーダーの反射時間や赤外線などを用いた一般的な距離測定器を用いても構わないが、携帯端末装置が高価で大型化するため、ここでは、第一カメラ202と第二カメラ203の画像から推定する方法を採用する。これにより、距離測定手段のための追加部品が不要となる。 As a method of measuring the distance of a short distance object, a general distance measuring device using a reflection time of radar or infrared rays may be used, but since the portable terminal device becomes expensive and the size increases, here A method of estimating from the images of the camera 202 and the second camera 203 is adopted. This eliminates the need for additional components for the distance measuring means.
 例えば、距離情報測定手段210において、左右カメラ画像の対応画素(またはブロック)がわかれば、その画素(ブロック)の距離が推定できる。その画素の中で最も近くにあるものが近距離被写体であると推定できる。 For example, in the distance information measurement means 210, if the corresponding pixels (or blocks) of the left and right camera images are known, the distance of the pixels (blocks) can be estimated. It can be estimated that the closest subject among the pixels is the short distance subject.
 なお、左右画像全ての対応画素を算出するには処理時間がかかるため、撮影画像の中で数ポイントの画素(またはブロック)について距離情報を算出しても構わない。 In addition, since it takes processing time to calculate the corresponding pixels of all the left and right images, distance information may be calculated for pixels (or blocks) of several points in the photographed image.
 また、一般に被写体は画像中心部分に集中していると考えられるので、中心付近の画素(またはブロック)の距離情報から最近接被写体の距離を推定しても構わない。 In addition, since the subject is generally considered to be concentrated at the center of the image, the distance of the closest subject may be estimated from the distance information of the pixels (or blocks) near the center.
 以上のように、従来の遠距離3D撮影モードに加え、距離情報測定手段により視差角が調整された近距離3D撮影モードを組み合わせることで、安価で小型なカメラ構成で3D効果と見やすさを両立する3D撮影が可能となる。 As described above, by combining the short distance 3D shooting mode in which the parallax angle is adjusted by the distance information measurement means in addition to the conventional long distance 3D shooting mode, both the 3D effect and the visibility can be achieved with the inexpensive and compact camera configuration. 3D photography is possible.
 (実施の形態4)
 図14(a),(b),(c)は、本実施の形態の3D撮像装置の概略構造図である。
Embodiment 4
FIGS. 14 (a), (b) and (c) are schematic structural diagrams of the 3D imaging apparatus of the present embodiment.
 以下、本実施の形態4では、1個の筐体からなる携帯端末装置を想定して説明するが、その他、折畳み型の携帯端末やデジタルスチルカメラ(DSC)など、小型カメラを有する電子機器に装着した場合も同様である。 In the following, the fourth embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition, electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs) are described. The same applies to the case of wearing.
 図14(a),(b)に示すように、携帯端末装置の長方形の筐体301の表示素子305の裏面に第一カメラ302と第二カメラ303からなる2眼の3Dカメラ304が配置されている。 As shown in FIGS. 14A and 14B, a binocular 3D camera 304 including a first camera 302 and a second camera 303 is disposed on the back of the display element 305 of the rectangular casing 301 of the portable terminal device. ing.
 第一カメラ302と第二カメラ303は、カメラセンサ307とレンズ306からなる。 The first camera 302 and the second camera 303 are composed of a camera sensor 307 and a lens 306.
 図14(c)は携帯端末装置のブロック構成図を示す。第一カメラ302と第二カメラ303はカメラ制御手段308で制御され、撮影画像を記憶手段309に保存し、表示素子305を用いて撮像画像を表示する。 FIG. 14C shows a block diagram of the portable terminal device. The first camera 302 and the second camera 303 are controlled by the camera control unit 308, store the captured image in the storage unit 309, and display the captured image using the display element 305.
 表示素子305は、例えばバリア液晶を有する裸眼3D液晶であり、バリアの方向を縦と横に切り替えることにより、3D画像の横長表示と縦長表示が可能である。 The display element 305 is, for example, a naked-eye 3D liquid crystal having a barrier liquid crystal, and can switch between horizontal and vertical display of a 3D image by switching the barrier direction to vertical and horizontal.
 また、電子シャッタ式眼鏡を用いた3D表示素子であっても、3D画像の横長表示と縦長表示が可能である。 In addition, even in the case of a 3D display device using electronic shutter glasses, it is possible to display horizontally and vertically in a 3D image.
 なお、3D画像を表示素子で表示する必要性がなければ、縦長3D表示ができなくてもよい。または、3D表示自体ができなくても構わない。 Note that if there is no need to display a 3D image with a display element, vertically long 3D display may not be possible. Alternatively, 3D display itself may not be possible.
 距離情報抽出部310は、第一カメラ302と第二カメラ303の画像から被写体の距離情報を抽出する機能を有する。 The distance information extraction unit 310 has a function of extracting distance information of an object from images of the first camera 302 and the second camera 303.
 さらに、この携帯端末は、加速度センサや方位センサなどからなる姿勢センサ311を内蔵し、端末の姿勢を検出することができる。 Furthermore, this portable terminal incorporates an attitude sensor 311 including an acceleration sensor, an azimuth sensor, and the like, and can detect the attitude of the terminal.
 表示素子の長辺を水平にした撮影を横3D撮影モード、表示素子の長辺を垂直にした撮影を縦3D撮影モードと呼ぶことにする。 The imaging in which the long side of the display element is horizontal is referred to as a horizontal 3D imaging mode, and the imaging in which the long side of the display element is vertical is referred to as a vertical 3D imaging mode.
 表示素子の長辺が水平か垂直かは姿勢センサ311が検出し、横3D撮影モードにするか縦3D撮影モードにするかを自動的に決定する。 The posture sensor 311 detects whether the long side of the display element is horizontal or vertical, and automatically determines whether to set the horizontal 3D shooting mode or the vertical 3D shooting mode.
 なお、姿勢センサ311がない場合には、ユーザ自身が横3D撮影モードか縦3D撮影モードかを選択すればよいため、姿勢センサ311がなくても構わない。 In the case where the posture sensor 311 is not provided, the user may select the horizontal 3D photographing mode or the vertical 3D photographing mode, so the posture sensor 311 may not be provided.
 図15(a)に横3D撮影モード、図15(b)に縦3D撮影モードの概要を示す。 FIG. 15A shows an outline of the horizontal 3D shooting mode, and FIG. 15B shows an outline of the vertical 3D shooting mode.
 なお、第一カメラ302および第二カメラ303の各カメラセンサ307の長辺は表示素子の長辺と平行とする。 The long sides of the camera sensors 307 of the first camera 302 and the second camera 303 are parallel to the long sides of the display element.
 横撮影モードでは、従来通り第一カメラ画像と第二カメラ画像を左右画像にすることで3D画像を得る。 In the horizontal shooting mode, a 3D image is obtained by converting the first camera image and the second camera image into left and right images as in the prior art.
 図15(a)に示すように、3D画像の再生表示または、プレビュー表示では、水平方向すなわち表示素子の長辺方向に左右視差を有する3D画像となる。 As shown in FIG. 15A, in reproduction display or preview display of a 3D image, it becomes a 3D image having horizontal parallax in the horizontal direction, that is, in the long side direction of the display element.
 一方、縦撮影モードでは、左右画像が表示素子の短辺方向に左右視差を有する3D画像とする。 On the other hand, in the vertical shooting mode, the left and right images are 3D images having left and right parallax in the short side direction of the display element.
 この短辺方向に左右視差を有する縦3D画像は、第一カメラ画像と第二カメラ画像を左右画像としただけでは得ることができない。次に、縦3D画像の生成方法について説明する。 The vertical 3D image having left and right parallax in the short side direction can not be obtained only by using the first camera image and the second camera image as the left and right images. Next, a method of generating a vertical 3D image will be described.
 図16は横3D画像と縦3D画像の生成手順を示す。 FIG. 16 shows a generation procedure of horizontal 3D images and vertical 3D images.
 横3D撮影モードでは、左右画像[1]と[2]から3D画像を得る。 In the horizontal 3D imaging mode, a 3D image is obtained from the left and right images [1] and [2].
 一方、縦3D撮影モードでは、縦長の左右画像[3]と[4]から、各画素(または微小ブロック)に対して、画像処理の相関演算などを用いて対応関係をつけることで、各画素(または微小ブロック)の視差情報(=距離情報)[5]を算出する。さらに、第一カメラ画像[3](または第二カメラ画像[4])と視差情報[5]から、カメラ画像の短辺方向に左右視差(BL-BR)を有する左右画像[6]と[7]を得る。 On the other hand, in the vertical 3D shooting mode, each pixel (or minute block) is made to correspond to each other by using correlation calculation of image processing or the like from the vertically oriented left and right images [3] and [4]. Parallax information (= distance information) [5] of (or small block) is calculated. Furthermore, from the first camera image [3] (or the second camera image [4]) and the parallax information [5], left and right images [6] and [6] having left and right parallax (BL-BR) in the short side direction of the camera image 7] get.
 2眼カメラ画像から距離情報を算出したのち3D画像を生成する方法は、例えば特許文献6に記載されているが、表示素子の長辺方向に配置された2眼カメラから、表示素子の短辺方向の視差(BL-BR)を有する3D画像を生成するところが、本発明と従来例の異なる点である。 Although a method of generating a 3D image after calculating distance information from a twin-lens camera image is described in, for example, Patent Document 6, a short-side of the display element is obtained from a twin-lens camera disposed in the long side direction of the display element. The point of generating a 3D image having a parallax of direction (BL-BR) is the difference between the present invention and the conventional example.
 なお、横3D撮影モードにおいても、縦3D撮影モードと同様に距離情報による3D画像生成方法を用いても構わない。 Also in the horizontal 3D shooting mode, as in the vertical 3D shooting mode, a 3D image generation method using distance information may be used.
 但し、距離情報による3D画像生成方法は処理時間がかかるため、横3D撮影モードでは従来の2眼カメラ画像で3D画像を生成したほうが望ましい。 However, since the 3D image generation method using distance information takes a long time to process, it is preferable to generate a 3D image from a conventional twin-lens camera image in the horizontal 3D imaging mode.
 図17には3D撮影フローチャートの一例を示す。姿勢センサ311の検出結果が縦置きの場合は縦3D撮影モード、横置きの場合は横3D撮影モードに切り替える。 FIG. 17 shows an example of a 3D imaging flowchart. In the case where the detection result of the posture sensor 311 is vertical, the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to horizontal 3D imaging mode.
 横置きの場合は、3Dプレビュー表示画像は第一カメラ画像と第二カメラ画像を用いる。 In the case of landscape orientation, the 3D preview display image uses the first camera image and the second camera image.
 一方、縦置きの場合は、3Dプレビュー表示画像としては、ステップS302とS303で生成された3D画像を用いる。なお、このときには、視差バリア液晶のバリア液晶を縦表示のモードに切り換える必要がある。 On the other hand, in the case of portrait orientation, the 3D images generated in steps S302 and S303 are used as the 3D preview display image. At this time, it is necessary to switch the barrier liquid crystal of the parallax barrier liquid crystal to the mode of vertical display.
 以上のように、通常の2眼カメラによる3D画像生成方法と、距離情報と片側のカメラ画像から3D画像生成する方法を組み合わせることで、カメラの配置を物理的に変更することなく、小型で安価に、縦3D撮影と横3D撮影を両立させることができる。 As described above, by combining the 3D image generation method with a normal twin-lens camera and the 3D image generation method from distance information and one-sided camera image, it is compact and inexpensive without physically changing the arrangement of the cameras. In addition, vertical 3D shooting and horizontal 3D shooting can be compatible.
 (実施の形態5)
 次に、本実施の形態5の3D撮像におけるプレビュー表示方法ついて説明する。
Fifth Embodiment
Next, a preview display method in 3D imaging according to the fifth embodiment will be described.
 なお、本実施の形態5の3D画像生成方法の基本的部分に関しては実施の形態4と同様であるため、説明は省略する。 The basic part of the 3D image generation method of the fifth embodiment is the same as that of the fourth embodiment, so the description will be omitted.
 図18には本実施の形態5の3D撮影フローチャートの一例を示す。 FIG. 18 shows an example of a 3D imaging flowchart of the fifth embodiment.
 姿勢センサ311の検出結果が縦置きの場合は縦3D撮影モード、横置きの場合は横3D撮影モードに切り替える。 In the case where the detection result of the posture sensor 311 is vertical, the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to the horizontal 3D imaging mode.
 実施の形態4の図17との違いは、ステップS312の縦3D撮影モードの場合のプレビュー表示の方法である。 The fourth embodiment differs from FIG. 17 in the method of preview display in the case of the vertical 3D shooting mode in step S312.
 実施の形態4では、縦3D撮影時のプレビュー表示は、第一または第二カメラ画像と距離情報から生成した3D画像から生成したプレビュー画像を用いていたが、距離情報による3D画像生成に時間がかかるため、プレビュー表示の開始時間が遅れたり、表示フレームレートが遅いといった課題があった。 In the fourth embodiment, the preview display at the time of vertical 3D shooting uses the preview image generated from the 3D image generated from the first or second camera image and the distance information, but it takes time to generate the 3D image using distance information Therefore, there is a problem that the start time of the preview display is delayed or the display frame rate is slow.
 そこで、ここでは、第一または第二カメラ画像の2D画像を用いてプレビュー表示するものとする。なお、3Dバリア液晶は3D表示モードではなく2D表示モードとする。 Therefore, here, the preview display is performed using a 2D image of the first or second camera image. The 3D barrier liquid crystal is not in 3D display mode but in 2D display mode.
 これにより、上記の縦3D撮影時のプレビュー表示の遅れの問題が解消できる。 Thereby, the problem of the delay in the preview display at the time of the vertical 3D shooting can be solved.
 なお、単なる縦長の2D撮影と縦3D撮影モードを、ユーザが明確に区別できるように、2D表示画面の中に3D表示プレビュー中であることがわかる文字表示またはマーク表示等を追加することが望ましい。 In addition, it is desirable to add the character display or mark display etc. which show that it is in 3D display preview in a 2D display screen so that a user can distinguish mere vertical 2D photography and vertical 3D photography mode clearly. .
 図19には、もう一つのプレビュー表示を用いた3D撮影フローチャートの一例を示す。 FIG. 19 shows an example of a 3D imaging flowchart using another preview display.
 図19と図18の差は、ステップS321の縦3D撮影モードのプレビュー表示の方法である。 The difference between FIG. 19 and FIG. 18 is the method of preview display in the vertical 3D shooting mode in step S321.
 ステップS321では、第一カメラ画像または第二カメラ画像から擬似3D画像をプレビュー表示する。ここで、擬似3D画像とは以下の画像を指す。 In step S321, a pseudo 3D image is previewed from the first camera image or the second camera image. Here, the pseudo 3D image refers to the following image.
 図20に示すように、第一カメラ画像または第二カメラ画像の2D画像([3])の水平方向のずれ量(C)が異なる2つの画像を左右画像([8]と[9])とする。これを擬似3D画像と呼ぶ。 As shown in FIG. 20, the left and right images ([8] and [9]) are two images that differ in the amount of horizontal shift (C) of the 2D image ([3]) of the first camera image or the second camera image. I assume. This is called a pseudo 3D image.
 これにより、視差のある画像が生成でき、2D撮影時のプレビューとは異なるプレビュー画像表示が可能になるため、ユーザが3D撮影か2D撮影かの認識が容易となり、2D撮影と3D撮影を区別するための表示等が不要となる。 As a result, an image with parallax can be generated, and a preview image different from the preview at 2D shooting can be displayed. This makes it easy for the user to recognize whether 3D shooting or 2D shooting and distinguish between 2D shooting and 3D shooting. There is no need to display the
 以上により、上記の縦撮影時のプレビュー表示方法は3D画像の生成が不要なため、短時間でプレビュー表示が可能となる。2D表示画面の中に2D表示プレビューと区別するための文字表示またはマーク表示等は不要である。 As described above, since the preview display method at the time of vertical shooting does not require generation of a 3D image, preview display can be performed in a short time. It is not necessary to display characters or marks in the 2D display screen to distinguish it from the 2D display preview.
 (実施の形態6)
 次に、本実施の形態6の3D撮像における視差調整方法ついて説明する。
Sixth Embodiment
Next, a parallax adjustment method in 3D imaging of the sixth embodiment will be described.
 なお、本実施の形態6の3D画像生成方法の基本的部分に関しては実施の形態4、5と同様であるため、説明は省略する。 The basic part of the 3D image generation method of the sixth embodiment is the same as that of the fourth and fifth embodiments, so the description will be omitted.
 図21(a),(b)は本実施の形態6の3D画像生成方法を示す図である。 21 (a) and 21 (b) are diagrams showing a 3D image generation method according to the sixth embodiment.
 図21(a)は横撮影時の3D画像、(b)は縦撮影時の3D画像を示す。 21A shows a 3D image at the time of horizontal shooting, and FIG. 21B shows a 3D image at the time of vertical shooting.
 横撮影のときと縦撮影のときに、左右画像の視差が独立に設定できるようにする。横撮影のときに比べて縦撮影のときには被写体が人物などの近距離被写体を撮影する頻度が多いことを想定する。 The parallax of the left and right images can be set independently at the time of horizontal shooting and vertical shooting. It is assumed that the subject shoots a short distance subject such as a person more frequently in the vertical shooting than in the horizontal shooting.
 縦撮影のときには、近距離の被写体の視差角(または視差C2)が大きくなり過ぎないように、左右画像の相対的水平距離を予め所定の値に調整する。すなわち、視差設定値が近距離用となるようにする。これにより、縦撮影の近距離撮影でも快適な視差の3D画像が得られる。 At the time of vertical shooting, the relative horizontal distance between the left and right images is adjusted in advance to a predetermined value so that the parallax angle (or parallax C2) of the subject at a short distance does not become too large. That is, the parallax setting value is made to be for the short distance. As a result, a comfortable parallax 3D image can be obtained even in close-up shooting in vertical shooting.
 一方、横撮影のときには、遠距離被写体が多いことを想定し、視差角(または視差C1)が小さくなりすぎないように、視差設定値を遠距離用とする。 On the other hand, in the case of horizontal shooting, assuming that there are many far-distance objects, the parallax setting value is used for long-distance so that the parallax angle (or parallax C1) does not become too small.
 以上により、縦撮影と横撮影の最適な視差の3D画像が得られる。 As described above, a 3D image of optimal parallax for vertical shooting and horizontal shooting can be obtained.
 (実施の形態7)
 図22(a),(b),(c)は、本実施の形態の3D撮像装置の概略構造図である。
Seventh Embodiment
22 (a), (b) and (c) are schematic structural views of the 3D imaging apparatus of the present embodiment.
 以下、本実施の形態7では、1個の筐体からなる携帯端末装置を想定して説明するが、その他、折畳み型の携帯端末やデジタルスチルカメラ(DSC)など、小型カメラを有する電子機器に装着した場合も同様である。 In the following, the seventh embodiment will be described on the assumption of a portable terminal device having a single housing, but in addition, it is an electronic device having a small camera such as a folding portable terminal or a digital still camera (DSC). The same applies to the case of wearing.
 図22(a),(b)に示すように、携帯端末装置の長方形の筐体401の表示素子405の裏面に第一カメラ402と第二カメラ403からなる3Dカメラ404が配置されている。 As shown in FIGS. 22A and 22B, a 3D camera 404 including a first camera 402 and a second camera 403 is disposed on the back surface of the display element 405 of the rectangular casing 401 of the portable terminal device.
 図22(c)は携帯端末装置のブロック構成図を示す。第一カメラ402と第二カメラ403はカメラ制御手段408で制御され、撮影画像を記憶手段409に保存したり、表示素子405を用いて表示する。 FIG.22 (c) shows the block block diagram of a portable terminal device. The first camera 402 and the second camera 403 are controlled by the camera control unit 408, and save the photographed image in the storage unit 409 or display it using the display element 405.
 第一カメラ402と第二カメラ403はセンサ407とレンズ406からなる。但し、第二カメラ403の画角は、第一カメラ402の画角よりも大きい。カメラ制御手段408は、第二カメラ403のセンサ407が取得したイメージに対するスキャン範囲とスキャン速度を調整することで、第一カメラ画像と第二カメラ画像の撮像範囲と撮像時間の同期を行う。また、カメラ制御手段408は、第一カメラ画像と第二カメラ画像を左右画像とした3D画像を得る。 The first camera 402 and the second camera 403 are composed of a sensor 407 and a lens 406. However, the angle of view of the second camera 403 is larger than the angle of view of the first camera 402. The camera control means 408 synchronizes the imaging range of the first camera image and the second camera image with the imaging time by adjusting the scan range and the scan speed for the image acquired by the sensor 407 of the second camera 403. Further, the camera control unit 408 obtains a 3D image with the first and second camera images as left and right images.
 以下、図23を参照して、カメラ制御手段408によるスキャン範囲の調整について説明する。カメラ制御手段408は、第二カメラ403が第一カメラ402の画角と同じ画角の範囲のみをスキャンするよう、垂直方向のスキャン範囲を図23の位置A1から位置A2の間に限定する。すなわち、カメラ制御手段408は、第二カメラ403のセンサ407が取得したイメージのうち、第一カメラ402には写っていない範囲はスキャンしない。 Hereinafter, adjustment of the scan range by the camera control unit 408 will be described with reference to FIG. The camera control unit 408 limits the scan range in the vertical direction between the position A1 and the position A2 in FIG. 23 so that the second camera 403 scans only the range of the same angle of view as the angle of view of the first camera 402. That is, the camera control unit 408 does not scan the range not captured by the first camera 402 in the image acquired by the sensor 407 of the second camera 403.
 次に、図23を参照して、カメラ制御手段408によるスキャン速度の調整について説明する。カメラ制御手段408は、第二カメラ403に対して調整した範囲をスキャンする時間が、第一カメラ402のセンサ407のイメージの全てをスキャンする時間T1と同じ時間T2になるよう、第二カメラ403のセンサ407が取得したイメージのうち上記限定した範囲をスキャンする速度を調整する。スキャン速度の調整は、第二カメラ403の画素のサンプリング周期、ブランキング期間、又は第二カメラ403に入力するクロック信号の周波数を変更することによって行われる。 Next, adjustment of scan speed by the camera control unit 408 will be described with reference to FIG. The camera control means 408 controls the second camera 403 so that the time for scanning the adjusted range with respect to the second camera 403 is the same time T2 as the time T1 for scanning all of the images of the sensor 407 of the first camera 402. Among the images acquired by the sensor 407, the speed of scanning the limited range is adjusted. The adjustment of the scan speed is performed by changing the sampling period of the pixels of the second camera 403, the blanking period, or the frequency of the clock signal input to the second camera 403.
 このように、カメラ制御手段408は、画角が異なる第一カメラ402と第二カメラ403が同じ時間に取得した2つのイメージに対して、スキャン開始から終了までの垂直方向の画角を一致させ、同じ画角のイメージを同じ速度でスキャンする。したがって、第一カメラ402の第一カメラ画像と、第一カメラ画像の画角及び撮像時間に同期した第二カメラ403の第二カメラ画像とが得られる。 As described above, the camera control unit 408 matches the vertical angle of view from the start to the end of the scan with two images acquired at the same time by the first camera 402 and the second camera 403 having different angles of view. Scan the same angle of view image at the same speed. Therefore, the first camera image of the first camera 402 and the second camera image of the second camera 403 synchronized with the angle of view of the first camera image and the imaging time can be obtained.
 なお、図23を参照した上記説明では、画角の大きい第二カメラ403におけるスキャン速度を遅くすることで、第一カメラ402と第二カメラ403の撮像時間を合わせた。しかし、この方法に限らず、画角の小さい第一カメラ402におけるスキャン速度を速くすることで、2つのカメラ画像の同じ画角に対する撮像時間を同期させてもよい。 In the above description with reference to FIG. 23, the imaging time of the first camera 402 and the imaging time of the second camera 403 are matched by reducing the scanning speed of the second camera 403 having a large angle of view. However, the present invention is not limited to this method, and the imaging time for the same angle of view of two camera images may be synchronized by increasing the scanning speed of the first camera 402 with a small angle of view.
 また、本実施の形態では、片方のカメラのみが光学ズーム機能を有していても、同様に撮像時間を合わせることができる。例えば、第一カメラ402が光学ズームを行えば、第一カメラ402の画角は小さくなる。このとき、カメラ制御手段408は、第一カメラ402における光学ズーム倍率の変化に応じて、第二カメラ403におけるスキャン範囲とスキャン速度を調整することで、2つのカメラ画像の撮像範囲と撮像時間の同期を行うことができる。 Further, in the present embodiment, even if only one camera has the optical zoom function, the imaging time can be similarly adjusted. For example, if the first camera 402 performs an optical zoom, the angle of view of the first camera 402 decreases. At this time, the camera control unit 408 adjusts the scan range and the scan speed of the second camera 403 in accordance with the change of the optical zoom magnification of the first camera 402 to obtain the imaging ranges and imaging times of the two camera images. It can synchronize.
 例えば、第一カメラ402の光学ズーム倍率が1倍の状態で第一カメラ402の画角に合わせて第二カメラ403におけるスキャン範囲とスキャン速度を調整した後、第一カメラ402の光学ズーム倍率がn倍となった場合、カメラ制御手段408は、第二カメラ403に入力するクロック信号の周波数を1/n倍に変更する。この場合、カメラ制御手段408は、第二カメラ403におけるスキャン範囲のみを倍率に合わせて1/nに調整するだけで、2つのカメラ画像の撮像範囲と撮像時間の同期をとることができる。 For example, after adjusting the scan range and scan speed in the second camera 403 in accordance with the angle of view of the first camera 402 in a state where the optical zoom magnification of the first camera 402 is 1 ×, the optical zoom magnification of the first camera 402 is When it is n, the camera control unit 408 changes the frequency of the clock signal input to the second camera 403 to 1 / n. In this case, the camera control means 408 can synchronize the imaging range of two camera images with the imaging time only by adjusting only the scan range of the second camera 403 to 1 / n in accordance with the magnification.
 なお、第一カメラ402と第二カメラ403は、外部で同期する仕組みを持ち、撮像開始の時間を合わせる構成としてもよい。 Note that the first camera 402 and the second camera 403 may have a mechanism to synchronize externally, and may be configured to synchronize the imaging start time.
 以上説明したように、本実施形態によれば、3D画像を得るための第一カメラ画像と第二カメラ画像を撮像する際に、第一カメラの画角と第二カメラの画角とが異なれば、第一カメラ画像と第二カメラ画像の撮像範囲と撮像時間が同期するよう、画角が大きい方のカメラのセンサが取得したイメージに対するスキャン範囲とスキャン速度を調整する。したがって、カメラに対して画角を変更するための操作を行わずして、3D画像のための左右画像を電気的に取得することができる。このため、動く被写体の3D映像も得ることができる。 As described above, according to the present embodiment, when the first camera image and the second camera image for obtaining a 3D image are captured, the angle of view of the first camera and the angle of view of the second camera are different. For example, the scan range and scan speed for the image acquired by the sensor of the camera with the larger angle of view are adjusted so that the imaging range and imaging time of the first camera image and the second camera image are synchronized. Therefore, it is possible to electrically obtain left and right images for a 3D image without performing an operation for changing the angle of view with respect to the camera. For this reason, it is possible to obtain 3D images of a moving subject.
 また、いずれか一方のカメラの画角が変更されても、第一カメラ画像と第二カメラ画像の撮像時間の同期はクロック信号の周波数を変更すればよい。したがって、カメラのブランク期間等の設定を変えることなく、画角の変更に対応することができる。 Further, even if the angle of view of one of the cameras is changed, the synchronization of the imaging time of the first camera image and the second camera image may be performed by changing the frequency of the clock signal. Therefore, it is possible to cope with the change of the angle of view without changing the setting such as the blank period of the camera.
 本発明を詳細にまた特定の実施態様を参照して説明したが、本発明の精神と範囲を逸脱することなく様々な変更や修正を加えることができることは当業者にとって明らかである。 Although the invention has been described in detail and with reference to specific embodiments, it will be apparent to those skilled in the art that various changes and modifications can be made without departing from the spirit and scope of the invention.
 本出願は、2011年3月17日出願の日本特許出願(特願2011-058826)、2011年3月17日出願の日本特許出願(特願2011-058827)及び2011年3月17日出願の日本特許出願(特願2011-058832)に基づくものであり、その内容はここに参照として取り込まれる。 The present application relates to Japanese Patent Application filed on March 17, 2011 (Japanese Patent Application No. 2011-058828), Japanese Patent Application filed on March 17, 2011 (Japanese Patent Application No. 2011-058827), and Japanese Patent Application filed on March 17, 2011 This application is based on Japanese Patent Application (Japanese Patent Application No. 2011-058832), the contents of which are incorporated herein by reference.
 本発明は3Dカメラとして有用であり、携帯電話、携帯端末、デジタルスチルカメラ、等々のカメラを有する様々な電子機器に利用可能である。 The present invention is useful as a 3D camera, and can be used in various electronic devices having cameras such as mobile phones, mobile terminals, digital still cameras, and so on.
 101  筐体
 102  第一カメラ
 103  第二カメラ
 104  3Dカメラ
 105  表示素子
 106  レンズ
 107  センサ
 108  カメラ制御手段
 109  ズーム制御信号
 110  記憶手段
 111  第三カメラ
 112  第四カメラ
 201  筐体
 202  第一カメラ
 203  第二カメラ
 204  3Dカメラ
 205  表示素子(表示部)
 206  レンズ
 207  センサ
 208  カメラ制御手段
 209  記憶手段
 210  距離情報測定手段
 211  遠距離被写体
 212  近距離被写体
 301  筐体
 302  第一カメラ
 303  第二カメラ
 304  3Dカメラ
 305  表示素子
 306  レンズ
 307  カメラセンサ
 308  カメラ制御手段
 309  記憶手段
 310  距離情報抽出部
 311  姿勢センサ
101 Case 102 First Camera 103 Second Camera 104 3D Camera 105 Display Element 106 Lens 107 Sensor 108 Camera Control Means 109 Zoom Control Signal 110 Storage Means 111 Third Camera 112 Fourth Camera 201 Housing 202 First Camera 203 Second Camera 204 3D camera 205 Display element (Display section)
206 lens 207 sensor 208 camera control means 209 storage means 210 distance information measuring means 211 long distance object 212 near distance object 301 housing 302 first camera 303 second camera 304 3D camera 305 display element 306 lens 307 camera sensor 308 camera control means 309 storage means 310 distance information extraction unit 311 attitude sensor

Claims (16)

  1.  異なる光軸を持つ2つのカメラと、
     前記2つのカメラの各々が撮像した2つのカメラ画像の視差から被写体までの距離を測定する距離測定部と、
     前記2つのカメラ画像から3D画像を生成する第一3D画像生成部と、
     前記距離測定部が測定した被写体までの距離と、前記2つのカメラの一方が撮像したカメラ画像とに基づいて、3D画像を生成する第二3D画像生成部と、を備え、
     前記被写体に対する前記2つのカメラの位置に基づく撮影状態に応じて、前記第一3D画像生成部による3D画像の生成及び前記第二3D画像生成部による3D画像の生成のいずれかに切り替える3D撮像装置。
    With two cameras with different optical axes,
    A distance measuring unit that measures a distance to an object from parallax of two camera images captured by each of the two cameras;
    A first 3D image generation unit that generates a 3D image from the two camera images;
    A second 3D image generation unit configured to generate a 3D image based on the distance to the subject measured by the distance measurement unit and a camera image captured by one of the two cameras;
    A 3D imaging apparatus that switches between generation of a 3D image by the first 3D image generation unit and generation of a 3D image by the second 3D image generation unit according to a shooting state based on the positions of the two cameras with respect to the subject .
  2.  請求項1に記載の3D撮像装置であって、
     前記2つのカメラは、光学ズーム機能付きの第一カメラと、単焦点型の第二カメラとであり、
     前記第一カメラによるズーム倍率が所定の倍率より小さいときには、前記第一3D画像生成部が、前記第一カメラの光学ズーム画像と前記第二カメラの電子ズーム画像から3D画像を生成し、
     前記ズーム倍率が前記所定の倍率より大きいときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第一カメラが撮像したカメラ画像とに基づいて、3D画像を生成する3D撮像装置。
    The 3D imaging device according to claim 1, wherein
    The two cameras are a first camera with an optical zoom function and a second camera of a single focus type,
    When the zoom magnification by the first camera is smaller than a predetermined magnification, the first 3D image generation unit generates a 3D image from the optical zoom image of the first camera and the electronic zoom image of the second camera,
    When the zoom magnification is larger than the predetermined magnification, the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the first camera. 3D imaging device to generate.
  3.  請求項1に記載の3D撮像装置であって、
     前記2つのカメラは、第三カメラと、前記第三カメラよりも画素数が小さい第四カメラとであり、
     生成する3D画像の画素数が所定の画素数よりも小さいときには、前記第一3D画像生成部が、前記第三カメラが撮像したカメラ画像と前記第四カメラが撮像したカメラ画像から3D画像を生成し、
     前記生成する3D画像の画素数が前記所定の画素数よりも大きいときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第三カメラが撮像したカメラ画像とに基づいて、3D画像を生成する3D撮像装置。
    The 3D imaging device according to claim 1, wherein
    The two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera.
    When the number of pixels of the 3D image to be generated is smaller than a predetermined number of pixels, the first 3D image generation unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera And
    When the number of pixels of the 3D image to be generated is larger than the predetermined number of pixels, the second 3D image generation unit measures the distance to the subject measured by the distance measurement unit and the camera image captured by the third camera And a 3D imaging device that generates 3D images.
  4.  請求項1に記載の3D撮像装置であって、
     前記2つのカメラは、第三カメラと、前記第三カメラよりも画素数が小さい第四カメラとであり、
     3D動画撮影のときには、前記第一3D画像生成部が、前記第三カメラが撮像したカメラ画像と前記第四カメラが撮像したカメラ画像から3D画像を生成し、
     3D静止画撮影のときには、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記第三カメラが撮像したカメラ画像とに基づいて、3D画像を生成する3D撮像装置。
    The 3D imaging device according to claim 1, wherein
    The two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera.
    At the time of 3D moving image shooting, the first 3D image generation unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera,
    At the time of 3D still image shooting, the second 3D image generating unit generates a 3D image based on the distance to the subject measured by the distance measuring unit and the camera image captured by the third camera apparatus.
  5.  請求項1に記載の3D撮像装置であって、
     前記距離測定部が測定した被写体までの距離が所定値以上である遠距離撮影モードでは、前記第一3D画像生成部が、前記2つのカメラの一方が撮像したカメラ画像と前記2つのカメラの他方が撮像したカメラ画像から第一3D画像を生成し、
     前記距離測定部が測定した被写体までの距離が前記所定値未満である近距離撮影モードでは、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記2つのカメラのいずれか一方が撮像したカメラ画像とに基づいて、第二3D画像を生成し、
     前記第二3D画像において視差角が異なる第一被写体と第二被写体の視差角差が、前記第一3D画像における前記第一被写体と前記第二被写体の視差角差よりも小さい3D撮像装置。
    The 3D imaging device according to claim 1, wherein
    In the long-distance shooting mode in which the distance to the subject measured by the distance measuring unit is equal to or more than a predetermined value, the first 3D image generating unit is a camera image captured by one of the two cameras and the other of the two cameras Generates a first 3D image from the camera image captured by
    In the short distance shooting mode in which the distance to the subject measured by the distance measurement unit is less than the predetermined value, the second 3D image generation unit determines the distance to the subject measured by the distance measurement unit, and the two cameras Generating a second 3D image based on the camera image captured by one of the
    The 3D imaging device in which the parallax angle difference between the first object and the second object at different parallax angles in the second 3D image is smaller than the parallax angle difference between the first object and the second object in the first 3D image.
  6.  請求項5に記載の3D撮像装置であって
     前記第一3D画像及び/又は前記第二3D画像を出力する表示部を備える3D撮像装置。
    The 3D imaging apparatus according to claim 5, further comprising: a display unit configured to output the first 3D image and / or the second 3D image.
  7.  請求項5又は請求項6に記載の3D撮像装置であって、
     前記距離測定部は、前記3D撮像装置から最も近い第三被写体の最近接距離を算出し、
     前記最近接距離が前記所定値以上の場合は、前記遠距離撮影モードとし、
     前記最近接距離が前記所定値未満の場合は、前記近距離撮影モードとする3D撮像装置。
    The 3D imaging apparatus according to claim 5 or 6, wherein
    The distance measurement unit calculates the closest distance of the third object closest to the 3D imaging device,
    When the closest distance is equal to or more than the predetermined value, the far-field shooting mode is set.
    The 3D imaging apparatus in which the short distance imaging mode is set when the closest distance is less than the predetermined value.
  8.  請求項1に記載の3D撮像装置であって、
     筐体と、
     前記筐体に配置された長方形の表示素子と、を備え、
     前記2つのカメラは、前記表示素子の長辺方向に並んで配置され、
     前記表示素子の長辺を垂直にした縦3D撮影モードでは、前記距離測定部が、前記2つのカメラの各々が撮像した2つのカメラ画像の前記表示素子の短辺方向の視差から被写体までの距離を測定し、前記第二3D画像生成部が、前記距離測定部が測定した被写体までの距離と、前記2つのカメラの一方が撮像したカメラ画像とに基づいて、3D画像を生成する3D撮像装置。
    The 3D imaging device according to claim 1, wherein
    And
    A rectangular display element disposed in the housing;
    The two cameras are arranged side by side in the long side direction of the display element,
    In the vertical 3D shooting mode in which the long side of the display element is vertical, the distance measuring unit measures the distance from the parallax of the short side of the display element of the two camera images captured by each of the two cameras to the object And the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras. .
  9.  請求項8に記載の3D撮像装置であって、
     前記表示素子の長辺を水平にした横3D撮影モードでは、前記第一3D画像生成部が、前記2つのカメラの各々が撮像したカメラ画像から3D画像を生成する3D撮像装置。
    The 3D imaging device according to claim 8, wherein
    In the horizontal 3D imaging mode in which the long side of the display element is horizontal, the first 3D image generation unit generates a 3D image from camera images captured by each of the two cameras.
  10.  請求項8又は請求項9に記載の3D撮像装置であって、
     前記表示素子の長辺が水平か垂直かを検出する姿勢センサを有し、
     前記姿勢センサの検出結果に応じて、前記縦3D撮影モード又は前記横3D撮影モードに切り替える3D撮像装置。
    The 3D imaging apparatus according to claim 8 or 9, wherein
    It has an attitude sensor that detects whether the long side of the display element is horizontal or vertical,
    A 3D imaging device that switches to the vertical 3D imaging mode or the horizontal 3D imaging mode according to the detection result of the attitude sensor.
  11.  請求項8から請求項10のいずれか1項に記載の3D撮像装置であって、
     前記表示素子は、3D表示機能を有し、前記横3D撮影モードでは3Dプレビュー表示し、前記縦3D撮影モードでは、前記2つのカメラのいずれか一方のカメラ画像を用いた2D画像で前記表示素子によるプレビュー表示を行う3D撮像装置。
    The 3D imaging apparatus according to any one of claims 8 to 10, wherein
    The display device has a 3D display function and displays a 3D preview in the horizontal 3D shooting mode, and in the vertical 3D shooting mode, the display device is a 2D image using a camera image of one of the two cameras. 3D imaging device that performs preview display by.
  12.  請求項11に記載の3D撮像装置であって、
     前記表示素子は、前記縦3D撮影モードでは、前記2つのカメラのいずれか一方のカメラ画像から前記表示素子の短辺方向に視差を追加した擬似3D画像で前記プレビュー表示を行う3D撮像装置。
    The 3D imaging device according to claim 11, wherein
    The display device is a 3D imaging device that performs the preview display with a pseudo 3D image in which parallax is added in a short side direction of the display device from the camera image of one of the two cameras in the vertical 3D imaging mode.
  13.  請求項8から請求項12のいずれか1項に記載の3D撮像装置であって、
     前記横3D撮影モードと前記縦3D撮影モードの左右画像の視差設定が独立にそれぞれ設定できる3D撮像装置。
    The 3D imaging apparatus according to any one of claims 8 to 12, wherein
    A 3D imaging apparatus capable of independently setting parallax settings of left and right images of the horizontal 3D imaging mode and the vertical 3D imaging mode.
  14.  請求項13に記載の3D撮像装置であって、
     前記横3D撮影モードよりも前記縦3D撮影モードの前記視差設定値が近距離用である3D撮像装置。
    The 3D imaging device according to claim 13, wherein
    The 3D imaging device, wherein the parallax setting value of the vertical 3D imaging mode is for a short distance than the horizontal 3D imaging mode.
  15.  第一カメラと、
     前記第一カメラの光軸と異なる光軸を有し、前記第一カメラよりも画角が大きな第二カメラと、
     前記第一カメラの画角に等しい撮影範囲のカメラ画像を前記第一カメラの撮像時間に同期して撮像するよう前記第二カメラを制御するカメラ制御部と、
     前記第一カメラのカメラ画像と前記第二カメラのカメラ画像とから3D画像を生成する3D画像生成部と、
    を備えた3D撮像装置。
    With the first camera,
    A second camera having an optical axis different from the optical axis of the first camera and having a larger angle of view than the first camera;
    A camera control unit that controls the second camera to pick up a camera image of a shooting range equal to the angle of view of the first camera in synchronization with the shooting time of the first camera;
    A 3D image generator configured to generate a 3D image from the camera image of the first camera and the camera image of the second camera;
    3D imaging device with.
  16.  請求項15に記載の3D撮像装置であって、
     前記カメラ制御部は、前記第二カメラに入力するクロック信号の周波数を変更することで、前記第二カメラのカメラ画像の撮像時間が前記第一カメラの撮像時間に同期させる3D撮像装置。
    The 3D imaging device according to claim 15, wherein
    The camera control unit changes a frequency of a clock signal input to the second camera to synchronize an imaging time of a camera image of the second camera with an imaging time of the first camera.
PCT/JP2012/001798 2011-03-17 2012-03-14 Three-dimensional image pickup device WO2012124331A1 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2011058832A JP2014102266A (en) 2011-03-17 2011-03-17 3d imaging apparatus
JP2011-058832 2011-03-17
JP2011058827A JP2014103431A (en) 2011-03-17 2011-03-17 3d imaging apparatus
JP2011-058827 2011-03-17
JP2011-058826 2011-03-17
JP2011058826A JP2014102265A (en) 2011-03-17 2011-03-17 3d imaging apparatus

Publications (1)

Publication Number Publication Date
WO2012124331A1 true WO2012124331A1 (en) 2012-09-20

Family

ID=46830416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/001798 WO2012124331A1 (en) 2011-03-17 2012-03-14 Three-dimensional image pickup device

Country Status (1)

Country Link
WO (1) WO2012124331A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759556A (en) * 2016-04-08 2016-07-13 凯美斯三维立体影像(惠州)有限公司 Mobile phone having three-dimensional image shooting function
JP6062039B2 (en) * 2013-04-04 2017-01-18 株式会社Amatel Image processing system and image processing program
CN107357046A (en) * 2017-05-26 2017-11-17 张家港康得新光电材料有限公司 2D patterns and the detection method and detecting system of 3D mode switch times
CN107640317A (en) * 2016-07-22 2018-01-30 松下知识产权经营株式会社 Unmanned vehicle system
CN114025107A (en) * 2021-12-01 2022-02-08 北京七维视觉科技有限公司 Image ghost shooting method and device, storage medium and fusion processor

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0847001A (en) * 1994-08-01 1996-02-16 Minolta Co Ltd Steroscopic television camera
JPH1090814A (en) * 1996-09-11 1998-04-10 Canon Inc Compound eye camera and image processing method
JP2000102040A (en) * 1998-09-28 2000-04-07 Olympus Optical Co Ltd Electronic stereo camera
JP2000299810A (en) * 1999-04-13 2000-10-24 Matsushita Electric Ind Co Ltd Image pickup device
JP2003304561A (en) * 2003-05-01 2003-10-24 Nissan Motor Co Ltd Stereo image processing apparatus
JP2005020606A (en) * 2003-06-27 2005-01-20 Sharp Corp Digital camera
JP2005181377A (en) * 2003-12-16 2005-07-07 Sophia Co Ltd Game machine
JP2006093860A (en) * 2004-09-21 2006-04-06 Olympus Corp Camera mounted with twin lens image pick-up system
JP2007295113A (en) * 2006-04-21 2007-11-08 Matsushita Electric Ind Co Ltd Imaging apparatus
JP2008236642A (en) * 2007-03-23 2008-10-02 Hitachi Ltd Object tracking device
JP2009048181A (en) * 2007-07-25 2009-03-05 Fujifilm Corp Stereoscopic image photographing device
JP2009288657A (en) * 2008-05-30 2009-12-10 Olympus Imaging Corp Stroboscopic photographing device
JP2010154310A (en) * 2008-12-25 2010-07-08 Fujifilm Corp Compound-eye camera, and photographing method
JP2010261877A (en) * 2009-05-11 2010-11-18 Ricoh Co Ltd Stereoscopic camera device and vehicle-outside monitoring apparatus using the same
JP2011017825A (en) * 2009-07-08 2011-01-27 Seiko Epson Corp Electrooptical device and electronic device
JP2011151517A (en) * 2010-01-20 2011-08-04 Jvc Kenwood Holdings Inc Video processor
JP2011211381A (en) * 2010-03-29 2011-10-20 Fujifilm Corp Stereoscopic imaging apparatus
JP2012023557A (en) * 2010-07-14 2012-02-02 Sharp Corp Image pickup device

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0847001A (en) * 1994-08-01 1996-02-16 Minolta Co Ltd Steroscopic television camera
JPH1090814A (en) * 1996-09-11 1998-04-10 Canon Inc Compound eye camera and image processing method
JP2000102040A (en) * 1998-09-28 2000-04-07 Olympus Optical Co Ltd Electronic stereo camera
JP2000299810A (en) * 1999-04-13 2000-10-24 Matsushita Electric Ind Co Ltd Image pickup device
JP2003304561A (en) * 2003-05-01 2003-10-24 Nissan Motor Co Ltd Stereo image processing apparatus
JP2005020606A (en) * 2003-06-27 2005-01-20 Sharp Corp Digital camera
JP2005181377A (en) * 2003-12-16 2005-07-07 Sophia Co Ltd Game machine
JP2006093860A (en) * 2004-09-21 2006-04-06 Olympus Corp Camera mounted with twin lens image pick-up system
JP2007295113A (en) * 2006-04-21 2007-11-08 Matsushita Electric Ind Co Ltd Imaging apparatus
JP2008236642A (en) * 2007-03-23 2008-10-02 Hitachi Ltd Object tracking device
JP2009048181A (en) * 2007-07-25 2009-03-05 Fujifilm Corp Stereoscopic image photographing device
JP2009288657A (en) * 2008-05-30 2009-12-10 Olympus Imaging Corp Stroboscopic photographing device
JP2010154310A (en) * 2008-12-25 2010-07-08 Fujifilm Corp Compound-eye camera, and photographing method
JP2010261877A (en) * 2009-05-11 2010-11-18 Ricoh Co Ltd Stereoscopic camera device and vehicle-outside monitoring apparatus using the same
JP2011017825A (en) * 2009-07-08 2011-01-27 Seiko Epson Corp Electrooptical device and electronic device
JP2011151517A (en) * 2010-01-20 2011-08-04 Jvc Kenwood Holdings Inc Video processor
JP2011211381A (en) * 2010-03-29 2011-10-20 Fujifilm Corp Stereoscopic imaging apparatus
JP2012023557A (en) * 2010-07-14 2012-02-02 Sharp Corp Image pickup device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6062039B2 (en) * 2013-04-04 2017-01-18 株式会社Amatel Image processing system and image processing program
US9832447B2 (en) 2013-04-04 2017-11-28 Amatel Inc. Image processing system and image processing program
CN105759556A (en) * 2016-04-08 2016-07-13 凯美斯三维立体影像(惠州)有限公司 Mobile phone having three-dimensional image shooting function
CN107640317A (en) * 2016-07-22 2018-01-30 松下知识产权经营株式会社 Unmanned vehicle system
CN107357046A (en) * 2017-05-26 2017-11-17 张家港康得新光电材料有限公司 2D patterns and the detection method and detecting system of 3D mode switch times
CN114025107A (en) * 2021-12-01 2022-02-08 北京七维视觉科技有限公司 Image ghost shooting method and device, storage medium and fusion processor
CN114025107B (en) * 2021-12-01 2023-12-01 北京七维视觉科技有限公司 Image ghost shooting method, device, storage medium and fusion processor

Similar Documents

Publication Publication Date Title
JP5014979B2 (en) 3D information acquisition and display system for personal electronic devices
US8077964B2 (en) Two dimensional/three dimensional digital information acquisition and display device
CN108718373B (en) Image device
US9699440B2 (en) Image processing device, image processing method, non-transitory tangible medium having image processing program, and image-pickup device
KR101824439B1 (en) Mobile Stereoscopic Camera Apparatus and Method of Shooting thereof
JP2011064894A (en) Stereoscopic image display apparatus
KR20140051112A (en) Primary and auxiliary image capture devices for image processing and related methods
KR20090035880A (en) Osmu( one source multi use)-type stereoscopic camera and method of making stereoscopic video content thereof
KR100818155B1 (en) Stereo camera system for mobile device and the method for controlling convergence
WO2012124331A1 (en) Three-dimensional image pickup device
JP2012186612A (en) Imaging device
JP2017041887A (en) Image processing system, imaging apparatus, image processing method and program
JP6155471B2 (en) Image generating apparatus, imaging apparatus, and image generating method
US20120154543A1 (en) Document camera, method for controlling document camera, program, and display processing system
CN103329549B (en) Dimensional video processor, stereoscopic imaging apparatus and three-dimensional video-frequency processing method
CN103339948B (en) 3D video playing device, 3D imaging device, and 3D video playing method
US20120307016A1 (en) 3d camera
EP2566166A1 (en) Three-dimensional imaging device
JP5562122B2 (en) Image processing apparatus and control method thereof
US9325975B2 (en) Image display apparatus, parallax adjustment display method thereof, and image capturing apparatus
CN104041026B (en) Image take-off equipment, method and program and recording medium thereof
JP2013046081A (en) Image capturing device and image generation method
JP2014102265A (en) 3d imaging apparatus
KR20130052582A (en) Osmu( one source multi use)-type stereoscopic camera and method of making stereoscopic video content thereof
JP2014103431A (en) 3d imaging apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12757984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12757984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP