WO2012124331A1 - Dispositif de capture d'image tridimensionnelle - Google Patents

Dispositif de capture d'image tridimensionnelle Download PDF

Info

Publication number
WO2012124331A1
WO2012124331A1 PCT/JP2012/001798 JP2012001798W WO2012124331A1 WO 2012124331 A1 WO2012124331 A1 WO 2012124331A1 JP 2012001798 W JP2012001798 W JP 2012001798W WO 2012124331 A1 WO2012124331 A1 WO 2012124331A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
image
distance
imaging
cameras
Prior art date
Application number
PCT/JP2012/001798
Other languages
English (en)
Japanese (ja)
Inventor
東郷 仁麿
孝幸 有馬
井村 康治
和之 田中
中村 剛
郁雄 渕上
山口 徹
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011058827A external-priority patent/JP2014103431A/ja
Priority claimed from JP2011058826A external-priority patent/JP2014102265A/ja
Priority claimed from JP2011058832A external-priority patent/JP2014102266A/ja
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Publication of WO2012124331A1 publication Critical patent/WO2012124331A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B35/00Stereoscopic photography
    • G03B35/08Stereoscopic photography by simultaneous recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/243Image signal generators using stereoscopic image cameras using three or more 2D image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/25Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor

Definitions

  • the present invention relates to a camera-equipped portable electronic device such as a mobile phone having a 3D camera capable of 3D imaging and a digital still camera.
  • FIGS. 24 (a), (b) and (c) show a conventional structure example of a 3D camera consisting of two cameras. For example, details are described in Patent Document 1.
  • the two cameras 12 and 13 with an optical zoom are disposed apart from each other on the back surface of the display element 15 of the housing 11 of the imaging device.
  • FIG. 24C shows a block diagram of the imaging apparatus.
  • the first camera 12 and the second camera 13 with optical zoom are controlled by the camera control means 18.
  • the positions of the lenses 16 of the first camera 12 and the second camera 13 are adjusted based on the adjustment value of the look-up table so that the optical zoom magnifications of the first camera 12 and the second camera 13 coincide.
  • FIG. 25 shows a block diagram of the imaging apparatus.
  • Two cameras 22 and 23 are disposed apart from each other on the back surface of the display element of the housing.
  • the first camera 22 has an optical zoom
  • the second camera 23 is a single focus camera without an optical zoom. Further, the number of pixels of the first camera 22 is larger than the number of pixels of the second camera 23.
  • the first camera 22 with an optical zoom and the second camera 23 of a single focus type are controlled by a camera control means 28.
  • the optical zoom magnification of the first camera 22 and the electronic zoom magnification of the second camera 23 are adjusted so that the zoom magnifications of the first camera 22 and the second camera 23 match.
  • resolution enhancement of the second camera image is performed as follows.
  • a region of the first camera image (referred to as a corresponding block) corresponding to a part of the second camera image (referred to as a block) is searched for by image processing, and a block of the second camera image is converted to a corresponding block of the first camera image replace.
  • the resolution of the second camera image can be increased, and the difference in resolution between the first camera image and the second camera image can be reduced.
  • FIGS. 26 (a), (b) and (c) show an example of a conventional structure of a 3D camera consisting of two cameras. For example, details are described in Patent Document 3.
  • the two cameras 32 and 33 are disposed apart from each other in the housing 31 of the imaging apparatus. As shown in FIG. 26 (b), it is possible to mechanically change the distance between optical axes (baseline length) of the camera and to change the angle of the optical axis of the camera.
  • FIG. 26C shows a block diagram.
  • the first camera 32 and the second camera 33 are controlled by the camera control means 38.
  • position adjustment (parallax adjustment) of the first camera 32 and the second camera 33 in the horizontal direction is also performed.
  • the parallax means the difference in the horizontal position of the left and right images with respect to a certain subject.
  • the subject can be freely placed in front of or behind the 3D display screen.
  • FIG. 27 shows the relationship between the subject distance from the 3D camera to the subject and the parallax angle (convergence angle) of the left and right images when the optical axes of the 3D camera are parallel.
  • the three-dimensional image reproduced on the 3D display may not be in front of or behind the display screen.
  • the amount of deviation with respect to the convergence angle on the display screen becomes a certain value or more, comfortable stereoscopic vision becomes difficult. Furthermore, if this shift amount becomes large, stereoscopic vision itself can not be performed.
  • the parallax angle is a difference of about ⁇ 1 degree or less with respect to the display screen.
  • the base length is 6.5 cm and the subject distance is from about 1 m or more to infinity, there is no problem because the parallax angle is 2 degrees or less ( ⁇ 1 degree or less) as shown in FIG.
  • the parallax angle is 2 degrees or less ( ⁇ 1 degree or less) as shown in FIG.
  • the parallax angle can be made 2 degrees or less even for a short distance object of about 50 cm.
  • the convergence angle increases as the object approaches the camera, and the display screen jumps out, making it difficult to view stereoscopically.
  • Patent Document 4 describes a method of electronically changing the convergence angle.
  • the convergence angle can be changed by adjusting the cutout horizontal position of the left and right images.
  • Patent Document 5 describes a method of generating a parallax image such that a specific object in a subject (object) in the 3D expression space falls within the recommended parallax range.
  • the parallax adjustment is performed so that the designated object is displayed near the surface of the display screen.
  • FIGS. 28 (a) and 28 (b) show an example of the conventional structure of a 3D camera consisting of a twin-lens camera.
  • the first camera 42 and the second camera 43 are disposed apart from each other in the housing 41 of the imaging device.
  • a display element 45 for displaying a 3D photographed image is disposed on the back surface of the housing 41.
  • the display element 45 is, for example, a parallax barrier type naked eye 3D liquid crystal or the like.
  • FIG. 29 shows a state where the display screen is vertically elongated by rotating the case 41 by 90 degrees. When the subject is vertically long, shooting is often performed in this manner.
  • FIG. 30 (a), (b) shows landscape shooting
  • FIG. 30 (b) shows portrait shooting
  • shooting is performed by two cameras 52 and 53 which are disposed apart from each other in the direction of the long side of the display element 55 as in the conventional case.
  • the camera sensors 57 of the left and right cameras are both arranged horizontally long.
  • the arrangement of the camera sensor 57 is changed from horizontal to vertical by a mechanism in which the first camera 52 and the second camera 53 rotate 90 degrees with respect to the camera optical axis.
  • the display element 55 also rotates 90 degrees in the same direction as the rotation direction of the camera, thereby enabling display of a vertically long 3D image.
  • the display device 55 needs a 3D display device capable of vertical display and horizontal display.
  • parallax information (or distance information) is another method of generating a 3D image from a 3D camera. There is a method of using). After measuring the parallax from the main image and the sub image of the twin-lens camera including the main image sensor and the sub image sensor disposed spatially apart, it is possible to generate a 3D image from the parallax information and the main image.
  • Japanese Patent No. 3303254 Japanese Patent Application Laid-Open No. 2005-210217 Japanese Patent Application Laid-Open No. 07-167633 Japanese Patent Application Laid-Open No. 08-251625 Japanese Patent Application Laid-Open No. 2004-220127 Japanese Patent Application Laid-Open No. 10-224820 Japanese Patent Laid-Open Publication No. 2005-20606
  • Patent Document 1 The 3D optical zoom camera described in Patent Document 1 described in the first prior art needs two expensive and large optical zoom cameras, so there is a problem that the portable terminal becomes expensive and large. .
  • the 3D optical zoom camera described in Patent Document 2 described in the second prior art is inexpensive and can be miniaturized because one side is a single-focus one-way camera. It takes time for software processing of resolution conversion to take moving pictures. Further, there may be a case where corresponding blocks (or corresponding pixels) of the optical zoom camera image and the single focus camera image can not be found. At this time, the single focus camera image is an image partially deteriorated in low resolution.
  • first and second conventional examples assume that the first cameras 12 and 22 have an optical zoom function, they are simply a 3D camera consisting of a high resolution third camera and a low resolution fourth camera. In the case of the same problem occurs. The issues are described below.
  • An image cut out from the third camera with the maximum number of pixels of AL is taken as a third camera generated image, and the number of pixels is BL (BL ⁇ AL).
  • an image cut out of the fourth camera (AR ⁇ AL) with the maximum number of pixels of AR is taken as a fourth camera generated image, and the number of pixels is BR (BR ⁇ AR).
  • AR ⁇ AL image cut out of the fourth camera with the maximum number of pixels of AR
  • BR ⁇ AR
  • the 3D image having the third camera generated image and the fourth camera generated image as the left and right images has a large resolution difference and becomes a difficult 3D image .
  • the resolutions of the left and right images are different, the image quality of the 3D image is degraded.
  • Patent Document 3 described in the third prior art can achieve both the 3D effect and the viewability by changing the relative arrangement of the cameras, but moves the two cameras continuously. Since the function is required, there is a problem that the camera becomes expensive and large.
  • Patent Document 4 since the 3D camera described in Patent Document 4 can change the horizontal distance (parallax) of the subject in the left and right images depending on the image cutout position, a complicated mechanical structure as in Patent Document 3 is unnecessary. Only the adjustment of parallax in the horizontal direction is possible, and it is not possible to adjust the height of the sense of depth between the subjects (3D sense). Therefore, in the 3D image in which the short distance subject and the long distance subject are shown, it has not been possible to adjust so that all subjects have a comfortable parallax.
  • the feeling of popping up on the display screen is too strong to be seen. Therefore, if the parallax of the short distance object is reduced to weaken the feeling of popping out, a distant object may be placed in the back and it may be difficult to view or may not be stereoscopically viewable.
  • Patent Document 6 proposes a method capable of realizing 3D vertical shooting, but a mechanism in which two cameras physically rotate with respect to the camera optical axis to perform vertical 3D shooting, The rotation mechanism of a display element is required, and the subject that an imaging device enlarges occurred.
  • An object of the present invention is to provide a simple 3D imaging apparatus capable of generating a 3D image by a method suitable for an imaging state.
  • a 3D imaging apparatus includes: two cameras having different optical axes; a distance measuring unit that measures a distance from an object based on parallax of two camera images captured by each of the two cameras; A 3D image is generated based on a first 3D image generation unit that generates a 3D image from an image, a distance to an object measured by the distance measurement unit, and a camera image captured by one of the two cameras And generating a 3D image by the first 3D image generation unit and a 3D image by the second 3D image generation unit according to a photographing state based on the positions of the two cameras with respect to the subject. Switch to one of the generation of.
  • the two cameras are a first camera with an optical zoom function and a single-focus second camera, and the zoom magnification by the first camera is smaller than a predetermined magnification.
  • the first 3D image generation unit generates a 3D image from the optical zoom image of the first camera and the electronic zoom image of the second camera, and when the zoom magnification is larger than the predetermined magnification, the second 3D image is generated.
  • the 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the first camera.
  • the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and the number of pixels of the 3D image to be generated is a predetermined number of pixels
  • the first 3D image generation unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and the number of pixels of the 3D image generated is the above
  • the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by the third camera.
  • the two cameras are a third camera and a fourth camera having a smaller number of pixels than the third camera, and in the case of 3D moving image shooting, the first 3D image generation The unit generates a 3D image from the camera image captured by the third camera and the camera image captured by the fourth camera, and in the case of 3D still image shooting, the second 3D image generator converts the distance measurement unit A 3D image is generated based on the measured distance to the subject and the camera image captured by the third camera.
  • the first 3D image generating unit captures one of the two cameras.
  • the first 3D image is generated from the captured camera image and the camera image captured by the other of the two cameras, and in the short distance shooting mode, the distance to the subject measured by the distance measuring unit is less than the predetermined value.
  • the second 3D image generation unit generates a second 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras, and the second 3D image is generated.
  • the parallax angle difference between the first object and the second object at different parallax angles in the image is smaller than the parallax angle difference between the first object and the second object in the first 3D image.
  • the 3D imaging device of the present invention further includes a display unit that outputs the first 3D image and / or the second 3D image.
  • both long distance and short distance shooting can be performed with an inexpensive camera configuration, and 3D shooting that achieves both 3D effect and easy viewing can be realized. .
  • the distance measuring unit calculates the closest distance of the third object closest to the 3D imaging device, and when the closest distance is equal to or more than the predetermined value, the far distance When the shooting mode is selected and the closest distance is less than the predetermined value, the short distance shooting mode is selected.
  • the 3D imaging device of the present invention includes a housing and a rectangular display element disposed in the housing, and the two cameras are arranged side by side in the long side direction of the display element,
  • the distance measuring unit measures the distance from the parallax of the display element to the subject of the two camera images captured by each of the two cameras.
  • the second 3D image generation unit generates a 3D image based on the distance to the subject measured by the distance measurement unit and the camera image captured by one of the two cameras.
  • both vertical 3D imaging and horizontal 3D imaging can be made compact and inexpensive without physically changing the arrangement of the cameras.
  • the first 3D image generation unit in the horizontal 3D imaging mode in which the long side of the display element is horizontal, the first 3D image generation unit generates a 3D image from camera images captured by each of the two cameras.
  • the 3D imaging apparatus further includes an orientation sensor that detects whether the long side of the display element is horizontal or vertical, and the vertical 3D imaging mode or the horizontal 3D imaging is performed according to the detection result of the orientation sensor. Switch to mode.
  • the display element has a 3D display function, displays a 3D preview in the horizontal 3D imaging mode, and either one of the two cameras in the vertical 3D imaging mode
  • the preview display by the display element is performed with a 2D image using an image.
  • preview display can be performed without depending on the generation time of the 3D image.
  • the display element is a pseudo 3D image in which parallax is added in a short side direction of the display element from one camera image of the two cameras in the vertical 3D imaging mode.
  • the preview display is performed.
  • the preview in the vertical 3D shooting mode can be displayed in a pseudo 3D manner, and the difference from the 2D shooting can be easily recognized.
  • parallax settings of left and right images in the horizontal 3D imaging mode and the vertical 3D imaging mode can be set independently.
  • the parallax setting value of the vertical 3D shooting mode is for short distance rather than the horizontal 3D shooting mode.
  • This configuration makes it easy to perform short distance shooting during vertical shooting.
  • the 3D imaging apparatus includes a first camera, an optical axis different from the optical axis of the first camera, and a second camera having a larger angle of view than the first camera, and the first camera.
  • a camera control unit for controlling the second camera to synchronously capture a camera image of a shooting range equal to the angle of view with a shooting time of the first camera; a camera image of the first camera and a camera of the second camera
  • 3D image generation unit for generating a 3D image from the image.
  • the camera control unit changes the frequency of the clock signal input to the second camera so that the imaging time of the camera image of the second camera is the imaging time of the first camera Synchronize to
  • FIG. 1 Diagram showing the relationship between the number of pixels and the resolution ratio in the first 3D image generation method Flow chart of 3D imaging according to the second embodiment of the present invention (A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the third embodiment of the present invention (A), (b) A diagram showing a general 3D image capturing method FIG.
  • FIG. 10 is a diagram showing a 3D image generation method according to Embodiment 3 of the present invention (A), (b) The figure which shows the detail of the 3D image generation method of Embodiment 3 of this invention 3D imaging flowchart of the third embodiment of the present invention (A), (b), (c) Schematic structural view of the 3D imaging apparatus according to the fourth embodiment of the present invention (A), (b) Schematic of horizontal shooting mode and vertical shooting mode according to the fourth embodiment of the present invention The figure which shows the 3D image generation method of Embodiment 4 of this invention.
  • 3D imaging flowchart of the fourth embodiment of the present invention 3D imaging flowchart of the fifth embodiment of the present invention
  • (A), (b) The figure which shows the 3D image generation method of Embodiment 6 of this invention (A), (b), (c) Schematic structural view of a 3D imaging apparatus according to a seventh embodiment of the present invention The figure which shows the image which the sensor of each camera of Embodiment 7 of this invention acquired.
  • FIG. 1A, FIG. 1B, and FIG. 1C are schematic structural views of a 3D imaging apparatus according to the present embodiment.
  • a 3D camera 104 including a first camera 102 and a second camera 103 is disposed on the back of the display element 105 of the rectangular casing 101 of the mobile terminal device.
  • the first camera 102 is a camera with an optical zoom function
  • the second camera 103 is a single focus camera without an optical zoom function. Since the second camera 103 is cheaper and smaller than the first camera 102, the portable terminal can be made smaller or thinner than when two cameras with an optical zoom in FIG. 24A are used. .
  • the first camera 102 with an optical zoom function is a camera for the left image in FIG. 1A, it may be a camera for the right image.
  • FIG. 1C shows a block diagram of the portable terminal device.
  • the first camera 102 and the second camera 103 are controlled by the camera control means 108, and the photographed image is stored in the storage means 110 or displayed using the display element 105.
  • the first camera 102 comprises a sensor 107 and a lens 106 and is provided with an optical zoom function, and the position of the lens 106 is controlled by the zoom control signal 109 to obtain a desired zoom magnification.
  • FIG. 2 shows a 3D image generation method at the time of zooming.
  • a 3D image generation method when the zoom magnification N is smaller than the switching magnification N0 will be described. This is called a first 3D image generation method.
  • a 3D image is generated by using the optical zoom image [1] as a left image and the electronic zoom image [2] as a right image.
  • the first 3D image generation method is performed only when the zoom magnification N is smaller than N0.
  • YR (or XR) is larger than YL (or XL) when the zoom magnification is large. It becomes smaller.
  • the inventors conducted a subjective evaluation as to whether or not the 3D image looks comfortable as a result.
  • a zoom magnification at which YR / YL is, for example, 0.4 or more is N0
  • a 3D image can be generated by the first 3D image generation method described above when N ⁇ N0.
  • the second 3D image generation method is a method of calculating or estimating distance information (depth information) for each pixel or minute region, and generating a 3D image from the distance information.
  • the parallax information (distance information) of each pixel is obtained by establishing correspondence with each pixel (or a minute region) using correlation calculation of image processing or the like. [5] can be calculated.
  • Left and right images [6] and [7] are obtained from the first camera left image [3] and disparity information [5].
  • both left and right images can be made high resolution images.
  • FIG. 4 shows a 3D imaging flowchart at the time of zooming according to the first embodiment of the present invention.
  • high resolution 3D imaging can be performed in a wide zoom magnification range.
  • the first 3D image generation method is more desirable than the second 3D image generation method.
  • the second 3D image generating method takes processing time and the frame rate of the moving image becomes small, so the first 3D image generating method is more desirable than the second 3D image generating method.
  • FIG. 5 shows a 3D imaging flowchart at this time.
  • a first 3D image generation method with high processing speed is adopted, and in the case of still image shooting, a high resolution second 3D image generation method is adopted.
  • a 3D image can be generated without reducing the frame rate of the 3D moving image.
  • Second Embodiment In the second embodiment, a method of changing the 3D image generation method according to the number of pixels of the 3D image to be generated will be described.
  • a mobile terminal device having a single housing is described, but in addition, the mobile terminal device is attached to an electronic device having a small camera such as a foldable mobile terminal or a digital still camera (DSC). The same is true for the case.
  • a small camera such as a foldable mobile terminal or a digital still camera (DSC).
  • FIG. 6A is a block diagram of the 3D imaging apparatus according to the second embodiment.
  • the third camera 111 and the fourth camera 112 both have no optical zoom function, they may have an optical zoom function.
  • the 3D image generation method shown in FIG. 6B is the same as the first 3D image generation method of the first embodiment.
  • the number of pixels indicates the number of vertical pixels ⁇ the number of horizontal pixels, and the resolution means the number of pixels in the original image. Also, for simplicity, it is assumed that the aspect ratio of the image is constant.
  • An image generated from the third camera 111 with the maximum number of pixels of AL ([6]) is a third camera generated image ([8]), and the number of pixels is BL (BL ⁇ AL).
  • a resized image to an image of pixel number C is a [10] image
  • a resized image of [9] image to an image of pixel number C is a [11] image, from [10] and [11] Get a 3D image.
  • FIG. 7 shows an example of a graph of resolution ratio P ((right resolution) / (left resolution)) with respect to the number C of pixels of the 3D image.
  • the first 3D image generation method is not suitable because the resolution difference is large at the resolution ratio P0 or less. Therefore, when C> C0, a second 3D image generation method is used to generate a 3D image.
  • FIG. 6C shows a second 3D image generation method.
  • FIG. 8 shows a flowchart of 3D imaging according to the second embodiment of the present invention.
  • the distance information [12] is calculated from the third camera image [10] and the fourth camera image [11], and the third camera image [10] of pixel number C with high resolution and the distance information [10] Obtain a 3D image from 12].
  • the generation method is the same as the second 3D image generation method of the first embodiment. Thus, 3D images with the same resolution of the left and right images can be obtained.
  • FIGS. 9A, 9B, and 9C are schematic structural views of the 3D imaging apparatus of the present embodiment.
  • the third embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition to electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs). The same applies to the case of wearing.
  • DSCs digital still cameras
  • the 3D camera 204 including the first camera 202 and the second camera 203 is disposed on the back of the display element 205 of the rectangular casing 201 of the mobile terminal device.
  • FIG.9 (c) shows the block block diagram of a portable terminal device.
  • the first camera 202 and the second camera 203 are controlled by the camera control unit 208, and store the photographed image in the storage unit 209 or display it using the display element 205.
  • the distance information measuring means 210 measures distance information of the subject from the images of the first camera 202 and the second camera 203.
  • FIGS. 10A and 10B show a general 3D image capturing method in the case where there is a long distance object 211 and a short distance object 212.
  • FIG. 10A shows the case where both the far-distance object 211 and the short-distance object 212 are behind the virtual projection plane (or 3D display screen).
  • FIG. 10B shows the case where the short distance object 212 is near the 3D camera 204, the long distance object 211 is behind the virtual projection plane, and the short distance object 212 is in front.
  • the parallax angle range D2 of the subject in FIG. 10 (b) is wider than the parallax angle range D1 of the subject in FIG. 10 (a).
  • the parallax (C2L-C2R) of the short distance object 212 can be reduced by shifting the horizontal position of the left and right images as in Patent Document 4, but at this time the parallax (C1L-C1R) of the long distance object 211 It may increase in the negative direction, making it impossible to view stereoscopically.
  • a 3D image is obtained with the left and right images of the first camera image and the second camera image. This will be referred to as the long distance imaging mode.
  • is small, and the parallax angle range D1 is narrow, so that both the long distance object 211 and the near distance object 212 can be comfortably viewed stereoscopically. it can.
  • the parallax angle can be within ⁇ 1 degree, and a comfortable 3D image can be obtained.
  • the distance of the near distance object 212 closest to the camera is less than a certain value (for example, 1 m), it is called a short distance shooting mode, and the following 3D image generation method is adopted.
  • a certain value for example, 1 m
  • FIG. 11 shows a 3D image generation method in the short distance shooting mode and the long distance shooting mode. Further, FIG. 13 shows a flowchart of 3D imaging.
  • the long-distance shooting mode is set, and a 3D image is obtained from the left and right images [1] and [2].
  • the short-distance shooting mode is selected, and from left and right images [3] and [4], correlation calculation of image processing is performed on each pixel (or minute block).
  • the parallax information (distance information) [5] of each pixel is calculated.
  • the measurement of the parallax information (distance information) is performed by the distance information measuring means 210.
  • left and right images [6] and [7] are obtained from the first camera image [3] (or the second camera image [4]) and the parallax information [5].
  • the sense of depth can be made smaller than the actual one by multiplying the calculated distance information by a certain constant value (a) smaller than 1 in order to make the parallax angle range narrower than the actual.
  • in the short distance shooting mode of FIG. 12 (a) corresponds to the short distance object of FIG. 10 (b) in the images [1] and [2] of the long distance shooting mode. It can be made smaller than the difference
  • the parallax angle range D3 becomes narrower than the parallax angle range D2, and even when the short distance object is very close, the fluctuation of the parallax angle is not too large, and comfortable stereoscopic vision can be achieved.
  • the user may switch between the long distance shooting mode and the short distance shooting mode, but in order to switch automatically, the 3D image pickup apparatus has means for measuring the distance of the closest subject. is necessary.
  • a general distance measuring device using a reflection time of radar or infrared rays may be used, but since the portable terminal device becomes expensive and the size increases, here A method of estimating from the images of the camera 202 and the second camera 203 is adopted. This eliminates the need for additional components for the distance measuring means.
  • the distance information measurement means 210 if the corresponding pixels (or blocks) of the left and right camera images are known, the distance of the pixels (blocks) can be estimated. It can be estimated that the closest subject among the pixels is the short distance subject.
  • distance information may be calculated for pixels (or blocks) of several points in the photographed image.
  • the distance of the closest subject may be estimated from the distance information of the pixels (or blocks) near the center.
  • FIGS. 14 (a), (b) and (c) are schematic structural diagrams of the 3D imaging apparatus of the present embodiment.
  • the fourth embodiment will be described on the assumption of a portable terminal device consisting of a single housing, but in addition, electronic devices having small cameras such as folding type portable terminals and digital still cameras (DSCs) are described. The same applies to the case of wearing.
  • DSCs digital still cameras
  • a binocular 3D camera 304 including a first camera 302 and a second camera 303 is disposed on the back of the display element 305 of the rectangular casing 301 of the portable terminal device. ing.
  • the first camera 302 and the second camera 303 are composed of a camera sensor 307 and a lens 306.
  • FIG. 14C shows a block diagram of the portable terminal device.
  • the first camera 302 and the second camera 303 are controlled by the camera control unit 308, store the captured image in the storage unit 309, and display the captured image using the display element 305.
  • the display element 305 is, for example, a naked-eye 3D liquid crystal having a barrier liquid crystal, and can switch between horizontal and vertical display of a 3D image by switching the barrier direction to vertical and horizontal.
  • 3D display itself may not be possible.
  • the distance information extraction unit 310 has a function of extracting distance information of an object from images of the first camera 302 and the second camera 303.
  • this portable terminal incorporates an attitude sensor 311 including an acceleration sensor, an azimuth sensor, and the like, and can detect the attitude of the terminal.
  • the imaging in which the long side of the display element is horizontal is referred to as a horizontal 3D imaging mode
  • the imaging in which the long side of the display element is vertical is referred to as a vertical 3D imaging mode.
  • the posture sensor 311 detects whether the long side of the display element is horizontal or vertical, and automatically determines whether to set the horizontal 3D shooting mode or the vertical 3D shooting mode.
  • the user may select the horizontal 3D photographing mode or the vertical 3D photographing mode, so the posture sensor 311 may not be provided.
  • FIG. 15A shows an outline of the horizontal 3D shooting mode
  • FIG. 15B shows an outline of the vertical 3D shooting mode.
  • the long sides of the camera sensors 307 of the first camera 302 and the second camera 303 are parallel to the long sides of the display element.
  • a 3D image is obtained by converting the first camera image and the second camera image into left and right images as in the prior art.
  • the left and right images are 3D images having left and right parallax in the short side direction of the display element.
  • the vertical 3D image having left and right parallax in the short side direction can not be obtained only by using the first camera image and the second camera image as the left and right images. Next, a method of generating a vertical 3D image will be described.
  • FIG. 16 shows a generation procedure of horizontal 3D images and vertical 3D images.
  • a 3D image is obtained from the left and right images [1] and [2].
  • each pixel (or minute block) is made to correspond to each other by using correlation calculation of image processing or the like from the vertically oriented left and right images [3] and [4].
  • left and right images [6] and [6] having left and right parallax (BL-BR) in the short side direction of the camera image 7] get.
  • a 3D image generation method using distance information may be used.
  • the 3D image generation method using distance information takes a long time to process, it is preferable to generate a 3D image from a conventional twin-lens camera image in the horizontal 3D imaging mode.
  • FIG. 17 shows an example of a 3D imaging flowchart.
  • the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to horizontal 3D imaging mode.
  • the 3D preview display image uses the first camera image and the second camera image.
  • the 3D images generated in steps S302 and S303 are used as the 3D preview display image. At this time, it is necessary to switch the barrier liquid crystal of the parallax barrier liquid crystal to the mode of vertical display.
  • the 3D image generation method with a normal twin-lens camera and the 3D image generation method from distance information and one-sided camera image, it is compact and inexpensive without physically changing the arrangement of the cameras.
  • vertical 3D shooting and horizontal 3D shooting can be compatible.
  • the basic part of the 3D image generation method of the fifth embodiment is the same as that of the fourth embodiment, so the description will be omitted.
  • FIG. 18 shows an example of a 3D imaging flowchart of the fifth embodiment.
  • the mode is switched to the vertical 3D imaging mode, and in the case of horizontal installation, the mode is switched to the horizontal 3D imaging mode.
  • the fourth embodiment differs from FIG. 17 in the method of preview display in the case of the vertical 3D shooting mode in step S312.
  • the preview display at the time of vertical 3D shooting uses the preview image generated from the 3D image generated from the first or second camera image and the distance information, but it takes time to generate the 3D image using distance information Therefore, there is a problem that the start time of the preview display is delayed or the display frame rate is slow.
  • the preview display is performed using a 2D image of the first or second camera image.
  • the 3D barrier liquid crystal is not in 3D display mode but in 2D display mode.
  • FIG. 19 shows an example of a 3D imaging flowchart using another preview display.
  • FIG. 19 The difference between FIG. 19 and FIG. 18 is the method of preview display in the vertical 3D shooting mode in step S321.
  • a pseudo 3D image is previewed from the first camera image or the second camera image.
  • the pseudo 3D image refers to the following image.
  • the left and right images ([8] and [9]) are two images that differ in the amount of horizontal shift (C) of the 2D image ([3]) of the first camera image or the second camera image. I assume. This is called a pseudo 3D image.
  • an image with parallax can be generated, and a preview image different from the preview at 2D shooting can be displayed. This makes it easy for the user to recognize whether 3D shooting or 2D shooting and distinguish between 2D shooting and 3D shooting. There is no need to display the
  • preview display since the preview display method at the time of vertical shooting does not require generation of a 3D image, preview display can be performed in a short time. It is not necessary to display characters or marks in the 2D display screen to distinguish it from the 2D display preview.
  • the basic part of the 3D image generation method of the sixth embodiment is the same as that of the fourth and fifth embodiments, so the description will be omitted.
  • 21 (a) and 21 (b) are diagrams showing a 3D image generation method according to the sixth embodiment.
  • FIG. 21A shows a 3D image at the time of horizontal shooting
  • FIG. 21B shows a 3D image at the time of vertical shooting.
  • the parallax of the left and right images can be set independently at the time of horizontal shooting and vertical shooting. It is assumed that the subject shoots a short distance subject such as a person more frequently in the vertical shooting than in the horizontal shooting.
  • the relative horizontal distance between the left and right images is adjusted in advance to a predetermined value so that the parallax angle (or parallax C2) of the subject at a short distance does not become too large. That is, the parallax setting value is made to be for the short distance. As a result, a comfortable parallax 3D image can be obtained even in close-up shooting in vertical shooting.
  • the parallax setting value is used for long-distance so that the parallax angle (or parallax C1) does not become too small.
  • Seventh Embodiment 22 (a), (b) and (c) are schematic structural views of the 3D imaging apparatus of the present embodiment.
  • the seventh embodiment will be described on the assumption of a portable terminal device having a single housing, but in addition, it is an electronic device having a small camera such as a folding portable terminal or a digital still camera (DSC). The same applies to the case of wearing.
  • a portable terminal device having a single housing
  • DSC digital still camera
  • a 3D camera 404 including a first camera 402 and a second camera 403 is disposed on the back surface of the display element 405 of the rectangular casing 401 of the portable terminal device.
  • FIG.22 (c) shows the block block diagram of a portable terminal device.
  • the first camera 402 and the second camera 403 are controlled by the camera control unit 408, and save the photographed image in the storage unit 409 or display it using the display element 405.
  • the first camera 402 and the second camera 403 are composed of a sensor 407 and a lens 406. However, the angle of view of the second camera 403 is larger than the angle of view of the first camera 402.
  • the camera control means 408 synchronizes the imaging range of the first camera image and the second camera image with the imaging time by adjusting the scan range and the scan speed for the image acquired by the sensor 407 of the second camera 403. Further, the camera control unit 408 obtains a 3D image with the first and second camera images as left and right images.
  • the camera control unit 408 limits the scan range in the vertical direction between the position A1 and the position A2 in FIG. 23 so that the second camera 403 scans only the range of the same angle of view as the angle of view of the first camera 402. That is, the camera control unit 408 does not scan the range not captured by the first camera 402 in the image acquired by the sensor 407 of the second camera 403.
  • the camera control means 408 controls the second camera 403 so that the time for scanning the adjusted range with respect to the second camera 403 is the same time T2 as the time T1 for scanning all of the images of the sensor 407 of the first camera 402. Among the images acquired by the sensor 407, the speed of scanning the limited range is adjusted.
  • the adjustment of the scan speed is performed by changing the sampling period of the pixels of the second camera 403, the blanking period, or the frequency of the clock signal input to the second camera 403.
  • the camera control unit 408 matches the vertical angle of view from the start to the end of the scan with two images acquired at the same time by the first camera 402 and the second camera 403 having different angles of view. Scan the same angle of view image at the same speed. Therefore, the first camera image of the first camera 402 and the second camera image of the second camera 403 synchronized with the angle of view of the first camera image and the imaging time can be obtained.
  • the imaging time of the first camera 402 and the imaging time of the second camera 403 are matched by reducing the scanning speed of the second camera 403 having a large angle of view.
  • the present invention is not limited to this method, and the imaging time for the same angle of view of two camera images may be synchronized by increasing the scanning speed of the first camera 402 with a small angle of view.
  • the imaging time can be similarly adjusted. For example, if the first camera 402 performs an optical zoom, the angle of view of the first camera 402 decreases. At this time, the camera control unit 408 adjusts the scan range and the scan speed of the second camera 403 in accordance with the change of the optical zoom magnification of the first camera 402 to obtain the imaging ranges and imaging times of the two camera images. It can synchronize.
  • the camera control unit 408 changes the frequency of the clock signal input to the second camera 403 to 1 / n.
  • the camera control means 408 can synchronize the imaging range of two camera images with the imaging time only by adjusting only the scan range of the second camera 403 to 1 / n in accordance with the magnification.
  • first camera 402 and the second camera 403 may have a mechanism to synchronize externally, and may be configured to synchronize the imaging start time.
  • the angle of view of the first camera and the angle of view of the second camera are different.
  • the scan range and scan speed for the image acquired by the sensor of the camera with the larger angle of view are adjusted so that the imaging range and imaging time of the first camera image and the second camera image are synchronized. Therefore, it is possible to electrically obtain left and right images for a 3D image without performing an operation for changing the angle of view with respect to the camera. For this reason, it is possible to obtain 3D images of a moving subject.
  • the synchronization of the imaging time of the first camera image and the second camera image may be performed by changing the frequency of the clock signal. Therefore, it is possible to cope with the change of the angle of view without changing the setting such as the blank period of the camera.
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application No. 2011-058828
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application No. 2011-058827
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application filed on March 17, 2011 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application No. 2011-058832 Japanese Patent Application filed on March 17, 2011
  • Japanese Patent Application No. 2011-058832 Japanese Patent Application filed on March 17, 2011
  • the present invention is useful as a 3D camera, and can be used in various electronic devices having cameras such as mobile phones, mobile terminals, digital still cameras, and so on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Studio Devices (AREA)

Abstract

L'invention porte sur un dispositif de capture d'image tridimensionnelle qui comporte : deux appareils photographiques qui ont différents axes optiques, respectivement ; une unité de mesure de distance, qui mesure une distance à un objet sur la base d'une parallaxe de deux images d'appareil photographique capturées par les appareils photographiques ; une première unité de génération d'image tridimensionnelle, qui génère une image tridimensionnelle sur la base des deux images d'appareils photographiques ; et une seconde unité de génération d'image tridimensionnelle, qui génère une image tridimensionnelle sur la base de la distance à l'objet et d'une image d'appareil photographique capturée par l'un des deux appareils photographiques. En correspondance avec des états de photographie sur la base des positions des deux appareils photographiques par rapport à l'objet, une commutation est réalisée entre une génération d'image tridimensionnelle à l'aide de la première unité de génération d'image tridimensionnelle, et une génération d'image tridimensionnelle à l'aide de la seconde unité de génération d'image tridimensionnelle.
PCT/JP2012/001798 2011-03-17 2012-03-14 Dispositif de capture d'image tridimensionnelle WO2012124331A1 (fr)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2011-058826 2011-03-17
JP2011058827A JP2014103431A (ja) 2011-03-17 2011-03-17 3d撮像装置
JP2011058826A JP2014102265A (ja) 2011-03-17 2011-03-17 3d撮像装置
JP2011058832A JP2014102266A (ja) 2011-03-17 2011-03-17 3d撮像装置
JP2011-058832 2011-03-17
JP2011-058827 2011-03-17

Publications (1)

Publication Number Publication Date
WO2012124331A1 true WO2012124331A1 (fr) 2012-09-20

Family

ID=46830416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/001798 WO2012124331A1 (fr) 2011-03-17 2012-03-14 Dispositif de capture d'image tridimensionnelle

Country Status (1)

Country Link
WO (1) WO2012124331A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105759556A (zh) * 2016-04-08 2016-07-13 凯美斯三维立体影像(惠州)有限公司 一种具有三维图像拍摄功能的手机
JP6062039B2 (ja) * 2013-04-04 2017-01-18 株式会社Amatel 画像処理システムおよび画像処理用プログラム
CN107357046A (zh) * 2017-05-26 2017-11-17 张家港康得新光电材料有限公司 2d模式与3d模式切换时间的检测方法与检测系统
CN107640317A (zh) * 2016-07-22 2018-01-30 松下知识产权经营株式会社 无人驾驶飞行器系统
CN114025107A (zh) * 2021-12-01 2022-02-08 北京七维视觉科技有限公司 图像重影的拍摄方法、装置、存储介质和融合处理器

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0847001A (ja) * 1994-08-01 1996-02-16 Minolta Co Ltd 立体テレビカメラ
JPH1090814A (ja) * 1996-09-11 1998-04-10 Canon Inc 複眼カメラ及び画像処理方法
JP2000102040A (ja) * 1998-09-28 2000-04-07 Olympus Optical Co Ltd 電子ステレオカメラ
JP2000299810A (ja) * 1999-04-13 2000-10-24 Matsushita Electric Ind Co Ltd 撮像装置
JP2003304561A (ja) * 2003-05-01 2003-10-24 Nissan Motor Co Ltd ステレオ画像処理装置
JP2005020606A (ja) * 2003-06-27 2005-01-20 Sharp Corp デジタルカメラ
JP2005181377A (ja) * 2003-12-16 2005-07-07 Sophia Co Ltd 遊技機
JP2006093860A (ja) * 2004-09-21 2006-04-06 Olympus Corp 2眼撮像系を搭載したカメラ
JP2007295113A (ja) * 2006-04-21 2007-11-08 Matsushita Electric Ind Co Ltd 撮像装置
JP2008236642A (ja) * 2007-03-23 2008-10-02 Hitachi Ltd 物体追跡装置
JP2009048181A (ja) * 2007-07-25 2009-03-05 Fujifilm Corp 立体画像撮像装置
JP2009288657A (ja) * 2008-05-30 2009-12-10 Olympus Imaging Corp ストロボ撮影装置
JP2010154310A (ja) * 2008-12-25 2010-07-08 Fujifilm Corp 複眼カメラ及び撮影方法
JP2010261877A (ja) * 2009-05-11 2010-11-18 Ricoh Co Ltd ステレオカメラ装置及びそれを用いた車外監視装置
JP2011017825A (ja) * 2009-07-08 2011-01-27 Seiko Epson Corp 電気光学装置、および電子機器
JP2011151517A (ja) * 2010-01-20 2011-08-04 Jvc Kenwood Holdings Inc 映像処理装置
JP2011211381A (ja) * 2010-03-29 2011-10-20 Fujifilm Corp 立体撮像装置
JP2012023557A (ja) * 2010-07-14 2012-02-02 Sharp Corp 画像撮像装置

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0847001A (ja) * 1994-08-01 1996-02-16 Minolta Co Ltd 立体テレビカメラ
JPH1090814A (ja) * 1996-09-11 1998-04-10 Canon Inc 複眼カメラ及び画像処理方法
JP2000102040A (ja) * 1998-09-28 2000-04-07 Olympus Optical Co Ltd 電子ステレオカメラ
JP2000299810A (ja) * 1999-04-13 2000-10-24 Matsushita Electric Ind Co Ltd 撮像装置
JP2003304561A (ja) * 2003-05-01 2003-10-24 Nissan Motor Co Ltd ステレオ画像処理装置
JP2005020606A (ja) * 2003-06-27 2005-01-20 Sharp Corp デジタルカメラ
JP2005181377A (ja) * 2003-12-16 2005-07-07 Sophia Co Ltd 遊技機
JP2006093860A (ja) * 2004-09-21 2006-04-06 Olympus Corp 2眼撮像系を搭載したカメラ
JP2007295113A (ja) * 2006-04-21 2007-11-08 Matsushita Electric Ind Co Ltd 撮像装置
JP2008236642A (ja) * 2007-03-23 2008-10-02 Hitachi Ltd 物体追跡装置
JP2009048181A (ja) * 2007-07-25 2009-03-05 Fujifilm Corp 立体画像撮像装置
JP2009288657A (ja) * 2008-05-30 2009-12-10 Olympus Imaging Corp ストロボ撮影装置
JP2010154310A (ja) * 2008-12-25 2010-07-08 Fujifilm Corp 複眼カメラ及び撮影方法
JP2010261877A (ja) * 2009-05-11 2010-11-18 Ricoh Co Ltd ステレオカメラ装置及びそれを用いた車外監視装置
JP2011017825A (ja) * 2009-07-08 2011-01-27 Seiko Epson Corp 電気光学装置、および電子機器
JP2011151517A (ja) * 2010-01-20 2011-08-04 Jvc Kenwood Holdings Inc 映像処理装置
JP2011211381A (ja) * 2010-03-29 2011-10-20 Fujifilm Corp 立体撮像装置
JP2012023557A (ja) * 2010-07-14 2012-02-02 Sharp Corp 画像撮像装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6062039B2 (ja) * 2013-04-04 2017-01-18 株式会社Amatel 画像処理システムおよび画像処理用プログラム
US9832447B2 (en) 2013-04-04 2017-11-28 Amatel Inc. Image processing system and image processing program
CN105759556A (zh) * 2016-04-08 2016-07-13 凯美斯三维立体影像(惠州)有限公司 一种具有三维图像拍摄功能的手机
CN107640317A (zh) * 2016-07-22 2018-01-30 松下知识产权经营株式会社 无人驾驶飞行器系统
CN107357046A (zh) * 2017-05-26 2017-11-17 张家港康得新光电材料有限公司 2d模式与3d模式切换时间的检测方法与检测系统
CN114025107A (zh) * 2021-12-01 2022-02-08 北京七维视觉科技有限公司 图像重影的拍摄方法、装置、存储介质和融合处理器
CN114025107B (zh) * 2021-12-01 2023-12-01 北京七维视觉科技有限公司 图像重影的拍摄方法、装置、存储介质和融合处理器

Similar Documents

Publication Publication Date Title
JP5014979B2 (ja) 個人用電子機器の3次元情報取得及び表示システム
CN108718373B (zh) 影像装置
US8077964B2 (en) Two dimensional/three dimensional digital information acquisition and display device
US9699440B2 (en) Image processing device, image processing method, non-transitory tangible medium having image processing program, and image-pickup device
KR101824439B1 (ko) 모바일 스테레오 카메라 장치 및 그 촬영방법
JP2011064894A (ja) 立体画像表示装置
KR20140051112A (ko) 영상 처리를 위한 주 및 보조 영상 캡처 장치들 및 관련 방법들
KR20090035880A (ko) 원소스 멀티유즈 스테레오 카메라 및 스테레오 영상 컨텐츠제작방법
JP5993937B2 (ja) 画像処理装置、撮像装置、画像処理方法、及びプログラム
KR100818155B1 (ko) 모바일 기기용 스테레오 입체카메라 시스템 및 주시각조절방법
WO2012124331A1 (fr) Dispositif de capture d'image tridimensionnelle
JP2012186612A (ja) 撮像装置
JP6155471B2 (ja) 画像生成装置、撮像装置および画像生成方法
US20120154543A1 (en) Document camera, method for controlling document camera, program, and display processing system
CN103329549B (zh) 立体视频处理器、立体成像装置和立体视频处理方法
CN103339948B (zh) 3d视频再现装置、3d成像装置和3d视频再现方法
US20120307016A1 (en) 3d camera
EP2566166A1 (fr) Dispositif d'imagerie en trois dimensions
JP5562122B2 (ja) 画像処理装置及びその制御方法
CN104041026B (zh) 图像输出装置、方法以及程序及其记录介质
JP5586788B2 (ja) 画像表示装置及び画像撮像装置
KR20130052582A (ko) 원소스 멀티유즈 스테레오 카메라 및 스테레오 영상 컨텐츠 제작방법
JP2013046081A (ja) 撮影装置および映像生成方法
JP2014102265A (ja) 3d撮像装置
JP2014103431A (ja) 3d撮像装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12757984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12757984

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP