WO2012165123A1 - 撮影装置、撮影選択方法および記録媒体 - Google Patents
撮影装置、撮影選択方法および記録媒体 Download PDFInfo
- Publication number
- WO2012165123A1 WO2012165123A1 PCT/JP2012/062191 JP2012062191W WO2012165123A1 WO 2012165123 A1 WO2012165123 A1 WO 2012165123A1 JP 2012062191 W JP2012062191 W JP 2012062191W WO 2012165123 A1 WO2012165123 A1 WO 2012165123A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- photographing
- image data
- camera
- photographing means
- imaging
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/296—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2206/00—Systems for exchange of information between different pieces of apparatus, e.g. for exchanging trimming information, for photo finishing
Definitions
- the present invention relates to a photographing apparatus, a photographing selection method, and a recording medium, and particularly to a photographing apparatus, a photographing selection method, and a recording medium for generating stereoscopic image data.
- Patent Document 1 describes a compound eye camera in which a left-eye camera and a right-eye camera can be arranged at positions shifted by a parallax in the horizontal direction even when the camera body is tilted by 90 degrees.
- the compound eye camera described in Patent Document 1 has a revolution mechanism that rotates the left-eye camera and the right-eye camera about 90 degrees around a common axis. For this reason, when shooting is performed with the camera body tilted 90 degrees, the left eye is captured by rotating the left-eye camera and the right-eye camera approximately 90 degrees using the revolution mechanism.
- the camera for the right eye and the camera for the right eye can be arranged at positions shifted by the amount of parallax in the horizontal direction.
- the compound eye camera described in Patent Document 1 has a movable part called a revolving mechanism that rotates a left-eye camera and a right-eye camera.
- the movable part may not operate normally due to wear of the contact portion, and is likely to cause a failure.
- the left-eye camera and the right-eye camera are rotated as a method of making the left-eye camera and the right-eye camera have a positional relationship shifted in the horizontal direction by the amount of parallax.
- a technique that does not require a movable part is desired.
- An object of the present invention is to provide a photographing apparatus, a photographing selection method, and a recording medium that can solve the above-described problems.
- the photographing apparatus of the present invention 3 or more photographing means, Detection means for detecting the tilt of the device; Control means for selecting, based on the detection result of the detection means, two imaging means that are displaced in the horizontal direction in a situation where the detection means detects an inclination from among the three or more imaging means. .
- the shooting selection method of the present invention includes: A shooting selection method in a shooting apparatus including three or more shooting means, Detecting the tilt of the imaging device; Based on the inclination of the photographing apparatus, two photographing means that are displaced in the horizontal direction in the situation where the detecting means detects the inclination are selected from the three or more photographing means.
- the recording medium of the present invention is To a computer connected to three or more photographing means, A detection procedure for detecting the tilt of the imaging device; And a control procedure for selecting, from among the three or more photographing means, two photographing means that are shifted in the horizontal direction in a situation where the detecting means detects the inclination based on the inclination of the photographing device. It is a computer-readable recording medium which memorize
- the left-eye camera and the right-eye camera are horizontally arranged even when the photographing apparatus is tilted by 90 degrees without requiring a movable part that rotates the left-eye camera and the right-eye camera. It becomes possible to make the positional relationship shifted by the amount of parallax.
- FIG. 1 is a block diagram illustrating a photographing apparatus 1 according to an embodiment of the present invention.
- 1 is a front view of a photographing apparatus 1.
- FIG. It is the figure which showed an example of decision information.
- 3 is a flowchart for explaining the operation of the photographing apparatus 1;
- 1 is a block diagram illustrating an imaging device including a camera 11, a camera 12, a camera 13, a tilt sensor 14, and a control unit 15.
- FIG. It is a front view of the imaging device 1 to which the additional camera 61 was added. It is the figure which showed an example of the decision information.
- FIG. 1 is a block diagram showing a photographing apparatus 1 according to an embodiment of the present invention.
- FIG. 2 is a front view of the photographing apparatus 1.
- the photographing apparatus 1 is, for example, a mobile phone or a smartphone. Note that the photographing apparatus 1 is not limited to a mobile phone or a smartphone.
- a portable game machine for example, a tablet PC (Personal Computer), a notebook PC, a PHS (Personal Handyphone System), a PDA (Personal Data Assistance, Personal Digital Assistants).
- a portable information communication device for individuals a tablet, or a 3D imaging device.
- the photographing apparatus 1 includes cameras 11, 12 and 13, a tilt sensor 14, and a control unit 15.
- the camera 11 can be generally referred to as first photographing means.
- the camera 11 is provided on the front surface 1 a of the photographing apparatus 1 and generates photographed image data 111 when the subject 2 is photographed.
- the camera 12 can generally be referred to as second photographing means.
- the camera 12 is provided on the front surface 1 a of the photographing apparatus 1 and generates photographed image data 121 when the subject 2 is photographed.
- the camera 12 is provided at a location shifted from the camera 11 in the horizontal direction.
- the camera 12 is provided at a position that is shifted from the camera 11 by the distance r1 in the horizontal direction and not from the camera 11 in the vertical direction in the reference state.
- the reference state a state in which the horizontal direction A of the photographing apparatus 1 is the horizontal direction and the vertical direction B of the photographing apparatus 1 is the vertical direction is used.
- the distance r1 is a value larger than 0.
- the camera 13 can generally be referred to as third photographing means.
- the camera 13 is provided on the front surface 1a of the photographing apparatus 1 and generates photographed image data 131 when the subject 2 is photographed.
- the camera 13 is provided at a position shifted from the camera 11 in the vertical direction.
- the camera 13 is provided at a position that is shifted from the camera 11 by a distance r2 in the vertical direction and not shifted from the camera 11 in the horizontal direction in the reference state.
- the distance r2 is a value larger than 0.
- the distance r2 may be longer, shorter or equal to the distance r1.
- the tilt sensor 14 can be generally referred to as detection means.
- the tilt sensor 14 detects the tilt of the photographing apparatus 1.
- the inclination sensor 14 includes, for example, a gravity sensor, and the inclination of the photographing apparatus 1 with respect to the direction of gravity, for example, gravity is directed toward the bottom surface 1b (see FIG. 2), the right surface 1c, or the top surface. Whether it is directed to the 1d side or the left side 1e is detected.
- the photographing apparatus 1 when the photographing apparatus 1 is in the reference state, the inclination of the photographing apparatus 1 is 0 degree, and when the photographing apparatus 1 is viewed from the front 1a, the photographing apparatus 1 is normal to the front 1a from the reference state.
- the rotation angle when rotating counterclockwise about the axis is used as the inclination of the photographing apparatus 1.
- the inclination sensor 14 determines that the gravity is directed toward the bottom surface 1b when the inclination of the photographing apparatus 1 is 0 degree or more and less than 45 degrees or 315 degrees or more and less than 360 degrees. Further, the inclination sensor 14 determines that the gravity is directed to the right side surface 1c when the inclination of the photographing apparatus 1 is not less than 45 degrees and less than 135 degrees. Further, the tilt sensor 14 determines that the gravity is directed toward the upper surface 1d when the tilt of the photographing apparatus 1 is not less than 135 degrees and less than 225 degrees. The tilt sensor 14 determines that the gravity is directed toward the left side surface 1e when the tilt of the photographing apparatus 1 is 225 degrees or more and less than 315 degrees.
- Control unit 15 can be generally referred to as control means.
- the control unit 15 Based on the detection result of the tilt sensor 14, the control unit 15 converts two cameras that are displaced in the horizontal direction in the situation where the tilt of the photographing apparatus 1 is detected from among the cameras 11 to 13 into stereoscopic image data. Are selected as the two shooting execution cameras used to generate the.
- the imaging execution camera can be generally referred to as imaging execution means.
- control unit 15 shifts the cameras 11 and 12 or the cameras 11 and 13 that are displaced in the horizontal direction in a state where the tilt of the photographing apparatus 1 is detected based on the detection result of the tilt sensor 14. Select as two shooting execution cameras.
- the control unit 15 selects the cameras 11 and 12 as two photographing execution cameras. Further, when the detection result of the inclination sensor 14 indicates that the inclination of the photographing apparatus 1 is 90 degrees, the control unit 15 selects the cameras 11 and 13 as two photographing execution cameras.
- the control unit 15 generates stereoscopic image data based on captured image data (hereinafter referred to as “selected image data”) generated by each of the two imaging execution cameras.
- the control unit 15 selects, based on the detection result of the inclination sensor 14, the selected image data that is the right-eye image data out of the two selected image data and the selected image that is the left-eye image data out of the two selected image data. Data.
- the control unit 15 includes a storage unit 15a and a processing unit 15b.
- the storage unit 15a stores selection image data, stereoscopic image data, and determination information indicating the relationship between the tilt of the photographing apparatus 1, the photographing execution camera, the right-eye image data, and the left-eye image data.
- FIG. 3 is a diagram showing an example of decision information.
- the determination information 31 indicates that the camera 11 and 12 are the shooting execution cameras, the shot image data 111 is the right-eye image data, and the shot image is taken in the situation of the tilt of the shooting apparatus 1 in which the direction of gravity is the bottom surface 1b side. It indicates that the data 121 is left-eye image data. In this case, the captured image data 111 and 121 are selected image data, the camera 11 is a right-eye camera, and the camera 12 is a left-eye camera.
- the determination information 31 includes the cameras 11 and 13 as shooting execution cameras, the shooting image data 111 as right-eye image data, and shooting image data when the direction of gravity is the right side surface 1c side and the tilt of the shooting apparatus 1 is.
- 131 indicates that the image data is for the left eye.
- the captured image data 111 and 131 are selected image data
- the camera 11 is a right-eye camera
- the camera 13 is a left-eye camera.
- the cameras 11 and 12 are the photographing execution cameras
- the photographed image data 121 is the right-eye image data
- the photographed image data 111 Indicates left-eye image data.
- the captured image data 111 and 121 are selected image data
- the camera 12 is a right-eye camera
- the camera 11 is a left-eye camera.
- the cameras 11 and 13 become photographing execution cameras, the photographed image data 131 becomes right-eye image data, and the photographed image data.
- Reference numeral 111 denotes left-eye image data.
- the captured image data 111 and 131 are selected image data
- the camera 13 is a right eye camera
- the camera 11 is a left eye camera.
- the processing unit 15b controls the photographing apparatus 1. For example, the processing unit 15b uses the detection result of the tilt sensor 14 and the determination information 31 in the storage unit 15a to determine the imaging execution camera, the right-eye image data, and the left-eye image data.
- FIG. 4 is a flowchart for explaining the operation of the photographing apparatus 1.
- the processing unit 15b When the processing unit 15b receives a user start instruction via an operation switch (not shown), the processing unit 15b operates the tilt sensor 14 to detect the tilt of the photographing apparatus 1 (step S401).
- the processing unit 15b receives the detection result of the tilt sensor 14, refers to the determination information 31 in the storage unit 15a, and according to the detection result of the tilt sensor 14, the two shooting execution cameras and the right-eye image data. And left-eye image data are determined (step S402).
- the processing unit 15b when the detection result of the inclination sensor 14 indicates the state of inclination of the photographing apparatus 1 in which the direction of gravity is the bottom surface 1b side (for example, the inclination of the photographing apparatus 1 is 0 degree), the processing unit 15b Then, the cameras 11 and 12 are selected as the shooting execution cameras, and the shot image data 111 is determined as the right-eye image data, and the shot image data 121 is determined as the left-eye image data.
- the processing unit 15b operates the two photographing execution cameras determined in step S402 (step S403) to photograph the subject 2.
- the processing unit 15b generates stereoscopic image data based on the right-eye image data and the left-eye image data from the two photographing execution cameras (step S404).
- the processing unit 15b combines the right-eye image data and the left-eye image data in a 3D image file format, for example, a CIPA (Camera & Imaging Products Association) multi-picture format format to generate stereoscopic image data. Generate.
- a CIPA Cosmetica & Imaging Products Association
- the processing unit 15b When the processing unit 15b captures a 3D still image, the processing unit 15b ends the still image capturing with the end of step S404.
- the processing unit 15b returns the process to step S401 when the step S404 is ended when shooting the 3D moving image. For this reason, when the orientation (tilt) of the photographing apparatus 1 changes during the photographing of the 3D moving image, the photographing execution camera is switched so that the photographing of the 3D moving image is continued.
- the processing unit 15b receives a user's end instruction via an operation switch (not shown), the processing unit 15b ends the shooting of the 3D moving image.
- the tilt sensor 14 detects the tilt of the photographing apparatus 1. Based on the detection result of the tilt sensor 14, the control unit 15 selects two cameras that are displaced in the horizontal direction from the cameras 11 to 13 when the tilt sensor 14 detects the tilt.
- FIG. 5 is a block diagram illustrating a photographing apparatus including the camera 11, the camera 12, the camera 13, the tilt sensor 14, and the control unit 15.
- the camera 12 is provided at a position shifted from the camera 11 in the horizontal direction
- the camera 13 is provided at a position shifted from the camera 11 in the vertical direction.
- the control unit 15 selects the cameras 11 and 12 or the cameras 11 and 13 that are shifted in the horizontal direction in a situation where the tilt of the photographing apparatus 1 is detected.
- control unit 15 further generates stereoscopic image data based on the captured image data generated by each of the two selected cameras (imaging execution cameras).
- the processing unit 15b when the quality of the captured image data generated by each of the two selected cameras (imaging execution cameras) is different, the processing unit 15b reduces the difference in the quality of the captured image data. Also good.
- the quality of the shot image data generated by each of the two shooting execution cameras (for example, the brightness of the image, the number of pixels, or the width of the shooting range) May be different.
- the processing unit 15b corrects the captured image data so that the brightness of the images specified by the captured image data generated by each of the two shooting execution cameras matches or the difference in brightness is reduced. To do.
- the processing unit 15b corrects the captured image data so that the number of pixels of the captured image data generated by each of the two imaging execution cameras is equal or the difference in the number of pixels is small.
- the processing unit 15b takes a captured image so that the widths of the shooting ranges specified by the shot image data generated by each of the two shooting execution cameras match or the difference in the widths of the shooting ranges becomes small. Correct the data.
- the quality of the stereoscopic image data can be improved.
- the inclination sensor 14 detects the inclination of the photographing apparatus 1 using a gravity sensor.
- the inclination sensor 14 includes an acceleration sensor or a gyro sensor instead of the gravity sensor, and the acceleration sensor or the gyro sensor. May be used to detect the tilt of the photographing apparatus 1.
- the three cameras 11 to 13 are used as the photographing means, but the number of photographing means may be three or more.
- a camera (hereinafter referred to as “additional camera”) that is provided on the front surface 1a of the photographing apparatus 1 and generates the photographed image data 611 when the subject 2 is photographed may be added as the fourth photographing unit. .
- FIG. 6 is a front view of the photographing apparatus 1 to which the additional camera 61 is added.
- the additional camera 61 is provided at a location shifted from the camera 11 in each of the horizontal direction and the vertical direction.
- the additional camera 61 is provided at a position shifted from the camera 11 by the distance r1 and the distance r2 in the horizontal direction and the vertical direction in the reference state.
- the storage unit 15 a stores the determination information 71 instead of the determination information 31. Further, the processing unit 15b determines the photographing execution camera, the right-eye image data, and the left-eye image data using the detection result of the tilt sensor 14 and the determination information 71 in the storage unit 15a.
- FIG. 7 is a diagram showing an example of the decision information 71.
- the determination information 71 indicates that, in the situation where the photographing apparatus 1 is tilted so that the direction of gravity is on the bottom surface 1 b side, the cameras 11 and 12 serve as photographing execution cameras, and the photographed image data 111 serves as right-eye image data. It indicates that the data 121 is left-eye image data. In this case, the captured image data 111 and 121 are selected image data, the camera 11 is a right-eye camera, and the camera 12 is a left-eye camera.
- the determination information 71 includes the cameras 12 and 61 as shooting execution cameras, the shooting image data 121 as right-eye image data, and shooting image data when the direction of gravity is the right side 1c side and the tilt of the shooting apparatus 1 is.
- Reference numeral 611 denotes left-eye image data.
- the captured image data 121 and 611 are selected image data
- the camera 12 is a right-eye camera
- the camera 61 is a left-eye camera.
- the cameras 61 and 13 are photographing execution cameras, the photographed image data 611 is right-eye image data, and the photographed image data 131 is. Indicates left-eye image data.
- the captured image data 611 and 131 are selected image data, the camera 61 is a right-eye camera, and the camera 13 is a left-eye camera.
- the determination information 71 indicates that the camera 13 and 11 are the shooting execution cameras, the shot image data 131 is the right-eye image data, and the shot image data in the situation of the tilt of the shooting apparatus 1 in which the direction of gravity is the left side 1e side.
- Reference numeral 111 denotes left-eye image data.
- the captured image data 131 and 111 are selected image data
- the camera 13 is a right eye camera
- the camera 11 is a left eye camera.
- the processing unit 15b When the processing unit 15b receives the detection result of the tilt sensor 14, the processing unit 15b refers to the determination information 71 in the storage unit 15a, and according to the detection result of the tilt sensor 14, the two shooting execution cameras, the right-eye image data, and the left eye Image data.
- the processing unit 15b determines that the cameras 12 and 61 are selected as the photographing execution cameras, the photographed image data 121 is the right-eye image data, and the photographed image data 611 is the left-eye image data.
- the processing unit 15b operates the two shooting execution cameras to generate stereoscopic image data based on the right-eye image data and the left-eye image data from the two shooting execution cameras.
- the processing unit 15b uses the cameras 11 and 12, or the cameras 12 and 13, or the camera that are displaced in the horizontal direction in a situation where the inclination sensor 14 detects the inclination based on the detection result of the inclination sensor 14. 14 and 61 or cameras 61 and 11 are selected. For this reason, in a situation where four cameras are used, it is possible to select two cameras having a shift in the amount of parallax in the horizontal direction even if the photographing apparatus 1 is tilted. Moreover, it becomes possible to use two cameras existing on the upper side at the time of shooting among the four cameras as shooting execution cameras.
- the photographing apparatus 1 may be realized by a computer connected to three or more cameras.
- the computer reads and executes a program recorded on a recording medium such as a CD-ROM (Compact Disk Read Only Memory) readable by the computer, and functions as the tilt sensor 14 and the control unit 15.
- a recording medium such as a CD-ROM (Compact Disk Read Only Memory) readable by the computer, and functions as the tilt sensor 14 and the control unit 15.
- the recording medium is not limited to the CD-ROM and can be changed as appropriate.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Cameras In General (AREA)
Abstract
Description
3つ以上の撮影手段と、
自装置の傾きを検出する検出手段と、
前記検出手段の検出結果に基づいて、前記3つ以上の撮影手段の中から、前記検出手段が傾きを検出した状況で水平方向にずれている2つの撮影手段を選択する制御手段と、を含む。
3つ以上の撮影手段を含む撮影装置での撮影選択方法であって、
前記撮影装置の傾きを検出し、
前記撮影装置の傾きに基づいて、前記3つ以上の撮影手段の中から、前記検出手段が傾きを検出した状況で水平方向にずれている2つの撮影手段を選択する。
3つ以上の撮影手段に接続されたコンピュータに、
前記撮影装置の傾きを検出する検出手順と、
前記撮影装置の傾きに基づいて、前記3つ以上の撮影手段の中から、前記検出手段が傾きを検出した状況で水平方向にずれている2つの撮影手段を選択する制御手順と、を実行させるためのプロクラムを記憶したコンピュータ読み取り可能な記録媒体である。
11~13、61 カメラ
14 傾きセンサ
15 制御部
15a 格納部
15b 処理部
Claims (7)
- 3つ以上の撮影手段と、
自装置の傾きを検出する検出手段と、
前記検出手段の検出結果に基づいて、前記3つ以上の撮影手段の中から、前記検出手段が傾きを検出した状況で水平方向にずれている2つの撮影手段を選択する制御手段と、を含む撮影装置。 - 前記3つ以上の撮影手段は、第1撮影手段と、水平方向において前記第1撮影手段とずれた場所に設けられた第2撮影手段と、垂直方向において前記第1撮影手段とずれた位置に設けられた第3撮影手段と、の3つの撮影手段であり、
前記制御手段は、前記検出手段の検出結果に基づいて、前記検出手段が傾きを検出した状況で水平方向にずれている、前記第1撮影手段と前記第2撮影手段、または、前記第1撮影手段と前記第3撮影手段を選択する、請求項1に記載の撮影装置。 - 前記3つ以上の撮影手段は、第1撮影手段と、水平方向において前記第1撮影手段とずれた場所に設けられた第2撮影手段と、垂直方向において前記第1撮影手段とずれた位置に設けられた第3撮影手段と、水平方向および垂直方向のそれぞれにおいて前記第1撮影手段とずれた場所に設けられた第4撮影手段と、の4つの撮影手段であり、
前記制御手段は、前記検出手段の検出結果に基づいて、前記検出手段が傾きを検出した状況で水平方向にずれている、前記第1撮影手段と前記第2撮影手段、または、前記第2撮影手段と前記第3撮影手段、または、前記第3撮影手段と前記第4撮影手段、または、前記第4撮影手段と前記第1撮影手段を選択する、請求項1に記載の撮影装置。 - 前記制御手段は、さらに、選択された2つの撮影手段のそれぞれが生成した撮影画像データを基に立体視用画像データを生成する、請求項1から3のいずれか1項に記載の撮影装置。
- 前記制御手段は、さらに、選択された2つの撮影手段のそれぞれが生成した撮影画像データの品質が異なる場合に、該撮影画像データの品質の差を小さくする、請求項1から4のいずれか1項に記載の撮影装置。
- 3つ以上の撮影手段を含む撮影装置での撮影選択方法であって、
前記撮影装置の傾きを検出し、
前記撮影装置の傾きに基づいて、前記3つ以上の撮影手段の中から、前記傾きが検出された状況で水平方向にずれている2つの撮影手段を選択する、撮影選択方法。 - 3つ以上の撮影手段に接続されたコンピュータに、
前記撮影装置の傾きを検出する検出手順と、
前記撮影装置の傾きに基づいて、前記3つ以上の撮影手段の中から、前記傾きが検出された状況で水平方向にずれている2つの撮影手段を選択する制御手順と、を実行させるためのプログラムを記憶したコンピュータ読み取り可能な記録媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12792167.4A EP2717096A4 (en) | 2011-05-27 | 2012-05-11 | IMAGING DEVICE, PICTURE SELECTION PROCEDURE AND RECORDING MEDIUM |
US14/119,037 US20140098200A1 (en) | 2011-05-27 | 2012-05-11 | Imaging device, imaging selection method and recording medium |
JP2013517945A JP5999089B2 (ja) | 2011-05-27 | 2012-05-11 | 撮影装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011-119200 | 2011-05-27 | ||
JP2011119200 | 2011-05-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012165123A1 true WO2012165123A1 (ja) | 2012-12-06 |
Family
ID=47258987
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/062191 WO2012165123A1 (ja) | 2011-05-27 | 2012-05-11 | 撮影装置、撮影選択方法および記録媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140098200A1 (ja) |
EP (1) | EP2717096A4 (ja) |
JP (1) | JP5999089B2 (ja) |
WO (1) | WO2012165123A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013148587A1 (en) * | 2012-03-28 | 2013-10-03 | Qualcomm Incorporated | Method and apparatus for managing orientation in devices with multiple imaging sensors |
EP2919067A1 (en) * | 2014-03-12 | 2015-09-16 | Ram Srikanth Mirlay | Multi-planar camera apparatus |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113141497A (zh) * | 2020-01-20 | 2021-07-20 | 北京芯海视界三维科技有限公司 | 3d拍摄装置、3d拍摄方法及3d显示终端 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001016591A (ja) * | 1999-07-02 | 2001-01-19 | Fuji Photo Film Co Ltd | 圧縮符号化装置および方法 |
JP2004264492A (ja) * | 2003-02-28 | 2004-09-24 | Sony Corp | 撮影方法及び撮像装置 |
JP2006033476A (ja) * | 2004-07-16 | 2006-02-02 | Sharp Corp | 撮影装置、表示装置及び撮影表示装置 |
JP2009177565A (ja) | 2008-01-25 | 2009-08-06 | Fujifilm Corp | 複眼カメラ及び撮影方法 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001142166A (ja) * | 1999-09-15 | 2001-05-25 | Sharp Corp | 3dカメラ |
JP2005210217A (ja) * | 2004-01-20 | 2005-08-04 | Olympus Corp | ステレオカメラ |
JP4448844B2 (ja) * | 2006-11-22 | 2010-04-14 | 富士フイルム株式会社 | 複眼撮像装置 |
US20100194860A1 (en) * | 2009-02-03 | 2010-08-05 | Bit Cauldron Corporation | Method of stereoscopic 3d image capture using a mobile device, cradle or dongle |
JP5621303B2 (ja) * | 2009-04-17 | 2014-11-12 | ソニー株式会社 | 撮像装置 |
JP2010258583A (ja) * | 2009-04-22 | 2010-11-11 | Panasonic Corp | 立体画像表示装置、立体画像再生装置および立体画像視認システム |
TW201040581A (en) * | 2009-05-06 | 2010-11-16 | J Touch Corp | Digital image capturing device with stereo image display and touch functions |
KR20110020082A (ko) * | 2009-08-21 | 2011-03-02 | 엘지전자 주식회사 | 이동 단말기의 제어 장치 및 그 방법 |
US8922625B2 (en) * | 2009-11-19 | 2014-12-30 | Lg Electronics Inc. | Mobile terminal and controlling method thereof |
KR20120010764A (ko) * | 2010-07-27 | 2012-02-06 | 엘지전자 주식회사 | 이동 단말기 및 3차원 영상 제어 방법 |
JP2012199759A (ja) * | 2011-03-22 | 2012-10-18 | Konica Minolta Holdings Inc | 情報処理装置、そのプログラム、および情報処理方法 |
-
2012
- 2012-05-11 EP EP12792167.4A patent/EP2717096A4/en not_active Withdrawn
- 2012-05-11 WO PCT/JP2012/062191 patent/WO2012165123A1/ja active Application Filing
- 2012-05-11 JP JP2013517945A patent/JP5999089B2/ja not_active Expired - Fee Related
- 2012-05-11 US US14/119,037 patent/US20140098200A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001016591A (ja) * | 1999-07-02 | 2001-01-19 | Fuji Photo Film Co Ltd | 圧縮符号化装置および方法 |
JP2004264492A (ja) * | 2003-02-28 | 2004-09-24 | Sony Corp | 撮影方法及び撮像装置 |
JP2006033476A (ja) * | 2004-07-16 | 2006-02-02 | Sharp Corp | 撮影装置、表示装置及び撮影表示装置 |
JP2009177565A (ja) | 2008-01-25 | 2009-08-06 | Fujifilm Corp | 複眼カメラ及び撮影方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2717096A4 |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013148587A1 (en) * | 2012-03-28 | 2013-10-03 | Qualcomm Incorporated | Method and apparatus for managing orientation in devices with multiple imaging sensors |
EP2919067A1 (en) * | 2014-03-12 | 2015-09-16 | Ram Srikanth Mirlay | Multi-planar camera apparatus |
CN106575072A (zh) * | 2014-03-12 | 2017-04-19 | 拉姆·斯里坎斯·米莱 | 多平面摄像机设备 |
Also Published As
Publication number | Publication date |
---|---|
EP2717096A1 (en) | 2014-04-09 |
EP2717096A4 (en) | 2015-11-25 |
US20140098200A1 (en) | 2014-04-10 |
JPWO2012165123A1 (ja) | 2015-02-23 |
JP5999089B2 (ja) | 2016-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240107172A1 (en) | Image processing device and associated methodology for generating panoramic images | |
JP6518069B2 (ja) | 表示装置、撮像システム、表示装置の制御方法、プログラム、及び記録媒体 | |
JP5522018B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
US20120263372A1 (en) | Method And Apparatus For Processing 3D Image | |
US20120033046A1 (en) | Image processing apparatus, image processing method, and program | |
JP5640155B2 (ja) | 立体画像撮像装置及びその合焦状態確認用画像表示方法 | |
WO2015194084A1 (ja) | 情報処理装置、情報処理システム、情報処理方法およびプログラム | |
JP5955417B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
JP5547356B2 (ja) | 撮影装置、方法、記憶媒体及びプログラム | |
JP5999089B2 (ja) | 撮影装置 | |
JP2013113877A (ja) | 立体撮影装置およびそれを用いた携帯端末装置 | |
JP2007006162A (ja) | 画像処理装置 | |
JP2011091750A (ja) | 立体画像撮像装置及びその制御方法 | |
JP2013253995A (ja) | ステレオ撮影装置 | |
JP6319081B2 (ja) | 端末装置、撮影システム、及び撮影方法 | |
US20220385883A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP2014155126A (ja) | 表示装置、表示方法およびプログラム | |
US20150035952A1 (en) | Photographing apparatus, display apparatus, photographing method, and computer readable recording medium | |
JP2011259341A (ja) | 撮像装置 | |
US20230384236A1 (en) | Information processing apparatus, imaging apparatus, information processing method, and non-transitory computer readable medium | |
KR20150016871A (ko) | 촬상 장치, 표시 장치, 촬상 방법 및 촬상 프로그램 | |
US20230281768A1 (en) | Electronic apparatus, control method of electronic apparatus, and non-transitory computer readable medium | |
JP2007295391A (ja) | 撮影機能付き携帯情報端末 | |
JP5875268B2 (ja) | 撮像装置 | |
JP2013115467A (ja) | 立体撮影装置およびそれを用いた携帯端末装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12792167 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012792167 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14119037 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2013517945 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |