WO2011135904A1 - 立体映像撮像装置 - Google Patents
立体映像撮像装置 Download PDFInfo
- Publication number
- WO2011135904A1 WO2011135904A1 PCT/JP2011/053859 JP2011053859W WO2011135904A1 WO 2011135904 A1 WO2011135904 A1 WO 2011135904A1 JP 2011053859 W JP2011053859 W JP 2011053859W WO 2011135904 A1 WO2011135904 A1 WO 2011135904A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video data
- imaging
- eye video
- eye
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/08—Stereoscopic photography by simultaneous recording
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/327—Calibration thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2213/00—Details of stereoscopic systems
- H04N2213/001—Constructional or mechanical details
Definitions
- the present invention relates to a stereoscopic video imaging apparatus that generates video for perceiving stereoscopic video based on binocular parallax.
- the imaging axis indicates an imaging direction and is an axis corresponding to the center (video center) of the video generated by the imaging unit.
- Such imaging axis misalignment difference and magnification error can be calibrated (corrected) by directly adjusting the hardware or adjusting the cutout range of the acquired video data.
- a clipping range is set around the actual imaging axis of the imaging unit
- a technique for setting a cutout range by shifting the center (cutout center) when cutting out the actual image pickup axis of the other image pickup unit with respect to the actual image pickup axis of the other image pickup unit is disclosed (for example, Patent Document 1). ).
- an object of the present invention is to provide a stereoscopic video imaging apparatus capable of maintaining the image quality of the cut out partial video data by appropriately cutting out the video data generated by the imaging unit.
- a stereoscopic video imaging apparatus includes a right-eye video data and a right-eye video data having binocular parallax for perceiving a stereoscopic video, and a right-eye video data. Projecting the index projected on the right-eye video data to the index projected on the right-eye video data and the left-eye video data when the video center point of the video data and the left-eye video data is the origin.
- the position of the right calibration point moved by the difference vector starting from the midpoint of the connection line connecting the indices and the video center point as the end point, and the left calibration point moving the index projected on the left-eye video data by the difference vector An information holding unit that holds position information indicating the position of the right eye, and based on the position information, the right calibration point is extracted from the right-eye video data as the center of extraction, and the left calibration point is extracted from the left-eye image data as the center of extraction. Equality Characterized in that it and a clipping control unit for cutting out a partial image data, respectively.
- the imaging unit has lenses capable of enlarging and reducing an imaging target on an imaging axis for generating right-eye video data and on an imaging axis for generating left-eye video data
- the information holding unit holds the position information when the lens is set to a plurality of different magnifications in association with each magnification, and the extraction control unit obtains the current lens magnification and obtains the current lens magnification.
- partial video data having the same size may be cut out from the right-eye video data and the left-eye video data.
- the imaging unit has lenses capable of enlarging and reducing an imaging target on an imaging axis for generating right-eye video data and on an imaging axis for generating left-eye video data
- the information holding unit holds ratio information indicating the size ratio of the same imaging target projected on the right-eye video data and the left-eye video data generated by the imaging unit, and the clipping control unit is held
- the average value obtained by dividing the sum of the ratio indicated by the ratio information and 1 by 2 is multiplied by either the right-eye video data or the left-eye video data, and the average value is divided by the ratio. Partial video data having the same size may be cut out from the video data multiplied by the average value and the video data multiplied by the average value divided by the ratio by multiplying the video data not multiplied by the average value.
- the information holding unit holds the ratio information when the lens is set to a plurality of different magnifications in association with each magnification, and the extraction control unit acquires the current lens magnification, and acquires the current lens magnification. Based on the ratio information associated with the magnification, partial video data having the same size may be cut out from the video data multiplied by the average value and the video data multiplied by the average value divided by the ratio.
- FIG. 1 is an external view showing an example of a stereoscopic video imaging apparatus 100 according to the first embodiment.
- FIG. 1A shows a video camera as the stereoscopic image capturing apparatus 100
- FIG. 1B shows a so-called digital still camera as the stereoscopic image capturing apparatus 100.
- Some stereoscopic video imaging devices 100 have portability, and include a main body 102, two imaging lenses 104a and 104b, an operation unit 106, and a display unit (viewfinder) 108.
- the display unit 108 is openable and can be made compact by folding the display unit 108 except during imaging.
- the imaging lens 104b is provided on the back surface of the display unit 108, so that the base line length between the imaging lenses 104a and 104b can be sufficiently secured while avoiding an increase in size of the stereoscopic video imaging apparatus 100. Yes.
- the imaging axes 110 a and 110 b of the imaging unit of the stereoscopic image capturing apparatus 100 correspond to the respective imaging axes 110 a and 110 b when the photographer grips the main body 102 of the stereoscopic image capturing apparatus 100 horizontally. Intersect substantially parallel or in the imaging direction on the same horizontal plane.
- the imaging axes 110a and 110b indicate the imaging direction, and are lines extending in the imaging direction from the center point (video center point) of the video generated by the imaging unit.
- the stereoscopic video imaging apparatus 100 generates two video data (right-eye video data and left-eye video data) having binocular parallax generated through the two imaging lenses 104a and 104b for perceiving stereoscopic video. Recording is performed by a predetermined method for perceiving stereoscopic images such as a side-by-side method, a top-and-bottom method, a line-sequential method, a frame-sequential method, and the imaging timing and image according to the operation input of the photographer through the operation unit 106. Adjust the corners.
- the stereoscopic video imaging apparatus 100 receives an input of switching between imaging modes such as outdoor, indoor, and night view from the photographer, and executes image processing corresponding thereto.
- Such a stereoscopic video imaging apparatus 100 generally has two imaging units, and right-eye video data for allowing the right eye to be visually recognized by one imaging unit is left by the other imaging unit. Video data for the left eye to be visually recognized by the eye is generated.
- the processing is not performed, there has been a problem that the photographer (observer) cannot correctly form a three-dimensional image by the generated two image data.
- the stereoscopic video imaging apparatus 100 of the present embodiment it is possible to maintain the image quality of the cut out partial video data by appropriately cutting out the video data generated by the imaging unit.
- a stereoscopic video imaging apparatus 100 will be described.
- FIG. 2 is a functional block diagram showing a schematic configuration of the stereoscopic video imaging apparatus 100.
- the video camera shown in FIG. the stereoscopic video imaging apparatus 100 includes an operation unit 106, two imaging units 120 (indicated by 120a and 120b in FIG. 2), a video buffer 122, a data processing unit 124, and a clipping unit.
- the processing unit 126, the video composition unit 128, the display unit 108, the video storage unit 130, the information holding unit 132, and the central control unit 134 are configured.
- the solid line indicates the flow of data
- the broken line indicates the flow of the control signal.
- the operation unit 106 includes operation keys including a release switch, a cross key, a joystick, a switch such as a touch panel superimposed on the display surface of the display unit 108, and accepts an operation input from the photographer.
- operation keys including a release switch, a cross key, a joystick, a switch such as a touch panel superimposed on the display surface of the display unit 108, and accepts an operation input from the photographer.
- the imaging unit 120 is arranged such that the imaging axes 110a and 110b intersect at an arbitrary convergence point substantially parallel or in the imaging direction.
- generation (recording) of video data is selected in response to an operation input by the photographer through the operation unit 106, the imaging unit 120a selects the right-eye video data, the imaging unit 120b stores the left-eye video data, Each is generated and the video data is output to the video buffer 122.
- the imaging unit 120 includes an imaging lens 104 (indicated by 104a and 104b in FIG. 2), a zoom lens 140 (magnification lens) capable of enlarging and reducing an imaging target, and a focus lens used for focus adjustment. 142, an iris 144 used for exposure adjustment, an image sensor 146 that photoelectrically converts a light beam incident through the imaging lens 104 into electrical data (video data), and a control signal of an imaging control unit 150 described later.
- the zoom lens 140, the focus lens 142, the stop 144, and the drive unit 148 that drives the image sensor 146, respectively, and generates two pieces of video data on each of the two image pickup axes 110a and 110b.
- the two image capturing units 120 are linked to each other, and the combinations of the zoom lens 140, the focus lens 142, the diaphragm 144, and the image sensor 146 are arranged in parallel according to the control signal of the image capturing control unit 150. Drive (synchronously).
- the video buffer 122 includes a flash memory, a RAM, and the like, and temporarily stores the right-eye video data generated by one imaging unit 120a and the left-eye video data generated by the other imaging unit 120b in units of frames. Hold on.
- the data processing unit 124 applies R (Red) G (Green) B (Blue) processing ( ⁇ correction or color correction), enhancement processing, and processing to the right-eye video data and left-eye video data output from the video buffer 122.
- Video signal processing such as noise reduction processing is performed.
- the cutout processing unit 126 performs an imaging axis calibration process in which a part of the video data after the video signal processing by the data processing unit 124 is cut out and the imaging axis is calibrated under the control of the cutout control unit 154 described later. After performing or performing an enlargement / reduction calibration process for calibrating the enlargement / reduction (enlargement or reduction) ratio, the result is output to the video composition unit 128. Detailed operations of the imaging axis calibration process and the enlargement / reduction calibration process of the extraction processing unit 126 will be described in detail later. Note that the video composition unit 128 may be in front of the clipping processing unit 126.
- the video composition unit 128 merges the right-eye video data and the left-eye video data output from the cut-out processing unit 126 and, for example, a three-dimensional method such as a side-by-side method, a top-and-bottom method, a line-sequential method, or a frame-sequential method.
- Composite data of a predetermined method for causing a video to be perceived is generated and output to the display unit 108 and the video storage unit 130.
- the display unit 108 includes a liquid crystal display, an organic EL (Electro Luminescence) display, and the like, and is not limited to a display or viewfinder attached to a video camera or a digital still camera, and may be other display displays. Then, the display unit 108 displays the composite data output from the video composition unit 128. At this time, the display unit 108 may display the right-eye video data or the left-eye video data constituting the composite data.
- an organic EL Electro Luminescence
- the video composition unit 128 outputs the composite data by a line sequential method, and the photographer uses polarized glasses or the like. Wearing it, the right eye only sees the right eye video and the left eye only the left eye video, and the combined data is perceived as a stereoscopic video.
- the display method is not limited to the line sequential method. For example, a right-eye image and a left-eye image are displayed for each frame. A frame sequential method in which images are alternately displayed and visually recognized through electronic shutter glasses, a lenticular method in which the light traveling directions of the right eye image and the left eye image are controlled via a lenticular lens may be used.
- the imager can grasp the subject at a desired position and occupied area by operating the operation unit 106 while viewing the video displayed on the display unit 108.
- the video storage unit 130 is configured with an HDD, a flash memory, a nonvolatile RAM, and the like, and stores the synthesized data output from the video synthesis unit 128 in accordance with a control signal of a storage control unit 156 described later.
- the video storage unit 130 may be configured as a device that stores the composite data in a removable storage medium such as an optical disk medium such as a CD, DVD, or BD, or a portable memory card. At this time, the video storage unit 130 converts the video data into M-JPEG, MPEG-2, H.264, or the like. It is also possible to encode with a predetermined encoding method such as H.264.
- the information holding unit 132 includes an EEPROM, a PROM, a non-volatile RAM, and the like, and holds the position information generated by the calibration value generation unit 152, which will be described later, at the time of factory shipment, for example.
- the information holding unit 132 holds in advance position information when the zoom lens 140 and the focus lens 142 are set to a plurality of different magnifications in association with the respective magnifications.
- the position information is the position of the right calibration point obtained by moving the index projected on the right-eye video data by the difference vector when the video center point of the right-eye video data and the left-eye video data is the origin. It shows the position of the left calibration point obtained by moving the index projected on the left-eye video data by the difference vector.
- the difference vector is a vector whose starting point is the midpoint of the connection between the index projected on the right-eye video data and the index projected on the left-eye video data, and the video center point is the end point.
- a means for generating position information by the calibration value generator 152 will be described in detail later.
- the central control unit 134 manages and controls the entire three-dimensional image pickup device 100 by a semiconductor integrated circuit including a central processing unit (CPU), a ROM storing programs, a RAM as a work area, and the like.
- the central control unit 134 also functions as the imaging control unit 150, the calibration value generation unit 152, the extraction control unit 154, and the storage control unit 156.
- the imaging control unit 150 controls the imaging unit 120 in accordance with the operation input of the photographer, that is, the information supplied from the operation unit 106. For example, the imaging control unit 150 causes the driving unit 148 to drive the zoom lens 140, the focus lens 142, the diaphragm 144, and the imaging element 146 so that appropriate video data can be obtained. Further, the imaging control unit 150 moves the zoom lens 140 and the focus lens 142 through the driving unit 148 when performing the zoom function (magnification changing function).
- the calibration value generation unit 152 generates position information used when the cutout control unit 154 performs the imaging axis calibration process, and causes the information holding unit 132 to hold the position information.
- FIG. 3 to 5 are explanatory diagrams for explaining a means for generating position information by the calibration value generation unit 152.
- FIG. As described above, there are individual differences in the drive unit 148 including the formation accuracy and assembly accuracy of the two imaging units 120 including the imaging lens 104, the zoom lens 140, and the focus lens 142, and the lens driving motor. There is. Therefore, a reference imaging axis, that is, an imaging axis (hereinafter simply referred to as a reference axis) set when designing the stereoscopic image capturing apparatus 100 and an actual imaging axis (hereinafter simply referred to as an actual imaging axis) 110a and 110b. A shift occurs, and the shift differs for each imaging unit 120, so that a difference occurs between the two imaging units 120.
- an imaging axis that is, an imaging axis (hereinafter simply referred to as a reference axis) set when designing the stereoscopic image capturing apparatus 100 and an actual imaging axis (hereinafter simply referred to as an
- the calibration value generation unit 152 firstly , The position of the index projected on the right-eye video data and the left-eye video data when the video center point (point indicating the actual imaging axis) of the right-eye video data and the left-eye video data is the origin, respectively.
- the positions of the indices projected on the are respectively calculated.
- indexes 162 as shown in FIG. 3C are installed on the reference axes 160a and 160b of the imaging units 120a and 120b, respectively.
- the zoom lens 140 and the focus lens 142 are set to a predetermined magnification to capture an image, and the calibration value generation unit 152 generates the right-eye video data and the left-eye video data generated by the imaging units 120a and 120b, respectively.
- the position of the index projected on the right-eye video data and the position of the index projected on the left-eye video data when the center point is the origin are calculated.
- the index 162 may be installed at any position on the reference axes 160a and 160b.
- the index 162 uses the “x” mark, but the present invention is not limited to this. Any index can be used as long as it can be expressed as.
- the video data generated by the imaging units 120a and 120b is the right-eye video data 164a and the left-eye video data 164b as shown in FIG. 4A
- the point indicating the actual imaging axis of the imaging unit 120a With respect to the video center point M in the right-eye video data 164a, the index A projected on the right-eye video data 164a (the index 162 projected on the right-eye video data 164a) is right in the front view. You can see that it is shifted.
- the video center point M is an intersection of the vertical center line 166a and the horizontal center line 168a of the right-eye video data 164a.
- the index 162 is set on the reference axis
- the point indicating the reference axis in the right-eye video data 164a is shifted to the right in the front view with respect to the video center point M in the right-eye video data 164a.
- the actual imaging axis of the imaging unit 120a is deviated from the reference axis.
- the index B projected on the left-eye video data 164b (the index 162 is the left-eye video data). It can be seen that the image projected on 164b is shifted to the lower left in front view.
- the video center point N is an intersection of the vertical center line 166b and the horizontal center line 168b of the left-eye video data 164b.
- the point indicating the reference axis in the left-eye video data 164b is shifted to the lower left in the front view with respect to the video center point N in the left-eye video data 164b.
- the actual imaging axis of the imaging unit 120b is deviated from the reference axis.
- the calibration value generation unit 152 projects the right eye video data 164a when the video center point M of the right eye video data 164a, which is a point indicating the actual imaging axis shown in FIG.
- the position of the index A (point indicating the reference axis) is calculated.
- the calibration value generation unit 152 generates the left-eye video data 164b when the video center point N of the left-eye video data 164b, which is a point indicating the actual imaging axis shown in FIG.
- the position of the projected index B (point indicating the reference axis) is calculated.
- the index A (hereinafter referred to as the right eye video data 164a).
- the xy coordinate value of the reference point A) is (ax, ay)
- the xy coordinate value of the index B (hereinafter simply referred to as the reference point B) projected on the left-eye video data 164b is (bx, by).
- the x coordinate is a coordinate in the horizontal direction of the screen
- the y coordinate is a coordinate in the vertical direction of the screen.
- the calibration value generation unit 152 calculates the difference with the midpoint D of the connection C connecting the calculated reference points A and B as the start point and the video center point (origin) as the end point.
- a vector E is derived.
- the origin is the video center point and the xy coordinate value (x, y) is used.
- the xy coordinate value of the midpoint D can be represented by ((ax + bx) / 2, (ay + by) / 2). Therefore, the difference vector E can be said to be a vector having ((ax + bx) / 2, (ay + by) / 2) as the start point and the origin (0, 0) as the end point, when expressed in xy coordinate values.
- the calibration value generation unit 152 generates position information indicating the position of the right calibration point that has moved the reference point A by the difference vector E and the position of the left calibration point that has moved the reference point B by the difference vector E, and stores the information. Held by the unit 132.
- the xy coordinate value of the reference point A is (ax, ay)
- the reference point B is (bx, by)
- the starting point of the difference vector E is ((ax + bx) / 2, (ay + by) / 2)) and the end point is the origin (0, 0)
- the right calibration point F becomes (ax ⁇ (ax + bx) / 2, ay ⁇ (ay + by) / 2)
- the left calibration point G Is (bx ⁇ (ax + bx) / 2, by ⁇ (ay + by) / 2)).
- the calibration value generation unit 152 sets the magnification of the zoom lens 140 and the focus lens 142 a plurality of times, for example, in 3 to 4 levels, and sets the right calibration point F of the video data 164 captured at the set different magnifications. The position and the position of the left calibration point G are calculated for each magnification.
- the calibration value generation unit 152 associates with the magnifications of the zoom lens 140 and the focus lens 142 when imaging the video data, which is used when calculating the position of the right calibration point F and the position of the left calibration point G.
- the information holding unit 132 holds position information indicating the calculated position of the right calibration point F and the position of the left calibration point G. Further, the calibration value generation unit 152 obtains position information about the magnification at which the position of the right calibration point F and the position of the left calibration point G have not been calculated, from the zoom lens 140 and the focus lens 142 held in the information holding unit 132.
- the position of the right calibration point F at the step magnification is linearly interpolated, and the position of the left calibration point G at the multiple magnifications of the zoom lens 140 and the focus lens 142 held in the information holding unit 132 is obtained by linear interpolation.
- the interpolation method is not limited to linear interpolation, and may be other interpolation processing such as interpolation using a spline function.
- the calibration value generation unit 152 holds the position information in the information holding unit 132 in advance before shipping, and uses the position information to calibrate the extraction center of the video data, so The deviation between the reference axis of the right-eye video data generated by the video imaging apparatus 100 and the actual imaging axis and the deviation between the reference axis of the left-eye video data and the actual imaging axis are approximately matched.
- the present invention is not limited to this, and as shown in FIG. 5, the point between the imaging unit 120a and the imaging unit 120b is used.
- a point on the circumference 170 of a circle having a radius from the point P to the convergence point Q with the same distance P from each imaging unit 120 as the center, A point within the angle of view can be set as the installation position of the index 162.
- the cutout control unit 154 Based on the position information held in the information holding unit 132, the cutout control unit 154 cuts the right calibration point F from the right eye video data 164a and the left calibration point G from the left eye video data 164b.
- the extraction processing unit 126 is controlled so as to extract partial video data having the same size as the center.
- FIG. 6 is an explanatory diagram for explaining the detailed operation of the imaging axis calibration processing by the cutout processing unit 126.
- the cutout processing unit 126 cuts out the partial video data 180a from the right-eye video data 164a
- the cutout processing unit 126 performs the right calibration according to the control of the cutout control unit 154.
- the partial video data 180a is cut out with the point F as the cutting center.
- the cutout processing unit 126 cuts out the partial video data 180b from the left-eye video data 164b
- the cutout processing unit 126 uses the left calibration point G as the cutout center in accordance with the control of the cutout control unit 154.
- the partial video data 180b is cut out.
- the cutout processing unit 126 cuts out the sizes of the partial video data 180a and the partial video data 180b with the same aspect ratio (for example, 16: 9).
- the cut-out processing unit 126 expands / contracts the cut-out partial video data 180 in accordance with the method of the video composition unit 128.
- the number of pixels of the right-eye video data 164a and the left-eye video data 164b generated by the image sensor 146 is 3840 ⁇ 2160
- the video composition unit 128 is a 1920 ⁇ 1080 side-by-side format
- the output processing unit 126 extracts 1920 ⁇ 1080 partial video data 180a and partial video data 180b from the right-eye video data 164a and the left-eye video data 164b, respectively, and further compresses the number of pixels in the horizontal direction to 1920 ⁇ 540.
- the partial video data 180a and the partial video data 180b are generated.
- the cut-out center is shifted independently by the two video data, that is, in FIG. 6B, the right calibration point F ′ and the left calibration point G ′ are used as the cut-out centers.
- the allowable amount in the vertical direction or the allowable amount in the horizontal direction of the cutout range is limited to one video data.
- the vertical segmentation range of the partial video data 182b is limited by the left-eye video data 164b, and the partial video data 182a is segmented to match the size of the partial video data 182b.
- the size of the range is also limited, and as a result, the partial video data 182a and 182b become smaller.
- the clipping control unit 154 performs imaging for both of the two pieces of video data 164 (right-eye video data 164a and left-eye video data 164b). Since the center of the cutout is shifted by an amount corresponding to the difference of the deviation of the actual imaging axis from the reference axis between the units 120, the horizontal allowable amount and the vertical allowable amount of the cutout range are equal in the two video data 164. . Therefore, as compared with the case where the cutout center shown in FIG. 6B is shifted independently by the two video data 164, the interval between the cutout center and the range of the video data 164 can be secured to the maximum, and the cutout range is widened. It becomes possible.
- the storage control unit 156 causes the video storage unit 130 to store the partial video data cut out by the cut-out control unit 154 by the cut-out processing unit 126.
- the stereoscopic video imaging apparatus 100 can maintain the image quality of the cut out partial video data by appropriately cutting out the video data generated by the imaging unit.
- the stereoscopic image capturing apparatus 100 provides two image capturing units 120 to generate image data
- one subject is captured using an optical technique such as a mirror.
- the position information stored in the information holding unit 132 by the cutout control unit 154 indicates the cutout center and cutout range when the cutout processing unit 126 cuts out video data.
- the stereoscopic video imaging apparatus 300 that can further reduce the magnification difference between the generated right-eye video data and left-eye video data using an actual magnification error between the imaging units 120 will be described.
- FIG. 7 is a functional block diagram showing a schematic configuration of the stereoscopic video imaging apparatus 300 according to the second embodiment.
- the stereoscopic video imaging apparatus 300 includes an operation unit 106, two imaging units 120 (indicated by 120a and 120b in FIG. 7), a video buffer 122, a data processing unit 124, and a clipping unit.
- the processing unit 326, the video composition unit 128, the display unit 108, the video storage unit 130, the information holding unit 332, and the central control unit 334 are configured.
- the central control unit 334 also functions as an imaging control unit 150, a calibration value generation unit 352, a cutout control unit 354, and a storage control unit 156.
- the cutout processing unit 326 cuts out part of the video data after the video signal processing by the data processing unit 124 according to the control of the cutout control unit 354 described later, and performs imaging axis calibration processing, or enlargement / reduction After performing calibration processing, the image is output to the video composition unit 128.
- the imaging axis calibration process of the clipping processing unit 326 is substantially the same as the imaging axis calibration process of the first embodiment described above, and the detailed operation of the scaling calibration process will be described in detail later.
- the cutout processing unit 326 first performs the imaging axis calibration process, and then performs the enlargement / reduction calibration process.
- the order of the imaging axis calibration process and the scaling calibration process is not particularly limited.
- the information holding unit 332 includes an EEPROM, a PROM, a nonvolatile RAM, and the like.
- the information holding unit 332 holds ratio information generated by a calibration value generation unit 352, which will be described later, in advance at the time of factory shipment, for example.
- the ratio information is the size ratio of the same imaging target projected on the right-eye video data and the left-eye video data generated by the two imaging units 120a and 120b when set to a plurality of different magnifications. Is stored in advance in association with each magnification.
- the calibration value generation unit 352 of the central control unit 334 generates ratio information used when the cutout control unit 354 performs the enlargement / reduction calibration process, and causes the information holding unit 332 to hold the ratio information.
- FIG. 8 is an explanatory diagram for explaining a method of calculating ratio information by the calibration value generation unit 352.
- the two imaging units 120 including the imaging lens 104, the zoom lens 140, and the focus lens 142, and the driving unit 148 such as a lens driving motor. Therefore, there is a deviation between the magnification of the design value (hereinafter simply referred to as the set magnification) and the actual magnification in the video data (hereinafter simply referred to as the actual magnification).
- a difference hereinafter referred to as a dimensional difference
- the calibration value generation unit 352 calculates ratio information for reducing a dimensional difference between the two imaging units 120 at a predetermined set magnification.
- the reference axes of the imaging units 120a and 120b are set at equal distances R from the respective imaging units 120a and 120b.
- An index 362 is installed so as to be perpendicular to 160 a and 160 b, and the index 362 is imaged for each imaging unit 120.
- a difference in angle between the imaging unit 120 and the index 362 due to rotation about the reference axes 160a and 160b or rotation about an axis orthogonal to the reference axes 160a and 160b. Can be absorbed, and the dimensional difference between the imaging units 120 can be calculated with high accuracy.
- the index 362 is installed on the reference axes 160a and 160b, the index 362 is not limited to this.
- the index 362 uses a perfect circle, but the index 362 is not limited to this, and any index can be used as long as the dimensional ratio of the same imaging target projected on the right-eye video data and the left-eye video data can be calculated. There may be.
- the calibration value generation unit 352 calculates the ratio ⁇ / ⁇ of the index diameter ⁇ projected onto the right-eye video data and the index diameter ⁇ projected onto the left-eye video data as a dimensional ratio. Further, the calibration value generation unit 352 calculates the dimensional ratio of the image data captured by changing the set magnification to a plurality of values, for example, 3 to 4 levels, and captures ratio information indicating the dimensional ratio. The information is held in the information holding unit 332 in association with the set magnification. Here, the calibration value generation unit 352 obtains ratio information about the set magnification for which the size ratio is not calculated by linearly interpolating the size ratios at a plurality of set magnifications held in the information holding unit 332. . Note that the interpolation method is not limited to linear interpolation, and may be other interpolation processing such as interpolation using a spline function.
- the calibration value generation unit 352 holds the ratio information in the information holding unit 332 in advance before shipping, and calibrates the video data using the ratio information, so that the stereoscopic video imaging apparatus is shipped at the time of shipment.
- the dimensional difference of the video data generated at 300 is approximately matched between the two imaging units 120.
- the cut-out control unit 354 obtains the current set magnification, and calculates the average value obtained by dividing the sum of the ratio ⁇ / ⁇ and 1 indicated by the ratio information associated with the current set magnification by 2, by the diameter. Multiplying the video data (right-eye video data) on which the index of ⁇ is projected and dividing the average value by the ratio ⁇ / ⁇ , the video data not multiplying the average value, that is, the video data on which the index of diameter ⁇ is projected Multiply (video data for left eye) by multiplying the average value (video data for right eye) and video data (video data for left eye) multiplied by the average value divided by the ratio.
- the cut-out processing unit 326 is controlled so as to cut out the partial video data having the same.
- FIG. 9 and 10 are explanatory diagrams for explaining the detailed operation of the enlargement / reduction calibration processing by the cutout processing unit 326.
- FIG. 9B the imaging target diameter ⁇ projected onto the right eye video data 364a is 1, and the imaging target diameter ⁇ projected onto the left eye video data 364b is 2.
- the actual information amount (pixels) around 3/4 of the partial video data 366 is lost, and the partial video data 366 is enlarged.
- the image quality of the partial video data 368 thus deteriorated.
- the actual information amount indicates an information amount when pixel interpolation is not performed.
- the enlarged partial video data 368 shown in FIG. 9A when the enlarged partial video data 368 shown in FIG. 9A is compared with the left-eye video data 364b, the enlarged partial video data 368 has an actual information amount reduced to 1/4, and the enlarged partial video data. Only 368 causes image quality degradation. Therefore, there is a difference in image quality between the partial video data 368 visually recognized by the right eye and the left-eye video data 364b (the actual information amount is 1 and 1/4), and finally the viewer perceives it as a stereoscopic video. If this happens, the image quality will be significantly degraded.
- the diameter ⁇ of the imaging target projected on the right-eye video data 364a and the diameter ⁇ of the imaging target projected on the left-eye video data 364b are made equal. Only the video data 364b is reduced to be reduced video data 370. In this case, since it is necessary to enlarge the image to a size necessary for display on the display unit 108 after the reduction, the actual information amount of the left-eye video data 364b is not reduced, but the right-eye video data 364a is reduced. The actual information amount of the cut out partial video data 366 becomes 1 ⁇ 4, and the image quality is deteriorated.
- the cutout processing unit 326 has a diameter ⁇ of the imaging target projected on the right-eye video data 364a as 1, and a diameter ⁇ of the imaging target projected on the left-eye video data 364b.
- the cutout processing unit 326 outputs the enlarged / reduced video data 372b as video data to be visually recognized by the left eye as it is to the video synthesis unit 128, and the enlarged / reduced video as video data to be visually recognized by the right eye.
- Partial video data 374a having the same size as the enlarged / reduced video data 372b is cut out from the data 372a and output to the video composition unit 128.
- the clipping control unit 354 apportions the dimensional ratio of the imaging target, and performs the enlargement / reduction calibration processing on both of the two video data (right-eye video data 364a and left-eye video data 364b).
- the cutout processing unit 326 By controlling the cutout processing unit 326, it is possible to eliminate the difference between the two image data of the image quality caused by enlarging only one of the image data, and by reducing only one of the image data. It is also possible to suppress a significant decrease in the amount of actual information that occurs. Therefore, the stereoscopic video imaging apparatus 300 according to the present embodiment can make the image quality uniform while suppressing a reduction in information between the video data, and can reduce image quality degradation as a final stereoscopic video. Become.
- the cutout processing unit 326 cuts out the partial video data after performing the scaling calibration process on the video data in accordance with the control of the cutout control unit 354.
- Partial video data may be cut out from the data, and then the scaling correction process may be performed.
- the imaging axis calibration process described in the first embodiment is not performed and the difference described in the second embodiment is described. Only the scaling calibration process can be performed.
- the present invention can be used for a stereoscopic video imaging apparatus that generates video for perceiving stereoscopic video.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Studio Devices (AREA)
- Stereoscopic And Panoramic Photography (AREA)
Abstract
Description
図1は、第1の実施形態にかかる立体映像撮像装置100の一例を示した外観図である。図1(a)は、立体映像撮像装置100としてビデオカメラを、図1(b)は、立体映像撮像装置100として所謂デジタルスチルカメラを示している。立体映像撮像装置100は、携帯性を有すものもあり、本体102と、2つの撮像レンズ104a、104bと、操作部106と、表示部(ビューファインダ)108を含んで構成される。
上述した第1の実施形態の立体映像撮像装置100では、切出処理部126が映像データを切り出す際の切り出し中心および切り出し範囲を、切出制御部154が情報保持部132に保持された位置情報に基づいて調整することで、基準軸と実撮像軸とのズレの撮像部120間の差を削減することが可能となった。ここでは、さらに撮像部120間の実際の倍率誤差を用いて、生成した右眼用映像データと左眼用映像データとの倍率差を削減することができる立体映像撮像装置300について説明する。
104 …撮像レンズ
110a、110b …撮像軸
120(120a、120b) …撮像部
126、326 …切出処理部
132、332 …情報保持部
140 …ズームレンズ(撮像対象を拡大および縮小が可能なレンズ)
142 …フォーカスレンズ
152、352 …校正値生成部
154、354 …切出制御部
Claims (4)
- 立体映像を知覚させるための両眼視差を有する右眼用映像データおよび左眼用映像データを生成する撮像部と、
前記右眼用映像データおよび左眼用映像データの映像中心点をそれぞれ原点とした場合の、前記右眼用映像データに投影された指標を、前記右眼用映像データに投影された指標および前記左眼用映像データに投影された指標を結ぶ結線の中点を始点とし前記映像中心点を終点とする差分ベクトル分移動した右校正点の位置と、前記左眼用映像データに投影された指標を前記差分ベクトル分移動した左校正点の位置とを示す位置情報を保持する情報保持部と、
前記位置情報に基づいて、前記右眼用映像データから前記右校正点を切出し中心とし、前記左眼用映像データから前記左校正点を切り出し中心として、大きさの等しい部分映像データをそれぞれ切り出す切出制御部と、
を備えることを特徴とする立体映像撮像装置。 - 前記撮像部は、前記右眼用映像データを生成するための撮像軸上、および前記左眼用映像データを生成するための撮像軸上に、撮像対象の拡大および縮小が可能なレンズをそれぞれ有し、
前記情報保持部は、前記レンズを相異なる複数の倍率に設定したときの位置情報を、それぞれの前記倍率に関連付けて保持し、
前記切出制御部は、現在の前記レンズの倍率を取得し、前記現在のレンズの倍率に関連付けられた位置情報に基づいて、前記右眼用映像データおよび左眼用映像データから大きさの等しい部分映像データをそれぞれ切り出すことを特徴とする請求項1に記載の立体映像撮像装置。 - 前記撮像部は、前記右眼用映像データを生成するための撮像軸上、および前記左眼用映像データを生成するための撮像軸上に、撮像対象の拡大および縮小が可能なレンズをそれぞれ有し、
前記情報保持部は、前記撮像部で生成される前記右眼用映像データおよび左眼用映像データに投影された同一の撮像対象の寸法比を示す比率情報を保持し、
前記切出制御部は、保持された前記比率情報に示される比率と1との和を2で除算して得た平均値を、前記右眼用映像データおよび左眼用映像データのいずれか一方に乗じ、前記平均値を前記比率で除した値を、前記平均値を乗じていない映像データに乗じて、前記平均値を乗じた映像データおよび前記平均値を前記比率で除した値を乗じた映像データから、大きさの等しい部分映像データをそれぞれ切り出すことを特徴とする請求項1に記載の立体映像撮像装置。 - 前記情報保持部は、前記レンズを相異なる複数の倍率に設定したときの比率情報を、それぞれの前記倍率に関連付けて保持し、
前記切出制御部は、現在の前記レンズの倍率を取得し、前記現在のレンズの倍率に関連付けられた比率情報に基づいて、前記平均値を乗じた映像データおよび前記平均値を前記比率で除した値を乗じた映像データから、大きさの等しい部分映像データをそれぞれ切り出すことを特徴とする請求項3に記載の立体映像撮像装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP11774688.3A EP2566166B1 (en) | 2010-04-28 | 2011-02-22 | Three-dimensional imaging device |
CN201180021323.3A CN102893616B (zh) | 2010-04-28 | 2011-02-22 | 立体影像拍摄装置 |
US13/643,593 US9154771B2 (en) | 2010-04-28 | 2011-02-22 | Apparatus for capturing stereoscopic image |
KR1020127027767A KR101385583B1 (ko) | 2010-04-28 | 2011-02-22 | 입체 영상 촬상 장치 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010-102965 | 2010-04-28 | ||
JP2010102965A JP5304721B2 (ja) | 2010-04-28 | 2010-04-28 | 立体映像撮像装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011135904A1 true WO2011135904A1 (ja) | 2011-11-03 |
Family
ID=44861225
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/053859 WO2011135904A1 (ja) | 2010-04-28 | 2011-02-22 | 立体映像撮像装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9154771B2 (ja) |
EP (1) | EP2566166B1 (ja) |
JP (1) | JP5304721B2 (ja) |
KR (1) | KR101385583B1 (ja) |
CN (1) | CN102893616B (ja) |
WO (1) | WO2011135904A1 (ja) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015054507A1 (en) | 2013-10-10 | 2015-04-16 | Pronutria, Inc. | Nutritive polypeptide production systems, and methods of manufacture and use thereof |
US20150103162A1 (en) * | 2013-10-14 | 2015-04-16 | Etron Technology, Inc. | System of quickly generating a relationship table of distance to disparity of a camera and related method thereof |
CN104581107B (zh) * | 2013-10-16 | 2018-07-20 | 深圳市云立方信息科技有限公司 | 一种对3d相机进行校正的装置及方法 |
JP2018207259A (ja) * | 2017-06-01 | 2018-12-27 | マクセル株式会社 | ステレオ撮像装置 |
JP2022168781A (ja) * | 2021-04-26 | 2022-11-08 | キヤノン株式会社 | 電子機器、その制御方法、プログラムおよび記憶媒体 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004126905A (ja) * | 2002-10-02 | 2004-04-22 | Honda Motor Co Ltd | 画像処理装置 |
JP2005072674A (ja) * | 2003-08-27 | 2005-03-17 | Sharp Corp | 三次元画像生成装置および三次元画像生成システム |
JP2006162991A (ja) | 2004-12-07 | 2006-06-22 | Fuji Photo Film Co Ltd | 立体画像撮影装置 |
JP2006276743A (ja) * | 2005-03-30 | 2006-10-12 | Fuji Photo Film Co Ltd | 撮像範囲調整システム、立体撮像装置、調整装置、及び撮像範囲調整方法 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7586534B2 (en) * | 2005-03-22 | 2009-09-08 | Fujifilm Corporation | Multi-eye image pickup device, adjusting method and device therefor, and image-area adjusting system and method |
KR100834637B1 (ko) * | 2006-11-27 | 2008-06-02 | 삼성전자주식회사 | 스테레오 카메라 장치에서 이미지들을 정렬하기 위한 장치및 방법 |
JP4794510B2 (ja) * | 2007-07-04 | 2011-10-19 | ソニー株式会社 | カメラシステムおよびカメラの取り付け誤差の補正方法 |
DE102008040985B4 (de) * | 2008-08-05 | 2021-05-27 | Robert Bosch Gmbh | Verfahren zur Kalibrierung eines Mehrkamerasystems |
-
2010
- 2010-04-28 JP JP2010102965A patent/JP5304721B2/ja active Active
-
2011
- 2011-02-22 CN CN201180021323.3A patent/CN102893616B/zh active Active
- 2011-02-22 EP EP11774688.3A patent/EP2566166B1/en active Active
- 2011-02-22 WO PCT/JP2011/053859 patent/WO2011135904A1/ja active Application Filing
- 2011-02-22 KR KR1020127027767A patent/KR101385583B1/ko active IP Right Grant
- 2011-02-22 US US13/643,593 patent/US9154771B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004126905A (ja) * | 2002-10-02 | 2004-04-22 | Honda Motor Co Ltd | 画像処理装置 |
JP2005072674A (ja) * | 2003-08-27 | 2005-03-17 | Sharp Corp | 三次元画像生成装置および三次元画像生成システム |
JP2006162991A (ja) | 2004-12-07 | 2006-06-22 | Fuji Photo Film Co Ltd | 立体画像撮影装置 |
JP2006276743A (ja) * | 2005-03-30 | 2006-10-12 | Fuji Photo Film Co Ltd | 撮像範囲調整システム、立体撮像装置、調整装置、及び撮像範囲調整方法 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2566166A4 |
Also Published As
Publication number | Publication date |
---|---|
US9154771B2 (en) | 2015-10-06 |
KR101385583B1 (ko) | 2014-04-15 |
CN102893616A (zh) | 2013-01-23 |
JP5304721B2 (ja) | 2013-10-02 |
EP2566166A4 (en) | 2014-06-11 |
KR20120139838A (ko) | 2012-12-27 |
JP2011234151A (ja) | 2011-11-17 |
EP2566166B1 (en) | 2015-11-25 |
US20130038682A1 (en) | 2013-02-14 |
CN102893616B (zh) | 2015-06-17 |
EP2566166A1 (en) | 2013-03-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102165785B (zh) | 三维成像装置和方法 | |
JP5789793B2 (ja) | 3次元撮像装置、レンズ制御装置、およびプログラム | |
JP5258722B2 (ja) | 複眼カメラ及びその制御方法 | |
KR101824439B1 (ko) | 모바일 스테레오 카메라 장치 및 그 촬영방법 | |
US20120113278A1 (en) | Imaging apparatus, image processing apparatus, and image processing method, and program | |
JP5597525B2 (ja) | 立体映像撮像装置および立体映像撮像方法 | |
JP5166650B2 (ja) | 立体撮像装置、画像再生装置及び編集ソフトウエア | |
WO2012017684A1 (ja) | レンズユニット | |
JP2012532341A (ja) | 中間像面における空間多重化を採用した立体視投影システム | |
US10045005B2 (en) | 3D camera module | |
JP5304721B2 (ja) | 立体映像撮像装置 | |
WO2012026502A1 (ja) | 立体撮影装置および立体撮影方法 | |
JP2012042912A (ja) | 立体画像撮像装置および立体画像撮像装置におけるレンズ駆動方法 | |
JP2011095431A (ja) | 立体映像撮像装置および立体映像撮像方法 | |
WO2007029686A1 (ja) | 立体画像記録再生システム | |
JP2013051637A (ja) | 符号化装置及び符号化方法 | |
JP2013105000A (ja) | 映像表示装置及び映像表示方法 | |
JP2011008034A (ja) | ステレオビューア装置 | |
JP2012235281A (ja) | 3d画像撮像装置及びプログラム | |
KR20110034737A (ko) | 멀티프레임 타입의 디지털 입체영상 카메라 및 그 촬영방법 | |
JP2005020079A (ja) | 撮像装置 | |
WO2004071101A1 (ja) | 立体映像記録再生装置 | |
JP5362157B1 (ja) | 立体視用映像撮影装置、および立体視用映像撮影方法 | |
US9041777B2 (en) | Stereo camera system and method for controlling convergence | |
JP2023102773A (ja) | 温度変化に対する視差光学素子のキャリブレーション装置及び方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180021323.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11774688 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20127027767 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011774688 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13643593 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |