WO2011114572A1 - 撮像装置、方法およびプログラム並びにこれに用いる記録媒体 - Google Patents
撮像装置、方法およびプログラム並びにこれに用いる記録媒体 Download PDFInfo
- Publication number
- WO2011114572A1 WO2011114572A1 PCT/JP2010/069464 JP2010069464W WO2011114572A1 WO 2011114572 A1 WO2011114572 A1 WO 2011114572A1 JP 2010069464 W JP2010069464 W JP 2010069464W WO 2011114572 A1 WO2011114572 A1 WO 2011114572A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- imaging
- size
- shake correction
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B5/02—Lateral adjustment of lens
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B5/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B5/08—Swing backs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/239—Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/246—Calibration of cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
Definitions
- the present invention relates to camera shake correction of a compound eye imaging apparatus.
- Patent Literature 1 a pair of images (stereo images) in which a difference corresponding to binocular parallax occurs is obtained by photographing light incident through a pair of objective lenses of digital binoculars with a pair of imaging elements. Then, after recognizing the geometric difference of the image structure corresponding to binocular parallax for a pair of images stored in the memory after various corrections, the difference other than the geometric difference of the recognized image structure Noise reduction processing for reducing (for example, random noise differences superimposed on a pair of images by the image sensor) is performed. Then, the image after the noise reduction process is displayed on the display device. The image displayed on the display device is visually recognized (stereoscopically) by the user via the eyepiece.
- an imaging unit that images a subject and generates image data of two left-eye imaging regions and a right-eye imaging region by a stereo adapter, a camera shake correction unit that calculates a position correction amount due to camera shake, and an imaging unit
- a zoom control unit that performs zoom control, a size determination unit, a position determination unit, a clipping unit that cuts out right and left image regions for generating appropriate stereoscopic vision, an enlargement / reduction unit, and a stereoscopic view from the left and right images
- a three-dimensional image capturing apparatus including a combining unit and a recording unit that combine left and right image regions for generating the image is disclosed.
- Patent Document 3 there are electronic, optical, and sensor (imaging device) shift methods for camera shake correction in an imaging apparatus.
- a subject is imaged using two or more cameras provided at different positions, and a plurality of images acquired by this (a reference image by a reference camera and a reference image by a reference camera) Search for corresponding points that are corresponding pixels between (stereo matching), calculate the difference in position (parallax) between the corresponding pixel on the reference image and the pixel on the reference image,
- a reference image by a reference camera and a reference image by a reference camera Search for corresponding points that are corresponding pixels between (stereo matching), calculate the difference in position (parallax) between the corresponding pixel on the reference image and the pixel on the reference image.
- it is possible to measure the distance from the base camera or the reference camera to a point on the subject corresponding to the pixel, and to generate a distance image representing the three-dimensional shape of the subject.
- stereo matching there are a plurality of points in the real space that are mapped to a certain pixel on the reference image.
- a corresponding point that is a pixel on the reference image corresponding to the pixel is searched.
- a correlation window including pixels to be searched for corresponding points is set on the reference image, and the same correlation window as that set for the reference image is moved on the reference image along the epipolar line.
- the correlation for each pixel in the correlation window on each image is calculated for each moving position, and the pixel at the center of the correlation window where the correlation on the reference image is equal to or greater than a predetermined threshold is We are looking for a corresponding point.
- the measurement target image in the paired image is always on a straight line corresponding to each other in the paired image even if the distance and the position are different as long as there is no optical distortion or the like.
- These straight lines are called epipolar lines, and optical distortions are corrected in advance so that they become straight lines.
- the correlation between the straight lines is taken and the positional relationship between the images obtained by the left and right cameras is calculated. To do. Further, if the camera is configured such that the epipolar lines are in the same horizontal position in both images, the amount of calculation is reduced in image processing, so that the processing is improved.
- the compound-eye camera is adjusted so that the center of the optical axis coincides with the horizontal direction, enabling stereoscopic viewing. Therefore, if camera shake correction is performed by each camera and a difference in correction amount occurs between the individual cameras, the initial center position of the lens shifts and stereoscopic viewing cannot be performed.
- the present invention enables stereoscopic viewing even if camera shake correction is performed with each individual camera of a compound eye camera.
- the present invention provides a plurality of imaging units that capture subject images from different viewpoints, a shake detection unit that detects each shake of the imaging unit, and each imaging unit based on the shake of each imaging unit detected by the shake detection unit.
- An image pickup apparatus including a shake correction unit that corrects a shake of a captured subject image, and a size determination unit that determines a cut-out size for cutting out an output image from images respectively acquired from a plurality of image pickup units.
- a cutout size having a predetermined aspect ratio common to a plurality of images acquired from each of a plurality of imaging units is determined based on a candidate region having a minimum size among the sizes of candidate regions centered on the axis center.
- Sizing And a cutout unit that cuts out an output image from each of the plurality of images at a common cutout size determined by the size determining unit with reference to the initial optical axis center before shake correction of each of the plurality of imaging units An imaging apparatus comprising:
- the present invention also provides a plurality of imaging units that capture subject images from different viewpoints, a shake detection unit that detects each shake of the imaging unit, and each imaging based on the shake of each imaging unit detected by the shake detection unit. And a shake correction unit that corrects a shake of a subject image captured by the imaging unit, wherein the size determination unit determines a cut-out size for cutting out an output image from images respectively acquired from the plurality of image pickup units. Then, after determining the extraction candidate area centered on the initial optical axis center included in the invariant imaging area that does not depend on the shake correction of the shake correction unit for each of the plurality of imaging units, for each of the plurality of imaging units, A size determination unit that determines a cut-out size having a predetermined aspect ratio common to images from each of the plurality of image pickup units based on a minimum value of the size of the cut-out candidate regions corresponding to the plurality of image pickup units; A cut-out unit that cuts out an output image from each of the plurality of image pickup units
- the size determination unit includes a plurality of imaging units based on a common area between two different imaging regions that are displaced to the maximum in the vertical direction and / or the horizontal direction by shake correction of the shake correction unit for each of the plurality of imaging units.
- An invariable imaging area is determined every time.
- the size determining unit is provided between two different imaging regions that are maximally displaced in the vertical direction and / or the horizontal direction obtained by the shake correction performed by the shake correction unit at least twice for each of the plurality of imaging units.
- the common area is determined for each of the plurality of imaging units, and the common area determined for each of the plurality of imaging units is set as an invariable imaging area corresponding to each imaging unit.
- the size determination unit includes a plurality of imaging units based on a common area between two different imaging regions that are displaced to the maximum in the vertical direction and / or the horizontal direction by shake correction of the shake correction unit for each of the plurality of imaging units.
- An invariable imaging area is determined every time.
- the size determining unit is provided between two different imaging regions that are maximally displaced in the vertical direction and / or the horizontal direction obtained by the shake correction performed by the shake correction unit at least twice for each of the plurality of imaging units.
- the common area is determined for each of the plurality of imaging units, and the common area determined for each of the plurality of imaging units is set as an invariable imaging area corresponding to each imaging unit.
- the imaging device uses the image of the effective pixel region of the imaging unit corresponding to the complementation target region as a complement target.
- An image complementing unit that complements the region is provided.
- the imaging device when the output image cut out by the cutout unit has a complementation target region that exceeds the specified cutout range of the imaging unit, the imaging device includes a color complementation unit that complements the complementation target region with a predetermined color. Prepare.
- the imaging apparatus includes a panoramic image creation unit that creates a panoramic image by synthesizing each image based on the initial optical axis center of each output image cut out by the cutout unit.
- the imaging apparatus sets an epipolar line based on the initial optical axis center of each output image cut out by the cutout unit, and calculates stereo correlation by calculating a correlation between the output images along the epipolar line.
- a stereo matching operation unit is provided.
- the imaging apparatus includes a storage unit that stores each image from each of the plurality of imaging units in association with the initial optical axis center position and the cutout size of each image.
- the imaging apparatus includes a storage unit that stores the output images corresponding to the images acquired from each of the plurality of imaging units at the same shooting time in association with each other in order of shooting time series.
- the imaging apparatus includes a storage unit that stores the coordinates of the complementation target region of the output image and the identification information of the output image having the smallest area of the complementation target region in association with the output image.
- the imaging apparatus includes a parallax adjustment unit that determines the cutout position of the output image so that the parallax between the output images becomes a predetermined amount of parallax while maintaining the cutout size of the output image.
- the imaging apparatus includes an output unit that outputs a planar image or a stereoscopic image based on the image cut out by the cutout unit.
- the imaging apparatus includes a designation unit that receives designation of an enlargement position, and the cutout unit cuts out the output image when the enlargement position accepted by the designation unit reaches a boundary for cutting out the output image from the image. Change the position according to the enlarged position.
- the imaging apparatus includes a planar image output unit that outputs an image having the smallest area of the complement target region as a planar image based on the identification information stored in the storage unit.
- the imaging apparatus outputs a planar image or a three-dimensional image based on a color complementing unit that complements a complement target area stored in the storage unit with a predetermined color, and an image in which the color complementing unit complements the color.
- a color complementing unit that complements a complement target area stored in the storage unit with a predetermined color
- an image in which the color complementing unit complements the color is output.
- the imaging apparatus cuts out each output image on the basis of the initial optical axis center position and cutout size corresponding to each image stored in the storage unit, and then combines the output images into a panoramic image.
- a panoramic image creation unit is provided.
- the imaging apparatus cuts out each output image on the basis of the initial optical axis center position and cutout size corresponding to each image stored in the storage unit, and then sets each output image on the basis of the initial optical axis center.
- a stereo matching calculation unit that performs stereo matching by setting an epipolar line and calculating the correlation of each output image along the epipolar line is provided.
- the present invention provides a plurality of imaging units that capture subject images from different viewpoints, a shake detection unit that detects each shake of the imaging unit, and each imaging unit based on the shake of each imaging unit detected by the shake detection unit.
- An imaging method executed by an imaging apparatus including a shake correction unit that corrects shake of a captured subject image, and determines a cut-out size for cutting out an output image from images respectively acquired from a plurality of image pickup units Therefore, an initial area included in a common area between a specified imaging area based on the initial optical axis center before shake correction of each of the plurality of imaging units and an imaging area after shake correction of each of the plurality of imaging units
- a cutout size having a predetermined aspect ratio common to a plurality of images acquired from each of a plurality of imaging units is determined based on a candidate area having a minimum size among the sizes of candidate areas centered on the optical axis center.
- the present invention also provides a plurality of imaging units that capture subject images from different viewpoints, a shake detection unit that detects each shake of the imaging unit, and each imaging based on the shake of each imaging unit detected by the shake detection unit.
- An image pickup method executed by an image pickup apparatus including a shake correction unit that corrects a shake of a subject image shot by the image pickup unit, and a cut-out size for cutting out an output image from images respectively acquired from a plurality of image pickup units In order to determine the cutout candidate area centered on the initial optical axis center included in the invariant imaging area that does not depend on the shake correction of the shake correction section for each of the plurality of imaging sections, for each of the plurality of imaging sections.
- a program for causing the imaging apparatus to execute this imaging method is also included in the present invention.
- a recording medium on which a computer-readable code of the program is recorded is also included in the present invention.
- various magneto-optical recording media such as a semiconductor memory, a hard disk, and a CD / DVD can be used.
- clipping having a predetermined aspect ratio common to a plurality of images acquired from each of a plurality of imaging units based on a minimum size among the sizes of candidate regions centered on the initial optical axis center
- the size is determined, and an output image is cut out from each of the plurality of images with the common cut-out size determined by the size determination unit with reference to the initial optical axis center before shake correction of each of the plurality of imaging units.
- extraction based on the minimum value of the size of the extraction candidate region that does not depend on shake correction corresponding to the plurality of imaging units, extraction having a predetermined aspect ratio common to the images from the plurality of imaging units The size is determined, and the output image is cut out from each of the plurality of image pickup units with the common cut-out size determined by the size determining unit with reference to the initial optical axis center before shake correction of each of the plurality of image pickup units.
- FIG. 1A is a block diagram of an imaging apparatus according to the first embodiment
- FIG. 1B is another block diagram of the imaging apparatus according to the first embodiment
- 2A is a front view of the imaging device
- 2B is a rear view of the imaging device
- FIG. 3 is a flowchart of processing according to the first embodiment
- FIG. 4 is a diagram showing an example of the i-th image data and the i-th viewpoint image according to the first embodiment
- FIG. 5A is a block diagram of an imaging apparatus according to the second embodiment
- FIG. 5B is another block diagram of the imaging apparatus according to the second embodiment
- FIG. 6 is a flowchart of processing according to the second embodiment
- FIG. 7 is a diagram showing an example of the i-th image data and the i-th viewpoint image according to the second embodiment
- FIG. 8 is a diagram illustrating a state in which a clipping region Rout that is convenient for processing is acquired from the effective pixel region RA
- FIG. 9A is a block diagram of an imaging apparatus according to the third embodiment
- FIG. 9B is another block diagram of the imaging apparatus according to the third embodiment
- FIG. 10 is a flowchart of the process according to the third embodiment
- FIG. 11A is a diagram showing an example of a complement target area of the first and second viewpoint images
- FIG. 11B is a diagram showing another example of the complement target area of the first and second viewpoint images
- FIG. 12A is a block diagram of an imaging apparatus according to the fourth embodiment
- FIG. 12B is another block diagram of the imaging apparatus according to the fourth embodiment;
- FIG. 13 is a flowchart of processing according to the fourth embodiment;
- FIG. 14 is a diagram showing an example in which the complement target area is filled;
- FIG. 15A is a block diagram of an imaging apparatus according to the fifth embodiment;
- FIG. 15B is another block diagram of the imaging apparatus according to the fifth embodiment;
- FIG. 16 is a flowchart of processing according to the fifth embodiment;
- FIG. 17 shows an example of a panoramic image;
- FIG. 18A is a block diagram of an imaging apparatus according to the sixth embodiment;
- FIG. 18B is another block diagram of the imaging apparatus according to the sixth embodiment;
- FIG. 19 is a flowchart of processing according to the sixth embodiment;
- FIG. 20 is a diagram schematically showing a stereo matching operation;
- FIG. 21A is a block diagram of an imaging apparatus according to the seventh embodiment
- FIG. 21B is another block diagram of the imaging apparatus according to the seventh embodiment
- FIG. 22 is a flowchart of processing according to the seventh embodiment
- FIG. 23A is a diagram illustrating an example of a method of associating an image with various types of information
- FIG. 23B is a diagram showing another example of a method for associating an image with various types of information
- FIG. 24A is a block diagram of an imaging apparatus according to the eighth embodiment
- FIG. 24B is a block diagram of an imaging apparatus according to the eighth embodiment
- FIG. 25 is a flowchart of processing according to the eighth embodiment
- FIG. 26A is a diagram showing an example of a method for associating an image with various types of information
- FIG. 26B is a diagram illustrating an example of a method of associating an image with various types of information
- FIG. 27 is a flowchart of processing according to the ninth embodiment
- FIG. 28A is a diagram showing an example of a method for associating images
- FIG. 28B is a diagram illustrating an example of a method for associating images
- FIG. 29A is a block diagram of an imaging apparatus according to the tenth embodiment
- FIG. 29B is a block diagram of an imaging apparatus according to the tenth embodiment
- FIG. 30 is a flowchart of processing according to the tenth embodiment
- FIG. 31 is a diagram showing an example of a parallax correction button;
- FIG. 32A is a diagram schematically showing a state of parallax correction of a stereoscopic image
- FIG. 32B is another diagram schematically showing a state of parallax correction of a stereoscopic image
- FIG. 32C is another diagram schematically showing the state of parallax correction of a stereoscopic image
- FIG. 32D is another diagram schematically showing a state of parallax correction of a stereoscopic image
- FIG. 33A is a block diagram of an imaging apparatus according to the eleventh embodiment
- FIG. 33B is another block diagram of the imaging apparatus according to the eleventh embodiment
- FIG. 34 is a flowchart of processing according to the eleventh embodiment
- FIG. 35A is a diagram showing a display example of a 3D image
- FIG. 35B shows a display example of a 2D image
- FIG. 36A is a diagram showing a display example of a viewpoint image
- FIG. 36B is a diagram showing a display example of a 2D image
- FIG. 37A is a block diagram of an imaging apparatus according to the twelfth embodiment
- FIG. 37B is another block diagram of the imaging apparatus according to the twelfth embodiment
- FIG. 38 is a flowchart of processing according to the twelfth embodiment
- FIG. 39A is a diagram showing an example of the cutout position of the enlarged region
- FIG. 39B is another diagram showing an example of the cutout position of the enlarged region
- FIG. 39C is another diagram showing an example of the cutout position of the enlarged region
- FIG. 39A is a diagram showing an example of the cutout position of the enlarged region
- FIG. 39B is another diagram showing an example of the cutout position of the enlarged region
- FIG. 39C is another diagram showing an example of the cutout position
- FIG. 40A is a diagram showing a display example of an enlarged region
- FIG. 40B is another view showing a display example of the enlarged region
- FIG. 40C is another view showing a display example of the enlarged region
- FIG. 41A is a block diagram of an imaging apparatus according to the thirteenth embodiment
- FIG. 41B is another block diagram of the imaging apparatus according to the thirteenth embodiment
- FIG. 42 is a flowchart of processing according to the thirteenth embodiment
- FIG. 43A is a diagram showing another example of image data having the smallest number of pixels in the complement target region
- FIG. 43B is a diagram showing another example of image data in which the number of pixels in the complement target area is minimum
- FIG. 43C is a diagram showing another example of image data having the smallest number of pixels in the complement target region;
- FIG. 43D is a diagram showing another example of image data having the smallest number of pixels in the complement target region;
- FIG. 44 is a flowchart of processing according to the fourteenth embodiment;
- FIG. 45A is a diagram showing an example of a non-pixel area and filling;
- FIG. 45B is another diagram showing an example of a non-pixel area and filling;
- FIG. 45C is another diagram showing an example of a non-pixel area and filling;
- FIG. 45D is another diagram showing an example of a non-pixel area and filling;
- FIG. 46 is a flowchart of processing according to the fifteenth embodiment;
- FIG. 47 shows an example of search range correction;
- FIG. 47 shows an example of search range correction;
- FIG. 48 shows an example of a panoramic image
- FIG. 49 is a flowchart of processing according to the sixteenth embodiment
- FIG. 50A is a diagram schematically illustrating stereo matching according to the sixteenth embodiment
- FIG. 50B is another view schematically showing stereo matching according to the sixteenth embodiment
- FIG. 50C is another diagram schematically illustrating stereo matching according to the sixteenth embodiment.
- FIG. 1A is a schematic block diagram of an image pickup apparatus 10a including an image pickup element shift type camera shake correction control unit according to the first embodiment of the present invention
- FIG. 1B is an optical camera shake correction control according to the first embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b of FIGS. 1A and 1B are assigned the same numbers except for branch numbers a and b. Hereinafter, blocks having the same numbers are assigned. Each explanation will be summarized.
- the camera control unit 40 is composed of a CPU and the like, and comprehensively controls the operation of the entire imaging apparatus 10a.
- the camera control unit 40 includes n (n is an integer of 2 or more) imaging units 3, a camera shake correction control unit 15, a zoom button unit 16, an operation unit 17, a monitor unit 18, a recording control unit 19, a recording medium 20,
- n is an integer of 2 or more
- the ROM 41 stores various programs executed by the camera control unit 40, such as a program for executing the imaging method according to the present invention, and the RAM 42 serves as a buffer when the program is executed.
- the ROM 41 may be a rewritable nonvolatile recording medium such as a flash memory.
- a program for executing the imaging method according to the present invention a program recorded in advance on a recording medium such as a hard disk or a CD / DVD or a server on a network may be read into the imaging apparatus and used.
- the image signal processing unit 14-i is included.
- the first to n imaging units 11-1 to 11-n have the same configuration.
- the lens 11 is fixed in the lens barrel and includes a variable power lens and a focus lens.
- the camera control unit 40 controls driving means such as a camera lens motor in response to an input operation of tele or wide zoom direction information to the zoom button unit 16 (however, a ring-shaped operation member may be used instead of a button).
- the double lens is moved along the lens optical axis direction to the tele side (feeding side) / wide side (retracting side) to change the focal length (shooting magnification).
- the focus lens of the lens 11 is moved along the lens optical axis, and focus adjustment is performed.
- the position of the focus lens is automatically adjusted so that the focus lens does not shift as the zoom lens moves.
- the image sensor 12 receives the subject light imaged by the lens 11 and accumulates photocharges corresponding to the amount of received light in the light receiving element.
- the image sensor 12 is controlled in the photocharge accumulation / transfer operation by a timing signal (clock pulse) input from a timing generator (not shown), and acquires an image signal for one screen every predetermined period in the photographing mode,
- the data are sequentially input to a correlated double sampling circuit (CDS) (not shown).
- CDS correlated double sampling circuit
- the correlated double sampling circuit receives an image signal for one screen input from the image sensor 12 and an amplifier (not shown) for R, G, B image data corresponding to the amount of accumulated charges of each light receiving element accurately.
- the AMP amplifies the input image data and inputs it to the A / D converter 13.
- the A / D converter 13 converts the input image data from analog to digital.
- n 2
- the image pickup signal of the first image sensor 23 can be the first image data (right-eye image data) via the CDS, AMP, and A / D converter 13.
- the i-th image that is image data output from each of the A / D converters 13-i is input to the image signal processing unit 14-i.
- the image signal processing unit 14-i performs various image processing such as gradation conversion, white balance correction, and ⁇ correction processing on each image data.
- the i-th image data output from the image signal processing unit 14-i is input to the frame memory 43.
- the frame memory 43 is a working memory that temporarily stores i-th image data.
- the stereoscopic image processing circuit 455 synthesizes the i-th viewpoint image cut out from the i-th image data stored in the frame memory 43 with the stereoscopic image data for the monitor unit 18 to perform stereoscopic display.
- the monitor unit 18 displays the stereoscopic image data synthesized by the stereoscopic image processing circuit 455 as a through image (a stereoscopic image that is continuously updated; hereinafter, sometimes referred to as a through stereoscopic image).
- the recording control unit 19 performs compression processing on the i-th image data or the i-th viewpoint image stored in the frame memory 43 in a compression format such as the JPEG method.
- the recording control unit 19 records each compressed image data on a recording medium 20 such as a memory card.
- the recording control unit 19 reads the i-th image data recorded on the recording medium 20 and performs an expansion process. Store in the frame memory 43.
- the i-th image data stored in the frame memory 43 is converted into stereoscopic image data by the stereoscopic image processing circuit 455 and then reproduced and displayed on the monitor unit 18.
- the monitor unit 18 includes a parallax barrier display layer on the surface thereof.
- the monitor unit 18 generates a parallax barrier having a pattern in which light transmitting portions and light shielding portions are alternately arranged at a predetermined pitch on the parallax barrier display layer, and displays left and right images on the lower image display surface. By displaying the strip-like image fragments shown alternately arranged, it is possible to make the observer feel the stereoscopic effect of the image.
- the monitor unit 18 can also display a two-dimensional image to the observer by outputting the same i-th image data acquired from the desired i-th imaging unit 11 to the frame memory 43.
- the cut-out position / size determining unit 51 can be configured by an arithmetic processing unit such as a CPU, and the initial optical axis center position memory unit 52 can be configured by a storage medium such as a ROM.
- the shake correction control unit 15-ia of the imaging apparatus 10a shown in FIG. 1A includes a drive unit and a vibration detection unit corresponding to the image sensor 12-i.
- the drive unit can be composed of a plunger, a piezoelectric element, or the like.
- the vibration detection unit can be configured by a gyro sensor, an acceleration sensor, a speed sensor, or the like that detects the amount and direction of vibration generated in a three-dimensional direction.
- the shake correction control unit 15-ia controls the drive unit so that the image sensor 12-i is parallel to the imaging surface of the image sensor 12-i orthogonal to the optical axis of the lens 11-ia. In the XY plane, the shake is corrected by swinging the image sensor 12-i so as to cancel the shake of each imaging unit 3-i whose amount and direction are detected by the vibration detection unit.
- the 1B includes a vibration detecting unit, a correction optical system (anti-vibration lens) for correcting camera shake, and a driving unit thereof.
- the anti-vibration lens is movably supported in an XY plane parallel to the imaging surface of the image sensor 12 orthogonal to the imaging optical axis of the corresponding lens 11-ib.
- the camera shake correction control unit 15-ib includes a vibration detection unit corresponding to the image sensor 12-i.
- the camera shake correction control unit 15-ib drives the correction optical system by the drive unit so as to cancel out the shake of each imaging unit 3-i whose amount and direction are detected by the vibration detection unit, so that the image sensor 12- The camera shake on the imaging surface at i is prevented.
- FIG. 3 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the first embodiment.
- a program according to the first embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40a or 40b may be collectively represented by the camera control unit 40, but the execution subject of the following processing is the camera control unit 40a in the case of the imaging device 10a, and in the case of the imaging device 10b. Is the camera control unit 40b.
- the control targets of the camera control unit 40a and the camera control unit 40b are a block of the imaging device 10a and a block of the imaging device 10b, respectively.
- the camera control unit 40 performs normal shake correction in response to an instruction to set the shooting mode from the operation unit 17.
- the shake is corrected by swinging the image sensor 12-i or the anti-vibration lens so as to cancel the shake of each of the imaging units 3-1 to n detected in the amount and direction by the vibration detection unit. .
- the camera control unit 40 captures the multi-view i-th image data output from each imaging unit 3-i during normal shake correction into the frame memory 43.
- the camera control unit 40 reads the initial optical axis center position of the lens 11-i stored in the initial optical axis center position memory unit 52. Then, the camera control unit 40, based on the initial optical axis center position, the shake correction amount and the correction direction by the drive unit, in the i-th image data after shake correction corresponding to the initial optical axis center position before shake correction. A corrected position, which is a pixel position, is calculated. It is assumed that the camera control unit 40 converts the shake correction amount and correction direction by the drive unit into pixel units on the XY plane of the image sensor 12-i.
- the camera control unit 40 has the aspect ratio (x: y) of the monitor unit 18 centered on the corrected optical axis center position and stored in the ROM 41, and has an image pickup pixel area when there is no shake. And the largest common area included in both of the i-th image data after shake correction are set as candidate areas. Then, the camera control unit 40 calculates the length of the perpendicular line from the corrected optical axis center position to each side in the outer edge XY direction of the candidate area, and is the minimum value among the distances to each side in the XY direction. A certain shortest distance Li is calculated.
- the camera control unit 40 determines whether or not the shortest distance Li is obtained from each i-th image data. If it is determined that the shortest distance Li is not obtained from each i-th image data, ST4 is repeated. . When determining that the shortest distance Li is obtained from each i-th image data, the camera control unit 40 obtains the minimum value Lmin from each shortest distance Li.
- the camera control unit 40 reads the aspect ratio (x: y) of the monitor unit 18 from the ROM 41.
- the camera control unit 40 determines that the minimum value Lmin obtained in ST5 is the distance from the corrected position to the horizontal side parallel to the X direction, or the distance from the corrected position to the vertical side parallel to the Y direction. It is judged whether it is. If Lmin is a distance to the horizontal side, the process proceeds to ST8, and if Lmin is a distance to the vertical side, the process proceeds to ST8.1.
- the camera control unit 40 controls the cutout position / size determination unit 51 so as to calculate the cutout size of the stereoscopic display image. That is, the cutout position / size determining unit 51 sets the cutout size in the x direction to 2 ⁇ Lmin and the cutout size in the y direction to (y / x) ⁇ (2 ⁇ Lmin).
- the camera control unit 40 controls the cutout position / size determination unit 51 so as to calculate the cutout size of the stereoscopic display image. That is, the cutout position / size determining unit 51 sets the cutout size in the x direction to (x / y) ⁇ (2 ⁇ Lmin) and the cutout size in the y direction to 2 ⁇ Lmin.
- the camera control unit 40 obtains the i-th viewpoint image by cutting out from the i-th image data the rectangular region having the initial optical axis center position as the center and the size calculated in ST8 or 8.1.
- the stereoscopic image processing circuit 455 outputs a stereoscopic image to the monitor unit 18 based on the i-th viewpoint image.
- the processing of ST1 to ST10 is repeated by an imaging instruction by pressing the shutter button or the like.
- An image continuously output to the monitor unit 18 by repeating the processes of ST1 to ST10 until the shutter button is pressed is referred to as a through image or a live view.
- Fi-1 is a real space
- Fi-2 is an imaging pixel area without shake
- Fi-3 is an imaging pixel area after shake correction
- Fi-4 is an i-th viewpoint cut out based on the initial optical axis center position. Images are shown.
- the XY coordinate system is determined based on the real space Fi-1, where X represents the horizontal direction and Y represents the vertical direction.
- the shake correction amount in the X direction between F1-1 and F1-3 is x'R pixel
- the shake correction amount in the Y direction between F1-1 and F1-3 is y'R pixel.
- the aspect ratio of the monitor unit 18 is 3: 4.
- Lmin
- the cutout size in the x direction of the first and second image data is (8/3) ⁇ (2 ⁇ Lmin)
- the cutout size in the y direction is 2 ⁇ Lmin. Become.
- viewpoint images are cut out from a plurality of image data having different shake correction amounts with a common size and aspect ratio so that the position of the center of the optical axis is not displaced. For this reason, it is possible to generate a stereoscopic image with the same quality as before the shake correction after the shake correction.
- FIG. 5A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the second embodiment of the present invention
- FIG. 5B is an optical camera shake correction control according to the second embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the embodiment described above are given the same numbers except for branch numbers, and numbers other than these same numbers are given below. The block is explained.
- the imaging devices 10a / 10b include a cut-out size memory unit 53 configured with a rewritable storage medium such as a RAM.
- FIG. 6 shows a flowchart of correction processing executed by the imaging apparatus 10a or 10b according to the second embodiment.
- a program according to the second embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to an instruction to set the shooting mode from the operation unit 17.
- the camera control unit 40a or 40b controls to execute the shake correction by the corresponding camera shake correction control unit 15-ia or 40b twice (or more), and during each shake correction, Multi-view i-th image data output in synchronization from the imaging unit 3-i is taken into the frame memory 43.
- This shake correction is performed over the maximum drive range in the X direction and Y direction of each drive unit regardless of the shake detection of the vibration detection unit.
- the start and end timings of execution of the shake correction are arbitrary. For example, it may be performed at a predetermined time, such as until a predetermined time elapses after the imaging device 10 is activated.
- the camera control unit 40 reads the initial optical axis center position of the lens 11-ia or b stored in the initial optical axis center position memory unit 52.
- the camera control unit 40 reads the aspect ratio (x: y) of the monitor unit 18 from the ROM 41.
- the camera control unit 40 determines the extraction candidate area from the i-th image data after the shake correction for two times, the initial optical axis center position of the lens 11-ia or b, and the aspect ratio of the monitor unit 18. .
- the camera control unit 40 obtains the intersection of the edges of the outer edge of the i-th image data after shake correction for two times, and sets the common region Rc (i), which is a rectangular region having the intersection as a diagonal point, to the first region. Determined for each i-image data.
- the camera control unit 40 determines the largest rectangular region included in the common region Rc (i) and having the aspect ratio of the monitor unit 18 as a cutout candidate region of the i-th image data. Since the common area Rc (i) is a common part of different maximum driving ranges corresponding to different shake corrections, it is an invariable imaging area where image data can be obtained without any shake correction. That is, image data is obtained in the common region Rc (i) without depending on shake correction.
- the camera control unit 40a or 40b reads the initial optical axis center position of the lens 11-ia or b stored in the initial optical axis center position memory unit 52, respectively. Then, the camera control unit 40a or 40b calculates the initial optical axis center position and the distance to each side in the outer edge XY direction of the extraction candidate region, and calculates the shortest distance Li from the distance between the two.
- the camera control unit 40 determines whether or not the shortest distance Li is obtained from each segmentation candidate region. If it is determined that the shortest distance Li is not obtained, ST3 and 4 are repeated. When determining that the shortest distance Li is obtained, the camera control unit 40 obtains the minimum value Lmin from each shortest distance Li.
- the camera control unit 40 reads the aspect ratio (x: y) of the monitor unit 18 from the ROM 41.
- the camera control unit 40 determines that the minimum value Lmin is the distance from the initial optical axis center position to the horizontal side parallel to the X direction, or the distance from the initial optical axis center position to the vertical side parallel to the Y direction. It is judged whether it is. If it is a distance to the horizontal side, the process proceeds to ST8, and if it is a distance to the vertical side, the process proceeds to ST8.1.
- the camera control unit 40 calculates the cut-out size of the image for stereoscopic display. That is, the cutout size in the x direction is 2 ⁇ Lmin, and the cutout size in the y direction is (y / x) ⁇ (2 ⁇ Lmin).
- the camera control unit 40 calculates a cut-out size of an image for stereoscopic display. That is, the cutout size in the x direction is (x / y) ⁇ (2 ⁇ Lmin), and the cutout size in the y direction is 2 ⁇ Lmin.
- the camera control unit 40 stores in the cut-out size memory unit 53 a rectangular area having the initial optical axis center position as the center and the size calculated in ST8 or 8.1.
- the camera control unit 40 performs shake correction on the first to n imaging units 11-1 to 11-n.
- the shake correction here refers to normal shake correction, that is, the image sensor 12-i and the vibration-proof lens are shaken so as to cancel the shake of each imaging unit 3-1 to n whose amount and direction are detected by the vibration detection unit. It means to correct the shake by moving.
- the camera control unit 40 captures the multi-viewpoint first to n image data output from the imaging units 3-1 to n after the correction in synchronization into the frame memory 43.
- the camera control unit 40 obtains first to n viewpoint images by cutting out the rectangular areas having the center position and the size stored in the cut size memory unit 53 from the first to n image data after shake correction. Then, the stereoscopic image processing circuit 45 outputs a stereoscopic image to the monitor unit 18 based on the first to n viewpoint images. The processes of ST1 to ST10 are repeated until there is an imaging instruction or the shooting mode is canceled. As a result, through images based on sequentially captured image data are continuously displayed on the monitor unit 18.
- Fi-1 is the real space
- Fi-2 is the imaging pixel area when there is no shake
- Fi-4 is the initial stage.
- the i-th viewpoint image cut out based on the optical axis center position is shown.
- the XY coordinate system is determined based on the real space Fi-1.
- the amount of deviation (in pixels) in the X and Y directions between the imaging pixel area F1-3-1 after the first shake correction and the imaging pixel area F1-2 when there is no shake is expressed as (x ′ R1, y′R1).
- the amount of deviation (in pixels) in the X and Y directions between the imaging pixel area F1-3-2 after the second shake correction and the imaging pixel area F1-2 when there is no shake is expressed as (x ′ R2, y′R2).
- the X coordinate of the lower right diagonal point of the imaging pixel region F1-2 when there is no shake is xR0_max
- the Y coordinate of the upper left diagonal point of F1-2 is YR0_max.
- Lmin
- the lower left intersection coordinates of F1-3-1 and F1-3-2 are (x′R2, y′R1), and the upper right intersection coordinates are (xR0_max ⁇ x′R1, yR0_max ⁇ y′R2). ).
- the common region Rc (1) is determined with the two intersections as diagonal points.
- Lmin depends on changes in the magnitude of individual shake correction in ST1, and accordingly, the cut-out size of the i-th viewpoint image changes every time shake correction is performed. There is a possibility that the display pixel of the stereoscopic image changes.
- the cut-out size calculated based on Lmin determined by the shake correction performed in advance is stored in the size memory unit 53, and the first center having the stored common center position and size is stored. Since the i viewpoint image is output to the monitor unit 18, the display pixel of the through image does not change regardless of the magnitude of shake correction performed by the arbitrary i th imaging unit 3.
- control is performed so that shake correction is performed twice (or more), and the multi-view i-th image data output from each imaging unit 3-i during the shake correction is framed. Although it was captured in the memory 43, this is only performed to obtain the maximum drive range (maximum shake correction amount) in the vertical direction (Y direction) and the horizontal direction (X direction).
- the maximum drive range in the X direction and the Y direction for each i-th imaging unit 3-1-i is not a rewritable storage medium such as ROM 41 or ROM 41, but an EEPROM or flash memory. If it is stored in a rewritable nonvolatile storage medium such as the above, the common area Rc (i) for each i-th imaging unit 3-1-i is determined based on the value, and the i-th imaging unit 3- It is possible to determine a clipping candidate area for each 1-i. In this case, it is not essential to perform the shake correction twice.
- shake correction is performed once or twice at an arbitrary timing and period such as when the imaging device 10 is started, and the maximum drive range of each imaging unit 3-i obtained as a result can be rewritten.
- the common area Rc (i) for each i-th imaging unit 3-1-i is stored in the ROM 41, EEPROM, or the like, and the i-th imaging unit 3-1-i is based on the value.
- Each cutout candidate area may be determined.
- ⁇ Third Embodiment> As illustrated in FIG. 8, when an image I having the maximum size is read from all the effective pixel areas RA of the sensor 12 at the time of shooting, a video signal for sequentially outputting the images shot on the monitor unit 18 or the like is generated. It is necessary to perform signal processing at a considerably high speed, and the circuit scale increases. Therefore, in a normal camera, the default cutout area information TR is stored in advance in the ROM 41 or the like, and at the time of shooting, a cutout area Rout that is convenient for processing is selected and read out from the effective pixel area RA according to the default cutout area information. Take control.
- the cut-out size / position of the i-th viewpoint image calculated as in the second embodiment is stored in the cut-out size memory unit 53, and the i-th viewpoint image having the stored common center position and size is stored.
- the i-th viewpoint image may include a region without image information that does not fall within the default cropping range TR. Therefore, in the present embodiment, a portion having no image information in the i-th viewpoint image is captured from an effective pixel region outside the default clipping range TR and complemented.
- FIG. 9A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the third embodiment of the present invention
- FIG. 9B is an optical camera shake correction control according to the third embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a complementation target area calculation unit 70 and a region complementation unit 55 that are configured by a calculation device such as a CPU.
- FIG. 10 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the third embodiment.
- a program according to the third embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of shooting from the operation unit 17.
- the camera control unit 40 receives from the operation unit 17 whether or not the user sets the cutout size. If a selection for setting the cutout size by the user is accepted, the process proceeds to ST3.1. If a selection for not setting the cutout size by the user is accepted, the process proceeds to ST3.
- the camera control unit 40 sets the cutout size stored in the cutout size memory unit 53 in ST9 of the second embodiment as the cutout size of the i-th viewpoint image.
- the camera control unit 40 sets the i-th viewpoint image with the initial optical axis center position of the lens 11-ia or b stored in the initial optical axis center position memory unit 52 as the center of the cutout region. Is received from the operation unit 17.
- the camera control unit 40 determines a cutout candidate area from the initial optical axis center position of the lens 11-ia or b and the cutout size set in ST3 or ST3.1, and coordinates thereof are cut out in the cutout size memory. Store in the unit 53.
- ST5 to ST7 are the same as ST10 to ST12 of the second embodiment. That is, the camera control unit 40 cuts out the first to n viewpoint images by cutting out the rectangular areas having the center position and the size stored in the cut size memory unit 53 from the first to n image data after shake correction. obtain.
- the cut-out sources of the first to n viewpoint images are included in the default cut-out range of the first to n image data. That is, pixel information may be lost in a portion of the cutout region corresponding to the first to nth viewpoint images that is not included in the default cutout range. Therefore, the camera control unit 40 stores the i-th residual data, which is the remaining image data after the i-th viewpoint image is cut out from the i-th image data, in the frame memory 43 or the RAM 42.
- the camera control unit 40 determines, for each of the first to n-th viewpoint images, whether there is a complement target area that is an area without pixel information. This can be determined by whether or not the complement target area calculation unit 70 has extracted an area without color information from each i-th viewpoint image. If there is a region to be complemented for a certain i-th viewpoint image, the process proceeds to ST9. If not, the process proceeds to ST10 without proceeding to ST9.
- BL-1 ⁇ 2 shows an example of the complement target areas of the first and second viewpoint images, respectively.
- the camera control unit 40 cuts out an image area corresponding to the complement target area from the i-th residual data, and superimposes the cut-out image area on the complement target area of each i-th viewpoint image, so that there is no pixel information.
- the region complementing unit 55 is controlled so as to complement the above.
- 11B shows an example of the first and second viewpoint images in which the first and second residual data are combined in the complement target area, respectively.
- the stereoscopic image processing circuit 45 causes the camera control unit 40 to continuously output a through image to the monitor unit 18 based on the complemented i-th viewpoint image in a region having no pixel information.
- an area without pixel information may be generated in the viewpoint image.
- the region is complemented by the pixel information of the region that has been truncated at. Accordingly, it is possible to secure a surplus (margin) of the cutout range that can cope with a shift of the cutout range due to significant camera shake correction, and to improve the display resolution of the stereoscopic image.
- the center of the optical axis of the image before and after camera shake correction does not change, and a stereoscopic image can be output with the same quality as when there is no camera shake correction.
- FIG. 12A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the fourth embodiment of the present invention
- FIG. 12B is an optical camera shake correction control according to the fourth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a complement target area calculation unit 70 and a painting unit 56 that are configured by a calculation device such as a CPU.
- FIG. 13 shows a flowchart of correction processing executed by the imaging apparatus 10a or 10b according to the fourth embodiment.
- a program according to the fourth embodiment for causing the camera control unit 40a or 40b to execute this process is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of shooting from the operation unit 17.
- the camera control unit 40 receives from the operation unit 17 whether or not the user sets a cutout size and a fill color of a portion without pixel information. If a selection that the user performs the setting is accepted, the process proceeds to ST3.1. If a selection that the user does not perform the setting is accepted, the process proceeds to ST3.
- the camera control unit 40 sets the default fill color stored in advance in the ROM 41 as the fill color of the i-th viewpoint image.
- the camera control unit 40 receives a selection of a fill color of the complement target area from the operation unit 17.
- a color palette of color samples may be displayed on the monitor unit 18 and a desired color may be designated from among them.
- ST4 to ST8 are the same as ST4 to ST8 of the third embodiment.
- the camera control unit 40 superimposes the fill color set in ST3 or ST3.1 on the complement target area of each i-th viewpoint image, thereby filling the fill part 56 so as to complement the area without pixel information. Control.
- the camera control unit 40 controls the stereoscopic image processing circuit 455 to continuously output the through image to the monitor unit 18 based on each i-th viewpoint image on which the fill color is superimposed.
- FIG. 14 (a) is the default color or the selected color C1 of the complement target areas BL-1 and 2 of the first and second viewpoint images
- FIG. 14 (b) is the complement target area BL-1 and 2 of FIG.
- FIG. 14C shows an example in which the complement target region BL-1 ⁇ 2 is filled with the color C2.
- the area without pixel information is filled with a default color or an arbitrary color selected by the user. Accordingly, it is possible to secure a surplus (margin) of the cutout range that can cope with a shift of the cutout range due to significant camera shake correction, and to improve the display resolution of the stereoscopic image. Further, the center of the optical axis of the image before and after camera shake correction does not change, and a stereoscopic image can be output with the same quality as when there is no camera shake correction. In addition, since the filled area is displayed on the monitor unit 18, the user can recognize that the camera shake correction has been performed to the limit.
- the optical axis center positions of the i-th viewpoint images cut out in the first to fourth embodiments are aligned, particularly the optical axis center positions in the Y-axis direction, the optical axis center positions in the i-th viewpoint image Y-axis direction are the same. If the i-th viewpoint images are joined so that the horizontal panorama is aligned, a horizontally long panoramic image can be easily created.
- FIG. 15A is a schematic block diagram of an image pickup apparatus 10a including an image pickup device shift type camera shake correction control unit according to the fifth embodiment of the present invention
- FIG. 15B is an optical camera shake correction control according to the fifth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a panorama composition calculation unit 57 that is configured by a calculation device such as a CPU.
- FIG. 16 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the fifth embodiment.
- a program according to the fifth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the operation from the operation unit 17.
- the camera control unit 40 receives from the operation unit 17 a selection as to whether or not to create a panoramic image from the extracted i-th viewpoint image. If a selection to create a panoramic image from the cut-out i-th viewpoint image is accepted, the process proceeds to ST3. If a selection not to create a panoramic image from the cut-out i-th image data is received, the process proceeds to ST3.1.
- the camera control unit 40 determines the cut-out size and stores it in the cut-out size memory unit 53 as in the second to fourth embodiments.
- the camera control unit 40 performs camera shake correction and performs the i-th imaging unit 3a after camera shake correction in the same manner as ST10 to ST11 of the second embodiment or ST5 and 6 of the third and fourth embodiments.
- the i-th image data is output to the frame memory 43.
- ST5 to ST7 cut out the i-th viewpoint image from the i-th image data in the same manner as ST12 of the second embodiment or ST5 to ST7 of the third embodiment or the fourth embodiment.
- FIG. 17A shows an example of the first and second image data
- FIG. 17B shows an example of the first and second viewpoint images cut out from the first and second image data.
- the camera control unit 40 receives a selection for creating a panoramic image from the extracted i-th viewpoint image in ST2, the optical axis center positions in the Y-axis direction of the i-th viewpoint image extracted in ST7 are determined.
- the panorama synthesis calculation unit 57 is controlled so as to synthesize panoramic images by connecting the i-th viewpoint images so as to be aligned.
- the camera control unit 40 is arranged so that the subjects of the i-th image data acquired in ST3.1 are aligned.
- the panorama synthesis calculation unit 57 is controlled so as to synthesize the panorama image by connecting the viewpoint images.
- the panorama composition calculation unit 57 outputs the synthesized panorama image to the monitor unit 18.
- the i-th image data is acquired sequentially and continuously, the panorama image is also continuously output to the monitor unit 18 (through panorama image).
- (C) part of FIG. 17 shows an example of a panoramic image synthesized from the first and second viewpoint images.
- the optical axis center positions in the X-axis direction of the i-th viewpoint image cut out in the first to fourth embodiments are also aligned.
- the i-th viewpoint images are connected so that the optical axis center positions in the X-axis direction of the i-th viewpoint image are aligned, it is possible to easily create a vertically long panoramic image.
- the camera control unit 40 continuously outputs the panorama image synthesized by the panorama synthesis calculation unit 57 to the monitor unit 18 based on the sequentially acquired i-th viewpoint images.
- a panoramic image is created so that the optical axis centers of the images after camera shake correction are aligned, so the accuracy of the synthesis of the panorama image from the i-th viewpoint image after camera shake correction is not corrected for camera shake ( In other words, the accuracy of the panoramic image synthesis from the i-th image data can be made equivalent to that of the original i-th image data.
- the amount of calculation can be reduced.
- FIG. 18A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the sixth embodiment of the present invention
- FIG. 18B is an optical camera shake correction control according to the sixth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a stereo matching calculation unit 58 configured by a calculation device such as a CPU.
- the stereo matching performed by the stereo matching calculation unit 58 is the same as in Patent Documents 4 and 5. That is, the stereo matching calculation unit 58 epipolarizes the same correlation window as that set for the reference image (for example, the first viewpoint image or the first image data) on the reference image (for example, the second viewpoint image or the second image data). It moves along the line, calculates the correlation for each pixel in the correlation window on each image for each moving position, and is at the center position of the correlation window where the correlation on the reference image is greater than or equal to a predetermined threshold A pixel is obtained as a corresponding point of the pixel.
- FIG. 19 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the sixth embodiment.
- a program according to the sixth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of stereo matching from the operation unit 17.
- the camera control unit 40 receives from the operation unit 17 a selection as to whether or not to perform stereo matching from the clipped i-th viewpoint image. If a selection for performing stereo matching is received from the cut-out i-th viewpoint image, the process proceeds to ST3. If selection for performing stereo matching is received from uncut-out i-th image data, the process proceeds to ST3.1.
- ST3 to ST7 are the same as ST3 to ST7 of the fifth embodiment, respectively.
- the camera control unit 40 controls the stereo matching calculation unit 58 so as to perform stereo matching from the extracted i-th viewpoint image.
- the stereo matching calculation unit 58 first bases the i-th base (ibase is an arbitrary one of 1 to n) viewpoint image and the first iref (iref is 1 to n). (Iref ⁇ ibase), a line passing horizontally through the optical axis center coordinates of each viewpoint image is set as an epipolar line.
- the stereo matching calculation unit 58 moves the same correlation window as that set for the ibase viewpoint image on the iref viewpoint image along the epipolar line set above, and moves each image on each image for each moving position.
- the correlation for each pixel in the correlation window is calculated, and the pixel at the center position of the correlation window where the correlation on the reference image is equal to or greater than a predetermined threshold is obtained as the corresponding point of the pixel.
- the camera control unit 40 controls the stereo matching calculation unit 58 to perform stereo matching from the i-th image data. Since the shift of the optical axis position between the i-th viewpoint images due to camera shake is not corrected, the epipolar line for stereo matching is not always set along the optical axis.
- the camera control unit 40 applies the principle of triangulation to the position difference (parallax) between the pixel on the reference image corresponding to each other and the pixel on the reference image determined as a result of stereo matching,
- the distance from the reference camera or reference camera to a point on the subject corresponding to the pixel is measured, and the stereoscopic image processing circuit 455 is controlled to generate a distance image representing the three-dimensional shape of the subject.
- the stereoscopic image processing circuit 455 outputs the generated distance image to the monitor unit 18.
- FIG. 20 schematically shows a stereo matching operation when the base image is the second viewpoint image (left image) and the reference image is the first viewpoint image (right image).
- the part (a) in FIG. 20 is the relationship between the subject in real space, the camera, the optical axis, and the image
- the part (b) in FIG. 20 is the first and second image data before clipping
- part (d) in FIG. 20 is the epipolar line L passing horizontally through the optical axis center coordinates of the first and second viewpoint images
- part (e) in FIG. 20 is the epipolar line L.
- the correlation window is moved along the line to find the corresponding points.
- the viewpoint image cut out with the optical axis as the center is set as the object of stereo matching calculation. That is, since stereo matching is performed along an epipolar line that passes horizontally through the optical axis of each cutout image, the stereo matching calculation accuracy is improved as compared with the case of stereo matching for the i-th image data before cutout, The amount of calculation is also reduced. In addition, the stereo matching calculation accuracy can be equally ensured before and after camera shake correction.
- FIG. 21A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the seventh embodiment of the present invention
- FIG. 21B is an optical camera shake correction control according to the seventh embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include an association processing unit 59 that is configured by an arithmetic device such as a CPU.
- the association processing unit 59 associates image data after camera shake correction with various related information (minimum cut-out size of the viewpoint image, cut-out position coordinates of the viewpoint image, initial optical axis center position, post-correction position, etc.) and a predetermined recording medium. Save to 20.
- FIG. 22 shows a flowchart of the correction processing executed by the imaging device 10a or 10b according to the seventh embodiment.
- a program according to the seventh embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the photographing operation from the operation unit 17.
- ST2 to 6 are the same as ST3 to 7 in the sixth embodiment (FIG. 19). However, the processes in ST2 to ST6 are different from the sixth embodiment in that they are executed in response to an instruction to start the shooting operation.
- the displayed stereoscopic image is a stereoscopic image derived from the i-th viewpoint image acquired in response to an instruction to start the shooting operation.
- the camera control unit 40 determines whether or not the selection of whether to store the image and various information in association with each other is received from the operation unit 17. If a selection to store the image and various related information in association with each other is accepted, the process proceeds to ST9. If a selection not to store the image in association with various related information is received, the process proceeds to ST14.
- the camera control unit 40 determines whether or not the cutout size saved in ST3 is smaller than the minimum cutout size previously saved in the recording medium 20 such as a flash memory. If yes, go to ST10, if no, go to ST11. If the minimum cut-out size is not stored in the recording medium 20, it is determined as Yes and the process proceeds to ST10.
- the camera control unit 40 stores the cutout size saved in ST3 in the recording medium 20 as the minimum cutout size.
- the camera control unit 40 controls the association processing unit 59 to store the storage information including various related information in the RAM 42. Note that not only the minimum cutout size but also the cutout size of each i-th viewpoint image specified in ST2 may be stored together.
- the camera control unit 40 determines whether or not the selection of whether or not to end shooting has been received from the operation unit 17. When the selection to end the shooting is accepted, the process returns to ST13, and when the selection to end the shooting is not accepted, the process returns to ST4.
- the camera control unit 40 is an image group composed of frames of the first to nth image data acquired continuously and periodically between the instruction from the start of shooting to the end of shooting (moving video during playback). And may be handled as a continuous shot still image) and stored information in the RAM 42 in association with each other and stored in a predetermined recording medium 20. Note that the image group stored here may be handled as a moving image or a continuous shot still image during reproduction. When there is one frame, a still image is saved.
- the camera control unit 40 stores, in the RAM 42, an image group composed of the first to n-th image data acquired continuously and periodically during the instruction from the start of shooting to the end of shooting.
- the image group in the RAM 42 is recorded on the recording medium 20.
- the storage method is arbitrary.
- FIG. 23A and FIG. 23B show an example of a method of associating an image with various information.
- FIG. 23A shows a mode in which various information is written in the header of the moving image
- FIG. 23B shows a mode in which a file storing various information is stored together in the moving image storage folder.
- related information unique to each image is stored in the image storage folder.
- a related information file in which the minimum cutout size information is recorded for all the i-th image data corresponding to each shooting time, for example, a text file is stored.
- the related information file can be stored in another folder, but information indicating the relationship with the image storage folder needs to be stored in the related information file.
- the related information file may be stored in each of the folders that individually store the i-th image data constituting the frame.
- a specific method for associating an image with various types of information is not limited to the illustrated one.
- each frame is associated with the frame acquisition time or alternative information (frame acquisition time series order, ie frame number).
- the i-th image data before clipping is stored in the recording medium 20 in association with related information including the minimum clipping size of the viewpoint image, the clipping position coordinates of the viewpoint image, the optical axis center position, and the like.
- the related information and the i-th image data are read from the recording medium 20 by an information processing device such as a personal computer, and based on these, a stereoscopic image, a three-dimensional distance measurement, a panoramic image, or a planar image can be output.
- FIG. 24A shows a schematic block diagram of an image pickup apparatus 10a including an image pickup device shift type camera shake correction control unit according to the eighth embodiment of the present invention
- FIG. 24B shows an optical camera shake correction control according to the eighth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include an image association processing unit 60 that is configured by an arithmetic device such as a CPU.
- the image association processing unit 60 associates the i-th image data after camera shake correction with each other and stores them in a predetermined recording medium 20 (such as a hard disk or a memory card).
- FIG. 25 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the eighth embodiment.
- a program according to the eighth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- ST1-7 are the same as ST1-7 of the seventh embodiment (FIG. 22).
- the camera control unit 40 receives from the operation unit 17 a selection as to whether or not to store i-th image data corresponding to the same acquisition time. If the selection to save is accepted, the process proceeds to ST9, and if the selection to save is not accepted, the process proceeds to ST14.
- the camera control unit 40 saves, in a memory, the image-related information indicating the relevance between the i-th viewpoint images constituting the frame at the same shooting time point and the shooting time series order of each frame in association with the frame.
- the image association processing unit 60 is controlled to do so.
- ST12 to ST14 are the same as ST14 to ST16 of the seventh embodiment.
- FIG. 26A and FIG. 26B show an example of a method of associating an image group with various information.
- 26A shows a mode in which image-related information is written in the header of the frame
- FIG. 26B shows a mode in which a file storing the image-related information is stored together with the image group storage folder or in another folder.
- the image-related information is included in the incidental information (header, tag, etc.) of the file of the connected one image. (Information indicating the viewpoint of each image in the file, frame number, etc.) are recorded. Further, in the case where the i-th image data constituting the frame at the same acquisition time is individually recorded, information indicating a set of i-th image data corresponding to the same acquisition time and a recording unit including all the sets Image-related information for all i-th image data corresponding to each acquisition time is stored in the incidental information (file, folder, etc.).
- an image-related information file such as a text is stored in the image storage folder. Save the file.
- the related information file can be stored in another folder, but it is desirable to store information indicating the relationship with the image storage folder in the related information file.
- the image related information file may be stored in each of the folders that individually store the i-th viewpoint images constituting the frame. A specific method for associating images or associating images with image-related information is not limited to the illustrated one.
- each frame is associated with the frame acquisition time or alternative information (frame acquisition time series order, ie frame number).
- the image-related information indicating the viewpoint position of the i-th viewpoint image at the same acquisition time constituting each frame is stored in association with the frame and each i-th viewpoint image, so that the viewpoint position is confused. Therefore, it is possible to easily reproduce a three-dimensional image or a panoramic image or perform three-dimensional distance measurement based on the stored i-th image.
- FIG. 27 is a flowchart of the correction process executed by the imaging device 10a or 10b according to the ninth embodiment.
- a program according to the ninth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41. This process can be executed by the imaging device 10a or 10b in FIG.
- ST1 to ST7 are the same as in the seventh embodiment (FIG. 22).
- the camera control unit 40 selects whether to store the pixel-free region information for specifying the pixel-free region without pixel information in the i-th image data in association with the i-th image data. Accept from. If the selection to save is accepted, the process proceeds to ST9, and if the selection to save is not accepted, the process proceeds to ST13.
- the non-pixel region information includes the number and coordinates of the vertex of the polygon, and the number (branch number) of the imaging unit 3 that has the smallest number of pixels in such a region. including.
- the camera control unit 40 calculates the total number of pixels in the non-pixel area without pixel information for each imaging unit 3.
- An example of the non-pixel area is the same as the complement target areas BL-1 and BL-2 in FIG.
- the camera control unit 40 controls the association processing unit 59 to associate and store the i-th image data and the non-pixel area information in the recording medium 20.
- ST11 to ST12 and ST13 to ST15 are the same as ST12 to ST13 and ST14 to ST16 of the seventh embodiment, respectively.
- FIG. 28A and 28B show an example of a method for associating an image with various types of information.
- FIG. 28A shows a mode in which non-pixel area information is written in the header of the frame
- FIG. 28B shows a mode in which a file storing the non-pixel area information is stored together with the moving image storage folder or in another folder.
- a non-pixel area is included in the incidental information (header, tag, etc.) of the file of the connected one image. Record information. Further, when the i-th viewpoint images constituting the frames at the same acquisition time are recorded in individual files, the recording units (files, folders, etc.) including all sets of i-th viewpoint images corresponding to the same acquisition time are recorded. ), The non-pixel area information is stored for all the i-th viewpoint images corresponding to the respective acquisition times.
- a non-pixel area information file for example, Save the text file.
- the non-pixel area information file can be stored in another folder, but it is desirable to store information indicating the relationship with the image storage folder in the related information file.
- the non-pixel area information file may be stored in each of the folders that individually store the i-th image data constituting the frame. The specific method for associating the i-th image data with the image-related information is not limited to the illustrated one.
- the camera control unit 40 Based on the non-pixel area information, i-th image data, and related information stored in the recording medium 20 by the above processing, the camera control unit 40 fills the area with a predetermined color and displays a stereoscopic image or a panoramic image. Can do. Alternatively, the camera control unit 40 selects the i-th image data with the smallest area of the area based on the non-pixel area information stored in the recording medium 20 by the above-described processing, and selects the selected i-th image data with the highest quality. Can be displayed as a good planar image.
- FIG. 29A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type image stabilization control unit according to the tenth embodiment of the present invention
- FIG. 29B is an optical image stabilization control according to the eighth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same functions between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a parallax correction unit 61 that is configured by an arithmetic device such as a CPU.
- FIG. 30 shows a flowchart of the correction processing executed by the imaging device 10a or 10b according to the tenth embodiment.
- a program according to the tenth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to an instruction to start parallax adjustment from the operation unit 17.
- a “parallax correction button” is provided as the operation unit 17, and the process proceeds to ST ⁇ b> 2 when the button is pressed.
- the camera control unit 40 uses the i-th image data recorded on the recording medium 20 from the operation unit 17 (including those stored according to the embodiment described above), and works in conjunction with the parallax correction.
- An instruction as to whether or not to change the image cutout position is accepted. If the instruction is accepted, the process proceeds to ST10, and if the instruction is not accepted, the process proceeds to ST3.
- ST3 to ST7 are the same as ST2 to ST6 of the ninth embodiment (FIG. 27).
- the cut-out position of the i-th viewpoint image after the parallax correction from the i-th viewpoint image is determined so that the parallax amount between the i-th viewpoint images clipped in ST7 becomes a predetermined amount of parallax stored in the ROM 41.
- the camera control unit 40 cuts out the i-th viewpoint image from the determined cut-out position of the i-th image data (see FIG. 32A).
- the camera control unit 40 outputs the clipped i-th viewpoint image to the monitor unit 18.
- a stereoscopic image adjusted to a predetermined amount of parallax is displayed on the monitor unit 18.
- the size of the stereoscopic image may decrease as a result of parallax adjustment (see FIG. 32B).
- the camera control unit 40 receives an instruction from the operation unit 17 as to whether or not to display at the minimum cutout size in all frames. If the instruction is accepted, the process proceeds to ST11. If the instruction is not accepted, the process proceeds to ST12.
- the camera control unit 40 reads the optical axis center coordinates and the minimum cutout size (including those saved in ST10 of the seventh embodiment) from the incidental information of the image file of the recording medium 20.
- the camera control unit 40 reads the optical axis center coordinates and the cutout size (calculated in ST8 or 8.1 in the first embodiment) from the incidental information of the image file of the recording medium 20.
- the camera control unit 40 stores the optical axis center coordinates and cutout size read in ST11 or ST12 in the cutout size memory unit 53.
- the camera control unit 40 cuts out the rectangular regions having the center position and the size stored in the cut-out size memory unit 53 from the first to n-th image data after shake correction, respectively, so that the first to n-th viewpoint images are obtained. Get.
- the camera control unit 40 outputs the clipped i-th viewpoint image to the monitor unit 18 (see FIG. 32C).
- the i-th viewpoint image is displayed with the minimum size, and the angle of view can be prevented from changing even when the camera shake correction is performed. For example, when shooting a still image in a movie, the image stabilization amount always changes and the angle of view changes as if digital zoom is repeated. Will not change.
- the camera control unit 40 controls the parallax correction unit 61 to perform parallax correction.
- the parallax correction unit 61 maintains the cut-out size in the x and y directions of the i-th viewpoint image held in the RAM 42 in ST13, while maintaining the i-th viewpoint.
- the cut-out position of the i-th viewpoint image from the i-th image data is determined so that the parallax between the images becomes a predetermined amount of parallax stored in the ROM 41.
- the camera control unit 40 cuts out the i-th viewpoint image after parallax adjustment from the cut-out position of the determined i-th image data (see FIG. 32D). Then, the camera control unit 40 outputs the cut-out i-th viewpoint image after parallax adjustment to the monitor unit 18. As a result, a stereoscopic image adjusted to a predetermined amount of parallax while the size of the image is maintained is displayed on the monitor unit 18.
- the stereoscopic range is narrowed by performing the parallax correction of the stereoscopic image as shown in FIG. 32B, but in this process, the cut-out range of each i-th viewpoint image is linked with the parallax correction as shown in FIG. It is possible to prevent the stereoscopic view range from becoming narrow.
- FIG. 33A is a schematic block diagram of an image pickup apparatus 10a including an image sensor shift type camera shake correction control unit according to the eleventh embodiment of the present invention
- FIG. 33B is an optical camera shake correction control according to the eleventh embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a 3D / 2D switching image selection unit 62 configured by a user interface.
- FIG. 34 shows a flowchart of processing executed by the imaging apparatus 10a or 10b according to the eleventh embodiment.
- a program according to the eleventh embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41. This process can be executed by the imaging device 10a or 10b in FIG.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 uses the i-th image data recorded on the recording medium 20 from the operation unit 17 (including those stored according to the embodiment described above), and generates a stereoscopic image (3D image). And an instruction as to whether or not to display a planar image (2D image). If the instruction is accepted, the process proceeds to ST15, and if the instruction is not accepted, the process proceeds to ST3.
- FIG. 35A schematically shows display of a 3D image
- FIG. 35B schematically shows display of a 2D image.
- ST3 to ST8 are the same as in the tenth embodiment (FIG. 30).
- the camera control unit 40 determines whether 2D display is instructed from the 3D / 2D switching image selection unit 62. If YES, the process proceeds to ST10, and if NO, the process returns to ST5.
- the camera control unit 40 as a result of ST5 to ST7, the p-th imaging unit 3, for example, the first imaging unit 3 corresponding to the desired number p stored in the ROM 41 among the images obtained from the i-th imaging unit 3.
- the camera shake correction is performed.
- the camera control unit 40 acquires the image data after the camera shake correction in ST10.
- the camera control unit 40 cuts out a 2D image that is an area determined by the optical axis center and cutout size saved in ST4 from the acquired image data.
- ST14 to ST19 are the same as ST10 to 15 of the tenth embodiment (FIG. 30).
- ST21-22 are the same as ST12-13. It is also possible to return to ST20 and display a 3D / 2D image based on another recorded image.
- the p-th image data after camera shake correction is displayed as a 2D image as it is, a change in the angle of view occurs when switching from the 3D image to the 2D image.
- the p-th image data after the camera shake correction is used.
- the p-th viewpoint image cut out around the initial optical axis position is displayed as a 2D image.
- FIG. 37A shows a schematic block diagram of an image pickup apparatus 10a including an image pickup device shift type camera shake correction control unit according to the twelfth embodiment of the present invention
- FIG. 37B shows an optical camera shake correction control according to the twelfth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a and 10b include a display frame and cut-out frame interlocking operation unit 63 configured by a CPU or the like.
- FIG. 38 shows a flowchart of correction processing executed by the imaging apparatus 10a or 10b according to the twelfth embodiment.
- a program according to the twelfth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 uses the i-th image data recorded on the recording medium 20 from the operation unit 17 (including those stored according to the embodiment described above) to display a stereoscopic image.
- an instruction is received as to whether or not a part of the stereoscopic image designated by the display frame is cut out and enlarged. If the instruction is accepted, the process proceeds to ST18, and if the instruction is not accepted, the process proceeds to ST3.
- the camera control unit 40 receives an instruction to start enlarged display of a partial region of the 3D image from the enlarged display button provided on the operation unit 17. If the instruction is accepted, the process proceeds to ST10, and if the instruction is not accepted, the process returns to ST5.
- the camera control unit 40 accepts designation of an enlargement area, which is an area to be enlarged and displayed, in the 3D image via the cutout frame designation button provided on the operation unit 17.
- the camera control unit 40 determines whether or not the designated enlargement area has reached the outer edge of the 3D image, that is, the boundary of the cut-out area of the i-th viewpoint image from the i-th image data. If YES, the process proceeds to ST11, and if NO, the enlargement area designation and this determination are repeated.
- the camera control unit 40 controls the display frame and the cutout frame interlocking calculation unit 63 so as to calculate the cutout position of the enlarged region from the pth image data based on the position of the designated enlarged region.
- the camera control unit 40 obtains the p-th viewpoint image obtained from one desired p-th imaging unit 3 in the i-th imaging unit 3, for example, the n-th imaging unit 3 as a result of ST5 to ST7. Camera shake correction is performed on the n viewpoint images.
- the camera control unit 40 acquires the image data after the camera shake correction in ST12.
- the camera control unit 40 cuts out the enlarged area determined in ST11 from the acquired image data.
- the camera control unit 40 outputs the enlarged region cut out in ST14 to a predetermined position of the monitor unit 18.
- ST17 to ST22 are the same as ST14 to ST19 of the eleventh embodiment.
- ST23 to ST28 are the same as ST9 to ST16.
- the image from which the enlarged area is cut out is a p-th viewpoint image reproduced from the recording medium 20.
- FIGS. 39A to 39C show examples of arbitrarily designated enlargement region cutout positions Za to Zc, and FIGS. 40A to 40C show examples of enlargement region display corresponding to FIGS. 39A to 39C, respectively.
- Ip indicates the p-th image data.
- an arbitrary partial area in the viewpoint image after camera shake correction can be enlarged and displayed. Even when the designated position of the enlargement area protrudes from the viewpoint image, an image corresponding to the enlargement area is cut out from the p-th image data, and thus no image loss occurs in the enlargement area.
- FIG. 41A shows a schematic block diagram of an image pickup apparatus 10a including an image pickup device shift type camera shake correction control unit according to a thirteenth embodiment of the present invention
- FIG. 41B shows an optical camera shake correction control according to the thirteenth embodiment of the present invention.
- the schematic block diagram of the imaging device 10b provided with the part is shown. Blocks having the same function between the imaging devices 10a / 10b in both figures or in the already described embodiments are assigned the same numbers except for branch numbers.
- the imaging devices 10a / 10b include a pixel count / comparison unit 64 configured by a CPU or the like.
- the pixel count / comparison unit 64 may be shared with the complement target region calculation unit 70.
- FIG. 42 shows a flowchart of the correction processing executed by the imaging device 10a or 10b according to the thirteenth embodiment.
- a program according to the thirteenth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41.
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 calculates, for each of the first to n viewpoint images acquired in S6, a non-pixel area that is an area without pixel information and the number of pixels thereof, so as to calculate the number of pixels. To control. In addition, the camera control unit 40 controls the pixel number counting / comparing unit 64 so as to select the m1th viewpoint image that is the viewpoint image with the smallest number of pixels in the complement target region from the first to nth viewpoint images.
- the camera control unit 40 outputs the m1st viewpoint image selected in ST8 to the monitor unit 18 as a 2D image.
- ST10 to ST12 are the same as in the twelfth embodiment. However, in ST12, if the non-pixel area (or the complement target area) described in the ninth embodiment is stored in the recording medium 20, it is also read.
- the camera control unit 40 determines whether or not the non-pixel area information is included in various information read from the recording medium 20 in S10 or S12. If YES, the process proceeds to ST14, and if NO, the process proceeds to ST15.
- the camera control unit 40 based on the non-pixel area information of the first to n viewpoint images, the image data with the smallest number of pixels in the non-pixel area (or the complement target area) among the first to n viewpoint images.
- the pixel count / comparison unit 64 is controlled to select the m2nd image data.
- the camera control unit 40 cuts out the m2nd viewpoint image from the m2nd image data based on the optical axis center coordinates and the cutout size read out in ST10 or 12.
- the camera control unit 40 outputs the cut-out m2th viewpoint image to the monitor unit 18 as a 2D image.
- FIG. 43A since no pixel area BL does not exist in the left and right images, any one of them is output as a 2D image.
- FIG. 43B illustrates that the right image is output as a 2D image because the left image includes the no-pixel region BL.
- FIG. 43C illustrates that the right image is output as a 2D image because the non-pixel area BL-1 of the right image is smaller than the non-pixel area BL-2 of the left image.
- FIG. 43D illustrates that any one of the left and right images has the same area and is output as a 2D image.
- FIG. 44 is a flowchart of the correction process executed by the imaging device 10a or 10b according to the fourteenth embodiment.
- a program according to the fourteenth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41. This process can be executed by the imaging device 10a or 10b of the fourth embodiment (FIG. 12).
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 determines whether to fill the non-pixel area of the i-th viewpoint image. The instruction is accepted. If the instruction is accepted, the process proceeds to ST12. If the instruction is not accepted, the process proceeds to ST3.
- ST5-9 are the same as ST3-7 of the thirteenth embodiment.
- ST10 to 11 are the same as ST9 to 10 of the fourth embodiment (FIG. 13). However, the filling is performed on the non-pixel area of the i-th viewpoint image (or 1st to n-th image data) cut out from the i-th image data from the i-th imaging unit 3.
- ST12 to 15 are the same as ST10 to 13 of the thirteenth embodiment (FIG. 42).
- the camera control unit 40 controls the filling unit 56 to fill the non-pixel areas of the first to n-th image data with the color set in ST3 or 3.1.
- S17 is the same as ST15 of the thirteenth embodiment (FIG. 42).
- the cut-out source of the i-th viewpoint image is the first to n-th image data in which the non-pixel area is filled.
- S18 is the same as ST16 of the thirteenth embodiment (FIG. 42).
- the non-pixel area is filled so that the non-pixel area is less noticeable than when it is simply blank. it can. Further, it is possible to distinguish the non-pixel area from the other areas.
- FIG. 46 is a flowchart of the correction process executed by the imaging device 10a or 10b according to the fifteenth embodiment.
- a program according to the fifteenth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41. This process can be executed by the imaging device 10a or 10b of the fifth embodiment (FIG. 15).
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 receives an instruction from the operation unit 17 as to whether or not to synthesize a panoramic image based on the cutout range included in the storage information of the image stored in the recording medium 20. If the instruction is accepted, the process proceeds to ST3. If the instruction is not accepted, the process proceeds to ST9.
- the camera control unit 40 receives an instruction from the operation unit 17 as to whether or not to display at the minimum cutout size in all frames. If the instruction is accepted, the process proceeds to ST4. If the instruction is not accepted, the process proceeds to ST5.
- the camera control unit 40 reads the image stored in the recording medium 20 and the stored information.
- This storage information is stored in the seventh embodiment, and includes the optical axis center coordinates, the minimum cutout size, and the image cutout coordinates.
- the camera control unit 40 reads an image stored in the recording medium 20 and storage information incidental thereto.
- This stored information is stored in the seventh embodiment (FIG. 22) and includes the optical axis center coordinates, the cut-out size of each i-th viewpoint image, and the image cut-out coordinates.
- the camera control unit 40 stores the storage information read from the recording medium 20 in the RAM 42.
- the camera control unit 40 controls the panorama synthesis calculation unit 57 so as to search for similar points serving as a reference for the synthesis position of the panorama image based on the storage information stored in the RAM 42 in ST5 or 6. That is, the panorama composition calculation unit 57 sets a range corrected by the amount of deviation between image cutout coordinates between different i-th image data as a search range for similar points between different i-th image data, and is similar within the search range. Search for a point.
- the panorama composition calculation unit 57 scans the pixel group G in the first image data in the X direction using the pixel group G arranged in the vertical direction having the same X coordinate in the first image data as a reference pixel (pixel It is assumed that similar points similar to the pixel group G are searched from the second image data (moving the X coordinate of the group G). In this case, the panorama composition calculation unit 57 calculates the Y between the cutout range C-1 of the first image data and the cutout range C-2 of the second image data from the Y coordinate of each pixel Yi constituting the pixel group G.
- a value obtained by subtracting the amount of deviation ⁇ Y in the direction is set as a Y coordinate to search for a similar point Y′i corresponding to each pixel Yi of the pixel group G.
- the panorama composition calculation unit 57 specifies X′0 that is the X coordinate of the similarity point Y′i corresponding to each pixel Yi of the pixel group G having the same coordinate X0, the pixel group G ′ It is determined that there is a similar point corresponding to.
- the panorama synthesis calculation unit 57 combines the first image data and the second image data so that the pixel group G of the first image data and the pixel group G ′ of the second image data match, and synthesizes the panorama image (FIG. 48).
- a panoramic image can be synthesized with an accuracy equivalent to that of an image not subjected to camera shake correction even from an image stored after camera shake correction.
- FIG. 49 is a flowchart of the correction process executed by the imaging device 10a or 10b according to the sixteenth embodiment.
- a program according to the sixteenth embodiment for causing the camera control unit 40a or 40b to execute this processing is stored in the ROM 41. This process can be executed by the imaging device 10a or 10b of the sixth embodiment (FIG. 18).
- the camera control unit 40 proceeds to ST2 in response to accepting the start of the display operation from the operation unit 17.
- the camera control unit 40 receives an instruction from the operation unit 17 as to whether or not to perform stereo matching based on the cutout range included in the supplementary information of the image stored in the recording medium 20. If the instruction is accepted, the process proceeds to ST3. If the instruction is not accepted, the process proceeds to ST9.
- FIG. 50A is an example of i-th image data read from the recording medium 20
- FIG. 50B is an example of an epipolar line L set for the i-th viewpoint image
- FIG. 50C is a schematic diagram of stereo matching for the i-th viewpoint image. Indicate.
- the camera control unit 40 reads the image stored in the recording medium 20 into the frame memory 43.
- the camera control unit 40 outputs the image in the frame memory 43 to the monitor unit 18.
- This image may be a 3D image or a 2D image.
- a moving image frame recorded on the recording medium 20 can be equated with a still image. That is, the present invention can be applied to recording both moving images and still images.
- each frame may or may not be recorded in the order of shooting time series as in a moving image.
- the arrangement direction of the imaging units 3 may be the vertical (Y) direction instead of the horizontal (X) direction. In this case, for example, in the fifth and fifteenth embodiments, a vertically long panoramic image can be obtained.
- the corresponding points can be searched by setting the epipolar line in the Y direction.
- the arrangement direction of the imaging units 3 may be an oblique direction. In short, an epipolar line may be set in the direction in which the lens optical axes Li of the imaging unit 3 are arranged, and corresponding points may be searched in a direction parallel to the epipolar line.
- Cutout position / size determination unit 51: Cutout position / size determination unit, 52: Initial optical axis center position memory unit, 53: Cutout size memory unit, 70: Complementation target area calculation unit, 56: Filling unit, 57: Panorama synthesis calculation unit, 58: Stereo matching Calculation unit, 59: association processing unit, 60: image association processing unit, 61: parallax correction unit, 62: 3D / 2D switching image selection unit, 63: display frame and cutout frame interlocking calculation unit, 64: pixel number counting / comparison Part
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Studio Devices (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Image Processing (AREA)
- Adjustment Of Camera Lenses (AREA)
Abstract
Description
図1Aは本発明の第1実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図1Bは本発明の第1実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。図1A,図1Bの撮像装置10a/10b間で同様の機能のブロックには枝番aおよびbを除いて同一の番号を付しており、以下、これらの同一の番号の付与されたブロックの各々の説明をまとめて行う。
図5Aは本発明の第2実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図5Bは本発明の第2実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付すものとし、以下、これらの同一の番号以外の番号が付与されたブロックの説明を行う。
図8に例示するように、撮影時にセンサ12の全有効画素領域RAから最大サイズの画像Iを読み出すと、モニタ部18などに撮影された画像を順次出力するための映像信号を生成するにはかなり高速な信号処理を行う必要が有り、また、回路規模も大きくなってしまう。そこで、通常のカメラでは、予めROM41などにデフォルトの切り取り範囲情報TRを記憶しておき、撮影時は、デフォルトの切り取り範囲情報に従って有効画素領域RAから処理に都合の良い切り取り領域Routを選んで読み出す制御を行う。
図12Aは本発明の第4実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図12Bは本発明の第4実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
上記第1~4実施形態で切り出された第i視点画像の光軸中心位置、特にY軸方向の光軸中心位置はそろっているため、当該第i視点画像Y軸方向の光軸中心位置同士が揃うように第i視点画像をつなぎ合わせると、容易に横長のパノラマ画像を作成することができる。
図18Aは本発明の第6実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図18Bは本発明の第6実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図21Aは本発明の第7実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図21Bは本発明の第7実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図24Aは本発明の第8実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図24Bは本発明の第8実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図27は第9実施形態に係る撮像装置10aまたは10bの実行する補正処理のフローチャートを示す。この処理をカメラ制御部40aまたは40bに実行させるための、第9実施形態に係るプログラムはROM41に記憶されている。この処理は図21の撮像装置10aまたは10bによって実行することができる。
図29Aは本発明の第10実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図29Bは本発明の第8実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図33Aは本発明の第11実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図33Bは本発明の第11実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図37Aは本発明の第12実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図37Bは本発明の第12実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図41Aは本発明の第13実施形態に係る撮像素子シフト式手振れ補正制御部を備えた撮像装置10aの概略ブロック図を示し、図41Bは本発明の第13実施形態に係る光学式手振れ補正制御部を備えた撮像装置10bの概略ブロック図を示す。両図の撮像装置10a/10b間、あるいは説明済みの実施形態で同様の機能のブロックには枝番を除いて同一の番号を付している。
図44は第14実施形態に係る撮像装置10aまたは10bの実行する補正処理のフローチャートを示す。この処理をカメラ制御部40aまたは40bに実行させるための、第14実施形態に係るプログラムはROM41に記憶されている。この処理は第4実施形態(図12)の撮像装置10aまたは10bによって実行することができる。
図46は第15実施形態に係る撮像装置10aまたは10bの実行する補正処理のフローチャートを示す。この処理をカメラ制御部40aまたは40bに実行させるための、第15実施形態に係るプログラムはROM41に記憶されている。この処理は第5実施形態(図15)の撮像装置10aまたは10bによって実行することができる。
図49は第16実施形態に係る撮像装置10aまたは10bの実行する補正処理のフローチャートを示す。この処理をカメラ制御部40aまたは40bに実行させるための、第16実施形態に係るプログラムはROM41に記憶されている。この処理は第6実施形態(図18)の撮像装置10aまたは10bによって実行することができる。
上記実施形態において、記録媒体20に記録される動画のコマを静止画と同視することができる。すなわち、本発明は動画および静止画の双方の記録に適用しうる。なお、連写のように複数の静止画を記録する際には、動画のように各コマを撮影時系列順に記録してもよいし、しなくてもよい。
撮像部3の並び方向は横(X)方向でなく縦(Y)方向でもよい。この場合、例えば、第5・15実施形態では、縦長のパノラマ画像を得ることができる。あるいは第6・16実施形態では、エピポーラ線をY方向に設定して対応点検索を行える。撮像部3の並び方向は斜め方向でもよい。要するに、撮像部3のレンズ光軸Liの並び方向にエピポーラ線を設定し、そのエピポーラ線と平行な方向に対応点検索を行えばよい。
Claims (22)
- 異なる視点から被写体像を撮影する複数の撮像部と、前記撮像部の各々の振れを検出する振れ検出部と、前記振れ検出部の検出した各撮像部の振れに基づいて各撮像部で撮影される被写体像の振れを補正する振れ補正部とを備える撮像装置であって、
前記複数の撮像部から各々取得された画像から出力用画像を切り出すための切り出しサイズを決定するサイズ決定部であって、前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とした規定の撮像領域と、前記複数の撮像部の各々の振れ補正後の撮像領域との共通領域に含まれる、前記初期光軸中心を中心とした候補領域のサイズのうち、最小のサイズの候補領域に基づいて、前記複数の撮像部の各々から取得された複数の画像に共通の所定のアスペクト比を有する切り出しサイズを決定するサイズ決定部と、
前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とし、前記サイズ決定部の決定した共通の切り出しサイズで、前記複数の画像の各々から出力用画像を切り出す切出部と、
を備える撮像装置。 - 異なる視点から被写体像を撮影する複数の撮像部と、前記撮像部の各々の振れを検出する振れ検出部と、前記振れ検出部の検出した各撮像部の振れに基づいて各撮像部で撮影される被写体像の振れを補正する振れ補正部とを備える撮像装置であって、
前記複数の撮像部から各々取得された画像から出力用画像を切り出すための切り出しサイズを決定するサイズ決定部であって、前記複数の撮像部の各々に対する前記振れ補正部の振れ補正に依存しない不変の撮像領域に含まれる、前記初期光軸中心を中心とした切り出し候補領域を前記複数の撮像部ごとに決定した上、前記複数の撮像部に対応する切り出し候補領域のサイズの最小値に基づいて、前記複数の撮像部の各々からの画像に共通の所定のアスペクト比を有する切り出しサイズを決定するサイズ決定部と、
前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とし、前記サイズ決定部の決定した共通の切り出しサイズで、前記複数の撮像部の各々から出力用画像を切り出す切出部と、
を備える撮像装置。 - 前記サイズ決定部は、前記複数の撮像部の各々に対する前記振れ補正部の振れ補正により鉛直方向および/または水平方向に最大限変位した異なる2つの撮像領域間の共通領域に基づき、前記複数の撮像部ごとに前記不変の撮像領域を決定する請求項2に記載の撮像装置。
- 前記サイズ決定部は、前記複数の撮像部の各々に対して前記振れ補正部が少なくとも2回実施した振れ補正により得られた前記鉛直方向および/または水平方向に最大限変位した異なる2つの撮像領域間の共通領域を前記複数の撮像部ごとに決定し、前記複数の撮像部ごとに決定した共通領域を各撮像部に対応する不変の撮像領域とする請求項3に記載の撮像装置。
- 前記切出部の切り出した出力用画像が、規定の切出範囲を超える補完対象領域を有する場合、前記補完対象領域に相当する前記撮像部の有効画素領域の画像で前記補完対象領域を補完する画像補完部を備える請求項1~4のいずれかに記載の撮像装置。
- 前記切出部の切り出した出力用画像が、前記撮像部の規定の切出範囲を超える補完対象領域を有する場合、前記補完対象領域を所定の色で補完する色補完部を備える請求項1~4のいずれかに記載の撮像装置。
- 前記切出部の切り出した各出力用画像の初期光軸中心を基準に各画像を合成することでパノラマ画像を作成するパノラマ画像作成部を備える請求項1~6のいずれかに記載の撮像装置。
- 前記切出部の切り出した各出力用画像の初期光軸中心を基準にエピポーララインを設定し、前記エピポーララインに沿って各出力用画像の相関を演算することでステレオマッチングを行うステレオマッチング演算部を備える請求項1~7のいずれかに記載の撮像装置。
- 前記複数の撮像部の各々からの各画像と各画像の初期光軸中心位置および切り出しサイズとを関連づけて記憶する記憶部を備える請求項1~8のいずれかに記載の撮像装置。
- 前記複数の撮像部の各々から同一の撮影時刻に取得された各画像に対応する各出力用画像を関連づけて記憶する記憶部を備える請求項1~8のいずれかに記載の撮像装置。
- 前記出力用画像の補完対象領域の座標と前記補完対象領域の面積が最小の出力用画像の識別情報を前記出力用画像に対応づけて記憶する記憶部を備える請求項5または6に記載の撮像装置。
- 前記出力用画像の切り出しサイズを保ったまま、前記出力用画像間の視差が所定の視差量になるよう、前記出力用画像の切り出し位置を決定する視差調整部を備える請求項1~11のいずれかに記載の撮像装置。
- 前記切出部の切り出した画像に基づいて平面画像または立体画像を出力する出力部を備える請求項1~12のいずれかに記載の撮像装置。
- 拡大位置の指定を受け付ける指定部を備え、
前記切出部は、前記指定部の受け付けた拡大位置が前記画像から前記出力用画像を切り出す境界に達した場合、前記出力用画像を切り出す位置を前記拡大位置に応じて変更する請求項1~13のいずれかに記載の撮像装置。 - 前記記憶部に記憶された識別情報に基づいて前記補完対象領域の面積が最小の画像を平面画像として出力する平面画像出力部を備える請求項11に記載の撮像装置。
- 前記記憶部に記憶された補完対象領域を所定の色で補完する色補完部と、
前記色補完部が色を補完した画像に基づいて平面画像または立体画像を出力する出力部と、
を備える請求項11または15に記載の撮像装置。 - 前記記憶部に記憶された各画像に対応する初期光軸中心位置および切り出しサイズを基準に各出力用画像を切り出した上、各出力画像を合成することでパノラマ画像を作成するパノラマ画像作成部を備える請求項9に記載の撮像装置。
- 前記記憶部に記憶された各画像に対応する初期光軸中心位置および切り出しサイズを基準に各出力画像を切り出した上、前記初期光軸中心を基準に各出力用画像にエピポーララインを設定し、前記エピポーララインに沿って各出力用画像の相関を演算することでステレオマッチングを行うステレオマッチング演算部を備える請求項9に記載の撮像装置。
- 異なる視点から被写体像を撮影する複数の撮像部と、前記撮像部の各々の振れを検出する振れ検出部と、前記振れ検出部の検出した各撮像部の振れに基づいて各撮像部で撮影される被写体像の振れを補正する振れ補正部とを備える撮像装置で実行される撮像方法であって、
前記複数の撮像部から各々取得された画像から出力用画像を切り出すための切り出しサイズを決定するため、前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とした規定の撮像領域と、前記複数の撮像部の各々の振れ補正後の撮像領域との共通領域に含まれる、前記初期光軸中心を中心とした候補領域のサイズのうち、最小のサイズの候補領域に基づいて、前記複数の撮像部の各々から取得された複数の画像に共通の所定のアスペクト比を有する切り出しサイズを決定するステップと、
前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とし、前記サイズ決定部の決定した共通の切り出しサイズで、前記複数の画像の各々から出力用画像を切り出すステップと、
を含む撮像方法。 - 異なる視点から被写体像を撮影する複数の撮像部と、前記撮像部の各々の振れを検出する振れ検出部と、前記振れ検出部の検出した各撮像部の振れに基づいて各撮像部で撮影される被写体像の振れを補正する振れ補正部とを備える撮像装置で実行される撮像方法であって、
前記複数の撮像部から各々取得された画像から出力用画像を切り出すための切り出しサイズを決定するため、前記複数の撮像部の各々に対する前記振れ補正部の振れ補正に依存しない不変の撮像領域に含まれる、前記初期光軸中心を中心とした切り出し候補領域を前記複数の撮像部ごとに決定した上、前記複数の撮像部に対応する切り出し候補領域のサイズの最小値に基づいて、前記複数の撮像部の各々からの画像に共通の所定のアスペクト比を有する切り出しサイズを決定するステップと、
前記複数の撮像部の各々の振れ補正前の初期光軸中心を基準とし、前記サイズ決定部の決定した共通の切り出しサイズで、前記複数の撮像部の各々から出力用画像を切り出すステップと、
を含む撮像方法。 - 請求項19または20に記載の撮像方法を撮像装置に実行させるためのプログラム。
- 請求項21に記載のプログラムのコンピュータ読取可能なコードを記録した記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP10838382.9A EP2391142B1 (en) | 2010-03-19 | 2010-11-02 | Imaging device, method and program, and recording medium using same |
US13/142,779 US8310538B2 (en) | 2010-03-19 | 2010-11-02 | Imaging apparatus, method, program, and recording medium used in the program |
JP2011525325A JP4843750B2 (ja) | 2010-03-19 | 2010-11-02 | 撮像装置、方法およびプログラム |
CN201080004735.1A CN102282857B (zh) | 2010-03-19 | 2010-11-02 | 拍摄装置以及方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010064385 | 2010-03-19 | ||
JP2010-064385 | 2010-03-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011114572A1 true WO2011114572A1 (ja) | 2011-09-22 |
Family
ID=44648691
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/069464 WO2011114572A1 (ja) | 2010-03-19 | 2010-11-02 | 撮像装置、方法およびプログラム並びにこれに用いる記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8310538B2 (ja) |
EP (1) | EP2391142B1 (ja) |
JP (2) | JP4843750B2 (ja) |
CN (1) | CN102282857B (ja) |
WO (1) | WO2011114572A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012070389A (ja) * | 2010-03-19 | 2012-04-05 | Fujifilm Corp | 撮像装置、方法およびプログラム |
WO2019054304A1 (ja) * | 2017-09-15 | 2019-03-21 | 株式会社ソニー・インタラクティブエンタテインメント | 撮像装置 |
JP2021507655A (ja) * | 2018-02-08 | 2021-02-22 | イノベーションズ マインドトリック インコーポレイテッド | 視聴者に合わせて調整された立体画像表示 |
US11785197B2 (en) | 2017-08-30 | 2023-10-10 | Innovations Mindtrick Inc. | Viewer-adjusted stereoscopic image display |
Families Citing this family (79)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102037717B (zh) | 2008-05-20 | 2013-11-06 | 派力肯成像公司 | 使用具有异构成像器的单片相机阵列的图像拍摄和图像处理 |
US11792538B2 (en) | 2008-05-20 | 2023-10-17 | Adeia Imaging Llc | Capturing and processing of images including occlusions focused on an image sensor by a lens stack array |
US8866920B2 (en) | 2008-05-20 | 2014-10-21 | Pelican Imaging Corporation | Capturing and processing of images using monolithic camera array with heterogeneous imagers |
EP2502115A4 (en) | 2009-11-20 | 2013-11-06 | Pelican Imaging Corp | RECORDING AND PROCESSING IMAGES THROUGH A MONOLITHIC CAMERA ARRAY WITH HETEROGENIC IMAGE CONVERTER |
KR101824672B1 (ko) | 2010-05-12 | 2018-02-05 | 포토네이션 케이맨 리미티드 | 이미저 어레이 구조 및 어레이 카메라 |
JP5477349B2 (ja) * | 2010-09-30 | 2014-04-23 | カシオ計算機株式会社 | 画像合成装置、及び画像検索方法、プログラム |
US8878950B2 (en) | 2010-12-14 | 2014-11-04 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using super-resolution processes |
JP5686244B2 (ja) | 2010-12-21 | 2015-03-18 | ソニー株式会社 | 表示制御装置、表示制御方法、及び、プログラム |
US8960860B2 (en) * | 2011-04-27 | 2015-02-24 | Hewlett-Packard Development Company, L.P. | Printhead die |
WO2012155119A1 (en) | 2011-05-11 | 2012-11-15 | Pelican Imaging Corporation | Systems and methods for transmitting and receiving array camera image data |
KR20140045458A (ko) | 2011-06-28 | 2014-04-16 | 펠리칸 이매징 코포레이션 | 어레이 카메라와 함께 사용하는 광학 장치 |
US20130265459A1 (en) | 2011-06-28 | 2013-10-10 | Pelican Imaging Corporation | Optical arrangements for use with an array camera |
WO2013043761A1 (en) | 2011-09-19 | 2013-03-28 | Pelican Imaging Corporation | Determining depth from multiple views of a scene that include aliasing using hypothesized fusion |
WO2013049699A1 (en) | 2011-09-28 | 2013-04-04 | Pelican Imaging Corporation | Systems and methods for encoding and decoding light field image files |
US9412206B2 (en) | 2012-02-21 | 2016-08-09 | Pelican Imaging Corporation | Systems and methods for the manipulation of captured light field image data |
US9348211B2 (en) * | 2012-03-19 | 2016-05-24 | Htc Corporation | Camera module and portable device using the same |
JP6035842B2 (ja) * | 2012-04-25 | 2016-11-30 | ソニー株式会社 | 撮像装置、撮像処理方法、画像処理装置および撮像処理システム |
US9210392B2 (en) | 2012-05-01 | 2015-12-08 | Pelican Imaging Coporation | Camera modules patterned with pi filter groups |
KR101364050B1 (ko) * | 2012-05-29 | 2014-02-19 | 주식회사 코아로직 | 이미지 처리 방법 및 장치 |
CN104508681B (zh) * | 2012-06-28 | 2018-10-30 | Fotonation开曼有限公司 | 用于检测有缺陷的相机阵列、光学器件阵列和传感器的系统及方法 |
US20140002674A1 (en) | 2012-06-30 | 2014-01-02 | Pelican Imaging Corporation | Systems and Methods for Manufacturing Camera Modules Using Active Alignment of Lens Stack Arrays and Sensors |
CN104662589B (zh) | 2012-08-21 | 2017-08-04 | 派力肯影像公司 | 用于使用阵列照相机捕捉的图像中的视差检测和校正的系统和方法 |
CN104685513B (zh) | 2012-08-23 | 2018-04-27 | 派力肯影像公司 | 根据使用阵列源捕捉的低分辨率图像的基于特征的高分辨率运动估计 |
WO2014043641A1 (en) | 2012-09-14 | 2014-03-20 | Pelican Imaging Corporation | Systems and methods for correcting user identified artifacts in light field images |
US20140092281A1 (en) | 2012-09-28 | 2014-04-03 | Pelican Imaging Corporation | Generating Images from Light Fields Utilizing Virtual Viewpoints |
WO2014078443A1 (en) | 2012-11-13 | 2014-05-22 | Pelican Imaging Corporation | Systems and methods for array camera focal plane control |
US9462164B2 (en) | 2013-02-21 | 2016-10-04 | Pelican Imaging Corporation | Systems and methods for generating compressed light field representation data using captured light fields, array geometry, and parallax information |
US9253380B2 (en) | 2013-02-24 | 2016-02-02 | Pelican Imaging Corporation | Thin form factor computational array cameras and modular array cameras |
US9638883B1 (en) | 2013-03-04 | 2017-05-02 | Fotonation Cayman Limited | Passive alignment of array camera modules constructed from lens stack arrays and sensors based upon alignment information obtained during manufacture of array camera modules using an active alignment process |
WO2014138697A1 (en) | 2013-03-08 | 2014-09-12 | Pelican Imaging Corporation | Systems and methods for high dynamic range imaging using array cameras |
US8866912B2 (en) | 2013-03-10 | 2014-10-21 | Pelican Imaging Corporation | System and methods for calibration of an array camera using a single captured image |
US9124831B2 (en) | 2013-03-13 | 2015-09-01 | Pelican Imaging Corporation | System and methods for calibration of an array camera |
US9519972B2 (en) | 2013-03-13 | 2016-12-13 | Kip Peli P1 Lp | Systems and methods for synthesizing images from image data captured by an array camera using restricted depth of field depth maps in which depth estimation precision varies |
US9106784B2 (en) | 2013-03-13 | 2015-08-11 | Pelican Imaging Corporation | Systems and methods for controlling aliasing in images captured by an array camera for use in super-resolution processing |
WO2014164909A1 (en) | 2013-03-13 | 2014-10-09 | Pelican Imaging Corporation | Array camera architecture implementing quantum film sensors |
WO2014159779A1 (en) | 2013-03-14 | 2014-10-02 | Pelican Imaging Corporation | Systems and methods for reducing motion blur in images or video in ultra low light with array cameras |
US9100586B2 (en) | 2013-03-14 | 2015-08-04 | Pelican Imaging Corporation | Systems and methods for photometric normalization in array cameras |
US10122993B2 (en) | 2013-03-15 | 2018-11-06 | Fotonation Limited | Autofocus system for a conventional camera that uses depth information from an array camera |
US9497429B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Extended color processing on pelican array cameras |
WO2014145856A1 (en) | 2013-03-15 | 2014-09-18 | Pelican Imaging Corporation | Systems and methods for stereo imaging with camera arrays |
US9497370B2 (en) | 2013-03-15 | 2016-11-15 | Pelican Imaging Corporation | Array camera architecture implementing quantum dot color filters |
US9633442B2 (en) | 2013-03-15 | 2017-04-25 | Fotonation Cayman Limited | Array cameras including an array camera module augmented with a separate camera |
US9445003B1 (en) | 2013-03-15 | 2016-09-13 | Pelican Imaging Corporation | Systems and methods for synthesizing high resolution images using image deconvolution based on motion and depth information |
WO2014156202A1 (ja) * | 2013-03-29 | 2014-10-02 | 株式会社ニコン | 画像処理装置、撮像装置および画像処理プログラム |
JP5804007B2 (ja) * | 2013-09-03 | 2015-11-04 | カシオ計算機株式会社 | 動画生成システム、動画生成方法及びプログラム |
WO2015048694A2 (en) | 2013-09-27 | 2015-04-02 | Pelican Imaging Corporation | Systems and methods for depth-assisted perspective distortion correction |
US9426365B2 (en) * | 2013-11-01 | 2016-08-23 | The Lightco Inc. | Image stabilization related methods and apparatus |
US9185276B2 (en) | 2013-11-07 | 2015-11-10 | Pelican Imaging Corporation | Methods of manufacturing array camera modules incorporating independently aligned lens stacks |
WO2015074078A1 (en) | 2013-11-18 | 2015-05-21 | Pelican Imaging Corporation | Estimating depth from projected texture using camera arrays |
EP3075140B1 (en) | 2013-11-26 | 2018-06-13 | FotoNation Cayman Limited | Array camera configurations incorporating multiple constituent array cameras |
EP3064122A4 (en) * | 2013-12-05 | 2017-09-06 | Olympus Corporation | Stereoscopic endoscope system |
US10089740B2 (en) | 2014-03-07 | 2018-10-02 | Fotonation Limited | System and methods for depth regularization and semiautomatic interactive matting using RGB-D images |
JP2015177510A (ja) * | 2014-03-18 | 2015-10-05 | 株式会社リコー | カメラシステム、画像処理方法及びプログラム |
US9247117B2 (en) | 2014-04-07 | 2016-01-26 | Pelican Imaging Corporation | Systems and methods for correcting for warpage of a sensor array in an array camera module by introducing warpage into a focal plane of a lens stack array |
US9521319B2 (en) | 2014-06-18 | 2016-12-13 | Pelican Imaging Corporation | Array cameras and array camera modules including spectral filters disposed outside of a constituent image sensor |
EP3201877B1 (en) | 2014-09-29 | 2018-12-19 | Fotonation Cayman Limited | Systems and methods for dynamic calibration of array cameras |
KR20160068407A (ko) * | 2014-12-05 | 2016-06-15 | 삼성전기주식회사 | 촬영 장치 및 촬영 장치의 제어 방법 |
US9942474B2 (en) | 2015-04-17 | 2018-04-10 | Fotonation Cayman Limited | Systems and methods for performing high speed video capture and depth estimation using array cameras |
WO2017045129A1 (zh) * | 2015-09-15 | 2017-03-23 | 华为技术有限公司 | 图像畸变校正方法及装置 |
EP3352448B1 (en) | 2015-09-18 | 2023-03-01 | Sony Group Corporation | Image processing device, image processing method, program and imaging system |
WO2017077906A1 (ja) | 2015-11-06 | 2017-05-11 | 富士フイルム株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US10051263B2 (en) * | 2015-11-23 | 2018-08-14 | Center For Integrated Smart Sensors Foundation | Multi-aperture camera system using scan line processing |
EP3389268B1 (en) | 2016-01-12 | 2021-05-12 | Huawei Technologies Co., Ltd. | Depth information acquisition method and apparatus, and image collection device |
KR101905813B1 (ko) * | 2016-09-09 | 2018-10-08 | 광운대학교 산학협력단 | 영상비교장치 |
US10482618B2 (en) | 2017-08-21 | 2019-11-19 | Fotonation Limited | Systems and methods for hybrid depth regularization |
KR102042131B1 (ko) * | 2018-01-30 | 2019-11-07 | 광운대학교 산학협력단 | 단말기에서 실시간 글자 인식시 영상을 안정화하는 방법 |
JP2019162371A (ja) * | 2018-03-20 | 2019-09-26 | ソニー・オリンパスメディカルソリューションズ株式会社 | 医療用撮像装置及び内視鏡装置 |
JP6502003B1 (ja) * | 2018-07-23 | 2019-04-17 | 三菱電機株式会社 | 画像補正装置及び画像補正方法 |
WO2021055585A1 (en) | 2019-09-17 | 2021-03-25 | Boston Polarimetrics, Inc. | Systems and methods for surface modeling using polarization cues |
CN114766003B (zh) | 2019-10-07 | 2024-03-26 | 波士顿偏振测定公司 | 用于利用偏振增强传感器系统和成像系统的系统和方法 |
EP4066001A4 (en) | 2019-11-30 | 2024-01-24 | Boston Polarimetrics Inc | SYSTEMS AND METHODS FOR TRANSPARENT OBJECT SEGMENTATION USING POLARIZATION GUIDES |
US11727545B2 (en) * | 2019-12-12 | 2023-08-15 | Canon Kabushiki Kaisha | Image processing apparatus and image capturing apparatus |
CN111225201B (zh) * | 2020-01-19 | 2022-11-15 | 深圳市商汤科技有限公司 | 视差校正方法及装置、存储介质 |
KR20220132620A (ko) | 2020-01-29 | 2022-09-30 | 인트린식 이노베이션 엘엘씨 | 물체 포즈 검출 및 측정 시스템들을 특성화하기 위한 시스템들 및 방법들 |
JP2023511747A (ja) | 2020-01-30 | 2023-03-22 | イントリンジック イノベーション エルエルシー | 偏光画像を含む異なる撮像モダリティで統計モデルを訓練するためのデータを合成するためのシステムおよび方法 |
US11953700B2 (en) | 2020-05-27 | 2024-04-09 | Intrinsic Innovation Llc | Multi-aperture polarization optical systems using beam splitters |
US11290658B1 (en) | 2021-04-15 | 2022-03-29 | Boston Polarimetrics, Inc. | Systems and methods for camera exposure control |
US11954886B2 (en) | 2021-04-15 | 2024-04-09 | Intrinsic Innovation Llc | Systems and methods for six-degree of freedom pose estimation of deformable objects |
US11689813B2 (en) | 2021-07-01 | 2023-06-27 | Intrinsic Innovation Llc | Systems and methods for high dynamic range imaging using crossed polarizers |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003092768A (ja) * | 2001-09-18 | 2003-03-28 | Canon Inc | 撮像装置 |
JP2004120600A (ja) | 2002-09-27 | 2004-04-15 | Fuji Photo Film Co Ltd | デジタル双眼鏡 |
JP2005045328A (ja) | 2003-07-22 | 2005-02-17 | Sharp Corp | 3次元画像撮像装置 |
JP2008064863A (ja) | 2006-09-05 | 2008-03-21 | Sony Corp | 手ぶれ補正機構及び撮像装置 |
JP2008164338A (ja) | 2006-12-27 | 2008-07-17 | Nikon Corp | 位置検出装置 |
JP2009014445A (ja) | 2007-07-03 | 2009-01-22 | Konica Minolta Holdings Inc | 測距装置 |
JP2009205193A (ja) | 2008-02-26 | 2009-09-10 | Fujifilm Corp | 画像処理装置および方法並びにプログラム |
JP2010032969A (ja) * | 2008-07-31 | 2010-02-12 | Fujifilm Corp | 複眼撮像装置 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6741250B1 (en) * | 2001-02-09 | 2004-05-25 | Be Here Corporation | Method and system for generation of multiple viewpoints into a scene viewed by motionless cameras and for presentation of a view path |
JP4699995B2 (ja) * | 2004-12-16 | 2011-06-15 | パナソニック株式会社 | 複眼撮像装置及び撮像方法 |
JP4714176B2 (ja) * | 2007-03-29 | 2011-06-29 | 富士フイルム株式会社 | 立体撮影装置及び光軸調節方法 |
JP5129638B2 (ja) * | 2008-04-02 | 2013-01-30 | ペンタックスリコーイメージング株式会社 | 撮像装置 |
US8310538B2 (en) * | 2010-03-19 | 2012-11-13 | Fujifilm Corporation | Imaging apparatus, method, program, and recording medium used in the program |
-
2010
- 2010-11-02 US US13/142,779 patent/US8310538B2/en active Active
- 2010-11-02 WO PCT/JP2010/069464 patent/WO2011114572A1/ja active Application Filing
- 2010-11-02 EP EP10838382.9A patent/EP2391142B1/en not_active Not-in-force
- 2010-11-02 CN CN201080004735.1A patent/CN102282857B/zh not_active Expired - Fee Related
- 2010-11-02 JP JP2011525325A patent/JP4843750B2/ja not_active Expired - Fee Related
-
2011
- 2011-10-06 JP JP2011222044A patent/JP2012070389A/ja not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003092768A (ja) * | 2001-09-18 | 2003-03-28 | Canon Inc | 撮像装置 |
JP2004120600A (ja) | 2002-09-27 | 2004-04-15 | Fuji Photo Film Co Ltd | デジタル双眼鏡 |
JP2005045328A (ja) | 2003-07-22 | 2005-02-17 | Sharp Corp | 3次元画像撮像装置 |
JP2008064863A (ja) | 2006-09-05 | 2008-03-21 | Sony Corp | 手ぶれ補正機構及び撮像装置 |
JP2008164338A (ja) | 2006-12-27 | 2008-07-17 | Nikon Corp | 位置検出装置 |
JP2009014445A (ja) | 2007-07-03 | 2009-01-22 | Konica Minolta Holdings Inc | 測距装置 |
JP2009205193A (ja) | 2008-02-26 | 2009-09-10 | Fujifilm Corp | 画像処理装置および方法並びにプログラム |
JP2010032969A (ja) * | 2008-07-31 | 2010-02-12 | Fujifilm Corp | 複眼撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2391142A4 |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012070389A (ja) * | 2010-03-19 | 2012-04-05 | Fujifilm Corp | 撮像装置、方法およびプログラム |
US11785197B2 (en) | 2017-08-30 | 2023-10-10 | Innovations Mindtrick Inc. | Viewer-adjusted stereoscopic image display |
WO2019054304A1 (ja) * | 2017-09-15 | 2019-03-21 | 株式会社ソニー・インタラクティブエンタテインメント | 撮像装置 |
JP2019054463A (ja) * | 2017-09-15 | 2019-04-04 | 株式会社ソニー・インタラクティブエンタテインメント | 撮像装置 |
US11064182B2 (en) | 2017-09-15 | 2021-07-13 | Sony Interactive Entertainment Inc. | Imaging apparatus |
US11438568B2 (en) | 2017-09-15 | 2022-09-06 | Sony Interactive Entertainment Inc. | Imaging apparatus |
JP2021507655A (ja) * | 2018-02-08 | 2021-02-22 | イノベーションズ マインドトリック インコーポレイテッド | 視聴者に合わせて調整された立体画像表示 |
JP7339278B2 (ja) | 2018-02-08 | 2023-09-05 | イノベーションズ マインドトリック インコーポレイテッド | 視聴者に合わせて調整された立体画像表示 |
Also Published As
Publication number | Publication date |
---|---|
EP2391142A4 (en) | 2012-10-31 |
JP4843750B2 (ja) | 2011-12-21 |
CN102282857A (zh) | 2011-12-14 |
CN102282857B (zh) | 2014-03-12 |
JP2012070389A (ja) | 2012-04-05 |
EP2391142A1 (en) | 2011-11-30 |
US20110298917A1 (en) | 2011-12-08 |
US8310538B2 (en) | 2012-11-13 |
JPWO2011114572A1 (ja) | 2013-06-27 |
EP2391142B1 (en) | 2013-12-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4843750B2 (ja) | 撮像装置、方法およびプログラム | |
US9341935B2 (en) | Image capturing device | |
US9210408B2 (en) | Stereoscopic panoramic image synthesis device, image capturing device, stereoscopic panoramic image synthesis method, recording medium, and computer program | |
US8885070B2 (en) | Imaging apparatus, image correction method, and computer-readable recording medium | |
JP5127787B2 (ja) | 複眼撮影装置及びその制御方法 | |
JP6415179B2 (ja) | 画像処理装置、画像処理方法、および撮像装置並びにその制御方法 | |
JP5469258B2 (ja) | 撮像装置および撮像方法 | |
JP5371845B2 (ja) | 撮影装置及びその表示制御方法並びに3次元情報取得装置 | |
US9648305B2 (en) | Stereoscopic imaging apparatus and stereoscopic imaging method | |
JP5647740B2 (ja) | 視差調節装置及び方法、撮影装置、再生表示装置 | |
JP5929037B2 (ja) | 撮像装置 | |
US20130027520A1 (en) | 3d image recording device and 3d image signal processing device | |
KR20150003576A (ko) | 삼차원 영상 생성 또는 재생을 위한 장치 및 방법 | |
US20130083169A1 (en) | Image capturing apparatus, image processing apparatus, image processing method and program | |
JP2009258005A (ja) | 三次元測定装置及び三次元測定方法 | |
JP2012156747A (ja) | 撮像装置、画像合成方法、及びプログラム | |
JP6732440B2 (ja) | 画像処理装置、画像処理方法、及びそのプログラム | |
JP6648916B2 (ja) | 撮像装置 | |
WO2012086298A1 (ja) | 撮影装置、方法及びプログラム | |
JP6292785B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
JP6833772B2 (ja) | 画像処理装置、撮像装置、画像処理方法およびプログラム | |
JP2012220603A (ja) | 3d映像信号撮影装置 | |
JP2007194694A (ja) | 立体映像撮影装置並びにそのプログラム | |
JP2019035967A (ja) | 画像処理装置、画像処理方法、および撮像装置並びにその制御方法 | |
JP2012215980A (ja) | 画像処理装置、画像処理方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080004735.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011525325 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 4593/CHENP/2011 Country of ref document: IN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13142779 Country of ref document: US Ref document number: 2010838382 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10838382 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: PI1006036 Country of ref document: BR |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01E Ref document number: PI1006036 Country of ref document: BR |
|
ENPW | Started to enter national phase and was withdrawn or failed for other reasons |
Ref document number: PI1006036 Country of ref document: BR |