WO2012026185A1 - 撮像装置およびその動作制御方法 - Google Patents
撮像装置およびその動作制御方法 Download PDFInfo
- Publication number
- WO2012026185A1 WO2012026185A1 PCT/JP2011/063799 JP2011063799W WO2012026185A1 WO 2012026185 A1 WO2012026185 A1 WO 2012026185A1 JP 2011063799 W JP2011063799 W JP 2011063799W WO 2012026185 A1 WO2012026185 A1 WO 2012026185A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- subject
- image data
- parallax
- distance
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B35/00—Stereoscopic photography
- G03B35/02—Stereoscopic photography by sequential recording
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/207—Image signal generators using stereoscopic image cameras using a single 2D image sensor
- H04N13/221—Image signal generators using stereoscopic image cameras using a single 2D image sensor using the relative movement between cameras and objects
Definitions
- the present invention relates to an imaging apparatus and its operation control method.
- a left-eye image (an image viewed by the viewer with the left eye) for displaying a stereoscopic image using a digital still camera equipped with a single imaging device instead of a digital still camera for stereoscopic imaging
- the camera is imaged twice by shifting the camera in the horizontal direction by the amount of parallax between the image for the left eye and the image for the right eye.
- There is an apparatus that extracts an image with parallax from those captured in advance Patent Document 1).
- Patent Document 2 there is a technique that controls the position difference between a plurality of photographing positions based on the depth of the subject.
- Patent Document 1 does not have an appropriate amount of parallax.
- the invention disclosed in Patent Document 2 although a plurality of shooting positions are controlled, the control becomes complicated. Therefore, an object of the present invention is to obtain image data for a stereoscopic image relatively easily.
- the imaging device (stereoscopic image imaging device) according to the present invention continuously captures an image of a subject included in an imaging range, and is provided with an imaging unit that continuously outputs captured image data and a recording command. Image data obtained by being imaged at a timing when a recording command is given is output continuously from the first recording control means for recording in the recording means as image data representing the first subject image, and the imaging means.
- Object detection means for detecting all objects satisfying a predetermined condition from object images for object detection among object images represented by the image data, and imaging among a plurality of objects detected by the object detection means
- First distance information calculating means for calculating distance information between an object closest to the apparatus and an object farthest from the imaging apparatus;
- a parallax amount determining means for determining the amount of parallax based on the distance information calculated by the above and a subject image represented by image data continuously output from the imaging means by shifting the imaging device in the horizontal direction
- the image data captured at the same timing is recorded as image data representing the second subject image in the recording means in association with the image data representing the first subject image.
- the recording control means is provided.
- the present invention also provides an operation control method for the imaging apparatus. That is, in this method, the imaging means continuously captures the subject included in the imaging range, continuously outputs the captured image data, and the first recording control means is given a recording command. Thus, the image data obtained by imaging at the timing when the recording command is given is recorded in the recording means as image data representing the first subject image, and the object detecting means is continuously connected to the imaging means. All the objects satisfying a predetermined condition are detected from the object images for object detection among the object images represented by the output image data, and the distance information calculating means detects a plurality of objects detected by the object detecting means.
- the distance information between the object closest to the imaging device among the objects and the object farthest from the imaging device is calculated, and the parallax amount determining means is calculated by the first distance information calculating means.
- the parallax amount is determined based on the calculated distance information, and the second recording control means shifts the image pickup device in the horizontal direction, and is represented by image data continuously output from the image pickup means.
- image data captured at the same timing image data representing the second subject image is recorded in the recording means in association with the image data representing the first subject image.
- the subject included in the imaging range is continuously imaged.
- the image data obtained by imaging at that timing is recorded as image data representing the first subject image (a recording medium removable from the imaging apparatus, a recording medium built in the imaging apparatus). Recorded).
- An object satisfying a predetermined condition for example, a person's face or a spatial frequency equal to or higher than a predetermined threshold
- All objects having () are detected.
- the distance information between the object closest to the imaging device among the plurality of detected objects and the object farthest from the imaging device is calculated.
- the amount of parallax (the amount of parallax for making the first subject image appear as a stereoscopic image) is determined.
- the imaging device is moved by the user and the amount of parallax between the subject image obtained by imaging and the first subject image becomes equal to the determined amount of parallax
- the image data obtained by imaging at the timing when the amount of parallax becomes equal Are recorded on the recording medium as image data representing the second subject image in association with the image data representing the first subject image.
- a stereoscopic image is obtained using the first subject image and the second subject image.
- the apparatus may further include second distance information calculation means for measuring distance information from the imaging device to each of a plurality of objects included in the imaging range.
- the first distance information calculating means for example, from the distance information to the nearest object and the distance information to the farthest object calculated by the second distance information calculating means, The distance information between the nearest object and the farthest object is calculated.
- the imaging means may include an imaging device and a focus lens.
- an AF evaluation value calculating means for calculating an AF evaluation value representing the degree of focusing for each moving position from image data obtained by imaging for each moving position while moving the focus lens. Further prepare. Then, the second distance information calculation means has a focus lens position when the AF evaluation value calculated by the AF evaluation value calculation means is equal to or greater than a threshold value. Based on this, the distance to each of the plurality of objects is measured. Note that the focus lens freely moves in front of the image sensor, that is, on the subject side with respect to the image sensor.
- a predetermined value is determined as the parallax amount.
- the parallax amount determining means determines the parallax amount based on, for example, the size of a display screen for displaying a stereoscopic image and the distance information calculated by the first distance information calculating means.
- a setting means for setting the size of the display screen for displaying the stereoscopic image may be further provided.
- the parallax amount determining means determines the parallax amount based on the size of the display screen set by the setting means and the distance information calculated by the first distance information calculating means. Let's go.
- the second recording control unit shifts the image pickup device in the horizontal direction, thereby moving the subject image represented by the image data continuously output from the image pickup unit and the first subject image in the horizontal direction.
- the image data representing the process of recording on the recording medium in association with the image data representing the first subject image is repeated for the plurality of parallax amounts.
- It may further comprise an object type determining means for determining the type of the object included in the object detection subject image.
- the object detection means detects, for example, a predetermined type of object among the object types determined by the object type determination means.
- the object detection unit may detect, for example, an object of a type excluding a predetermined type among the types of objects determined by the object type determination unit.
- distance calculating means for calculating the distance to the object whose type is determined by the object determining means may be further provided.
- the object detection means is, for example, a distance calculated by the distance calculation means that is equal to or less than a first threshold value among the types of objects determined by the object type determination means. An object excluding an object and an object at a distance greater than or equal to a second threshold value that is greater than the first threshold value is detected.
- a display device for displaying the first subject image on the display screen, and a touch panel formed on the display screen are further provided.
- the object detection means will detect, for example, the object displayed at the position where the touch panel is touched.
- a display screen size setting image is shown. It is a flowchart which shows the process sequence of stereoscopic imaging mode. It is a flowchart which shows the process sequence of stereoscopic imaging mode. The relationship between the focus lens position and the AF evaluation value is shown. The relationship between the subject distance and the required amount of parallax is shown. The relationship between the subject distance and the required amount of parallax is shown. The relationship between the display screen size and the required amount of parallax is shown. The relationship among the subject, the display screen size, and the required amount of parallax is shown. It is a flowchart which shows the process sequence of stereoscopic imaging mode.
- the relationship between the focus lens position and the AF evaluation value is shown. It is a flowchart which shows the process sequence of stereoscopic imaging mode. It shows the distance to the subject. It is an example of the to-be-photographed image displayed on a display screen. The back of the digital still camera is shown. The relationship between an imaging position and a subject is shown. (A) shows an example of an image for the left eye, and (B) shows an example of an image for the right eye. It is an example of a stereo image. The relationship between the amount of parallax and the subject distance is shown. It is a flowchart which shows the required parallax amount calculation processing procedure. It is an example of a file structure.
- the relationship between a digital still camera and a subject is shown.
- the relationship between the amount of parallax and the distance between objects is shown.
- the relationship between the amount of parallax and the distance between objects is shown.
- (A) shows an example of an image for the left eye
- (B) shows an example of an image for the right eye.
- an image for the left eye viewed by the viewer with the left eye and an image for the right eye viewed by the viewer with the right eye are required.
- a digital still camera for stereoscopic image capturing two image capturing devices are provided, and an image for the left eye is captured using one image capturing device, and an image for the right eye is captured using the other image capturing device.
- a left-eye image for stereoscopic image display using a digital still camera provided with one imaging device not a digital still camera for imaging a stereoscopic image provided with two imaging devices.
- a right-eye image is required to display a stereoscopic image.
- FIG. 1 is a plan view showing the relationship between a digital still camera 1 having one image pickup device and a subject.
- a tree subject OB1 In front of the digital still camera 1 are a tree subject OB1, a person subject OB2, and a car subject OB3.
- the object OB1 of the tree is closest to the digital still camera 1, and the object OB2 of the person is next closest to the digital still camera 1.
- the subject OB3 of the car is farthest from the digital still camera 1.
- the digital still camera 1 is positioned at the reference position PL1, the subjects OB1, OB2, and OB3 are imaged, and image data representing the subject images of these subjects OB1, OB2, and OB3 is recorded.
- the subject image obtained by imaging at the reference position PL1 becomes the left-eye image (may be the right-eye image).
- a parallax amount d1 suitable for displaying a stereoscopic image on a 3-inch display screen and a parallax amount d2 suitable for displaying a stereoscopic image on a 32-inch display screen are calculated.
- the user moves the digital still camera 1 in the right direction while continuously (periodically) imaging the subjects OB1, OB2, and OB3.
- the subjects OB1, OB2, and OB3 are imaged while the digital still camera 1 is moving in the right direction.
- the parallax of the subject image obtained by imaging reaches the calculated parallax amount d1 when the digital still camera 1 reaches the position PR1
- the subject image obtained by imaging is displayed on a 3-inch display screen. And is recorded as image data representing the right-eye image.
- the user moves the digital still camera 1 to the right, and when the digital still camera 1 reaches the position PR2, the parallax of the subject image obtained by imaging is calculated as the parallax amount d2.
- the subject image obtained by imaging becomes a right-eye image displayed on a 32-inch display screen, and is recorded as image data representing the right-eye image.
- FIG. 2 is an example of a display screen size setting image.
- the display screen size setting image is used to set the size of a display screen that displays a stereoscopic image.
- Image data representing a left-eye image and image data representing a right-eye image having a parallax amount corresponding to the size of the display screen set using the display screen size setting image are recorded.
- the setting mode is set by the mode setting button included in the digital still camera 1.
- a display screen size setting image is displayed on the display screen 2 formed on the back surface of the digital still camera 1.
- Display screen size input areas 3, 4 and 5 are formed in the display screen size setting image. In these input areas 3, 4, and 5, the size of the display screen is input using buttons provided in the digital still camera 1.
- FIGS. 3 and 4 are flowcharts showing the processing procedure of the stereoscopic imaging mode for recording the left-eye image and the right-eye image for stereoscopic display using the digital still camera 1 having one imaging device as described above. It is.
- the subject When the stereoscopic imaging mode is entered, the subject is imaged continuously (periodically), and the subject image obtained by the imaging is displayed as a moving image (through image) on the display screen provided on the back of the digital still camera 1 Is done. The user determines the camera angle while viewing the moving image displayed on the display screen.
- the distance to the subject is calculated (step 12).
- the distance to the subject the distance to the subject OB2 of the person at the approximate center in the imaging range is calculated, but the distance to the other subject OB1 or OB3 in the other part of the imaging range is calculated. Also good.
- the distance to the subject can be calculated using the amount of movement of the focus lens.
- FIG. 5 shows the relationship between the focus lens position and the AF evaluation value representing the high frequency component of the image data obtained by imaging.
- the subject is imaged while moving the focus lens from the NEAR position (or home position position) to the FAR position.
- the high frequency component (AF evaluation value) of the image data in the center of the imaging range is extracted. From the focus / lens movement amount at the focus / lens position P0 when the AF evaluation value reaches the maximum value AF0, the distance to the subject OB2 in the center of the imaging range can be calculated.
- image data representing a subject image (left-eye image, first subject image) captured at the timing when the shutter release button is fully pressed. Is recorded on the memory card of the digital still camera 1 (step 14).
- the size variable i is reset to 1 (step 15).
- the required amount of parallax is determined for each display screen size set in the display screen size setting (step 16).
- FIG. 6 shows the relationship between the required amount of parallax and the distance to the subject.
- the relationship between the required amount of parallax and the distance to the subject is determined in advance for each display screen size for displaying a stereoscopic image.
- the example shown in FIG. 6 shows the relationship between the required amount of parallax in pixel units and the distance to the subject when a stereoscopic image is displayed on a 32-inch display screen. For example, in the case of a 32-inch display screen, if the distance to the subject is 0.3 m, the required amount of parallax is 40 pixels.
- FIG. 7 is a table showing the relationship between the required amount of parallax in units of pixels and the distance to the subject.
- This table has a display screen size of 32 inches. A required amount of parallax is defined for each distance to the subject. Such a table is determined for each display screen size.
- the required amount of parallax is determined.
- FIG. 8 is a table showing the relationship between the display screen size and the determined required amount of parallax.
- the required amount of parallax when the display screen size is 3 inches is d1, depending on the distance to the subject, and necessary when the display screen size is 32 inches
- the amount of parallax is d2.
- step 16 when the required amount of parallax for the display screen size of 3 inches is calculated as d1 (step 16), the number of display screen size types for which the size variable i is set (in this case, 2) is obtained. Is confirmed (step 17). If the size variable i is not the number of types of the set display screen size (NO in step 17), the size variable i is incremented (step 18), and the required amount of parallax for the next display screen size is obtained. Calculated (step 16).
- step 19 When the required amount of parallax between the left-eye image and the right-eye image required to display a stereoscopic image on all the display screen sizes (3 inches and 32 inches) set is calculated (YES in step 17) ), Timing is started (step 19).
- a message prompting the user to move the digital still camera 1 horizontally is displayed on the display screen, and according to the display, the user moves the digital still camera 1 in the horizontal direction (right direction.
- the reference image is the right-eye image. In this case, move to the left.) (Step 20).
- step 21 A deviation amount between the first subject image and the through image is calculated (step 21).
- the movement of the digital still camera 1 (step 20) and the calculation of the shift amount between the first subject image and the through image (step 21) are repeated until the calculated shift amount becomes equal to the required parallax amount.
- image data representing the subject image (second subject image, right-eye image) captured when the required parallax amount becomes equal is stored in the memory card.
- An image having an optimal amount of parallax can be recorded without the user being aware of it. Since an image corresponding to the display screen size is recorded, it is possible to prevent the amount of parallax from becoming too large when a stereoscopic image is displayed on a large display screen. In addition, imaging failure can be prevented in advance.
- step 24 If no subject image having all the required parallax amounts calculated has been recorded (NO in step 24), the processing from step 20 is repeated again unless the time limit has elapsed (NO in step 25).
- the processing in the stereoscopic imaging mode ends.
- the set display screen sizes are 3 inches and 32 inches, the right-eye image for 3 inches having the parallax amount of d1 and the right-eye image for 32 inches having the parallax amount of d2 Is obtained, the processing of the stereoscopic imaging mode ends.
- the amount of parallax for stereoscopically displaying a specific subject within the imaging range is calculated, and one right-eye image is generated for each display screen size.
- a parallax amount for displaying each of a plurality of subjects within the imaging range in a three-dimensional manner is calculated.
- One right-eye image is generated for each subject and for each display screen size. As shown in FIG. 1, a right-eye image having a parallax amount corresponding to the display screen size is generated for each of the subjects OB1, OB2, and OB3.
- FIG. 9 is a table showing the required amount of parallax, and corresponds to the table shown in FIG.
- a subject variable j is introduced to represent the number of main subjects within the imaging range.
- the subject variable j is 1 to 3.
- the number of main subjects may be input by the user, or may be the number of AF evaluation value peak values (maximum values) that are equal to or greater than a predetermined threshold, as will be described later.
- the subject is a foreground subject (subject close to the digital still camera 1) OB1, a middle subject (near or far from the digital still camera 1). No subject) OB2 and a background subject (subject far from the digital still camera 1) OB3.
- a required amount of parallax suitable for the display screen size is calculated. The calculated required amount of parallax is stored in the table shown in FIG.
- FIG. 10 is a flowchart showing the processing procedure of the stereoscopic imaging mode, and corresponds to the processing procedure of FIG. In FIG. 10, the same processes as those shown in FIG.
- step 12A When the shutter release button is pressed halfway (YES in step 11), the respective distances to a plurality of main subjects within the imaging range are calculated (step 12A).
- FIG. 11 shows the relationship between the focus lens position and the AF evaluation value representing the high frequency component extracted from the image data obtained by imaging.
- a graph of the relationship shown in FIG. 11 is obtained.
- focus lens positions P1, P2, and P3 corresponding to AF evaluation values AF1, AF2, and AF3 having relatively high AF evaluation values (above a predetermined threshold) are obtained.
- the distances from these positions P1, P2 and P3 to the subjects OB1, OB2 and OB3 are known.
- the subject image captured at the timing of the second stage depression is the first subject image (for the right eye).
- the image data representing the first subject image is recorded on the memory card (step 14).
- the subject variable j and the size variable i are each reset to 1 (steps 26 and 15).
- the required amount of parallax is calculated (step 16). Initially, since the subject variable j is 1 and the size variable i is 1, a necessary parallax amount suitable for a display screen size of 3 inches is calculated for the foreground subject OB1 (step 16). The necessary amount of parallax is calculated from the graph of the relationship shown in FIG. 6 corresponding to the display screen size, using the measured distance to the subject. If the size variable i is not the number of display screen size types (NO in step 17), the size variable i is incremented (step 18), and the required amount of parallax suitable for display at the next display screen size Is calculated (step 16).
- the size variable i is the number of types of display screen sizes (the display screen size is 3 inches and 32 inches, it is 2) (YES in step 17), it is confirmed whether or not the subject variable j is the number of subjects ( Step 27). If the number of subjects is not reached (NO in step 27), subject variable j is incremented (step 28). Thereby, the required parallax amount is calculated for each display screen size for the next subject.
- the calculated required parallax amount is stored in the table shown in FIG.
- the image data representing the subject image Is recorded on the memory card when the digital still camera 1 is moved in the horizontal direction by the user and the imaging is repeated, and the subject image having the calculated required amount of parallax is captured, the image data representing the subject image Is recorded on the memory card.
- the image data representing the reference left-eye image (first image) and the six types of right-eye images are recorded in the memory card.
- image data representing each of the six types of left-eye images may be recorded on the memory card with the right-eye image as a reference.
- FIG. 12 is a flowchart showing a processing procedure in the stereoscopic imaging mode. In this figure, the same processes as those shown in FIG.
- step 12A when the shutter release button is depressed in the first stage (YES in step 11), distances to a plurality of main subjects included in the imaging range are calculated (step 12A).
- FIG. 13 is a table showing distances to a plurality of main subjects.
- a table indicating the distance is generated and stored in the digital still camera 1.
- the distance to the foreground subject OB1 is 1 m
- the distance to the middle background subject OB2 is 1.5 m
- the distance to the background subject OB3 is 3 m.
- a representative distance representing the distance to the representative image is calculated (step 28).
- the representative distance may be an average distance to a plurality of main subjects in the imaging range, a distance to a subject closest to the digital still camera 1, or the like.
- the average distance is the representative distance
- the amount of parallax of the foreground subject may increase.
- the representative distance is the closest distance
- the parallax amount is prevented from increasing. it can.
- the subject image obtained by imaging is displayed on the display screen provided on the back of the digital still camera 1, a desired subject image is selected from the displayed subject images, The distance to the selected subject image may be used as the representative image.
- FIG. 14 is an example of a subject image displayed on the display screen.
- the display screen 2 displays a plurality of subject images OB1, OB2, and OB3 (same as those of the subject).
- the user designates a representative image with the finger F from these subject images OB1, OB2, and OB3.
- FIG. 15 shows another method for selecting a representative image, and shows a rear view of the digital still camera 1.
- a display screen 2 is formed over the entire rear surface of the digital still camera 1.
- a plurality of subject images OB1, OB2, and OB3 are displayed on the display screen 2.
- a movement button 6 is provided on the lower right side of the display screen 2.
- a determination button 7 is provided on the movement button 6. Further, a wide button 8 and a tele button 9 are provided on the determination button 7.
- a cursor 10 is displayed on the display screen 2. The cursor 10 moves on the image displayed on the display screen 2 according to the operation of the movement button 6 by the user's finger F. The movement button 6 is operated so that the cursor 10 is positioned on a desired subject image. When the cursor 10 is positioned on the desired subject image, the determination button 7 is pressed by the user's finger F. Then, the subject image with the cursor 10 becomes the representative image.
- the distance to the representative image selected in this way is the same as that shown in FIG. 5, in the image data obtained by repeating the imaging while moving the position of the focus lens as described above. It can be known from the position of the focus lens at the peak value of the AF evaluation value, which is a high frequency component obtained by extracting from the image data representing the representative image portion touched by F or the representative image portion designated by the cursor 10.
- the required parallax amount corresponding to the representative distance is calculated, and image data representing the subject image when the required parallax amount is reached is stored in the memory card. To be recorded. Since image data representing a subject image having a parallax amount corresponding to the representative distance is recorded on the memory card, image data is not unnecessarily recorded unnecessarily.
- FIG. 16 to FIG. 20 show still another modification.
- This modification is such that the required amount of parallax calculated as described above is less than or equal to the parallax amount allowable value. If the amount of parallax is large, the viewer of the stereoscopic image may feel uncomfortable, but since the upper limit of the required amount of parallax is limited, it is possible to prevent the viewer of the stereoscopic image from feeling uncomfortable.
- FIG. 16 is a plan view showing the relationship between the subject and the shooting position.
- the shooting position of the left-eye image is represented by X1, and the shooting position of the right-eye image is represented by X2. It is assumed that there is a first subject OB11 that is relatively close to the photographing positions X1 and X2 and a second subject OB12 that is relatively far from the photographing positions X11 and X12.
- the first subject OB11 and the second subject OB12 are imaged from the photographing position X1, and a left-eye image is obtained. Further, the first subject OB11 and the second subject OB12 are imaged from the photographing position X2, and a right-eye image is obtained.
- FIG. 17A is an example of a left-eye image obtained by imaging
- FIG. 17B is an example of a right-eye image obtained by imaging.
- the left-eye image 30L includes a first subject image 31L representing the first subject OB11 and a second subject image 32L representing the second subject OB12.
- the second subject image 32L is located on the left side of the first subject image 31L.
- the right-eye image 30R also includes a first subject image 31R representing the first subject OB11 and a second subject image 32R representing the second subject OB12.
- the second subject image 32R is located on the right side of the first subject image 31L.
- FIG. 18 shows a stereoscopic image 30 in which a left-eye image and a right-eye image are overlaid.
- the first subject image 31 representing the first subject OB11 is not shifted left and right.
- the second subject image 32L representing the second subject image OB12 and the second subject image 32R are shifted by a parallax amount L. If this parallax amount L is too large, the viewer of the stereoscopic image will have a sense of discomfort as described above.
- FIG. 19 shows the relationship between the amount of parallax and the subject distance.
- a graph G1 representing the amount of parallax in which the subject can be viewed stereoscopically is defined.
- the amount of parallax is 40 pixels.
- the allowable parallax amount of the subject image of the second subject OB12 is 25 pixels. .
- the parallax amount of the first subject OB11 is set to the parallax amount allowable value 25 of the second subject OB12.
- FIG. 20 is a flowchart showing a necessary parallax amount calculation processing procedure (processing procedure of step 16 in FIGS. 10 and 12).
- the required amount of parallax of the subject is calculated from the distance to the subject using the graph G1 shown in FIG. 19 (step 41). It is confirmed whether or not there is a main subject farther than the subject whose required amount of parallax is calculated (a subject whose AF evaluation value is equal to or greater than a predetermined threshold value) (step 42).
- the farthest main subject farther than the subject whose required amount of parallax is calculated Is calculated using the graph G2 (step 43).
- the parallax allowable value of the farthest subject is set as the required parallax amount (step 45).
- step 42 If there is no main subject farther than the subject whose required parallax amount is to be calculated (NO in step 42), the processing of steps 43 to 45 is skipped. If the required amount of parallax does not exceed the parallax allowable value of the main subject farthest away (NO in step 44), the process of step 45 is skipped. Of course, the same processing may be performed for a main subject closer to the subject for which the required amount of parallax is calculated.
- FIG. 21 shows an example of the file structure of a file storing image data representing the left-eye image and the right-eye image described above.
- the file includes a header recording area 51 and a data recording area 52.
- the header recording area 51 stores information for managing files.
- image data representing a plurality of images is recorded.
- a plurality of recording areas 71 to 78 are formed in the data recording area 52.
- the first recording area 71 and the second recording area 72 are areas for the left eye image.
- the third recording area 73 to the eighth recording area 78 are areas for the right eye image. Needless to say, if the number of right-eye images represented by the right-eye image data stored in the file is large, the number of recording areas is further increased.
- Ancillary information area 62 for storing ancillary information such as image information indicating an image, an area 63 for recording image data, and an EOI area for storing EOI (end-of-image) data indicating the end of the image data 64 is formed.
- image data representing the left eye image is recorded in the area 63 in which the image data of the first recording area 71 is recorded.
- Image data representing a thumbnail image of the left-eye image represented by the left-eye image data recorded in the first recording area 71 is recorded in the area 63 in which the image data in the second recording area 72 is recorded.
- the odd-numbered recording area stores image data representing the left-eye image or the right-eye image obtained by imaging, and the even-numbered recording area.
- the image data representing the thumbnail image of the left-eye image or right-eye image obtained by imaging is recorded.
- the third recording area 73 to the eighth recording area 78 are the same as the first recording area 71 and the second recording area 72 except that image data of the right-eye image is recorded.
- image data of the right-eye image is recorded in the right-eye image.
- data indicating the display screen size and the position of the main subject is also recorded in the attached information. Needless to say, it has been done.
- the image data representing the left eye image and the image data representing the plurality of right eye images obtained as described above are stored in a file and recorded on a memory card.
- FIG. 22 is a block diagram showing an electrical configuration of a digital still camera in which the above-described imaging is performed.
- Digital still cameras include a stereoscopic imaging mode for generating parallax images, an imaging mode for performing normal two-dimensional imaging, a two-dimensional playback mode for performing two-dimensional playback, a stereoscopic playback mode for displaying stereoscopic images, and a setting mode.
- An operation device 81 including various buttons such as a mode setting button for setting a mode and a two-stroke type shutter / release button is provided. An operation signal output from the operation device 81 is input to the CPU 80.
- the digital still camera includes one image sensor (CCD, CMOS, etc.) 88 that images a subject and outputs an analog video signal representing the subject.
- a focus lens 84, a diaphragm 85, an infrared cut filter 86, and an optical low pass filter 87 are provided in front of the image sensor 88.
- the lens position of the focus lens 84 is controlled by the lens driving device 89.
- a diaphragm amount of the diaphragm 85 is controlled by a diaphragm driving device 90.
- the image sensor 88 is controlled by an image sensor driving device 91.
- the subject When the stereoscopic imaging mode is set, the subject is periodically imaged by the image sensor 88. A video signal representing the subject image is periodically output from the image sensor 88. The video signal output from the image sensor 88 is subjected to predetermined analog signal processing in the analog signal processing device 92 and converted into digital image data in the analog / digital conversion device 96. The digital image data is input to the digital signal processing device 96. The digital signal processor 96 performs predetermined digital signal processing on the digital image data. Digital image data output from the digital signal processing device is given to the display device 102 via the display control device 101. An image obtained by imaging is displayed as a moving image on the display screen of the display device 102 (through image display).
- the subject When the shutter release button is depressed in the first stage, the subject is imaged while the focus lens 84 is moved as described above.
- the subject distance acquisition device 103 a high frequency component is extracted from image data obtained by imaging, and the distance to the subject is calculated from the peak value of the high frequency component and the amount of movement of the focus lens. Further, the image data is input to the integrating device 98, and the subject is photometrically measured. Based on the obtained photometric value, the aperture value of the aperture 85 and the shutter speed (electronic shutter) of the image sensor 88 are determined.
- the image data captured at the second timing represents the left eye image.
- Image data representing the left-eye image is given to the main memory 95 under the control of the memory control device 94 and temporarily stored.
- Image data is read from the main memory 95 and compressed by the compression / decompression processor 97.
- the compressed image data is recorded on the memory card 100 by the memory controller 99.
- Data representing the distance to the main subject (may be a distance to one subject existing in the center of the imaging range) acquired by the subject distance acquisition device 103 is input to the required parallax amount calculation device 105.
- the required parallax amount is calculated as described above.
- data representing the distance to the main subject is also given to the representative distance calculation device 104.
- the distance to the representative subject is calculated by the representative distance calculation device 104.
- the distance to the selected subject is set as the representative distance. Calculated.
- image data representing the left-eye image is recorded on the memory card 100
- the digital still camera itself is moved in the horizontal direction (right direction) by the user.
- the subject is continuously imaged even while the camera is moving, and the subject image is continuously obtained.
- Image data obtained by continuous imaging is input to the through image parallax amount calculation device 106.
- the through image parallax amount calculation device 106 it is confirmed whether or not the input subject image is equal to the calculated necessary parallax amount. If they are equal, the image data representing the input subject image is recorded in the memory card 100 as right-eye image data. As described above, the image data representing the right-eye image is recorded in the memory card 100 so as to have a parallax amount corresponding to the display screen size.
- the digital still camera includes a light emitting device 82 and a light receiving device 83.
- the stereoscopic playback mode When the stereoscopic playback mode is set, if left-eye image data recorded on the memory card 100 and right-eye image data corresponding to the display screen size of the display device 102 are recorded, the right-eye image is recorded. Data is read out. The read left-eye image data and right-eye image data are decompressed by the compression / decompression processor 97. The expanded left-eye image data and right-eye image data are given to the display device 102, whereby a stereoscopic image is displayed. When the right-eye image data corresponding to the display screen size of the display device 102 is not recorded on the memory card 100, the right-eye image data recorded on the memory card 100 is read and displayed on the display device 102. The parallax adjustment between the left-eye image and the right-eye image may be performed so that the parallax amount is suitable for the screen size.
- the object closest to the digital still camera (stereoscopic image capturing apparatus) 1 (the object having the AF evaluation value equal to or greater than the threshold value is the most).
- the object closest to the digital still camera 1 and the object farthest from the digital still camera 1 (the object farthest among the objects having an AF evaluation value equal to or greater than the threshold value and the object farthest)
- the required amount of parallax is determined based on the distance between the two (distance between objects, distance information).
- FIG. 23 is a plan view showing the relationship between the digital still camera 1A provided with a single imaging device and a plurality of objects included in the imaging range.
- the first object OB10 is closest to the digital still camera 1, and the second object OB20 is next closest to the digital still camera 1.
- the third object OB30 is farthest from the digital still camera 1.
- the first object OB10 is the latest object, and the third object OB30 is the farthest object.
- the distance between the objects between the nearest object and the farthest object is compared. Short distance L1.
- the first object OB10 is located at a position indicated by reference sign L12, which is closer to the digital still camera 1 than the reference sign L11, and the third object OB30 is digitally displayed at a position indicated by reference sign L31. -If it exists in the position shown by the code
- the main subject (the second object OB20, the main object being the most recent object and the farthest object) even if the left-eye image and the right-eye image are obtained as described above.
- the relative parallax between the nearest object or the farthest object is reduced.
- the required amount of parallax between the right-eye image and the left-eye image for forming the stereoscopic image is increased.
- the relative parallax between the main subject and the nearest or farthest object increases. For this reason, in this embodiment, when the distance between the objects is long, the required amount of parallax between the right-eye image and the left-eye image for forming the stereoscopic image is reduced.
- the digital still camera 1A is positioned at the reference position PL11, and the objects OB10, OB20, and OB30 included in the imaging range are continuously imaged.
- the objects OB10, OB20, and OB30 are detected from the object detection subject image that is one of the continuously captured subject images.
- image data representing the subject images of the objects OB10, OB20, and OB30 captured at the timing when the recording command is given is recorded as one frame image (first subject image).
- the subject image obtained by imaging at the reference position PL11 becomes the left-eye image (may be the right-eye image).
- a parallax amount d11 suitable for displaying a stereoscopic image on a display screen of a predetermined size is determined according to the distance between target images.
- the user moves the digital still camera 1A in the right direction while continuously (periodically) imaging the objects OB10, OB20, and OB30.
- the subjects OB10, OB20, and OB30 are imaged even while the digital still camera 1 is moving in the right direction.
- the parallax of the subject image obtained by imaging becomes the parallax amount d11 determined as described later when the digital still camera 1 reaches the position PR11, the subject image obtained by imaging has a predetermined size.
- the right-eye image (second subject image) displayed on the display screen is recorded as image data representing the right-eye image.
- a parallax amount suitable for a display screen of another size is also determined based on the distance between the objects, and a subject image having the determined parallax amount is captured, image data representing the captured subject image is obtained. To be recorded.
- the amount of parallax may be determined based on the distance between objects regardless of the size of the display screen.
- the digital still camera 1 may be provided with setting means for setting the size of a display screen for displaying a stereoscopic image. In that case, the amount of parallax is determined from the size of the display screen set by the setting means and the distance between the objects. Needless to say, a table representing the relationship among the size of the display screen, the distance between the objects, and the amount of parallax is determined in advance, and the amount of parallax is determined using such a table.
- FIG. 24 shows the relationship between the required amount of parallax and the distance between objects.
- the relationship between the required amount of parallax and the distance between objects is determined in advance for each display screen size for displaying a stereoscopic image.
- the example shown in FIG. 24 shows the relationship between the required amount of parallax in pixel units and the distance between objects when a stereoscopic image is displayed on a 3-inch display screen. For example, if the distance between objects is 0.3 m, the required amount of parallax is 40 pixels.
- FIG. 25 is a table showing the relationship between the required amount of parallax in pixel units and the distance between objects.
- This table has a display screen size of 3 inches. A required amount of parallax is defined for each distance between objects. Such a table is determined for each display screen size.
- the required amount of parallax is determined.
- the required amount of parallax may be determined according to only the distance between objects without considering the display screen size.
- FIG. 26 is a flowchart showing a part of the processing procedure of the stereoscopic imaging mode for recording the left-eye image and the right-eye image for stereoscopic display using the digital still camera 1 having one imaging device.
- FIG. 26 corresponds to FIG. 3, and the same processes as those in FIG. 3 are denoted by the same reference numerals and description thereof is omitted as necessary.
- the shutter release button is pressed halfway (step 11). Then, the subject image captured at the half-press timing (the subject detection subject image.
- the subject image is not necessarily limited to the subject image captured at the half-press timing, but any one of the subject images captured continuously. All objects satisfying a predetermined condition are detected from the image (which may be an image) (step 29). An inter-object distance representing the distance between the nearest object and the farthest object is calculated from the detected objects (step 12A).
- the distance between objects can be calculated as follows.
- the focus lens can be moved by a predetermined distance between the NEAR position that is closest to the image sensor and the FAR position that is farthest from the image sensor. .
- An object is imaged for each moving position, and high frequency components are extracted from the image data obtained for each imaging. From this high frequency component, an AF evaluation value representing the degree of focusing is obtained for each moving position of the focus lens.
- the focus lens positions P1, P2 and P3 (the movement amounts of the focus lens) that give the maximum value of the AF evaluation value curve exceeding the threshold are the distances to the objects OB10, OB20 and PB30, respectively. Equivalent to.
- the distance between the object OB10 and the object OB30 detected in this manner is the distance between the objects. It goes without saying that the distances of the objects OB10, OB20 and OB30 can be determined from the amount of movement of the focus lens.
- the detection of the object itself can also be realized as described above.
- the subject image captured at the timing when the shutter release button is fully pressed becomes the first subject image and is recorded on the memory card (Ste 14).
- the size variable i is reset to 1 (step 15), and the required parallax amount is determined from the table (see FIG. 25) corresponding to the size of the display screen determined by the size variable i (step 16). .
- the shutter release button is pressed halfway, the distance between the objects between the nearest object and the farthest object is calculated, and the shutter release button is fully pressed.
- the first subject image is recorded on the memory card according to the above, but when the shutter release button is fully pressed, the first subject image is recorded on the memory card, and the first subject image is recorded from the first subject image.
- All objects that satisfy a predetermined condition for example, a human face image, an object having a spatial frequency equal to or higher than a threshold value
- a digital still camera (imaging device) 1 among the detected objects It is preferable that the distance between the object closest to the object and the object farthest is calculated, and the amount of parallax is determined from the calculated distance.
- step 18 The required amount of parallax corresponding to the size of the display screen and the distance between the objects while the size variable i is incremented (step 18) until the size variable i is the same as the number of display screen size types (step 17). Is determined.
- the first image is a left-eye image (right-eye image) constituting a stereoscopic image
- the second image is a right-eye image (or left-eye image) constituting a stereoscopic image.
- the case where the object distance indicating the distance between the nearest object and the farthest object can be calculated has been described. However, when only one object is detected from the imaging range. The object distance cannot be calculated. In such a case, a predetermined required amount of parallax (preferably, a predetermined amount of parallax determined in advance corresponding to the display screen) is determined.
- FIG. 27 is an example of a file structure of a file that stores image data representing the left-eye image and the right-eye image obtained by the above-described embodiment.
- FIG. 27 corresponds to FIG. 21, and the same components as those shown in FIG.
- the image data representing the left eye image is stored in the image data recording area 63 of the first recording area 71.
- Image data representing a thumbnail image of the left-eye image is stored in the image data recording area 63 of the second recording area 72.
- image data representing a right-eye image having a required parallax amount corresponding to the distance between objects and the display screen size is stored. It is recorded. Thumbnail image data is recorded in the fourth recording area 74, the sixth recording area 76, and the eighth recording area 78.
- the image data representing the left-eye image and the right-eye image for a plurality of frames are stored in one file, and the file is recorded on the memory card.
- FIG. 28 (A) is an example of a left-eye image recorded according to this embodiment
- FIG. 28 (B) is an example of a right-eye image recorded according to this embodiment.
- left eye image 140L includes first object image 110L representing first object OB10, second object image 120L representing second object OB20, and second object image 120L.
- a third object image 130L representing three objects OB30 is included.
- the first object image 110R representing the first object OB10, the second object image 120R representing the second object OB20, and the first object image OB10 are also included in the right eye image 140R.
- a third object image 130R representing three objects OB30 is included.
- FIG. 29 shows a stereoscopic image 140 in which the left-eye image 140L shown in FIG. 28 (A) and the right-eye image 140R shown in FIG. 28 (B) are superimposed.
- FIG. 30 is a block diagram showing the electrical configuration of the digital still camera according to this embodiment.
- FIG. 30 corresponds to the block diagram shown in FIG. 22, and the same components as those shown in FIG.
- an inter-object distance calculation device 104A is provided in the digital still camera shown in FIG. 30, an inter-object distance calculation device 104A.
- the subject distance acquisition device 103 calculates the distances to a plurality of objects.
- Data representing the calculated distances of the plurality of objects to the respective objects is input from the object distance acquisition device 103 to the inter-object distance calculation device 104.
- the distance between objects is calculated by the distance calculation apparatus 104 from the input data.
- the required amount of parallax is determined as described above.
- the subject image having the determined required amount of parallax is recorded on the memory card 100 as described above.
- image data representing the left-eye image and the right-eye image recorded on the memory card 100 as described above, corresponding to the size of the display screen of the display device 102 Image data is read.
- the read image data is given to the display control apparatus 101, a stereoscopic image is displayed on the display screen of the display apparatus 102.
- the amount of parallax is determined based on the size of the display screen for displaying a stereoscopic image and the distance information between the nearest object and the farthest object. You may make it do. Also, as shown in FIG. 2, the size of the display screen is set, and the amount of parallax is determined based on the set size of the display screen and the distance information between the nearest object and the farthest object. It may be determined. Further, the image data representing the first subject image and the image data representing the second subject image recorded in the memory card 100 are read, and the first subject image and the second subject image represented by the read image data are read out. The subject image may be displayed on the display screen of the display device while being shifted in the horizontal direction by the determined amount of parallax.
- FIG. 31 to FIG. 40 show modified examples.
- the process of detecting the target object for which the closest target object and the farthest target object are selected (the process corresponding to step 29 in FIG. 26). Is shown.
- the above-mentioned closest object and the farthest object are determined from the objects detected by these processes.
- FIG. 31 is a flowchart showing a processing procedure for determining the type of object.
- FIG. 32 is an example of the object detection image 160 obtained by imaging.
- the subject is different from that in FIG. 23, but needless to say, it may be the same as the subject in FIG.
- the object detection image 160 includes a road image 162 in front, and an automobile image 161 on the road image 162. There is a person image 163 at the center, and tree images 164 and 165 on the left and right of the person image 163. At the upper left of the object detection image 160 is a cloud image 166. Further, the upper part in the object detection image 160 is an empty image 167.
- the color of each pixel constituting the object detection subject image 160 is detected and divided into regions for each color (step 151). It is not necessary to divide into areas for each full-color color, as long as the colors can be divided into areas considered to represent similar objects (for example, about 32 colors or 64 colors).
- feature values for each area are extracted from the areas divided for each color (step 152). This feature amount is predetermined and includes a color representing a divided area, a contrast, brightness, a position in the subject detection subject image 160, and the like. When areas are divided in this way, even if the same object is displayed in different areas, nearby areas whose features are approximated are grouped into one area ( Step 153).
- the object detection subject image 160 When the object detection subject image 160 is divided into a plurality of areas, it is determined with reference to the learning database what kind of object each divided area represents (step 154).
- the learning database stores the feature quantity such as the color, contrast, brightness, and position of the imaged object and the type of the object in association with each other. Stored in advance. From the feature value of the divided area, it can be determined what kind of object the area represents.
- FIG. 33 shows the types of the determined objects.
- the object detection subject image 160 is divided into a plurality of regions 171-177.
- a region 171 represents a car as the type of object.
- the region 172 represents a road
- the region 173 represents a human
- the regions 174 and 175 represent trees
- the region 176 represents a cloud
- the region 177 represents the sky, respectively.
- FIG. 34 shows a process for detecting an object to be selected from the nearest object and the farthest object using the type of object determined as described above (corresponding to step 29 in FIG. 26). It is a flowchart which shows a process.
- step 181 it is determined whether or not the determined type of the object is a predetermined type of object (step 182). If it is an object of a predetermined type (YES in step 182), it is detected as an object. Among the detected objects, the nearest object and the farthest object are determined as described above, and the distance between the objects between the nearest object and the farthest object is determined as described above. .
- the objects you want to show in three dimensions are people, cars, trees, buildings, etc.
- the objects you do not want to show in three dimensions are the sky, road, sea, etc. What you want to show in three dimensions You can decide freely or not to show. For example, you may want to show the sky, road, sea, etc. in three dimensions, or you may not want to show people, cars, trees, buildings, etc. in three dimensions.
- the types of objects for example, people, automobiles, trees, buildings
- the types of objects are determined in advance, and it is determined whether or not the determined types are objects that are desired to be stereoscopically displayed.
- FIG. 35 is another flowchart showing the object detection processing procedure.
- the processing procedure shown in FIG. 34 is to detect a predetermined type of object as the object.
- an object of an exclusion target type is determined in advance, and an object not corresponding to the exclusion target type is detected as an object.
- the type of the object is determined (step 181). Then, it is determined whether or not the determined type of object is an object of a predetermined exclusion target type (for example, road, sky, cloud, sea, etc.) (step 184). If the determined type of object is not an excluded object (NO in step 184), the determined type of object is detected as an object (step 183). If the determined type of object is an excluded object (YES in step 184), it is not detected as an object.
- a predetermined exclusion target type for example, road, sky, cloud, sea, etc.
- FIG. 36 is another flowchart showing the object detection processing procedure.
- this processing procedure an object closer to the first threshold and an object farther than the second threshold are excluded from the object, and the remaining objects are detected as objects. It is.
- the distance to the object for which the type has been determined is calculated (step 191).
- the distance to the object is obtained by extracting a high frequency component from the image data obtained by imaging the subject while moving the focus lens 84 (AF evaluation value), and the AF evaluation value and the focus lens. It can be calculated using a graph showing the relationship with 84 lens positions. As shown in FIG. 33, when divided into regions, an AF evaluation value is obtained by extracting a high frequency component from image data corresponding to the region, and the obtained AF evaluation value and the lens position of the focus lens 84 are obtained. The distance from the lens position of the focus lens 84 that gives the maximum AF evaluation value to the object represented by that area is known.
- FIG. 37 shows the relationship between the AF evaluation value obtained from the area 171 of the image representing the car shown in FIG. 33 and the lens position of the focus lens 84.
- the peak value of the AF evaluation value is AF11
- the lens position of the focus lens 84 at the peak value AF11 is P11.
- the distance to the car is known.
- FIG. 38 shows the relationship between the AF evaluation value obtained from the image area 173 representing the person shown in FIG. 33 and the lens position of the focus lens 84.
- the peak value of the AF evaluation value is AF13
- the lens position of the focus lens 84 at the peak value AF13 is P13.
- the distance to the person can be determined by how far the lens position P13 is from the home position of the focus lens 84.
- the nearest object and the farthest object are found as described above.
- FIG. 39 and 40 show the object detection
- FIG. 39 is a flowchart showing the processing procedure
- FIG. 40 shows the object image 160 for object detection displayed on the display screen 2.
- the object detection subject image 160 is displayed on the display screen 2 (step 201).
- a touch panel is formed on the surface of the display screen 2, and a desired object is touched by the user from the displayed object detection subject image 160 (step 202).
- an object detection subject image 160 is displayed on display screen 2.
- the object detection subject image 160 includes the car image 161, the road image 162, the person image 163, the tree images 164 and 165, the cloud image 166, and the sky image 167 as described above. Yes.
- the user touches the image portion of a desired object to be the object with these fingers F among these images. For example, a car image 161, a person image 163, and tree images 164 and 165 are touched with the finger F.
- An object representing the touched image portion is detected (step 203 in FIG. 39). The latest object and the farthest object are found from the touched objects.
- the type of the object determined in the object determining process may be displayed near the corresponding object in the object image for object detection 160 as shown in FIG.
- the user touches the target object to be touched.
- a region may be divided for each object, and the display screen 2 may display the type of the detected object in those regions. In this case as well, it is possible to understand at a glance what the object is touched by the user.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Stereoscopic And Panoramic Photography (AREA)
- Studio Devices (AREA)
Abstract
Description
80 CPU
88 撮像素子(撮像手段)
100 メモリ・カード
103 被写体距離取得装置
104A 対象物間距離算出装置
105 必要視差量算出装置
106 スルー画視差量算出装置
Claims (12)
- 撮像範囲内に含まれる被写体を連続して撮像し,撮像された画像データを連続して出力する撮像手段,
記録指令が与えられたことにより,記録指令が与えられたタイミングで撮像されることにより得られる画像データを,第1の被写体像を表わす画像データとして記録手段に記録する第1の記録制御手段,
上記撮像手段から連続して出力された画像データによって表わされる被写体像のうちの対象物検出用被写体像から所定の条件を満たす対象物を全て検出する対象物検出手段,
上記対象物検出手段により検出された複数の対象物のうち撮像装置にもっとも近い対象物と上記撮像装置にもっとも遠い対象物との間の距離情報を算出する第1の距離情報算出手段,
上記第1の距離情報算出手段によって算出された距離情報にもとづいて,視差量を決定する視差量決定手段,および
上記撮像装置を水平方向にずらすことにより,上記撮像手段から連続して出力される画像データによって表わされる被写体像と上記第1の被写体像との水平方向のずれ量が上記視差量決定手段によって決定された視差量と等しくなったことに応じて,等しくなったタイミングで撮像された画像データを,第2の被写体像を表わす画像データとして,上記第1の被写体像を表す画像データと関連づけて上記記録手段に記録する第2の記録制御手段,
を備えた撮像装置。 - 上記撮像装置から上記撮像範囲内に含まれる複数の対象物のそれぞれの対象物までの距離情報を測定する第2の距離情報算出手段をさらに備え,
上記第1の距離情報算出手段は,
上記第2の距離情報算出手段によって算出された,上記もっとも近い対象物までの距離情報と上記もっとも遠い対象物までの距離情報とから,上記もっとも近い対象物と上記もっとも遠い対象物との間の距離情報を算出するものである,
請求項1に記載の撮像装置。 - 上記撮像手段は,撮像素子およびフォーカス・レンズを含み,
上記フォーカス・レンズを移動させながら,その移動位置ごとに撮像することにより得られる画像データから,その移動位置ごとに合焦の程度を表わすAF評価値を算出するAF評価値算出手段をさらに備え,
上記第2の距離情報算出手段は,
上記AF評価値算出手段によって算出されたAF評価値がしきい値以上となる,AF評価値が得られたときのフォーカス・レンズの位置にもとづいて,上記複数の対象物のそれぞれの対象物までの距離を測定するものである,
請求項2に記載の撮像装置。 - 上記視差量決定手段は,
上記第2の距離情報算出手段が,上記対象物検出手段によって検出された複数の対象物のうち一つの対象物までの距離のみを測定できた場合には,あらかじめ定められた値を上記視差量と決定するものである,
請求項1から3のうちいずれか一項に記載の撮像装置。 - 上記視差量決定手段は,
立体画像を表示する表示画面の大きさと上記第1の距離情報算出手段によって算出された距離情報とにもとづいて,上記視差量を決定するものである,
請求項1から3のうち,いずれか一項に記載の撮像装置。 - 上記立体画像を表示する表示画面の大きさを設定する設定手段をさらに備え,
上記視差量決定手段は,
上記設定手段によって設定された表示画面の大きさと上記第1の距離情報算出手段によって算出された距離情報とにもとづいて,上記視差量を決定するものであり,
上記第2の記録制御手段は,
上記撮像装置を水平方向にずらすことにより,上記撮像手段から連続して出力される画像データによって表わされる被写体像と上記第1の被写体像との水平方向のずれ量が上記視差量決定手段によって決定された複数の視差量のいずれかの視差量と等しくなったことに応じて,その等しくなったタイミングで撮像された画像データを,第2の被写体像を表わす画像データとして,上記第1の被写体像を表す画像データと関連づけて上記記録媒体に記録する処理を上記複数の視差量について繰り返すものである,
請求項1から4のうち,いずれか一項に記載の撮像装置。 - 立体再生指令に応じて,上記記録手段に記録されている第1の被写体像を表わす画像データと上記記録手段に記録されている第2の被写体像を表わす画像データを上記記録手段から読み出す読出手段,および
上記読出手段によって読み出された第1の被写体像を表わす画像データによって表わされる第1の被写体像と第2の被写体像を表わす画像データによって表わされる第2の被写体像とを,上記視差量決定手段によって決定された視差量だけ水平方向にずらして表示するように表示装置を制御する表示制御手段,
をさらに備えた請求項1から6のうち,いずれか一項に記載の撮像装置。 - 上記対象物検出用被写体像に含まれる対象物の種類を決定する対象物種類決定手段をさらに備え,
上記対象物検出手段は,
上記対象物種類決定手段によって決定された対象物の種類のうち,あらかじめ定められた種類の対象物を検出するものである,
請求項1から7のうち,いずれか一項に記載の撮像装置。 - 上記対象物検出用被写体像に含まれる対象物の種類を決定する対象物種類決定手段をさらに備え,
上記対象物検出手段は,
上記対象物種類決定手段によって決定された対象物の種類のうち,あらかじめ定められた種類を除く種類の対象物を検出するものである,
請求項1から7のうち,いずれか一項に記載の撮像装置。 - 上記対象物決定手段によって種類が決定された対象物までの距離を算出する距離算出手段をさらに備え,
上記対象物検出手段は,
上記対象物種類決定手段によって決定された種類の対象物のうち,上記距離算出手段によって算出さたれ距離が第1のしきい値以下の距離にある対象物および第1のしきい値よりも大きな第2のしきい値以上の距離にある対象物を除く対象物を検出するものである,
請求項8または9に記載の撮像装置。 - 第1の被写体像を表示画面に表示する表示装置,および
上記表示画面に形成されたタッチ・パネルをさらに備え,
上記対象物検出手段は,
上記タッチ・パネルがタッチされた位置に表示されている対象物を検出するものである,
請求項1から7のうち,いずれか一項に記載の撮像装置。 - 撮像手段が,撮像範囲内に含まれる被写体を連続して撮像し,撮像された画像データを連続して出力し,
第1の記録制御手段が,記録指令が与えられたことにより,記録指令が与えられたタイミングで撮像されることにより得られる画像データを,第1の被写体像を表わす画像データとして記録手段に記録し,
対象物検出手段が,上記撮像手段から連続して出力された画像データによって表わされる被写体像のうちの対象物検出用被写体像から所定の条件を満たす対象物を全て検出し,
距離情報算出手段が,上記対象物検出手段により検出された複数の対象物のうち撮像装置にもっとも近い対象物と上記撮像装置にもっとも遠い対象物との間の距離情報を算出し,
視差量決定手段が,上記第1の距離情報算出手段によって算出された距離情報にもとづいて,視差量を決定し,
第2の記録制御手段が,上記撮像装置を水平方向にずらすことにより,上記撮像手段から連続して出力される画像データによって表わされる被写体像と上記第1の被写体像との水平方向のずれ量が上記視差量決定手段によって決定された視差量と等しくなったことに応じて,その等しくなったタイミングで撮像された画像データを,第2の被写体像を表わす画像データとして,上記第1の被写体像を表す画像データと関連づけて上記記録手段に記録する,
撮像装置の動作制御方法。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012530566A JP5507693B2 (ja) | 2010-08-24 | 2011-06-16 | 撮像装置およびその動作制御方法 |
CN2011800391226A CN103069819A (zh) | 2010-08-24 | 2011-06-16 | 摄像装置及其动作控制方法 |
US13/765,430 US20130155204A1 (en) | 2010-08-24 | 2013-02-12 | Imaging apparatus and movement controlling method thereof |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010187316 | 2010-08-24 | ||
JP2010-187316 | 2010-08-24 | ||
JP2011-020549 | 2011-02-02 | ||
JP2011020549 | 2011-02-02 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/765,430 Continuation-In-Part US20130155204A1 (en) | 2010-08-24 | 2013-02-12 | Imaging apparatus and movement controlling method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012026185A1 true WO2012026185A1 (ja) | 2012-03-01 |
Family
ID=45723201
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/063799 WO2012026185A1 (ja) | 2010-08-24 | 2011-06-16 | 撮像装置およびその動作制御方法 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130155204A1 (ja) |
JP (1) | JP5507693B2 (ja) |
CN (1) | CN103069819A (ja) |
WO (1) | WO2012026185A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013145820A1 (ja) * | 2012-03-30 | 2013-10-03 | 富士フイルム株式会社 | 撮影装置、方法、記憶媒体及びプログラム |
WO2014013695A1 (ja) * | 2012-07-19 | 2014-01-23 | パナソニック株式会社 | 画像符号化方法、画像復号方法、画像符号化装置及び画像復号装置 |
WO2014147957A1 (ja) * | 2013-03-21 | 2014-09-25 | パナソニック株式会社 | 画像処理方法及び画像処理装置 |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9336240B2 (en) * | 2011-07-15 | 2016-05-10 | Apple Inc. | Geo-tagging digital images |
JP6042096B2 (ja) * | 2012-05-09 | 2016-12-14 | 東芝メディカルシステムズ株式会社 | X線撮影装置及び医用画像処理装置 |
CN103680385B (zh) * | 2013-11-29 | 2017-01-11 | 合肥京东方光电科技有限公司 | 触控电路及其驱动方法、阵列基板、触控显示装置 |
JP6587995B2 (ja) * | 2016-09-16 | 2019-10-09 | 富士フイルム株式会社 | 画像表示制御システム,画像表示制御方法および画像表示制御プログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07167633A (ja) * | 1993-08-26 | 1995-07-04 | Matsushita Electric Ind Co Ltd | 立体画像撮像及び表示装置 |
JP2006113807A (ja) * | 2004-10-14 | 2006-04-27 | Canon Inc | 多視点画像の画像処理装置および画像処理プログラム |
JP2009103980A (ja) * | 2007-10-24 | 2009-05-14 | Fujifilm Corp | 撮影装置、画像処理装置、及び撮影システム |
JP2009212728A (ja) * | 2008-03-03 | 2009-09-17 | Ntt Docomo Inc | 立体映像処理装置及び立体映像処理方法 |
JP2011146825A (ja) * | 2010-01-13 | 2011-07-28 | Panasonic Corp | ステレオ画像撮影装置およびその方法 |
Family Cites Families (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69635101T2 (de) * | 1995-11-01 | 2006-06-01 | Canon K.K. | Verfahren zur Extraktion von Gegenständen und dieses Verfahren verwendendes Bildaufnahmegerät |
EP0830034B1 (en) * | 1996-09-11 | 2005-05-11 | Canon Kabushiki Kaisha | Image processing for three dimensional display of image data on the display of an image sensing apparatus |
US7113634B2 (en) * | 2001-07-31 | 2006-09-26 | Canon Kabushiki Kaisha | Stereoscopic image forming apparatus, stereoscopic image forming method, stereoscopic image forming system and stereoscopic image forming program |
JP3803626B2 (ja) * | 2002-08-26 | 2006-08-02 | オリンパス株式会社 | カメラ |
JP3944188B2 (ja) * | 2004-05-21 | 2007-07-11 | 株式会社東芝 | 立体画像表示方法、立体画像撮像方法及び立体画像表示装置 |
DE102005034597A1 (de) * | 2005-07-25 | 2007-02-08 | Robert Bosch Gmbh | Verfahren und Anordnung zur Erzeugung einer Tiefenkarte |
JP2008026802A (ja) * | 2006-07-25 | 2008-02-07 | Canon Inc | 撮像装置 |
JP2009003609A (ja) * | 2007-06-20 | 2009-01-08 | Nec Corp | 立体画像構築システムおよび立体画像構築方法 |
US8228327B2 (en) * | 2008-02-29 | 2012-07-24 | Disney Enterprises, Inc. | Non-linear depth rendering of stereoscopic animated images |
CN101282492B (zh) * | 2008-05-23 | 2010-07-21 | 清华大学 | 三维影像显示深度调整方法 |
JP5238429B2 (ja) * | 2008-09-25 | 2013-07-17 | 株式会社東芝 | 立体映像撮影装置および立体映像撮影システム |
JP4637942B2 (ja) * | 2008-09-30 | 2011-02-23 | 富士フイルム株式会社 | 3次元表示装置および方法並びにプログラム |
GB2471137B (en) * | 2009-06-19 | 2011-11-30 | Sony Comp Entertainment Europe | 3D image processing method and apparatus |
-
2011
- 2011-06-16 CN CN2011800391226A patent/CN103069819A/zh active Pending
- 2011-06-16 WO PCT/JP2011/063799 patent/WO2012026185A1/ja active Application Filing
- 2011-06-16 JP JP2012530566A patent/JP5507693B2/ja not_active Expired - Fee Related
-
2013
- 2013-02-12 US US13/765,430 patent/US20130155204A1/en not_active Abandoned
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07167633A (ja) * | 1993-08-26 | 1995-07-04 | Matsushita Electric Ind Co Ltd | 立体画像撮像及び表示装置 |
JP2006113807A (ja) * | 2004-10-14 | 2006-04-27 | Canon Inc | 多視点画像の画像処理装置および画像処理プログラム |
JP2009103980A (ja) * | 2007-10-24 | 2009-05-14 | Fujifilm Corp | 撮影装置、画像処理装置、及び撮影システム |
JP2009212728A (ja) * | 2008-03-03 | 2009-09-17 | Ntt Docomo Inc | 立体映像処理装置及び立体映像処理方法 |
JP2011146825A (ja) * | 2010-01-13 | 2011-07-28 | Panasonic Corp | ステレオ画像撮影装置およびその方法 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013145820A1 (ja) * | 2012-03-30 | 2013-10-03 | 富士フイルム株式会社 | 撮影装置、方法、記憶媒体及びプログラム |
JP5547356B2 (ja) * | 2012-03-30 | 2014-07-09 | 富士フイルム株式会社 | 撮影装置、方法、記憶媒体及びプログラム |
CN104185985A (zh) * | 2012-03-30 | 2014-12-03 | 富士胶片株式会社 | 拍摄装置、方法、存储介质以及程序 |
WO2014013695A1 (ja) * | 2012-07-19 | 2014-01-23 | パナソニック株式会社 | 画像符号化方法、画像復号方法、画像符号化装置及び画像復号装置 |
CN103688535A (zh) * | 2012-07-19 | 2014-03-26 | 松下电器产业株式会社 | 图像编码方法、图像解码方法、图像编码装置及图像解码装置 |
JPWO2014013695A1 (ja) * | 2012-07-19 | 2016-06-30 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 画像符号化方法、画像復号方法、画像符号化装置及び画像復号装置 |
CN103688535B (zh) * | 2012-07-19 | 2017-02-22 | 太阳专利托管公司 | 图像编码方法、图像解码方法、图像编码装置及图像解码装置 |
US10104360B2 (en) | 2012-07-19 | 2018-10-16 | Sun Patent Trust | Image encoding method, image decoding method, image encoding apparatus, and image decoding apparatus |
WO2014147957A1 (ja) * | 2013-03-21 | 2014-09-25 | パナソニック株式会社 | 画像処理方法及び画像処理装置 |
JP6016180B2 (ja) * | 2013-03-21 | 2016-10-26 | パナソニックIpマネジメント株式会社 | 画像処理方法及び画像処理装置 |
US9986222B2 (en) | 2013-03-21 | 2018-05-29 | Panasonic Intellectual Property Management Co., Ltd. | Image processing method and image processing device |
Also Published As
Publication number | Publication date |
---|---|
US20130155204A1 (en) | 2013-06-20 |
CN103069819A (zh) | 2013-04-24 |
JP5507693B2 (ja) | 2014-05-28 |
JPWO2012026185A1 (ja) | 2013-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5507693B2 (ja) | 撮像装置およびその動作制御方法 | |
JP5414947B2 (ja) | ステレオ撮影装置 | |
CN102959969B (zh) | 单眼立体摄像装置 | |
JP5249149B2 (ja) | 立体画像記録装置及び方法、立体画像出力装置及び方法、並びに立体画像記録出力システム | |
US20170094267A1 (en) | Stereoscopic image reproduction device and method, stereoscopic image capturing device, and stereoscopic display device | |
US8599245B2 (en) | Image processing apparatus, camera, and image processing method | |
JP5320524B1 (ja) | ステレオ撮影装置 | |
CN102959974B (zh) | 立体图像再现装置、其视差调整方法及摄影装置 | |
JP5526233B2 (ja) | 立体視用画像撮影装置およびその制御方法 | |
CN102972032A (zh) | 三维图像显示装置、三维图像显示方法、三维图像显示程序及记录介质 | |
JP5647740B2 (ja) | 視差調節装置及び方法、撮影装置、再生表示装置 | |
JP2011024003A (ja) | 立体動画記録方法および装置、動画ファイル変換方法および装置 | |
JP6036840B2 (ja) | 撮像装置、画像処理装置、撮像装置の制御プログラムおよび画像処理装置の制御プログラム | |
JP5144456B2 (ja) | 画像処理装置および方法、画像再生装置および方法並びにプログラム | |
JP5001960B2 (ja) | 3次元画像表示装置及び方法 | |
CN104054333A (zh) | 图像处理装置、方法以及程序及其记录介质 | |
JP2013046395A (ja) | 撮像装置及びその制御方法、プログラム、及び記録媒体 | |
JP2012129697A (ja) | 画像表示装置、画像表示方法及びそれらを用いた撮像装置 | |
WO2012101900A1 (ja) | 立体撮像用ディジタル・カメラおよびその動作制御方法 | |
WO2012105122A1 (ja) | 立体視用画像撮像装置およびその動作制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180039122.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11819657 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2012530566 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11819657 Country of ref document: EP Kind code of ref document: A1 |