JP2012220888A - Imaging device - Google Patents

Imaging device Download PDF

Info

Publication number
JP2012220888A
JP2012220888A JP2011089385A JP2011089385A JP2012220888A JP 2012220888 A JP2012220888 A JP 2012220888A JP 2011089385 A JP2011089385 A JP 2011089385A JP 2011089385 A JP2011089385 A JP 2011089385A JP 2012220888 A JP2012220888 A JP 2012220888A
Authority
JP
Japan
Prior art keywords
image
imaging
eye
left
right
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2011089385A
Other languages
Japanese (ja)
Inventor
Kajiro Ushio
Nobuyuki Miyake
Yoshikazu Sugiyama
Yuji Kunigome
Yutaka Ichihara
信行 三宅
祐司 國米
裕 市原
喜和 杉山
嘉次郎 潮
Original Assignee
Nikon Corp
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corp, 株式会社ニコン filed Critical Nikon Corp
Priority to JP2011089385A priority Critical patent/JP2012220888A/en
Publication of JP2012220888A publication Critical patent/JP2012220888A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

A base line length and a shift amount are appropriately set.
An imaging apparatus that captures an image for a right eye and an image for a left eye, and captures an image for a right eye according to a designated display effect mode among a plurality of predetermined display effect modes. A setting unit that sets at least one of a baseline length, which is a distance between the position and the imaging position of the left-eye image, and a left-right shift amount on the display screen between the right-eye image and the left-eye image And an imaging unit that captures an image for the right eye and an image for the left eye based on the baseline length set by the setting unit, and an image for the right eye and an image for the left eye that are captured by the imaging unit are set by the setting unit. And an adjustment unit that adjusts the display so as to be displayed based on the shift amount.
[Selection] Figure 1

Description

  The present invention relates to an imaging apparatus.

  An imaging device that captures a three-dimensional image captures a right-eye image and a left-eye image from an imaging position that is shifted in the left-right direction. A display device that displays a three-dimensional image shifts and displays the right-eye image and the left-eye image in the left-right direction.

  By the way, the impression that the viewer receives in the three-dimensional image is changed by changing the distance (baseline length) between the imaging position of the right-eye image and the imaging position of the left-eye image. Further, the impression received by the viewer also changes in the three-dimensional image by changing the distance (shift amount) between the right-eye image and the left-eye image at the time of display. Therefore, it is preferable that an imaging apparatus that captures a three-dimensional image can appropriately set the baseline length and the shift amount, assuming in advance the impression received by the viewer at the time of display.

  In order to solve the above-described problem, in the first aspect of the present invention, there is provided an imaging device that captures an image for the right eye and an image for the left eye, and is designated from a plurality of predetermined display effect modes. Depending on the display effect mode, the base length, which is the distance between the imaging position of the right-eye image and the imaging position of the left-eye image, and the display screen between the right-eye image and the left-eye image A setting unit that sets at least one of the left and right shift amounts above, an imaging unit that captures the right-eye image and the left-eye image based on the baseline length set by the setting unit, and the imaging unit There is provided an imaging device comprising: an adjustment unit that adjusts the captured right-eye image and left-eye image to be displayed based on a shift amount set by the setting unit.

  It should be noted that the above summary of the invention does not enumerate all the necessary features of the present invention. In addition, a sub-combination of these feature groups can also be an invention.

1 shows a configuration of an imaging apparatus 10 according to the present embodiment. 1 shows an example of an external configuration of a binocular imaging device 10. An example of an appearance of a single-lens imaging device 10 and an example of capturing a three-dimensional image are shown. The impression of the three-dimensional image received by the viewer is shown according to the difference in the baseline length and the shift amount. The structure of the setting part 20 is shown. The range of the base line length and shift amount set according to each display effect mode is shown. The structure of the imaging device 10 in the case of imaging in a telephoto mode is shown. 1 shows a configuration of an imaging apparatus 10 when imaging is performed in a wide angle mode. An example of image removal processing by the removal unit 74 will be described. Another example of image removal processing by the removal unit 74 is shown. The imaging device 10 in the state connected to the display apparatus 80 is shown. The imaging device 10 in the state which is imaging the display apparatus 80 is shown. An example of a plurality of pairs of right-eye images and left-eye images captured by the imaging device 10 in a single three-dimensional imaging process is shown. An example of the some image imaged in the different imaging position is shown. 1 shows a configuration of a trinocular imaging device 10. An example of imaging when a three-dimensional image is captured by the single-lens imaging device 10 and an example of the viewer image 112 are shown. An example of the viewer image 112 in a state where the imaging range 120 is displayed is shown. An imaging example when a three-dimensional image is captured by the single-lens imaging device 10 using the guide 132 is shown. A method for generating an image for the right eye and an image for the left eye in the slide stereoscopic mode will be described.

  Hereinafter, the present invention will be described through embodiments of the invention, but the following embodiments do not limit the invention according to the claims. In addition, not all the combinations of features described in the embodiments are essential for the solving means of the invention.

  FIG. 1 shows a configuration of an imaging apparatus 10 according to the present embodiment. The imaging device 10 captures a three-dimensional image (a left-eye image and a right-eye image) for making a subject appear stereoscopically to a viewer. The imaging device 10 may capture a still image or a moving image. The imaging device 10 includes a setting unit 20, a display information acquisition unit 22, an imaging unit 24, an adjustment unit 26, and an output unit 28.

  The setting unit 20 sets at least one of the baseline length of the imaging unit 24 and the shift amount at the time of display according to the display effect mode designated by the user among a plurality of predetermined display effect modes. . The setting unit 20 sets the imaging magnification of the imaging unit 24 based on the screen size and viewing distance acquired by the display information acquisition unit 22 according to the designated display effect mode.

  Here, the baseline length refers to the distance between the imaging position of the right-eye image captured by the imaging unit 24 and the imaging position of the left-eye image. As an example, the baseline length is a distance between the optical axis of the lens unit that captures the image for the right eye and the optical axis of the lens unit that captures the image for the left eye.

  The shift amount is a shift in the left-right direction on the display screen between the right-eye image and the left-eye image in the display device that displays the right-eye image and the left-eye image captured by the imaging device 10. The distance. More specifically, the shift amount refers to the distance of the horizontal shift on the display screen between the infinity subject included in the right-eye image and the same infinity subject included in the left-eye image. . Details of the specific display effect mode will be described later in detail.

  The display information acquisition unit 22 acquires the screen size and the viewing distance of the display screen that displays the right-eye image and the left-eye image captured by the imaging device 10. Details of an example of the method for acquiring the screen size and viewing distance will be described later with reference to FIGS.

  The imaging unit 24 captures the right eye image and the left eye image based on the baseline length set by the setting unit 20. Further, the imaging unit 24 changes the imaging magnification according to the setting by the setting unit 20.

  The imaging unit 24 may have any configuration as long as it can capture the right-eye image and the left-eye image by changing the baseline length. For example, the imaging unit 24 may be a binocular unit having two lens units 30 that are spaced apart in the left-right direction. The binocular imaging unit 24 captures an image captured by the right lens unit 30 as a right eye image, and captures an image captured by the left lens unit 30 as a left eye image.

  Further, the imaging unit 24 may be a single-lens unit having one lens unit 30. The single-lens imaging unit 24 first captures one of the right-eye image and the left-eye image, and then moves the lens unit 30 in the left-right direction (horizontal direction), for example, manually. The other image of the eye image and the left eye image is captured.

  The adjustment unit 26 adjusts the right-eye image and the left-eye image captured by the imaging unit 24 so as to be displayed with the shift amount set by the setting unit 20. More specifically, the adjustment unit 26 determines between the right-eye image and the left-eye image based on the shift amount set by the setting unit 20 and the screen size acquired by the display information acquisition unit 22. Adjust the shift amount on the image data.

  For example, assume that the screen size is 1 meter in the horizontal direction, the number of dots in the horizontal direction of the image is 3000 dots, and the shift amount of the right-eye image and the left-eye image of the infinity subject is 50 millimeters. In this case, the adjustment unit 26 performs adjustment so that the right-eye image and the left-eye image are shifted by (50 mm / 1000 mm) × 3000 dots = 150 dots on the image data.

  The output unit 28 outputs the image for the right eye and the image for the left eye that have been adjusted by the adjustment unit 26 in a predetermined format. For example, the output unit 28 writes the right-eye image and the left-eye image in a storage medium in a predetermined file format. Further, as an example, the output unit 28 outputs a data stream of a right eye image and a left eye image in a predetermined format.

  FIG. 2 shows an example of the external configuration of the binocular imaging device 10. The imaging unit 24 of the twin-lens imaging device 10 includes two lens units 30. Such an imaging device 10 can simultaneously capture a right-eye image and a left-eye image.

  In addition, at least one of the two lens units 30 can move the imaging position in the horizontal direction (left-right direction). As a result, the binocular imaging device 10 sets the distance between the imaging position of the right-eye image and the imaging position of the left-eye image to the designated baseline length, and the right-eye image and the left eye. A business image can be taken.

  FIG. 3 shows an example of the external appearance of the single-lens imaging device 10 and an example of capturing a three-dimensional image. The imaging unit 24 of the single-lens imaging device 10 includes one lens unit 30. Such an imaging device 10 first captures one of the right-eye image and the left-eye image, and then manually moves the lens unit 30 in the left-right direction, for example, to manually move the right-eye image and the left-eye image. The other image of the eye image is captured. That is, the single-lens imaging device 10 captures the right-eye image and the left-eye image at different timings.

  In addition, the single-lens imaging device 10 captures one image of the right-eye image or the left-eye image (the previous image), and then shifts by a specified baseline length from the imaging position of the previous image. A predicted subject image picked up from is estimated from the previous image. Then, the single-lens imaging device 10 displays the predicted subject image so as to overlap the actual viewer image.

  In the single-lens imaging device 10, the imaging position is moved manually by the user or the like, and the right eye image or the left eye at the timing when the subject image in the actual viewer image matches the predicted subject image. The other image (the subsequent image) of the business image is captured. Accordingly, the single-lens imaging device 10 sets the distance between the imaging position of the right-eye image and the imaging position of the left-eye image to the designated baseline length, and the right-eye image and the left eye A business image can be taken.

  FIG. 4 shows the impression of the three-dimensional image received by the viewer according to the difference in the baseline length and the shift amount. The three-dimensional image can change the stereoscopic effect given to the viewer as the baseline length changes. Specifically, the three-dimensional image can make the stereoscopic effect stronger as the baseline length is larger. Note that the three-dimensional effect means a sense of recognizing the subject structure in three dimensions.

  In addition, the three-dimensional image can change the sense of depth given to the viewer when the shift amount changes. Specifically, the depth of the three-dimensional image can be made stronger as the shift amount is larger. The sense of depth refers to a sense of recognizing the distance between subjects having different subject distances, for example, the distance between the main subject and the background.

  In addition, when both the baseline length and the shift amount are changed, the three-dimensional image can change the feeling of book splitting or the feeling of a small garden. The sense of book splitting refers to a sense that the depth between the subject and the background is recognized, but the subject is recognized as if it is a planar object. The sense of miniature garden is a sense that a stereoscopic image is recognized as being smaller and closer than it actually is.

  In the three-dimensional image, as both the baseline length and the shift amount are smaller, the feeling of writing is increased. Further, in the three-dimensional image, the larger the baseline length and the smaller the shift amount, the greater the feeling of the miniature garden.

  In addition, when the shift amount is set to the interpupillary distance, the three-dimensional image can recognize the position of the subject as if it existed at the actual position (proportional faithful reproduction). The eye distance is the distance between the right eye pupil and the left eye pupil of the viewer. The eye distance is generally around 65 mm, but there are individual differences. In addition, when the shift amount is set to the interocular distance and the base line length is set to the interocular distance, the three-dimensional image recognizes the position of the subject as if it exists at the actual position, and the subject Can be recognized as an actual size (completely faithful reproduction).

  In addition, when the shift amount is set to the eye width distance and the base line length is set to be smaller than the eye width distance, the three-dimensional image recognizes the position of the subject as existing at the actual position, and The size of the subject can be recognized by enlarging it in proportion to the baseline length from the actual size (proportional faithful reproduction (enlargement)). In addition, when the shift amount is set to the eye width distance and the base line length is set to be larger than the eye width distance, the three-dimensional image recognizes the position of the subject as being at the actual position, and The size of the subject can be reduced and recognized in proportion to the baseline length from the actual size (proportional faithful reproduction (reduction)).

  In addition, even if the shift amount and the base line length do not coincide with the eye width distance, the three-dimensional image is almost the same as the actual position of the subject if it is set near the eye width distance. And the size of the subject can be recognized to be the actual size (generally faithful reproduction). In addition, if the shift amount is set in the vicinity of the eye width distance and the base line length is set sufficiently smaller than the eye width distance, the three-dimensional image can be recognized so that there is almost no stereoscopic effect but a depth. Yes (reproduction with less stereoscopic effect).

  FIG. 5 shows the configuration of the setting unit 20. The setting unit 20 sets each parameter to be set in the imaging device 10 to a value corresponding to the designated display effect mode among a plurality of display effect modes registered in advance.

  The setting unit 20 includes an input unit 32, a proportional faithful reproduction mode setting unit 34, a weak stereoscopic mode setting unit 36, a close-up mode setting unit 38, a telephoto mode setting unit 40, a wide-angle mode setting unit 42, and a pop-up. Mode setting unit 44, portrait mode setting unit 46, landscape mode setting unit 48, landscape stereoscopic enhancement mode setting unit 50, night scene mode setting unit 52, print mode setting unit 54, and small display mode setting unit 56 And a slide stereoscopic mode setting unit 58.

  The input unit 32 inputs operation information by the user. The input unit 32 specifies the display effect mode designated by the user based on the operation information.

  The proportional faithful reproduction mode setting unit 34 sets parameters when the proportional faithful reproduction mode is set. The weak stereoscopic mode setting unit 36 sets parameters when the weak stereoscopic mode is set. The close-up mode setting unit 38 sets parameters when the close-up mode is set. The telephoto mode setting unit 40 sets parameters when the telephoto mode is set.

  The wide angle mode setting unit 42 sets parameters when the wide angle mode is set. The pop-out mode setting unit 44 sets parameters when the pop-out mode is set. The portrait mode setting unit 46 sets parameters when the portrait mode is set. The landscape mode setting unit 48 sets parameters when the landscape mode is set. The landscape stereoscopic enhancement mode setting unit 50 sets parameters when the landscape stereoscopic enhancement mode is set.

  The night view mode setting unit 52 sets parameters when the night view mode is set. The print mode setting unit 54 sets parameters when the print mode is set. The small display mode setting unit 56 sets parameters when the small display mode is set. The slide solid mode setting unit 58 sets parameters when the slide solid mode is set.

  FIG. 6 shows the range of the base line length and the shift amount set according to each display effect mode.

  The range surrounded by A in FIG. 6 indicates the range of the base line length and the shift amount in the proportional faithful reproduction mode. In the proportional faithful reproduction mode, the setting unit 20 sets the shift amount to the eye width distance, and sets the baseline length to a distance according to the enlargement ratio or reduction ratio of the subject. The setting unit 20 sets the baseline length shorter than the eye distance when the subject is enlarged and displayed, and longer than the eye distance when the subject is reduced and displayed. As a result, the imaging apparatus 10 can capture a three-dimensional image that allows the subject position to be recognized as if it existed in the proportional faithful reproduction mode.

  Further, the setting unit 20 may set the base length so that the subject can be seen at a predetermined ratio with respect to the imaging distance based on the imaging distance of the subject in the proportional faithful reproduction mode. For example, suppose that it is desired to make a subject appear at a ratio X with respect to the imaging distance of the subject. In this case, the setting unit 20 sets the baseline length to a value obtained by dividing the eye distance by the ratio X.

  Furthermore, in the proportional faithful reproduction mode, the setting unit 20 uses the imaging magnification of the imaging unit 24 so that the subject can be seen at a predetermined ratio with respect to the actual size based on the screen size and the viewing distance. May be set. For example, the setting unit 20 calculates the angle of view during viewing from the screen size and viewing distance, and sets the imaging magnification of the imaging unit 24 based on the ratio and the calculated angle of view during viewing.

  Further, the range surrounded by A ′ in FIG. 6 indicates the range of the base line length and the shift amount in the full faithful reproduction mode, particularly in the proportional faithful reproduction mode. The setting unit 20 sets both the baseline length and the shift amount to the interpupillary distance in the complete faithful reproduction mode. Furthermore, the setting unit 20 adjusts the imaging magnification of the imaging unit 24 so that the subject can be seen at an actual size based on the screen size and the viewing distance in the complete faithful reproduction mode.

  More specifically, the setting unit 20 calculates the angle of view during viewing from the screen size and viewing distance, and sets the imaging magnification of the imaging unit 24 so that the angle of view during imaging is the same as the angle of view during viewing. Set. As a result, in the complete faithful reproduction mode, the imaging apparatus 10 can recognize the subject position as if it existed at the actual position, and can recognize the subject size at the actual size. .

  The range surrounded by B in FIG. 6 indicates the range of the baseline length and the shift amount in the weak stereoscopic mode. In the weak stereoscopic mode, the setting unit 20 sets the baseline length from about 30% to about 60% of the eye width distance, and sets the shift amount from about 60% to about 70% of the eye width distance. A three-dimensional image in which the baseline length is relatively small has a relatively strong depth feeling but a reduced stereoscopic effect. Therefore, the imaging device 10 can capture a three-dimensional image that can be observed easily with less visual field conflict in the weak stereoscopic mode.

  The range surrounded by C in FIG. 6 indicates the range of the baseline length and the shift amount in the close-up mode. In the close-up mode, the setting unit 20 sets the baseline length to about 5% to 40% of the eye width distance, and sets the shift amount to about 65% to 95% of the eye width distance.

  As an example, the setting unit 20 sets the baseline length to a value obtained by multiplying the distance between the viewing distance and the imaging distance by the distance between the eyes. A three-dimensional image in which the baseline length is reduced and the shift amount is increased in display can display an enlarged subject according to the ratio of the shift amount to the baseline length. Further, when close-up photography is performed, an image with a strong stereoscopic effect is obtained even if the baseline length is shorter than the distance between the eyes. Accordingly, in the close-up mode, the imaging device 10 can enlarge a close-up subject and capture a three-dimensional image with a stereoscopic effect.

  The range surrounded by D in FIG. 6 indicates the range of the base line length and the shift amount in the telephoto mode. In the telephoto mode, the setting unit 20 sets the baseline length to a value larger than the eye width distance, and sets the shift amount to about 70% to 100% of the eye width distance.

  In the telephoto mode, the imaging unit 24 captures the right eye image and the left eye image at an imaging magnification for enlarging the subject. When telephoto shooting is performed, the sense of distance before and after the two images is compressed, and the feeling of writing is increased.

  Therefore, the setting unit 20 sets the baseline length according to the imaging magnification of the imaging unit 24 in the telephoto mode. More specifically, the setting unit 20 increases the baseline length as the imaging magnification of the imaging unit 24 increases. Thereby, the imaging device 10 can capture a three-dimensional image in which the stereoscopic effect of the subject that is magnified and captured is enhanced in the telephoto mode.

  The setting unit 20 may set a process for blurring the background or warn in the telephoto mode. The configuration of the imaging apparatus 10 that performs such processing will be further described with reference to FIG.

  The range surrounded by E in FIG. 6 indicates the range of the baseline length and the shift amount in the wide angle mode. In the wide-angle mode, the setting unit 20 sets the baseline length to about 30% to 60% of the eye width distance, and sets the shift amount to about 35% to 70% of the eye width distance.

  In the wide-angle mode, the imaging unit 24 images the subject with a wide angle of view. Here, when wide-angle shooting is performed, the sense of distance between the two images is extended, and the reverse writing feeling is increased. In order to eliminate this reverse writing feeling, the shift amount may be reduced. However, if the shift amount is reduced, the image moves to the near side. Therefore, the feeling of popping out becomes large, or unnatural vignetting occurs in the popped out image.

  Therefore, the setting unit 20 sets a process for blurring an image when the distance from the viewer is shorter than a predetermined distance at the time of display (an image that protrudes too far forward). In addition, the setting unit 20 performs processing such as removing an image that is shorter than a predetermined distance from the viewer at the time of display and in which a part of the subject is missing from the image frame. Set.

  Thereby, the imaging device 10 can capture a more easily viewable three-dimensional image in the wide-angle mode. The configuration of the imaging apparatus 10 that performs processing in the wide-angle mode will be further described with reference to FIG.

  The range surrounded by F in FIG. 6 indicates the range of the base line length and the shift amount in the pop-out mode. In the pop-out mode, the setting unit 20 sets the base line length to 60% or more of the eye width distance, and sets the shift amount from 0% to about 85% of the eye width distance.

  A three-dimensional image in which an image pops out to the front side of the display screen can make the viewer feel a strong stereoscopic effect. Therefore, the setting unit 20 captures a three-dimensional image in which the baseline length is relatively long and the feeling of popping out is emphasized in the popping out mode.

  Thereby, the imaging device 10 can capture a three-dimensional image in which the viewer can enjoy more stereoscopic effect in the jump-out mode. Note that the setting unit 20 notifies the user of a warning because a three-dimensional image having a high possibility of visual field struggle and spread is captured when setting according to the pop-up mode.

  The range surrounded by G in FIG. 6 indicates the range of the baseline length and the shift amount in the portrait mode. In the portrait mode, the setting unit 20 sets the baseline length from about 60% to about 100% of the eye width distance, and sets the shift amount from about 60% to about 90% of the eye width distance.

  It is preferable that the three-dimensional image obtained by portrait photography is arranged at a position where a main subject such as a person can be easily seen in the depth direction. Therefore, in the portrait mode, the setting unit 20 captures both the baseline length and the shift amount in a state close to the interpupillary distance. As a result, in the portrait mode, the imaging apparatus 10 can recognize the position of the subject as if it exists at the approximate actual position, and can capture a three-dimensional image in which the size of the subject is recognized as the actual size. . Then, the setting unit 20 can display the main subject almost on the display screen in the portrait mode.

  In the portrait mode, the setting unit 20 sets the imaging magnification of the imaging unit 24 so that the subject can be seen at a predetermined ratio with respect to the actual size based on the screen size and the viewing distance. May be. The setting unit 20 may set a process for blurring the background in the portrait mode.

  The range surrounded by H in FIG. 6 indicates the range of the baseline length and the shift amount in the landscape mode. In the landscape mode, the setting unit 20 sets the baseline length to about 10% to 100% of the eye width distance, and sets the shift amount to about 70% to 75% of the eye width distance.

  A three-dimensional image obtained by capturing a landscape can make the landscape feel magnificent when the sense of depth is large. Therefore, the setting unit 20 relatively increases the shift amount in the landscape mode. Thereby, the imaging device 10 can capture a three-dimensional image of a magnificent landscape with a sense of depth.

  In the landscape mode, the setting unit 20 causes a subject at infinity to be observed, so that the shift amount is set to a predetermined value (for example, 50 mm) or less that is shorter than the interpupillary distance so as not to spread in the display. To do. In addition, the setting unit 20 sets the base line length in accordance with a user instruction in the landscape mode. However, when the setting unit 20 captures a three-dimensional image including a distant view and a close view, it is preferable to set the base line length small for the purpose of reducing the visual field conflict.

  In addition, in the landscape mode, the setting unit 20 may set a process for blurring one of the distant view and the foreground for the purpose of avoiding a visual field conflict when capturing a three-dimensional image including a distant view and a foreground. In this case, as an example, the setting unit 20 refers to the focus information and determines the distant view area and the foreground area.

  The range surrounded by I in FIG. 6 indicates the range of the baseline length and the shift amount in the landscape stereoscopic enhancement mode. The setting unit 20 sets the baseline length to 105% or more of the eye width distance and sets the shift amount to about 60% to 80% of the eye width distance in the landscape stereoscopic enhancement mode.

  A three-dimensional image obtained by capturing a landscape is generally a distant view, and perspective due to parallax hardly occurs. That is, the landscape becomes a three-dimensional image with a poor stereoscopic effect even if the image is captured with a baseline length equal to or smaller than the distance between the eyes.

  Therefore, the setting unit 20 captures a distant scene with a baseline length longer than the eye width distance in the landscape stereoscopic enhancement mode. Thereby, the imaging device 10 can capture a three-dimensional image in which the stereoscopic effect of a distant landscape is emphasized. However, when the image is taken with a long base line length, the feeling of the miniature garden increases. Accordingly, the three-dimensional image captured in the landscape stereoscopic enhancement mode is an image in which a distant landscape is reproduced in a small box (for example, a three-dimensional image in which a diorama is captured).

  Note that the three-dimensional image including the distant view and the near view may cause a visual field conflict, and the setting unit 20 warns that the close view is not included in the image. Further, the setting unit 20 may set a process for blurring the foreground when the foreground is included in the three-dimensional image. Further, the setting unit 20 may enhance the sense of depth by increasing the shift amount. However, in this case, the setting unit 20 sets the shift amount to at least a predetermined value (for example, 50 mm) that is at least shorter than the interpupillary distance so as not to spread in the display.

  A range surrounded by K in FIG. 6 indicates a range of the base line length and the shift amount in the print mode. In the print mode, the setting unit 20 sets the baseline length from about 75% to about 120% of the eye width distance, and sets the shift amount from about 10% to about 20% of the eye width distance.

  The print mode is designated when performing display using a digital photo frame for autostereoscopic viewing and three-dimensional printing for autostereoscopic observation. When displaying with a digital photo frame and performing three-dimensional printing, the display screen is small, so the shift amount cannot be increased. Also, the angle of view at the time of display is small. On the other hand, when such display and printing are performed, there is no spread.

  Therefore, the setting unit 20 sets the baseline length according to the screen size in the print mode. More specifically, the setting unit 20 increases the baseline length as the screen size is smaller. Thereby, the imaging device 10 can capture an appropriate three-dimensional image according to the screen size.

  The range surrounded by L in FIG. 6 indicates the range of the baseline length and the shift amount in the case of capturing a normal size subject in the small display mode.

  The small display mode is specified when a three-dimensional image to be displayed on a display screen having a predetermined screen size or less is captured. Since the small display has a small display screen, the shift amount cannot be increased. However, the small display is suitable for displaying a relatively small subject.

  Therefore, the setting unit 20 switches parameters as follows in the small display mode according to the size of the subject and the shooting method. In other words, in the small display mode, the setting unit 20 performs setting in the same manner as in the close-up mode when a subject having a predetermined size or less (a relatively small subject) is enlarged and captured. The setting unit 20 performs the same setting as in the complete faithful reproduction mode when imaging a relatively small subject at a standard magnification in the small display mode. Thereby, even if it is a case where it displays on a small display, the imaging device 10 can image a three-dimensional image with a strong stereoscopic effect.

  Then, in the small display mode, the setting unit 20 picks up the base line length from 75% to 105% of the interocular distance when imaging a subject other than a subject having a predetermined size or less (that is, a subject having a size larger than the predetermined size). %, And the shift amount is set to about 20% to 40% of the interocular distance. Thereby, the imaging device 10 can capture an appropriate three-dimensional image corresponding to the small display.

  The range surrounded by M in FIG. 6 indicates the range of the baseline length and the shift amount in the slide stereoscopic mode. In the slide stereoscopic mode, the setting unit 20 sets the base line length to 0% of the eye width distance (that is, the setting for capturing only one two-dimensional image), and sets the shift amount from 10% to 75% of the eye width distance. Set to degree.

  The slide three-dimensional mode is designated when there is no need to feel a pop-up, when it is desired to take an image easily, or when a beginner who does not have a special imaging technique takes an image. In the slide stereoscopic mode, the setting unit 20 causes the imaging unit 24 to set the baseline length to 0% of the eye width distance and cause the imaging to be performed. Then, the setting unit 20 shifts the right-eye image and the left-eye image captured by the image capturing unit 24 to the left and right with a shift amount that does not cause at least spreading. As a result, the imaging device 10 can easily capture a three-dimensional image that gives a strong sense of depth and a certain degree of three-dimensionality and realism when, for example, an image of a distant view is taken.

  In addition, when capturing a night view, the user may further specify the night view mode in addition to the above modes. An image obtained by capturing a night view has a large contrast. In a three-dimensional image having a large contrast, a bright part (or a dark part) is displayed at a different position between the right-eye image and the left-eye image, so that a ghost is likely to occur.

  Therefore, the setting unit 20 reduces the contrast in each of the right-eye image and the left-eye image in the night view mode. Furthermore, when imaging a main subject such as a person, the setting unit 20 may blur the background in each of the right-eye image and the left-eye image. Thereby, the imaging device 10 can capture a three-dimensional image of a night scene with less ghost.

  For example, the setting unit 20 may detect the left and right correlation values while changing the shift amount between the right-eye image and the left-eye image, and adjust the shift amount to maximize the correlation value. . Thereby, the imaging device 10 can capture a three-dimensional image having a shift amount with a smaller ghost.

  As described above, the imaging device 10 according to the present embodiment registers in advance a plurality of display effect modes corresponding to the impression received by the viewer at the time of display. And the imaging device 10 sets a base line length and shift amount appropriately so that the display effect according to display effect mode may arise. Thereby, according to the imaging device 10, it is possible to appropriately set the baseline length and the shift amount, assuming in advance the impression received by the viewer at the time of display.

  FIG. 7 shows the configuration of the imaging apparatus 10 when imaging in the telephoto mode. When the imaging apparatus 10 operates in the telephoto mode, the imaging apparatus 10 functions in a configuration as illustrated in FIG. That is, the imaging apparatus 10 operating in the telephoto mode further includes a blur unit 70 and a warning unit 72 in addition to the configuration shown in FIG.

  In the telephoto mode, the imaging unit 24 sets the baseline length, which is the distance between the imaging position of the right-eye image and the imaging position of the left-eye image, to be equal to or larger than the eye width distance, and magnifies and images the main subject. In this case, as an example, the imaging unit 24 images the subject with a longer baseline length as the imaging magnification increases. Thereby, the imaging device 10 can capture a three-dimensional image in which the stereoscopic effect of the subject that is magnified and captured is enhanced in the telephoto mode.

  Further, the blur unit 70 performs a blurring process on the right-eye image and the left-eye image after the shift amount is adjusted by the adjustment unit 26. More specifically, in the blur unit 70, the distance between the display position on the right-eye image and the display position on the left-eye image when displayed on the display screen is greater than or equal to a predetermined reference value. Blur the subject. For example, the blur unit 70 blurs the target subject by performing a filtering process on the image portion of the target subject. Further, the blur unit 70 blurs the subject by performing image processing on at least one of the right-eye image and the left-eye image. Thereby, the imaging device 10 can blur a subject that may cause a divergence at the time of display so as not to cause the divergence.

  In addition, as an example, the imaging unit 24 adjusts the base line length so that the distance on the display screen between the main subject displayed in the right-eye image and the main subject displayed in the left-eye image is previously set. An image for the right eye and an image for the left eye are captured within a predetermined range. In this case, the blur unit 70 subjects the distance between the display position on the right-eye image and the display position on the left-eye image that is displayed on the display screen to be longer than the upper limit of a predetermined range. Blur. As a result, the blur unit 70 can blur the background that may cause spreading during display so as not to cause spreading.

  Further, the blur unit 70 may change the amount of blur according to the difference in imaging distance from the main subject. More specifically, the blur unit 70 may blur as the difference in imaging distance from the main subject increases. Thereby, the blur part 70 can be blurred more largely as the part which is likely to cause spreading.

  The warning unit 72 issues a warning when the baseline length, which is the distance between the imaging position of the right-eye image and the imaging position of the left-eye image, is imaged at an eye width distance or more. Thereby, the imaging device 10 can notify the user that a three-dimensional image with high possibility of spreading is captured in the telephoto mode.

  FIG. 8 shows a configuration of the imaging apparatus 10 when imaging in the wide-angle mode. When the imaging apparatus 10 operates in the wide-angle mode, the imaging apparatus 10 functions in a configuration as illustrated in FIG. That is, the imaging apparatus 10 operating in the wide-angle mode further includes a blur part 70 and a removal part 74 in addition to the configuration shown in FIG.

  In the wide-angle mode, the imaging unit 24 images the subject with a wide angle of view. In the wide-angle mode, the blur unit 70 performs a blurring process on the right-eye image and the left-eye image after the shift amount is adjusted by the adjustment unit 26. More specifically, in the case where the blur part 70 is displayed on the display screen, the distance from the viewer to the stereoscopic image is equal to or less than a predetermined ratio with respect to the distance from the viewer to the display screen. , Blur the image of the stereoscopic image. For example, the blur unit 70 blurs the target stereoscopic image by image processing when a stereoscopic image in which the distance from the display screen is equal to or greater than a predetermined distance is included.

  For example, the blur unit 70 blurs the target subject by performing a filtering process on the image portion of the target subject. Further, the blur unit 70 blurs the subject by performing image processing on at least one of the right-eye image and the left-eye image. As a result, the imaging device 10 can capture a more easily viewable three-dimensional image by blurring a subject with a feeling of popping out at the time of display.

  The removal unit 74 performs a predetermined subject removal process on the right-eye image and the left-eye image after the shift amount is adjusted by the adjustment unit 26. The removing unit 74 may perform a removing process instead of the blurring process by the blur unit 70, or may perform a removing process in addition to the blurring process by the blur unit 70.

  FIG. 9 shows an example of image removal processing by the removal unit 74. As an example, the removal unit 74 is partially missing at the ends of the right-eye image and the left-eye image, and the distance from the viewer when displayed on the display screen is the display screen from the viewer. Subjects that are less than or equal to a predetermined ratio with respect to the distance to the distance (that is, subjects that are partially missing and protruded by a certain amount or more) are removed from the right eye image and the left eye image.

  For example, as shown in FIG. 9, the removal unit 74 projects a subject (for example, a soccer ball 76) that protrudes closer to the viewer than the display screen 82 and is partially missing at the end. Remove by image processing. Thereby, the imaging device 10 can capture a three-dimensional image from which an unnatural subject is removed.

  FIG. 10 shows another example of image removal processing by the removal unit 74. Further, instead of the processing as shown in FIG. 9, the removal unit 74 predetermines a region including a subject that is partially missing and protrudes a certain amount or more in at least one of the right-eye image and the left-eye image. You may convert into the image of the obtained boundary color. For example, the border color is the same color as the frame of the display device 80.

  In this case, the removing unit 74 includes a subject in which at least one of the right-eye image and the left-eye image is missing and protrudes a certain amount or more, and has a range with a certain width from the left and right ends of the image and the image. At least one region in a certain height range from the upper and lower ends may be converted into an image having a predetermined boundary color. That is, the removing unit 74 may convert the peripheral portion of the display screen 82 into the frame image 78 in FIG. Thereby, the imaging device 10 can capture a natural three-dimensional image without an unnatural subject.

  FIG. 11 shows the imaging device 10 connected to the display device 80. The display information acquisition unit 22 receives the image size and viewing distance of the display device 80 that displays the captured right-eye image and left-eye image from the outside.

  In this case, for example, the display information acquisition unit 22 acquires the screen size from the display device 80 connected via a cable or wireless communication. Then, the setting unit 20 sets the shift amount based on the screen size acquired in the subsequent imaging. Thereby, the imaging device 10 can capture a three-dimensional image suitable for the screen size of the display device 80 used by the user.

  Furthermore, the display information acquisition part 22 may memorize | store the log | history of the screen size of the display apparatus 80 connected through the cable or radio | wireless communication etc. as an example. Then, the setting unit 20 sets the shift amount of the right eye image and the left eye image based on the screen size included in the stored history. Thereby, when the user uses any of the plurality of display devices 80, the imaging device 10 can capture a three-dimensional image suitable for the screen size without connecting to the display device 80 again.

  Further, the display information acquisition unit 22 may register a plurality of screen sizes in advance. In this case, the display information acquisition unit 22 sets the shift amount based on the screen size selected by the user from among a plurality of screen sizes registered in advance. Thereby, the display information acquisition unit 22 can set the shift amount without being connected to the display device 80.

  Further, the display information acquisition unit 22 calculates the viewing distance by multiplying the screen size by a predetermined coefficient. Further, when the viewing distance is set for each display device 80, the display information acquisition unit 22 may acquire the viewing distance together with the screen size from the display device 80 connected via a cable or wireless communication. Good.

  FIG. 12 shows the imaging device 10 in a state where the display device 80 is imaged. The imaging device 10 may acquire the screen size by imaging the display device 80 from the actual viewing position. In this case, the display information acquisition unit 22 calculates the screen size based on the screen size, the imaging magnification, and the subject distance on the image obtained by imaging the display device 80 by the imaging device 10. In this case, the display information acquisition unit 22 can measure the subject distance by the distance measurement function of the imaging device 10. Thereby, the imaging device 10 can acquire an accurate screen size without connecting to the display device 80.

  Furthermore, the display information acquisition unit 22 acquires a subject distance (for example, a subject distance measured by the distance measurement function of the imaging device 10) obtained by imaging the screen of the display device 80 from the viewing position by the imaging device 10 as a viewing distance. May be. Thereby, the imaging device 10 can acquire an accurate viewing distance without being connected to the display device 80.

  FIG. 13 shows an example of a plurality of pairs of right-eye images and left-eye images captured by the imaging device 10 in a single three-dimensional imaging process. The adjustment unit 26 may generate a plurality of pairs of right-eye images and left-eye images having different shift amounts based on the right-eye image and the left-eye image captured in one three-dimensional imaging process. Good.

  For example, the adjustment unit 26 generates a pair of right-eye images and left-eye images having different shift amounts in association with each of a plurality of screen sizes registered in advance. In this case, the adjustment unit 26 may share one image of the right eye image and the left eye image of each of the plurality of pairs with one image of the other pair.

  The output unit 28 outputs a right eye image and a left eye image of any one of a plurality of pairs of right eye images and left eye images to the display device 80 connected to the imaging device 10. To do. In this case, the output unit 28 outputs a pair of a right-eye image and a left-eye image having a shift amount corresponding to the screen size of the connected display device 80 to the display device 80. Thereby, the imaging device 10 can output the right-eye image and the left-eye image suitable for the connected display device 80 without acquiring the screen size of the connected display device 80 in advance. it can.

  FIG. 14 shows an example of a plurality of images captured at different imaging positions. The imaging unit 24 may capture a plurality of pairs of right-eye images and left-eye images corresponding to a plurality of baseline lengths in a single three-dimensional imaging process.

  For example, the imaging device 10 first captures one image of the right eye image or the left eye image. Subsequently, the main body of the imaging device 10 is manually moved in the horizontal direction. Then, the imaging device 10 captures an image at each of the imaging positions corresponding to the plurality of baseline lengths while the main body is being moved in the horizontal direction.

  That is, the imaging device 10 captures images at each of a plurality of imaging positions by continuously releasing the shutter while sliding in the horizontal direction. Thereby, the imaging device 10 can capture a plurality of pairs of right-eye images and left-eye images corresponding to a plurality of baseline lengths in a single three-dimensional imaging process.

  In this case, the imaging unit 24 uses either one of the right eye image and the left eye image of each of the plurality of pairs as one of the other pair of right eye images and left eye images. You may image in common. Thereby, the imaging unit 24 can reduce the number of times of imaging and the image data.

  FIG. 15 shows a configuration of the trinocular imaging device 10. The imaging unit 24 may have a configuration including three or more lens units 30 and a selection unit 90. In the example of FIG. 15, the imaging unit 24 includes three lens units 30 (30-1, 30-2, 30-3).

  The three or more lens units 30 are arranged side by side in the horizontal direction. The horizontal distance (baseline length) between each of the three or more lens units 30 and the other lens unit 30 is different for each pair.

  The selection unit 90 selects a pair of two lens units 30 among the three or more lens units 30. More specifically, the selection unit 90 selects a pair of two lens units 30 whose horizontal distance is a distance corresponding to the baseline length set by the setting unit 20.

  Then, the selection unit 90 sets the image captured by the right lens unit 30 among the two images captured by the pair of the selected two lens units 30 as the right eye image, and the left lens unit 30 performs the processing. The captured image is output as an image for the left eye. Such an imaging apparatus 10 sets the distance between the imaging position of the right-eye image and the imaging position of the left-eye image to the specified baseline length, and captures the right-eye image and the left-eye image. can do.

  FIG. 16 illustrates an example of capturing a three-dimensional image by the single-lens imaging device 10 and an example of the viewer image 112. The single-lens imaging device 10 captures one of the right-eye image and the left-eye image (the previous image), and manually slides the imaging position in the horizontal direction so that the right-eye image and the left-eye image are captured. The other image (subsequent image) is picked up among the business images.

  In this case, the imaging unit 24 of the imaging device 10 displays the information indicating the imaging position of the subsequent image superimposed on the viewer image 112 after imaging the previous image. For example, the imaging unit 24 extracts, as the reference image 114, an image of a part of a subject (for example, an image of a main subject in focus) included in the previous image. Subsequently, the imaging unit 24 displays the instruction image 116 obtained by duplicating the extracted reference image 114 at a position shifted in the horizontal direction by the base line length in the viewer image 112.

  Subsequently, the imaging device 10 is manually moved in the horizontal direction. The imaging unit 24 keeps the instruction image 116 fixed and displayed even when the imaging device 10 moves in the horizontal direction. Then, the imaging unit 24 detects that the positions of the actual reference image 114 in the viewer image 112 and the instruction image 116 displayed superimposed on the viewer image 112 match. Thereby, the imaging unit 24 can detect that the base unit has slid in the horizontal direction from the imaging position of the previous image. Then, the imaging unit 24 captures a subsequent image at the position.

  The imaging unit 24 may automatically capture an image when the image is slid in the horizontal direction by the baseline length from the imaging position of the previous image (when the actual reference image 114 and the instruction image 116 match). Thereby, the imaging device 10 can capture the right-eye image and the left-eye image at two imaging positions that are shifted in the horizontal direction by the baseline length.

  In addition, the imaging unit 24 may notify the user when sliding in the horizontal direction by the baseline length from the imaging position of the previous image. In this case, the imaging unit 24 captures a subsequent image in response to the pressing of the shutter button after the notification. Thereby, the imaging unit 24 can reliably capture an image after the user confirms the imaging position of the subsequent image.

  Note that the imaging unit 24 may switch between automatic imaging and notification according to the user setting when the imaging unit 24 moves to the imaging position of a subsequent image. In addition, as an example, the imaging unit 24 may perform automatic imaging after a certain time has elapsed after notification. In addition, the imaging unit 24 may cancel a series of imaging operations for a three-dimensional image when the imaging position of a subsequent image cannot be detected after a certain time has elapsed since the previous image was captured.

  Further, the imaging unit 24 designates the imaging position of the subsequent image so that the subsequent image is formed at a position shifted by a predetermined reference deviation amount on the imaging device with respect to the previous image. For example, the imaging unit 24 forms an image at a position shifted by several percent in the horizontal direction on the imaging device. Thereby, the imaging unit 24 can reduce the slide amount from the imaging position of the previous image to the imaging position of the subsequent image.

  Further, the imaging device 10 may include an acceleration sensor 110 inside. In this case, the imaging unit 24 detects the position of the previous image that has been slid in the horizontal direction by the baseline length, and the acceleration sensor 110 detects that the imaging device 10 has been slid in the horizontal direction by the baseline length. A later image may be taken at the position. Thereby, the imaging device 10 can capture a subsequent image at a more accurate position.

  Further, the imaging unit 24 detects whether the imaging device 10 has moved in the right direction or the left direction by the acceleration sensor 110 after imaging the previous image. Based on the detection result of the acceleration sensor 110, the imaging unit 24 can determine whether each of the previous image and the subsequent image is a right-eye image or a left-eye image.

  Further, the acceleration sensor 110 may detect the horizontal direction of the imaging device 10 during the slide from when the previous image is captured until the subsequent image is captured. The imaging unit 24 outputs a warning when the imaging device 10 is tilted more than a predetermined angle from the horizontal direction during the slide.

  The imaging unit 24 may detect the horizontal direction of the imaging device 10 based on the distortion of the viewer image 112 during the slide. The imaging unit 24 outputs a warning when the imaging device 10 is tilted by a predetermined angle or more from the horizontal direction.

  FIG. 17 shows an example of the viewer image 112 in a state where the imaging range 120 is displayed. In the single-lens imaging device 10, the imaging unit 24, for example, includes the right-eye image and the left-eye in the viewer image 112 when capturing the first image to be captured among the right-eye image and the left-eye image. The image capturing range 120 for the business image is displayed.

  In this case, for example, the imaging unit 24 may display an area included in at least one of the right-eye image and the left-eye image as the imaging range 120 in the preview image when the previous image is captured. . Further, the imaging unit 24 may display, as the imaging range 120, an area that is included in both the right-eye image and the left-eye image in the preview image when the previous image is captured. Moreover, the imaging unit 24 displays a rectangular imaging range 120 as an example. According to such an imaging apparatus 10, it is possible to present to the user the subject that is actually displayed on the display screen.

  FIG. 18 illustrates an imaging example when a three-dimensional image is captured by the single-lens imaging device 10 using the guide 132. The single-lens imaging device 10 may capture a right-eye image and a left-eye image in a state where the imaging device 10 is placed on a guide 132 that supports the imaging device 10 from the lower surface side.

  The guide 132 is, for example, a table having a flat surface on which the bottom surface of the imaging device 10 is placed. The imaging apparatus 10 according to this example further includes a movement amount detection unit 130. The movement amount detection unit 130 detects a relative movement amount in the horizontal direction of the imaging apparatus 10 with respect to the guide 132. The movement amount detection unit 130 is an optical sensor as an example. The movement amount detection unit 130 may be provided in the guide 132 instead of being provided in the imaging device 10.

  Then, the imaging unit 24 according to the present example captures the subsequent image in response to detecting that the movement amount detection unit 130 has moved the set movement amount after the previous image is captured. Thereby, the imaging device 10 according to this example can accurately slide the imaging device 10 in the horizontal direction and can accurately control the movement amount.

  FIG. 19 shows a method for generating an image for the right eye and an image for the left eye in the slide stereoscopic mode. The imaging device 10 generates a right eye image and a left eye image from one two-dimensional image in the slide stereoscopic mode.

  In this case, the adjustment unit 26 provided in the imaging device 10 shifts the one two-dimensional image 200 (that is, the right-eye image and the left-eye image whose baseline length is 0) to the left and right, and the right-eye image 210 and A left-eye image 220 is generated. Thereby, the adjustment unit 26 of the imaging device 10 can generate the right-eye image 210 and the left-eye image 220 for stereoscopic viewing by a simple process.

  More specifically, the adjustment unit 26 shifts the image 200 horizontally to the right to generate the right eye image 210, and shifts the image 200 horizontally to the left to generate the left eye image 220. In this case, for example, the adjustment unit 26 sets an image obtained by shifting the image 200 to the left by L / 2 as an image 220 for the left eye, and an image obtained by shifting the common image by L / 2 to the right by an image 210 for the right eye. And Here, L represents the set shift amount.

  Further, when the right-eye image and the left-eye image are shifted left and right, a blank portion is generated at the left end of the right-eye image. Similarly, a blank portion is also generated at the right end of the left-eye image. Since these regions can give an image to only one of the right eye and the left eye, there are cases where it becomes an obstacle to giving a natural stereoscopic effect to the viewer.

  Therefore, for example, the adjustment unit 26 may add the left end image to the left blank portion of the right eye image and may add the right end image to the right blank portion of the left eye image. For example, the adjustment unit 26 copies the right end image of the right eye image to generate a right end image, and copies the left end image of the left eye image to generate the left end image. In this case, the adjustment unit 26 may blur the right end image and the left end image. Thereby, the adjustment unit 26 can present the same image to the right eye and the left eye at the end portion, and can reduce unnaturalness felt by the viewer at the end portion.

  Further, as an example, the adjustment unit 26 expands a predetermined range from the left end of the display area for the right eye image to the left side to generate a left end image, and generates an image in the predetermined range from the right end of the display area for the left eye image. Enlarge rightward to generate the right edge image. Thereby, the adjustment unit 26 can present a natural stereoscopic image in which images are continuous at the end portion to the viewer.

  For example, the adjustment unit 26 may display a frame image such as black on the right end portion and the left end portion of each of the right eye image and the left eye image. Thereby, the adjustment part 26 can give the three-dimensional effect in all the regions in the frame without the region that does not give the three-dimensional effect at the left end and the right end. Furthermore, by displaying such a frame image, it is possible to give the viewer a sense of seeing an object through a frame (window frame, magnifying glass frame, etc.), so that a natural three-dimensional effect can be given to the viewer. Can be provided. Further, by displaying such a frame image, it may be possible for the viewer to promote the fusion of the right eye image and the left eye image. Note that the adjustment unit 26 may execute such processing for the end portion in other display effect modes in addition to the slide mode.

  As mentioned above, although this invention was demonstrated using embodiment, the technical scope of this invention is not limited to the range as described in the said embodiment. It will be apparent to those skilled in the art that various modifications or improvements can be added to the above-described embodiment. It is apparent from the scope of the claims that the embodiments added with such changes or improvements can be included in the technical scope of the present invention.

  The order of execution of each process such as operations, procedures, steps, and stages in the apparatus, system, program, and method shown in the claims, the description, and the drawings is particularly “before” or “prior to”. It should be noted that the output can be realized in any order unless the output of the previous process is used in the subsequent process. Regarding the operation flow in the claims, the description, and the drawings, even if it is described using “first”, “next”, etc. for convenience, it means that it is essential to carry out in this order. It is not a thing.

DESCRIPTION OF SYMBOLS 10 Imaging device, 20 Setting part, 22 Display information acquisition part, 24 Imaging part, 26 Adjustment part, 28 Output part, 30 Lens part, 32 Input part, 34 Proportional faithful reproduction mode setting part, 36 Weak stereo mode setting part, 38 Close-up mode setting unit, 40 telephoto mode setting unit, 42 wide-angle mode setting unit, 44 pop-up mode setting unit, 46 portrait mode setting unit, 48 landscape mode setting unit, 50 landscape stereoscopic enhancement mode setting unit, 52 night scene mode setting unit , 54 Print mode setting section, 56 Small display mode setting section, 58 Slide stereoscopic mode setting section, 70 Bokeh section, 72 Warning section, 74 Removal section, 76 Soccer ball, 78 Frame image, 80 Display device, 82 Display screen, 90 Selection unit, 110 acceleration sensor, 112 viewer image, 114 reference image, 116 instruction Image, 120 an imaging range, 130 movement amount detection unit, 132 guide, 200 images, 210 the right-eye image, 220 left-eye image

Claims (19)

  1. An imaging device that captures an image for the right eye and an image for the left eye,
    A baseline length that is a distance between the imaging position of the right-eye image and the imaging position of the left-eye image, and right A setting unit for setting at least one of left and right shift amounts on the display screen between the image for the eye and the image for the left eye;
    An imaging unit that captures the image for the right eye and the image for the left eye based on the baseline length set by the setting unit;
    An adjustment unit that adjusts the right-eye image and the left-eye image captured by the imaging unit to be displayed based on a shift amount set by the setting unit;
    An imaging apparatus comprising:
  2. A display information acquisition unit that acquires a screen size of a display screen that displays the right-eye image and the left-eye image;
    The adjustment unit adjusts a shift amount on image data between the right-eye image and the left-eye image based on the shift amount set by the setting unit and the screen size. Imaging device.
  3. The display information acquisition unit acquires the screen size and viewing distance,
    The setting unit sets the shift amount to an eye width distance, and based on the screen size and the viewing distance, the imaging unit allows the subject to be seen at a predetermined size relative to an actual size. The imaging device according to claim 2, wherein the imaging magnification is set.
  4. The imaging apparatus according to claim 3, wherein the setting unit sets the baseline length so that the subject appears at a position at a predetermined ratio with respect to an imaging distance.
  5. The said setting part adjusts the imaging magnification of the said imaging part so that a to-be-photographed object may be seen in an actual magnitude | size based on the said screen size and the said viewing distance, and sets the said base line length to the said interpupillary distance. Imaging device.
  6. The imaging apparatus according to claim 2, wherein the setting unit sets the baseline length according to an imaging magnification of the imaging unit.
  7. The imaging apparatus according to claim 6, wherein the setting unit increases the baseline length as the imaging magnification of the imaging unit increases.
  8. The imaging device according to claim 2, wherein the setting unit sets the baseline length according to the screen size.
  9. The imaging apparatus according to claim 6, wherein the setting unit increases the baseline length as the screen size is smaller.
  10. The display information acquisition unit acquires a screen size from a connected display device,
    The imaging apparatus according to any one of claims 2 to 9, wherein the setting unit sets the shift amount based on the screen size acquired in subsequent imaging.
  11. The display information acquisition unit stores a history of screen sizes of connected display devices,
    The imaging device according to claim 10, wherein the setting unit sets the shift amount based on a screen size included in the stored history.
  12. The imaging apparatus according to claim 2, wherein the display information acquisition unit sets the shift amount based on a screen size selected from a plurality of screen sizes registered in advance.
  13. The said display information acquisition part calculates the said screen size based on the magnitude | size of the screen on the image which imaged the display apparatus with the said imaging device, imaging magnification, and object distance. The imaging device described.
  14. The imaging apparatus according to claim 2, wherein the display information acquisition unit calculates a viewing distance by multiplying the screen size by a predetermined coefficient.
  15. The imaging device according to any one of claims 2 to 13, wherein a subject distance obtained by imaging the screen of the display device from the viewing position by the imaging device is acquired as a viewing distance.
  16. The adjustment unit is configured to extract the right-eye image and the left-eye image of a plurality of pairs with different shift amounts based on the right-eye image and the left-eye image captured in one three-dimensional imaging process. The imaging device according to any one of claims 1 to 15.
  17. The imaging device further includes an output unit that outputs a pair of the right-eye image and the left-eye image corresponding to a screen size of a connected display device to the display device. Imaging device.
  18. The image pickup unit picks up a plurality of pairs of right-eye images and left-eye images corresponding to a plurality of base line lengths in a single three-dimensional image pickup process. The imaging device according to item.
  19. The imaging unit shares one of the right eye image and the left eye image of each of a plurality of pairs with one of the right eye image and the left eye image of another pair. The imaging device according to claim 18.
JP2011089385A 2011-04-13 2011-04-13 Imaging device Pending JP2012220888A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011089385A JP2012220888A (en) 2011-04-13 2011-04-13 Imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011089385A JP2012220888A (en) 2011-04-13 2011-04-13 Imaging device

Publications (1)

Publication Number Publication Date
JP2012220888A true JP2012220888A (en) 2012-11-12

Family

ID=47272414

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011089385A Pending JP2012220888A (en) 2011-04-13 2011-04-13 Imaging device

Country Status (1)

Country Link
JP (1) JP2012220888A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016181781A1 (en) * 2015-05-14 2016-11-17 オリンパス株式会社 Endoscope device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6126278B2 (en) * 1980-01-25 1986-06-19 Matsushita Electric Ind Co Ltd
JPH09121370A (en) * 1995-08-24 1997-05-06 Matsushita Electric Ind Co Ltd Stereoscopic television device
JP2001346227A (en) * 2000-06-01 2001-12-14 Minolta Co Ltd Stereoscopic image display device, stereoscopic image display system and data file for stereoscopic image display
JP2002095018A (en) * 2000-09-12 2002-03-29 Canon Inc Image display controller, image display system and method for displaying image data
JP2002125246A (en) * 2000-10-16 2002-04-26 I-O Data Device Inc Stereoscopic image photographing adaptor, stereoscopic image photographing camera, and stereoscopic image processor
JP2003348621A (en) * 2002-05-27 2003-12-05 Canon Inc Means for setting two-viewpoint camera
JP2006254074A (en) * 2005-03-10 2006-09-21 Minoru Inaba Digital stereoscopic camera or digital stereoscopic camcorder, 3d display or 3d projector, printer, and stereoscopic viewer
JP2006303832A (en) * 2005-04-19 2006-11-02 Minoru Inaba Digital stereo camera or digital stereo video camera
JP2010171646A (en) * 2009-01-21 2010-08-05 Sony Corp Signal processing device, image display device, signal processing method, and computer program
WO2010146847A1 (en) * 2009-06-17 2010-12-23 パナソニック株式会社 Information recording medium for reproducing 3d video, and reproduction device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6126278B2 (en) * 1980-01-25 1986-06-19 Matsushita Electric Ind Co Ltd
JPH09121370A (en) * 1995-08-24 1997-05-06 Matsushita Electric Ind Co Ltd Stereoscopic television device
JP2001346227A (en) * 2000-06-01 2001-12-14 Minolta Co Ltd Stereoscopic image display device, stereoscopic image display system and data file for stereoscopic image display
JP2002095018A (en) * 2000-09-12 2002-03-29 Canon Inc Image display controller, image display system and method for displaying image data
JP2002125246A (en) * 2000-10-16 2002-04-26 I-O Data Device Inc Stereoscopic image photographing adaptor, stereoscopic image photographing camera, and stereoscopic image processor
JP2003348621A (en) * 2002-05-27 2003-12-05 Canon Inc Means for setting two-viewpoint camera
JP2006254074A (en) * 2005-03-10 2006-09-21 Minoru Inaba Digital stereoscopic camera or digital stereoscopic camcorder, 3d display or 3d projector, printer, and stereoscopic viewer
JP2006303832A (en) * 2005-04-19 2006-11-02 Minoru Inaba Digital stereo camera or digital stereo video camera
JP2010171646A (en) * 2009-01-21 2010-08-05 Sony Corp Signal processing device, image display device, signal processing method, and computer program
WO2010146847A1 (en) * 2009-06-17 2010-12-23 パナソニック株式会社 Information recording medium for reproducing 3d video, and reproduction device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016181781A1 (en) * 2015-05-14 2016-11-17 オリンパス株式会社 Endoscope device
JP6169312B2 (en) * 2015-05-14 2017-07-26 オリンパス株式会社 Endoscope device
JPWO2016181781A1 (en) * 2015-05-14 2017-09-14 オリンパス株式会社 Endoscope device

Similar Documents

Publication Publication Date Title
US9451242B2 (en) Apparatus for adjusting displayed picture, display apparatus and display method
CN104079839B (en) Device and method for the multispectral imaging using parallax correction
CN107105157B (en) Portrait image synthesis from multiple images captured by a handheld device
US9007442B2 (en) Stereo image display system, stereo imaging apparatus and stereo display apparatus
JP5449536B2 (en) Stereoscopic image reproduction apparatus and method, stereoscopic imaging apparatus, and stereoscopic display apparatus
JP5875839B2 (en) Plenoptic camera
JP5917017B2 (en) Image processing apparatus, control method therefor, and program
EP2683169A2 (en) Image blur based on 3D depth information
JP5325255B2 (en) Stereoscopic image display device, stereoscopic image display method, and stereoscopic image display program
JP4750835B2 (en) Digital camera and image processing method
CN101636747B (en) Two dimensional/three dimensional digital information acquisition and display device
EP2357838B1 (en) Method and apparatus for processing three-dimensional images
US8836760B2 (en) Image reproducing apparatus, image capturing apparatus, and control method therefor
JP2014504074A (en) Method, system, apparatus and associated processing logic for generating stereoscopic 3D images and video
US10070045B2 (en) Information processing apparatus, electronic apparatus, server, information processing program, and information processing method
JP4072674B2 (en) Image processing apparatus and method, recording medium, and program
KR20140000135A (en) Image processing device, image processing method and image processing computer program product
JP5931206B2 (en) Image processing apparatus, imaging apparatus, program, and image processing method
US8401380B2 (en) Stereo camera with preset modes
JP4707368B2 (en) Stereoscopic image creation method and apparatus
JP4657313B2 (en) Stereoscopic image display apparatus and method, and program
JP4620271B2 (en) Image processing method
JP5425554B2 (en) Stereo imaging device and stereo imaging method
JP3691444B2 (en) Stereo imaging device
JP5346266B2 (en) Image processing apparatus, camera, and image processing method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140403

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150119

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150303

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20150804