US20210385385A1 - Apparatus, method of same, and storage medium that utilizes captured images having different angles of view - Google Patents
Apparatus, method of same, and storage medium that utilizes captured images having different angles of view Download PDFInfo
- Publication number
- US20210385385A1 US20210385385A1 US17/332,809 US202117332809A US2021385385A1 US 20210385385 A1 US20210385385 A1 US 20210385385A1 US 202117332809 A US202117332809 A US 202117332809A US 2021385385 A1 US2021385385 A1 US 2021385385A1
- Authority
- US
- United States
- Prior art keywords
- image
- display
- display device
- subject
- control unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 22
- 238000012545 processing Methods 0.000 claims description 63
- 230000003287 optical effect Effects 0.000 claims description 26
- 230000007246 mechanism Effects 0.000 claims description 11
- 238000012937 correction Methods 0.000 claims description 10
- 230000008859 change Effects 0.000 claims description 5
- 238000001514 detection method Methods 0.000 claims description 5
- 230000001133 acceleration Effects 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000005484 gravity Effects 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 238000006731 degradation reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 101100507312 Invertebrate iridescent virus 6 EF1 gene Proteins 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H04N5/23296—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/685—Vibration or motion blur correction performed by mechanical compensation
- H04N23/687—Vibration or motion blur correction performed by mechanical compensation by shifting the lens or sensor position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/695—Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/23267—
-
- H04N5/23293—
-
- H04N5/23299—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/73—Colour balance circuits, e.g. white balance circuits or colour temperature control
Definitions
- the aspect of the embodiments relates to an apparatus, a method of the same, and a storage medium that utilizes captured images having different angles of view.
- an electronic apparatus such as electronic binoculars that makes a far away subject observable by capturing an image of an observation object (subject) by using an image capturing element, and displaying an image obtained by the image capture.
- Japanese Patent No. 5223486 proposes a technology in which in electronic binoculars including two image capturing units and two display units, an image obtained by correcting variation of movement of a casing, based on a sensor signal, is displayed.
- One aspect of the embodiments provides an apparatus comprising: a first capturing unit configured to obtain a first image; a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image; and a control unit configured to control each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the control unit causes at least one of the first display device and the second display device to display the second image.
- Another aspect of the embodiments provides, a method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising: controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
- a non-transitory computer-readable storage medium comprising instructions for performing a method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising: controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
- FIG. 1 is a block diagram illustrating a configuration example of electronic binoculars related to a first embodiment of the disclosure.
- FIG. 2 is a flowchart illustrating a sequence of operations of display control processing related to the first embodiment.
- FIG. 3 is a figure illustrating an example of image display in a display unit at normal time related to the first embodiment.
- FIG. 4 is a figure illustrating an example of image display in the display unit in the case of losing sight of a subject related to the first embodiment.
- FIG. 5 is a figure illustrating an example of image display in the display unit at the time of zoom driving related to the first embodiment.
- FIG. 6 is a flowchart illustrating a sequence of operations in cropping processing related to the first embodiment.
- FIG. 7 is a figure illustrating a sequence of operations in the cropping processing related to the first embodiment.
- FIG. 8 is a figure illustrating a relation between a distance and a shift amount of an image capture position related to the first embodiment.
- FIG. 9 is a flowchart illustrating a sequence of operations of display control processing related to a second embodiment.
- FIG. 10 is a figure illustrating an example of image display in a display unit at normal time related to the second embodiment.
- an example where electronic binoculars that can capture images having different angles of view by a plurality of image capturing units is used will be described below.
- the present embodiment is not limited to electronic binoculars and is also applicable to other devices that can capture images having different angles of view by a plurality of image capturing units.
- These devices may include, for instance, a spectacle-type information terminal, a digital camera, a mobile phone including a smartphone, a game machine, a tablet terminal, an endoscope, and a medical device for surgery.
- FIG. 1 is a block diagram illustrating a functional configuration example of electronic binoculars 100 as an example of an electronic apparatus of the present embodiment. Note that one or more of functional blocks illustrated in FIG. 1 may be accomplished by hardware such as an ASIC and a programmable logic array (PLA), or may be accomplished by causing a programmable processor such as a CPU and an MPU to execute software. In addition, one or more of the functional blocks illustrated in FIG. 1 may be accomplished by a combination of software and hardware.
- PDA programmable logic array
- a lens 101 L of the left side and a lens 101 R of the right side are arranged at a predetermined interval in front of the electronic binoculars 100 .
- Each of an image capturing unit 102 L of the left side and an image capturing unit 102 R of the right side captures a subject image having passed through the lens 101 L or the lens 101 R, and outputs an image signal.
- Each of the lens 101 L and the lens 101 R may include a plurality of lenses. For instance, focus adjustment is accomplished by shifting a focus lens among the plurality of lenses along an optical axis.
- the lens 101 L and the lens 101 R each include a zoom lens having a focal distance that becomes variable by a shift of several lens groups.
- the focal distances of the lens 101 L and the lens 101 R are independently changeable, and thus the focal distances of the respective lens 101 L and 101 R can be controlled to be identical or different. Note that, in description made below, description will be made taking as an example the case where the lens 101 L and the lens 101 R are each provided with a zoom lens having a variable focal distance, but the lens 101 L and the lens 101 R may be each provided with a fixed focal length lens having a fixed focal distance.
- the image capturing unit 102 L and the image capturing unit 102 R each include, for instance, an image sensor (image capturing element) of a Complementary Metal Oxide Semiconductor (CMOS) type, or the like, and output a captured image signal.
- CMOS Complementary Metal Oxide Semiconductor
- the image sensor may include various other types of image sensors such as a Charge Coupled Device (CCD).
- CCD Charge Coupled Device
- the image capturing unit 102 L and the image capturing unit 102 R are also each provided with a readout circuit or the like suitable for each of the image sensors.
- the image capturing unit 102 L and the image capturing unit 102 R obtain, as images, subject images having passed through the lens 101 L and the lens 101 R having different focal distances, and thus the image capturing unit 102 L and the image capturing unit 102 R can obtain images having different angles of view.
- a linear motor 103 L and a linear motor 103 R can shift the lenses of the lens 101 L and the lens 101 R, respectively and perform focus adjustment and zoom driving. Zoom magnification of the left and right lenses can be changeable individually by separately controlling the linear motor 103 L and the linear motor 103 R.
- a control unit 104 includes one or more processors, a RAM and a ROM, and executes various types of processing of the electronic binoculars 100 by causing the one or more processors to expand and execute, on the RAM, a program stored in the ROM. For instance, cropping processing is performed such that an obtained image obtained (captured) by the image capturing unit 102 L and an obtained image obtained by the image capturing unit 102 R have an identical angle of view, and an image subjected to the cropping processing is displayed on at least one (for example, both) of display units 107 L and 107 R.
- control unit 104 also controls display with respect to the display units 107 L and 107 R. Note that details of the cropping processing and the display control will be described below.
- the control unit 104 determines whether or not a subject is captured within an angle of view, or performs white balance adjustment processing by using images captured by the image capturing unit 102 L and the image capturing unit 102 R. In addition, the control unit 104 calculates a camera shake amount in the image capturing unit, based on information from a gyro sensor 105 L, a gyro sensor 105 R, an acceleration sensor 106 L and an acceleration sensor 106 R. In addition, camera-shake correction can be performed by controlling optical axes of the lens 101 L and the lens 101 R by an optical axis control unit (not illustrated).
- Each of the display unit 107 L and the display unit 107 R includes a display panel, and the display units 107 L and 107 R correspond to a left eye and a right eye of a user, respectively.
- each of the display units 107 L and 107 R displays an image or the like that is captured by the image capturing unit 102 L or the image capturing unit 102 R.
- the display unit 107 L and the display unit 107 R are attached to a movable unit 108 .
- the movable unit 108 is configured to slide or to cause the electronic binoculars 100 to axisymmetrically bend such that positions of the display unit 107 L and the display unit 107 R can match an interval between the left and right eyes of a person.
- a distance measuring unit 109 is a unit that measures a distance from the electronic binoculars 100 to a subject.
- a state switching unit 110 is configured to, according to a manual operation described below, switch whether or not a subject is being captured, or enable a user to switch display contents of the display unit 107 L and the display unit 107 R. For instance, in a case where the state switching unit 110 includes a push button switch, when the push button switch is pressed once within a predetermined period of time, switching of subject capture information can be performed, and when the push button switch is continuously pressed twice within a predetermined period of time, switching of display contents of the display units 107 L and 107 R can be performed.
- the zoom magnification of the lens 101 R described above is controlled by an optical zoom mechanism.
- description will be made taking as an example the case where an angle of view of the lens 101 L is controlled to become wider than an angle of view of the lens 101 R and optical zoom is not performed.
- the present processing is accomplished by causing the one or more processors of the control unit 104 to expand and execute, on the RAM, a program stored in the ROM.
- the control unit 104 determines whether or not the lens 101 R is under the zoom driving (that is, the zoom lens is being driven by the optical zoom mechanism). In a case where, for instance, the control unit 104 obtains a zoom state of the lens 101 R and determines that the lens 101 R is under the zoom driving, the processing proceeds to step S 204 , and in a case where the control unit 104 determines that the lens 101 R is not under the zoom driving, the processing proceeds to step S 201 .
- the control unit 104 determines whether or not, in an image obtained through the lens 101 R equipped with the optical zoom mechanism, a subject is captured. This determination can be performed by, for instance, determining whether or not the subject exists in the image, or determining that the subject no longer exists in the image. The control unit 104 performs this determination by, for instance, automatically determining whether or not the subject in the image can be recognized by image recognition processing. Alternatively, the control unit 104 may perform this determination by using a manual operation input to the state switching unit 110 that is performed after a user has confirmed the image obtained through the lens 101 R.
- step S 202 In a case where the control unit 104 determines that the subject is captured in the image, the processing proceeds to step S 202 , and in a case where the control unit 104 determines that the subject is not captured in the image, the processing proceeds to step S 203 .
- the control unit 104 causes the display unit 107 L and the display unit 107 R to display (respective) images in which the subject is captured.
- a display method of the present step will be described by referring to FIG. 3 .
- An obtained image 200 L illustrated in FIG. 3 represents an image obtained by the image capturing unit 102 L
- an obtained image 200 R represents an image obtained by the image capturing unit 102 R.
- the angle of view of the lens 101 L is controlled to become wider than the angle of view of the lens 101 R, and thus the subject in the obtained image 200 L becomes smaller than the subject in the obtained image 200 R.
- the control unit 104 performs the cropping processing on a region 202 of a portion of the obtained image 200 L to generate a display image 201 L, and causes the display unit 107 L to display the display image 201 L.
- a display image 201 R is identical to the image of the obtained image 200 R, and the obtained image 200 R is displayed as it is on the display unit 107 R. Note that details of the cropping processing by the control unit 104 will be described below by referring to FIG. 6 . Angles of view of the display images 201 L and 201 R are controlled to become approximately identical by the cropping processing.
- step S 203 the control unit 104 has lost sight of the subject in the state at step S 202 , and thus the control unit 104 causes the display unit 107 L and the display unit 107 R to display the image of the image capturing unit 102 L.
- a specific display method of the present step will be described by referring to FIG. 4 .
- An obtained image 300 L represents an image obtained by the image capturing unit 102 L
- an obtained image 300 R represents an image obtained by the image capturing unit 102 R.
- the subject is not captured in the obtained image 300 R having passed through the lens 101 R performing optical zoom, and thus no subject exists in the obtained image 300 R.
- an image having passed through the lens 101 L is captured at a wide angle, and thus the subject exists on the obtained image 300 L.
- control unit 104 causes the display unit 107 L to display, as a display image 301 L, the obtained image 300 L as it is.
- control unit 104 causes the display unit 107 R to display the obtained image 300 L (in place of the obtained image 300 R) as it is.
- the control unit 104 may cause a magnification frame 302 of current optical zoom to be displayed in a display image 301 R to indicate that the display image 301 R is not an image obtained through the lens 101 R.
- the image obtained through the lens 101 L having a wide angle is displayed on both the display unit 107 L and the display unit 107 R.
- the user can find the subject by keeping looking at the display units 107 L and 107 R.
- the obtained image 300 L is displayed on both the display units, but the obtained image 300 L may be displayed on one of the display units (for instance, the display unit 107 L), alone.
- step S 204 since the user is performing the zoom driving on the lens 101 R by the optical zoom mechanism, the control unit 104 causes the display unit 107 L and the display unit 107 R to display the image of the image capturing unit 102 R.
- the control unit 104 causes the display unit 107 L and the display unit 107 R to display the image of the image capturing unit 102 R.
- an obtained image 400 L illustrated in FIG. 5 represents an image obtained by the image capturing unit 102 L
- an obtained image 400 R represents an image obtained by the image capturing unit 102 R.
- the control unit 104 causes the display unit 107 L and the display unit 107 R to display, as a display image 401 L and a display image 401 R, the obtained image 400 R, as it is, and does not use the obtained image 400 L for display.
- the obtained image 400 R obtained while driving the zoom lens is displayed on each of the display units, and accordingly, an image shift or delay can be reduced in comparison to the case of performing the cropping processing in real time, instead.
- predetermined image processing such as white balance processing and camera-shake correction processing may be applied when an image is displayed.
- predetermined image processing such as white balance processing and camera-shake correction processing
- the matters particularly relating to the present embodiment are described, and general description of the image processing is omitted, but the present embodiment is not limited to the matters described, alone.
- an image signal is to be adjusted, based on a combination of characteristics of the image capturing side and the display side. That is, an image is to be corrected in consideration of both manufacture variation of a color sensitivity ratio in the image sensor of the image capturing unit and manufacture variation of color light emission efficiency of the display panel.
- an appropriate adjustment coefficient is used in a combination of the image capturing unit 102 L and the display unit 107 L and, on the other hand, an appropriate adjustment coefficient is used in a combination of the image capturing unit 102 R and the display unit 107 R, and the white balance adjustment is performed.
- an appropriate adjustment coefficient is used in a combination of the image capturing unit 102 R and the display unit 107 L, and on the other hand, an appropriate adjustment coefficient is used in a combination of the image capturing unit 102 R and the display unit 107 R. That is, the combination of the image capturing unit and the display unit is different from the combination of the image capturing unit and the display unit at step S 202 , and thus a different value of the adjustment coefficient is also used.
- the camera-shake correction is performed in accordance with a combination of zoom ratios in the lens 101 L and the lens 101 R, the gyro sensor 105 L and the gyro sensor 105 R that are of the left and the right, respectively, the acceleration sensor 106 L, and the acceleration sensor 106 R that are of the left and the right, respectively. That is, the camera-shake correction is performed by independently adjusting control of the optical axis of the lens 101 L and the optical axis of the lens 101 R.
- the cropping processing at S 202 described above will be described in more detail. Note that here, a method of cropping while comparing the obtained images via the lenses 101 R and 101 L will be described.
- the present processing is accomplished by causing the one or more processors of the control unit 101 to expand and execute, on the RAM, a program stored in the ROM.
- the control unit 104 extracts, as feature points, edge portions of the subjects from the obtained image 200 L and the obtained image 200 R obtained at step S 202 ( FIG. 2 ).
- FIG. 7 an image example in a case where the feature points in the obtained image 200 L and the obtained image 200 R are extracted is schematically illustrated. In this example, detected feature points are indicated by black dots.
- the control unit 104 calculates a ratio of the area from the area of a region formed by connecting the feature points. In a case where the obtained image 200 R is taken as a reference image, this ratio of the area corresponds to an enlargement ratio obtained after the obtained image 200 L is cropped. Note that the control unit 104 also determines, from a calculated enlargement ratio, the size of a region to be cropped in the obtained image 200 L.
- the control unit 104 calculates respective gravity center positions of the regions formed by connecting the feature points.
- the gravity center positions are indicated by x marks.
- the control unit 104 further calculates a difference in the respective gravity center positions, and determines a crop position from center coordinates of the obtained image 200 L and the “difference in the gravity center positions.”
- the control unit 104 corrects the crop position, based on distance information provided from the distance measuring unit 109 . This is because the left image and the right image can generally correspond to each other in accordance with the crop position determined at step S 602 , but it is actually necessary to correct the crop position in accordance with the distance to the subject.
- FIG. 8 a relation between a subject distance and a shift amount of subject positions in the image capturing units of the left and the right. Dashed lines illustrated in FIG. 8 indicate the optical axis center of the lens 101 L and the optical axis center of the lens 101 R, respectively.
- a reference image is the obtained image 200 R, and a position of the subject captured by the image capturing unit 102 R invariably corresponds to the optical axis regardless of the distance to the subject.
- a position of the subject captured by the image capturing unit 102 L shifts from a position on the optical axis in accordance with the subject distance.
- the shift amount is indicated by an arrow, and as the distance to the subject is closer, the shift amount becomes larger.
- the subject invariably exists on the optical axes of the image capturing units of the left and the right (since the shift amount according to the subject distance is not considered, this is different from an actual state). That is, the distance to the subject or a stereoscopic effect appears different from an actual state.
- the control unit 104 corrects the position by the shift amount (by the amount indicated by the arrow in FIG. 8 ) from the crop position determined at step S 602 . Specifically, the control unit 104 calculates the shift amount by using a known triangulation method from the distance to the subject detected by the distance measuring unit 109 , and the base line lengths and the focal distances of the lens 101 L and the lens 101 R.
- the control unit 104 extracts an image to be cropped from the obtained image 200 L, and subjects an extracted image to enlargement processing. For instance, the control unit 104 performs the image extraction processing and the enlargement processing with respect to the obtained image 200 L, based on the crop position and the enlargement ratio calculated in the operations at steps 5601 to 5603 .
- control unit 104 causes the display unit 107 L to display an extracted and enlarged image.
- the control unit 104 subsequently ends the present processing, and the processing returns to S 202 that is a call source.
- the obtained image 200 L and the obtained image 200 R are compared to calculate the crop position, and further the crop position is corrected by the shift amount according to the subject distance. In this manner, a cropped image having a reduced sense of unnaturalness can be displayed to the user.
- control unit 104 may perform the cropping by a method different from the cropping processing described above. For instance, first, a relation between crop magnification and a crop position corresponding to a zoom position is measured in advance at the time of manufacture of the electronic binoculars 100 , and data obtained by the measurement is stored in, for instance, the ROM or the like. Then, when the electronic binoculars 100 is used, the control unit 104 readouts the crop magnification and the crop position with respect to zoom position information and performs image processing. In this case, for instance, a configuration where the zoom position can be detected by attaching a potentiometer to the lens 101 L and the lens 101 R may be used. In this configuration, a load of the control unit 104 can be alleviated and an effect of alleviating power consumption can be expected, in comparison to the above-described method of performing the cropping processing by comparing the images.
- optical characteristics of the lens vary due to temperature and humidity in a use environment, and thus more accurate display can be performed by the cropping processing including the comparison of the images.
- control unit 104 is configured to cause the display units of the left and the right to display, under the zoom driving, the image captured at a wide angle.
- the displaying in this manner may reduce the distance to the subject and the stereoscopic effect as described by referring to FIG. 8 .
- the sense of unnaturalness of the display of the left and the right may be alleviated by performing the cropping processing also under the zoom driving.
- tolerance of a display error increases (due to variation of a subject) under the zoom driving.
- a first image and a second image having a wider angle of view than an angle of view of the first image are obtained, and each of the two display units is controlled to display an image based on the first image or the second image.
- the second image having a wide angle of view is displayed on at least one of the two display units. In this manner, even in a case where a subject shifts to the extent that the subject lies beyond an angle of view, it becomes possible to facilitate capturing of the subject.
- the second embodiment differs from the first embodiment in that optical zoom is controllable also in a lens 101 L.
- a subject capturing timer is used to detect that a main subject stably exists in an obtained image.
- This subject capturing timer may be constituted inside a control unit 104 , and when it is detected that the subject exists in a predetermined range of the image, the timer starts. Then, based on whether or not to keep detecting the subject for certain time, it is determined whether or not the subject stably exists in the predetermined range.
- the control unit 104 can determine whether the subject moving at high speed only momentarily exists in the predetermined range, or can determine whether the subject can be kept being captured stably.
- the display control processing in the present embodiment is accomplished by causing one or more processors of the control unit 104 to expand and execute, on a RAM, a program stored in a ROM.
- control unit 104 executes processing of steps S 200 to S 204 , and performs display control in accordance with a zoom driving state or a capture state of the subject.
- the control unit 104 sets predetermined time in the subject capturing timer as a capture stability detection unit.
- the predetermined time may be a value set in advance (for instance, three seconds), or may be changed by a user.
- control unit 104 may transition to step S 901 and may simultaneously execute step S 202 .
- step S 902 the control unit 104 starts a countdown by the timer. Then, at step S 903 , the control unit 104 determines whether a value of the timer is zero and, in a case where the control unit 104 determines that a value of the timer is zero, the processing proceeds to step S 904 . On the other hand, in a case where the control unit 104 determines that a value of the timer is not zero, the processing returns to step S 902 . That is, the control unit 104 repeats the countdown at step S 902 until a value of the timer becomes zero, and when a value of the timer becomes zero, the processing proceeds to step S 904 .
- the control unit 104 changes the cropping size of the image of the image capturing unit 102 L while performing the zoom driving on the lens 101 L, and the control unit 104 causes the display unit 107 L to display a cropped image.
- the control unit 104 performs the zoom driving in combination with the change of the cropping size such that the size of the subject displayed on the display unit 107 L is maintained. In this manner, it becomes possible to increase zoom magnification without giving a sense of unnaturalness to the user looking at the image on the display unit.
- the zoom driving of the lens 101 L and the change of the cropping size are continued until the zoom magnification of the lens 101 L and a lens 101 R becomes identical.
- the control unit 104 continues display in that state.
- an obtained image 200 L is an image obtained by the image capturing unit 102 L
- an obtained image 200 R is an image obtained by an image capturing unit 102 R.
- An image having passed through the lens 101 L has a wider angle than an angle of an image having passed through the lens 101 R, and thus the subject in the obtained image 200 L is smaller than the subject in the obtained image 200 R.
- the control unit 104 performs the zoom driving on the lens 101 L (such that angles of view of the two obtained images come close to each other)
- the subject is enlarged, and the obtained image 200 L becomes an obtained image 1000 .
- a display image becomes a display image 1001 .
- the control unit 104 further adjusts the cropping size such that the size of the subject becomes identical to the size of the subject in the obtained image 200 L. In this manner, enlargement magnification of the cropping of the display image 1001 is reduced in comparison with enlargement magnification of the cropping of a display image 201 L, and thus a rate of a decrease in resolution of the image due to the cropping is reduced, and degradation in image quality can be suppressed.
- control unit 104 enlarges, by further increasing the zoom magnification, the subject size in the obtained image 1000 until the subject size in the obtained image 1000 becomes equal to the subject size in the obtained image 200 R. In this manner, in a case where the subject can be captured stably, it is unnecessary to perform the cropping of the display image 201 L, and thus degradation in image quality due to the cropping can be prevented.
- the zoom driving of the lens 101 L and the cropping size are continuously changed, and thus degradation in image quality due to the cropping processing of the display image 201 L can be prevented.
- the display image 201 L may be held temporarily in a memory and the image may be kept being held for certain time (for instance, 0.1 seconds), and the display image may be changed step by step, rather than continuously.
- the zoom driving of the lens 101 L can be performed at high speed and greatly, and zoom magnification at a passing point of the zoom driving can be predicted to determine the cropping size and perform the cropping processing. In other words, it is also possible to reduce the time until completion of the zoom driving of the lens.
- the present embodiment description was made taking as an example the case of using the subject capturing timer as the capture stability detection processing.
- the present embodiment is not limited to this example, and any other method may be used as long as it can be detected that the subject stably exists in the predetermined range of the image.
- the electronic binoculars 100 may be equipped with an acceleration sensor (not illustrated), and when acceleration is almost at zero, it may be determined that the subject is captured.
- image identification may be executed continuously by the control unit 104 , and it may be determined based on continuous identification results that the subject stably exists.
- zoom in the lens 101 R is optical zoom
- zoom in the lens 101 R is not limited to optical zoom
- electronic zoom (cropping processing) may be used.
- Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as a ‘
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
Abstract
An apparatus includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image. The apparatus controls each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image. In a case where no subject exists in the first image, the apparatus causes at least one of the first display device and the second display device to display the second image.
Description
- The aspect of the embodiments relates to an apparatus, a method of the same, and a storage medium that utilizes captured images having different angles of view.
- In the related art, there is known an electronic apparatus such as electronic binoculars that makes a far away subject observable by capturing an image of an observation object (subject) by using an image capturing element, and displaying an image obtained by the image capture.
- In such an electronic apparatus, in a case where the subject slightly shakes and moves, or in a case where a camera shake occurs, a position of the subject within an image displays shifts, and it becomes difficult to observe the subject. To address such a situation, Japanese Patent No. 5223486 proposes a technology in which in electronic binoculars including two image capturing units and two display units, an image obtained by correcting variation of movement of a casing, based on a sensor signal, is displayed.
- However, in a case where a user loses sight of a subject due to movement of the subject such as a shift to the outside of an angle of view, it is difficult to search for the subject by continuing to looking through the electronic binoculars. In this case, the user is to perform an operation such as temporarily taking their eyes off the electronic binoculars to identify a subject position by naked eyes, and looking through the binoculars again.
- One aspect of the embodiments provides an apparatus comprising: a first capturing unit configured to obtain a first image; a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image; and a control unit configured to control each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the control unit causes at least one of the first display device and the second display device to display the second image.
- Another aspect of the embodiments provides, a method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising: controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
- Still another aspect of the embodiments provides, a non-transitory computer-readable storage medium comprising instructions for performing a method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising: controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image, wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
- According to the aspect of the embodiments, even in a case where a subject shifts to the extent that the subject lies beyond an angle of view, it becomes possible to facilitate capturing of the subject.
- Further features of the disclosure will become apparent from the following description of exemplary embodiments (with reference to the attached drawings).
-
FIG. 1 is a block diagram illustrating a configuration example of electronic binoculars related to a first embodiment of the disclosure. -
FIG. 2 is a flowchart illustrating a sequence of operations of display control processing related to the first embodiment. -
FIG. 3 is a figure illustrating an example of image display in a display unit at normal time related to the first embodiment. -
FIG. 4 is a figure illustrating an example of image display in the display unit in the case of losing sight of a subject related to the first embodiment. -
FIG. 5 is a figure illustrating an example of image display in the display unit at the time of zoom driving related to the first embodiment. -
FIG. 6 is a flowchart illustrating a sequence of operations in cropping processing related to the first embodiment. -
FIG. 7 is a figure illustrating a sequence of operations in the cropping processing related to the first embodiment. -
FIG. 8 is a figure illustrating a relation between a distance and a shift amount of an image capture position related to the first embodiment. -
FIG. 9 is a flowchart illustrating a sequence of operations of display control processing related to a second embodiment. -
FIG. 10 is a figure illustrating an example of image display in a display unit at normal time related to the second embodiment. - Hereinafter, embodiments will be described in detail with reference to the attached drawings. Note, the following embodiments are not intended to limit the scope of the disclosure. Multiple features are described in the embodiments, but limitation is not made to a disclosure that requires all such features, and multiple such features may be combined as appropriate. Furthermore, in the attached drawings, the same reference numerals are given to the same or similar configurations, and redundant description thereof is omitted.
- As an example of an electronic apparatus, an example where electronic binoculars that can capture images having different angles of view by a plurality of image capturing units is used will be described below. However, the present embodiment is not limited to electronic binoculars and is also applicable to other devices that can capture images having different angles of view by a plurality of image capturing units. These devices may include, for instance, a spectacle-type information terminal, a digital camera, a mobile phone including a smartphone, a game machine, a tablet terminal, an endoscope, and a medical device for surgery.
- Configuration of
Electronic Binoculars 100 -
FIG. 1 is a block diagram illustrating a functional configuration example ofelectronic binoculars 100 as an example of an electronic apparatus of the present embodiment. Note that one or more of functional blocks illustrated inFIG. 1 may be accomplished by hardware such as an ASIC and a programmable logic array (PLA), or may be accomplished by causing a programmable processor such as a CPU and an MPU to execute software. In addition, one or more of the functional blocks illustrated inFIG. 1 may be accomplished by a combination of software and hardware. - A
lens 101L of the left side and alens 101R of the right side are arranged at a predetermined interval in front of theelectronic binoculars 100. Each of animage capturing unit 102L of the left side and animage capturing unit 102R of the right side captures a subject image having passed through thelens 101L or thelens 101R, and outputs an image signal. Each of thelens 101L and thelens 101R may include a plurality of lenses. For instance, focus adjustment is accomplished by shifting a focus lens among the plurality of lenses along an optical axis. In addition, thelens 101L and thelens 101R each include a zoom lens having a focal distance that becomes variable by a shift of several lens groups. Note that the focal distances of thelens 101L and thelens 101R are independently changeable, and thus the focal distances of therespective lens lens 101L and thelens 101R are each provided with a zoom lens having a variable focal distance, but thelens 101L and thelens 101R may be each provided with a fixed focal length lens having a fixed focal distance. - The
image capturing unit 102L and theimage capturing unit 102R each include, for instance, an image sensor (image capturing element) of a Complementary Metal Oxide Semiconductor (CMOS) type, or the like, and output a captured image signal. The image sensor may include various other types of image sensors such as a Charge Coupled Device (CCD). In addition, theimage capturing unit 102L and theimage capturing unit 102R are also each provided with a readout circuit or the like suitable for each of the image sensors. Theimage capturing unit 102L and theimage capturing unit 102R obtain, as images, subject images having passed through thelens 101L and thelens 101R having different focal distances, and thus theimage capturing unit 102L and theimage capturing unit 102R can obtain images having different angles of view. - A
linear motor 103L and alinear motor 103R can shift the lenses of thelens 101L and thelens 101R, respectively and perform focus adjustment and zoom driving. Zoom magnification of the left and right lenses can be changeable individually by separately controlling thelinear motor 103L and thelinear motor 103R. - A
control unit 104 includes one or more processors, a RAM and a ROM, and executes various types of processing of theelectronic binoculars 100 by causing the one or more processors to expand and execute, on the RAM, a program stored in the ROM. For instance, cropping processing is performed such that an obtained image obtained (captured) by theimage capturing unit 102L and an obtained image obtained by theimage capturing unit 102R have an identical angle of view, and an image subjected to the cropping processing is displayed on at least one (for example, both) ofdisplay units lens 101L and thelens 101R are being driven, thecontrol unit 104 also controls display with respect to thedisplay units - The
control unit 104 determines whether or not a subject is captured within an angle of view, or performs white balance adjustment processing by using images captured by theimage capturing unit 102L and theimage capturing unit 102R. In addition, thecontrol unit 104 calculates a camera shake amount in the image capturing unit, based on information from agyro sensor 105L, agyro sensor 105R, anacceleration sensor 106L and anacceleration sensor 106R. In addition, camera-shake correction can be performed by controlling optical axes of thelens 101L and thelens 101R by an optical axis control unit (not illustrated). - Each of the
display unit 107L and thedisplay unit 107R includes a display panel, and thedisplay units control unit 104, each of thedisplay units image capturing unit 102L or theimage capturing unit 102R. Thedisplay unit 107L and thedisplay unit 107R are attached to amovable unit 108. Themovable unit 108 is configured to slide or to cause theelectronic binoculars 100 to axisymmetrically bend such that positions of thedisplay unit 107L and thedisplay unit 107R can match an interval between the left and right eyes of a person. - A
distance measuring unit 109 is a unit that measures a distance from theelectronic binoculars 100 to a subject. Astate switching unit 110 is configured to, according to a manual operation described below, switch whether or not a subject is being captured, or enable a user to switch display contents of thedisplay unit 107L and thedisplay unit 107R. For instance, in a case where thestate switching unit 110 includes a push button switch, when the push button switch is pressed once within a predetermined period of time, switching of subject capture information can be performed, and when the push button switch is continuously pressed twice within a predetermined period of time, switching of display contents of thedisplay units - Sequence of Operations of Display Control Processing
- Next, by referring to
FIG. 2 , display control processing of controlling display of thedisplay unit 107L and thedisplay unit 107R in theelectronic binoculars 100 will be described. In the display control processing of the present embodiment, the zoom magnification of thelens 101R described above is controlled by an optical zoom mechanism. On the other hand, description will be made taking as an example the case where an angle of view of thelens 101L is controlled to become wider than an angle of view of thelens 101R and optical zoom is not performed. Note that the present processing is accomplished by causing the one or more processors of thecontrol unit 104 to expand and execute, on the RAM, a program stored in the ROM. - At step S200, the
control unit 104 determines whether or not thelens 101R is under the zoom driving (that is, the zoom lens is being driven by the optical zoom mechanism). In a case where, for instance, thecontrol unit 104 obtains a zoom state of thelens 101R and determines that thelens 101R is under the zoom driving, the processing proceeds to step S204, and in a case where thecontrol unit 104 determines that thelens 101R is not under the zoom driving, the processing proceeds to step S201. - At step S201, the
control unit 104 determines whether or not, in an image obtained through thelens 101R equipped with the optical zoom mechanism, a subject is captured. This determination can be performed by, for instance, determining whether or not the subject exists in the image, or determining that the subject no longer exists in the image. Thecontrol unit 104 performs this determination by, for instance, automatically determining whether or not the subject in the image can be recognized by image recognition processing. Alternatively, thecontrol unit 104 may perform this determination by using a manual operation input to thestate switching unit 110 that is performed after a user has confirmed the image obtained through thelens 101R. In a case where thecontrol unit 104 determines that the subject is captured in the image, the processing proceeds to step S202, and in a case where thecontrol unit 104 determines that the subject is not captured in the image, the processing proceeds to step S203. - At step S202, the
control unit 104 causes thedisplay unit 107L and thedisplay unit 107R to display (respective) images in which the subject is captured. A display method of the present step will be described by referring toFIG. 3 . - An obtained
image 200L illustrated inFIG. 3 represents an image obtained by theimage capturing unit 102L, and an obtainedimage 200R represents an image obtained by theimage capturing unit 102R. As described above, the angle of view of thelens 101L is controlled to become wider than the angle of view of thelens 101R, and thus the subject in the obtainedimage 200L becomes smaller than the subject in the obtainedimage 200R. - The
control unit 104 performs the cropping processing on aregion 202 of a portion of the obtainedimage 200L to generate adisplay image 201L, and causes thedisplay unit 107L to display thedisplay image 201L. Adisplay image 201R is identical to the image of the obtainedimage 200R, and the obtainedimage 200R is displayed as it is on thedisplay unit 107R. Note that details of the cropping processing by thecontrol unit 104 will be described below by referring toFIG. 6 . Angles of view of thedisplay images - At step S203, the
control unit 104 has lost sight of the subject in the state at step S202, and thus thecontrol unit 104 causes thedisplay unit 107L and thedisplay unit 107R to display the image of theimage capturing unit 102L. A specific display method of the present step will be described by referring toFIG. 4 . - An obtained
image 300L represents an image obtained by theimage capturing unit 102L, and an obtainedimage 300R represents an image obtained by theimage capturing unit 102R. At this time, the subject is not captured in the obtainedimage 300R having passed through thelens 101R performing optical zoom, and thus no subject exists in the obtainedimage 300R. On the other hand, an image having passed through thelens 101L is captured at a wide angle, and thus the subject exists on the obtainedimage 300L. - Thus, the
control unit 104 causes thedisplay unit 107L to display, as adisplay image 301L, the obtainedimage 300L as it is. On the other hand, thecontrol unit 104 causes thedisplay unit 107R to display the obtainedimage 300L (in place of the obtainedimage 300R) as it is. At this time, thecontrol unit 104 may cause amagnification frame 302 of current optical zoom to be displayed in a display image 301R to indicate that the display image 301R is not an image obtained through thelens 101R. - In this manner, when the subject is not captured at the angle of view of the
lens 101R performing optical zoom, the image obtained through thelens 101L having a wide angle is displayed on both thedisplay unit 107L and thedisplay unit 107R. In this manner, the user can find the subject by keeping looking at thedisplay units image 300L is displayed on both the display units, but the obtainedimage 300L may be displayed on one of the display units (for instance, thedisplay unit 107L), alone. - At step S204, since the user is performing the zoom driving on the
lens 101R by the optical zoom mechanism, thecontrol unit 104 causes thedisplay unit 107L and thedisplay unit 107R to display the image of theimage capturing unit 102R. By referring toFIG. 5 , a display method of step S204 will be described. - Note that an obtained
image 400L illustrated inFIG. 5 represents an image obtained by theimage capturing unit 102L, and an obtainedimage 400R represents an image obtained by theimage capturing unit 102R. - In a case where the zoom lens is being driven (under the zoom driving) in the
lens 101R, the size of the subject in the obtainedimage 400R varies. Thus, thecontrol unit 104 causes thedisplay unit 107L and thedisplay unit 107R to display, as adisplay image 401L and adisplay image 401R, the obtainedimage 400R, as it is, and does not use the obtainedimage 400L for display. In this manner, under the zoom driving, the obtainedimage 400R obtained while driving the zoom lens is displayed on each of the display units, and accordingly, an image shift or delay can be reduced in comparison to the case of performing the cropping processing in real time, instead. Thus, it becomes easy for the user to set a display image at a desired angle of view. - Note that, at steps S202 to S204 described above, predetermined image processing such as white balance processing and camera-shake correction processing may be applied when an image is displayed. Here, the matters particularly relating to the present embodiment are described, and general description of the image processing is omitted, but the present embodiment is not limited to the matters described, alone.
- In a case where the white balance processing is performed, an image signal is to be adjusted, based on a combination of characteristics of the image capturing side and the display side. That is, an image is to be corrected in consideration of both manufacture variation of a color sensitivity ratio in the image sensor of the image capturing unit and manufacture variation of color light emission efficiency of the display panel.
- In the processing at step S202 described above, on one hand, an appropriate adjustment coefficient is used in a combination of the
image capturing unit 102L and thedisplay unit 107L and, on the other hand, an appropriate adjustment coefficient is used in a combination of theimage capturing unit 102R and thedisplay unit 107R, and the white balance adjustment is performed. - At step S204, on one hand, an appropriate adjustment coefficient is used in a combination of the
image capturing unit 102R and thedisplay unit 107L, and on the other hand, an appropriate adjustment coefficient is used in a combination of theimage capturing unit 102R and thedisplay unit 107R. That is, the combination of the image capturing unit and the display unit is different from the combination of the image capturing unit and the display unit at step S202, and thus a different value of the adjustment coefficient is also used. - In addition, the same applies to the camera-shake correction. That is, the camera-shake correction is performed in accordance with a combination of zoom ratios in the
lens 101L and thelens 101R, thegyro sensor 105L and thegyro sensor 105R that are of the left and the right, respectively, theacceleration sensor 106L, and theacceleration sensor 106R that are of the left and the right, respectively. That is, the camera-shake correction is performed by independently adjusting control of the optical axis of thelens 101L and the optical axis of thelens 101R. - Sequence of Operations Related to Cropping Processing
- Further, the cropping processing at S202 described above will be described in more detail. Note that here, a method of cropping while comparing the obtained images via the
lenses - At step S600, the
control unit 104 extracts, as feature points, edge portions of the subjects from the obtainedimage 200L and the obtainedimage 200R obtained at step S202 (FIG. 2 ). For instance, inFIG. 7 , an image example in a case where the feature points in the obtainedimage 200L and the obtainedimage 200R are extracted is schematically illustrated. In this example, detected feature points are indicated by black dots. - At step S601, the
control unit 104 calculates a ratio of the area from the area of a region formed by connecting the feature points. In a case where the obtainedimage 200R is taken as a reference image, this ratio of the area corresponds to an enlargement ratio obtained after the obtainedimage 200L is cropped. Note that thecontrol unit 104 also determines, from a calculated enlargement ratio, the size of a region to be cropped in the obtainedimage 200L. - At step S602, the
control unit 104 calculates respective gravity center positions of the regions formed by connecting the feature points. In the example illustrated inFIG. 7 , the gravity center positions are indicated by x marks. Thecontrol unit 104 further calculates a difference in the respective gravity center positions, and determines a crop position from center coordinates of the obtainedimage 200L and the “difference in the gravity center positions.” - At step S603, the
control unit 104 corrects the crop position, based on distance information provided from thedistance measuring unit 109. This is because the left image and the right image can generally correspond to each other in accordance with the crop position determined at step S602, but it is actually necessary to correct the crop position in accordance with the distance to the subject. - In
FIG. 8 , a relation between a subject distance and a shift amount of subject positions in the image capturing units of the left and the right. Dashed lines illustrated inFIG. 8 indicate the optical axis center of thelens 101L and the optical axis center of thelens 101R, respectively. A reference image is the obtainedimage 200R, and a position of the subject captured by theimage capturing unit 102R invariably corresponds to the optical axis regardless of the distance to the subject. On the other hand, a position of the subject captured by theimage capturing unit 102L shifts from a position on the optical axis in accordance with the subject distance. InFIG. 8 , the shift amount is indicated by an arrow, and as the distance to the subject is closer, the shift amount becomes larger. - Thus, when cropping and displaying are performed at the position calculated at step S602, the subject invariably exists on the optical axes of the image capturing units of the left and the right (since the shift amount according to the subject distance is not considered, this is different from an actual state). That is, the distance to the subject or a stereoscopic effect appears different from an actual state.
- Then, the
control unit 104 corrects the position by the shift amount (by the amount indicated by the arrow inFIG. 8 ) from the crop position determined at step S602. Specifically, thecontrol unit 104 calculates the shift amount by using a known triangulation method from the distance to the subject detected by thedistance measuring unit 109, and the base line lengths and the focal distances of thelens 101L and thelens 101R. - At step S604, the
control unit 104 extracts an image to be cropped from the obtainedimage 200L, and subjects an extracted image to enlargement processing. For instance, thecontrol unit 104 performs the image extraction processing and the enlargement processing with respect to the obtainedimage 200L, based on the crop position and the enlargement ratio calculated in the operations at steps 5601 to 5603. - At step S605, the
control unit 104 causes thedisplay unit 107L to display an extracted and enlarged image. Thecontrol unit 104 subsequently ends the present processing, and the processing returns to S202 that is a call source. - In this manner, in the cropping processing related to the present embodiment, the obtained
image 200L and the obtainedimage 200R are compared to calculate the crop position, and further the crop position is corrected by the shift amount according to the subject distance. In this manner, a cropped image having a reduced sense of unnaturalness can be displayed to the user. - Note that the
control unit 104 may perform the cropping by a method different from the cropping processing described above. For instance, first, a relation between crop magnification and a crop position corresponding to a zoom position is measured in advance at the time of manufacture of theelectronic binoculars 100, and data obtained by the measurement is stored in, for instance, the ROM or the like. Then, when theelectronic binoculars 100 is used, thecontrol unit 104 readouts the crop magnification and the crop position with respect to zoom position information and performs image processing. In this case, for instance, a configuration where the zoom position can be detected by attaching a potentiometer to thelens 101L and thelens 101R may be used. In this configuration, a load of thecontrol unit 104 can be alleviated and an effect of alleviating power consumption can be expected, in comparison to the above-described method of performing the cropping processing by comparing the images. - On the other hand, actually, optical characteristics of the lens (magnification, the lens optical axis center, or the like) vary due to temperature and humidity in a use environment, and thus more accurate display can be performed by the cropping processing including the comparison of the images.
- In the first embodiment described above, the
control unit 104 is configured to cause the display units of the left and the right to display, under the zoom driving, the image captured at a wide angle. The displaying in this manner may reduce the distance to the subject and the stereoscopic effect as described by referring toFIG. 8 . Then, the sense of unnaturalness of the display of the left and the right may be alleviated by performing the cropping processing also under the zoom driving. In addition, it is also conceivable that tolerance of a display error increases (due to variation of a subject) under the zoom driving. Thus, it becomes possible to provide a display image at high speed and with sufficient display quality by using, for instance, the above-described method of performing the cropping processing, based on the “magnification and the position stored in advance,” in which a processing load of thecontrol unit 104 is reduced. - As described above, in the present embodiment, a first image and a second image having a wider angle of view than an angle of view of the first image are obtained, and each of the two display units is controlled to display an image based on the first image or the second image. Particularly, in a case where no subject exists in the first image having a narrow angle of view, the second image having a wide angle of view is displayed on at least one of the two display units. In this manner, even in a case where a subject shifts to the extent that the subject lies beyond an angle of view, it becomes possible to facilitate capturing of the subject.
- Next, a second embodiment will be described. The second embodiment differs from the first embodiment in that optical zoom is controllable also in a
lens 101L. In addition, in the present embodiment, when it is determined whether a subject is captured, a subject capturing timer is used to detect that a main subject stably exists in an obtained image. This subject capturing timer, may be constituted inside acontrol unit 104, and when it is detected that the subject exists in a predetermined range of the image, the timer starts. Then, based on whether or not to keep detecting the subject for certain time, it is determined whether or not the subject stably exists in the predetermined range. In capture stability detection processing using this subject capturing timer, thecontrol unit 104 can determine whether the subject moving at high speed only momentarily exists in the predetermined range, or can determine whether the subject can be kept being captured stably. - Note that other configurations of
electronic binoculars 100 related to the present embodiment are identical or substantially identical to the configurations in the first embodiment. Thus, the identical or substantially identical configurations will be denoted by identical reference signs, and description of those configurations will be omitted, and description will be made focusing on differences. - Sequence of Operations of Display Control Processing
- By referring to
FIG. 9 , display control processing of controlling display on adisplay unit 107L and adisplay unit 107R, related to the present embodiment will be described. Note that, as with the first embodiment, the display control processing in the present embodiment is accomplished by causing one or more processors of thecontrol unit 104 to expand and execute, on a RAM, a program stored in a ROM. - First, as with the first embodiment, the
control unit 104 executes processing of steps S200 to S204, and performs display control in accordance with a zoom driving state or a capture state of the subject. - At step S901, after the
control unit 104 causes at step S202 thedisplay unit 107L to display an image obtained by cropping an image of animage capturing unit 102L, thecontrol unit 104 sets predetermined time in the subject capturing timer as a capture stability detection unit. Note that the predetermined time may be a value set in advance (for instance, three seconds), or may be changed by a user. - Note that as result of determining at step S201 that the subject is captured, the
control unit 104 may transition to step S901 and may simultaneously execute step S202. - At step S902, the
control unit 104 starts a countdown by the timer. Then, at step S903, thecontrol unit 104 determines whether a value of the timer is zero and, in a case where thecontrol unit 104 determines that a value of the timer is zero, the processing proceeds to step S904. On the other hand, in a case where thecontrol unit 104 determines that a value of the timer is not zero, the processing returns to step S902. That is, thecontrol unit 104 repeats the countdown at step S902 until a value of the timer becomes zero, and when a value of the timer becomes zero, the processing proceeds to step S904. - At step S904, the
control unit 104 changes the cropping size of the image of theimage capturing unit 102L while performing the zoom driving on thelens 101L, and thecontrol unit 104 causes thedisplay unit 107L to display a cropped image. At this time, thecontrol unit 104 performs the zoom driving in combination with the change of the cropping size such that the size of the subject displayed on thedisplay unit 107L is maintained. In this manner, it becomes possible to increase zoom magnification without giving a sense of unnaturalness to the user looking at the image on the display unit. The zoom driving of thelens 101L and the change of the cropping size are continued until the zoom magnification of thelens 101L and alens 101R becomes identical. When the zoom magnification of thelens 101L and thelens 101R becomes identical, thecontrol unit 104 continues display in that state. - Further, by referring to
FIG. 10 , display control at step S904 will be described. At the beginning of step S904, an obtainedimage 200L is an image obtained by theimage capturing unit 102L, and, in addition, an obtainedimage 200R is an image obtained by animage capturing unit 102R. An image having passed through thelens 101L has a wider angle than an angle of an image having passed through thelens 101R, and thus the subject in the obtainedimage 200L is smaller than the subject in the obtainedimage 200R. In this state, when thecontrol unit 104 performs the zoom driving on thelens 101L (such that angles of view of the two obtained images come close to each other), the subject is enlarged, and the obtainedimage 200L becomes an obtainedimage 1000. At this time, when thecontrol unit 104 also simultaneously changes the cropping size to aregion 1002, a display image becomes adisplay image 1001. At this time, thecontrol unit 104 further adjusts the cropping size such that the size of the subject becomes identical to the size of the subject in the obtainedimage 200L. In this manner, enlargement magnification of the cropping of thedisplay image 1001 is reduced in comparison with enlargement magnification of the cropping of adisplay image 201L, and thus a rate of a decrease in resolution of the image due to the cropping is reduced, and degradation in image quality can be suppressed. - Subsequently, the
control unit 104 enlarges, by further increasing the zoom magnification, the subject size in the obtainedimage 1000 until the subject size in the obtainedimage 1000 becomes equal to the subject size in the obtainedimage 200R. In this manner, in a case where the subject can be captured stably, it is unnecessary to perform the cropping of thedisplay image 201L, and thus degradation in image quality due to the cropping can be prevented. - As described above, in the present embodiment, after a display image obtained by the cropping processing from the obtained
image 200L is displayed, the zoom driving of thelens 101L and the cropping size are continuously changed, and thus degradation in image quality due to the cropping processing of thedisplay image 201L can be prevented. - Note that in the present embodiment, description was made taking as an example the case of continuously changing the zoom driving of the
lens 101L and the cropping size. However, thedisplay image 201L may be held temporarily in a memory and the image may be kept being held for certain time (for instance, 0.1 seconds), and the display image may be changed step by step, rather than continuously. In this manner, the zoom driving of thelens 101L can be performed at high speed and greatly, and zoom magnification at a passing point of the zoom driving can be predicted to determine the cropping size and perform the cropping processing. In other words, it is also possible to reduce the time until completion of the zoom driving of the lens. - In addition, in the present embodiment, description was made taking as an example the case of using the subject capturing timer as the capture stability detection processing. However, the present embodiment is not limited to this example, and any other method may be used as long as it can be detected that the subject stably exists in the predetermined range of the image. For instance, the
electronic binoculars 100 may be equipped with an acceleration sensor (not illustrated), and when acceleration is almost at zero, it may be determined that the subject is captured. Alternatively, image identification may be executed continuously by thecontrol unit 104, and it may be determined based on continuous identification results that the subject stably exists. - Further, in the first embodiment and the second embodiment, description was made taking as an example the case where the zoom in the
lens 101R is optical zoom, but the zoom in thelens 101R is not limited to optical zoom, and electronic zoom (cropping processing) may be used. - Embodiment(s) of the disclosure can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a ‘non-transitory computer-readable storage medium’) to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)™), a flash memory device, a memory card, and the like.
- While the disclosure has been described with reference to exemplary embodiments, it is to be understood that the disclosure is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
- This application claims the benefit of Japanese Patent Application No. 2020-098884, filed Jun. 5, 2020, which is hereby incorporated by reference herein in its entirety.
Claims (20)
1. An apparatus comprising:
a first capturing unit configured to obtain a first image;
a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image; and
a control unit configured to control each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image,
wherein, in a case where no subject exists in the first image, the control unit causes at least one of the first display device and the second display device to display the second image.
2. The apparatus according to claim 1 , wherein, in a case where the subject exists in the first image, the control unit causes the first display device to display the first image, and causes the second display device to display a portion of the second image including the subject.
3. The apparatus according to claim 2 , wherein the control unit causes the second display device to display a portion of the second image such that the first image and the portion of the second image have an approximately identical angle of view.
4. The apparatus according to claim 2 , wherein, in a case where a predetermined operation is being performed on the apparatus, the control unit causes the first display device and the second display device to display the first image regardless of whether or not the subject exists in the first image.
5. The apparatus according to claim 4 , wherein the first capturing unit includes a zoom mechanism configured to change an angle of view of the obtained first image, and
the predetermined operation is an operation of changing an angle of view of an image obtained by the zoom mechanism.
6. The apparatus according to claim 1 , wherein at least one of the first capturing unit and the second capturing unit includes a zoom mechanism configured to change an angle of view of an image.
7. The apparatus according to claim 6 , further comprising a detection unit configured to detect that a subject exists in a predetermined range of the second image for a predetermined period of time,
wherein, in a case where the detection unit detects that a subject exists in the predetermined range of the second image for the predetermined period of time, the control unit controls the zoom mechanism such that an angle of view of the first image and an angle of view of the second image come close to each other.
8. The apparatus according to claim 3 , wherein the control unit crops and enlarges a portion of the second image from the second image such that the first image and the portion of the second image have an approximately identical angle of view.
9. The apparatus according to claim 8 , wherein the control unit executes first image processing of cropping and enlarging a portion of the second image in accordance with a crop position and an enlargement ratio that are set in advance.
10. The apparatus according to claim 9 , wherein the control unit executes second image processing of cropping and enlarging a portion of the second image from the second image by determining a crop position and an enlargement ratio of the portion of the second image, based on comparison between a region of a subject in the first image and a region of the subject in the second image.
11. The apparatus according to claim 10 , wherein the first capturing unit comprises a zoom mechanism configured to change an angle of view of an obtained image, wherein in a case where an operation of driving the zoom mechanism is being performed, the control unit
executes the first image processing, and
after the operation of driving the zoom mechanism ends, the control unit executes the second image processing.
12. The apparatus according to claim 9 , further comprising a measuring unit configured to measure a distance to a subject,
wherein the control unit corrects the crop position in accordance with a distance to a subject measured by the measuring unit.
13. The apparatus according to claim 1 , further comprising a first camera- shake correction unit configured to control a position of an optical axis of the first capturing unit to correct a camera shake, and a second camera-shake correction unit configured to control a position of an optical axis of the second capturing unit to correct a camera shake,
wherein the first camera-shake correction unit and the second camera-shake correction unit independently perform camera-shake correction.
14. The apparatus according to claim 1 , wherein the control unit performs different white balance adjustment with respect to each of the first image and the second image.
15. A method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising:
controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image,
wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
16. The method according to claim 15 , wherein, in a case where the subject exists in the first image, the controlling causes the first display device to display the first image, and causes the second display device to display a portion of the second image including the subject.
17. The method according to claim 15 , wherein the controlling performs different white balance adjustment with respect to each of the first image and the second image.
18. A non-transitory computer-readable storage medium comprising instructions for performing a method of an apparatus which includes a first capturing unit configured to obtain a first image; and a second capturing unit configured to obtain a second image having a wider angle of view than an angle of view of the first image, the method comprising:
controlling each of a first display device corresponding to one eye of a user and a second display device corresponding to the other eye of the user to display an image based on the first image or the second image,
wherein, in a case where no subject exists in the first image, the controlling includes causing at least one of the first display device and the second display device to display the second image.
19. The non-transitory computer-readable storage medium according to claim 18 , wherein, in a case where the subject exists in the first image, the controlling causes the first display device to display the first image, and causes the second display device to display a portion of the second image including the subject.
20. The non-transitory computer-readable storage medium according to claim 18 , wherein the controlling performs different white balance adjustment with respect to each of the first image and the second image.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020-098884 | 2020-06-05 | ||
JPJP2020-098884 | 2020-06-05 | ||
JP2020098884A JP2021192492A (en) | 2020-06-05 | 2020-06-05 | Electronic apparatus and method for controlling the same, and program |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210385385A1 true US20210385385A1 (en) | 2021-12-09 |
US11206356B1 US11206356B1 (en) | 2021-12-21 |
Family
ID=78818102
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/332,809 Active US11206356B1 (en) | 2020-06-05 | 2021-05-27 | Apparatus, method of same, and storage medium that utilizes captured images having different angles of view |
Country Status (2)
Country | Link |
---|---|
US (1) | US11206356B1 (en) |
JP (1) | JP2021192492A (en) |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4198449B2 (en) * | 2002-02-22 | 2008-12-17 | 富士フイルム株式会社 | Digital camera |
JP4956988B2 (en) * | 2005-12-19 | 2012-06-20 | カシオ計算機株式会社 | Imaging device |
JP5247076B2 (en) * | 2007-07-09 | 2013-07-24 | 株式会社ニコン | Image tracking device, focus adjustment device, and imaging device |
JP5223486B2 (en) | 2008-06-18 | 2013-06-26 | ソニー株式会社 | Electronic binoculars |
JP2012029245A (en) | 2010-07-27 | 2012-02-09 | Sanyo Electric Co Ltd | Imaging apparatus |
US9338370B2 (en) * | 2012-11-05 | 2016-05-10 | Honeywell International Inc. | Visual system having multiple cameras |
US10007333B2 (en) * | 2014-11-07 | 2018-06-26 | Eye Labs, LLC | High resolution perception of content in a wide field of view of a head-mounted display |
JP2018124523A (en) | 2017-02-04 | 2018-08-09 | 真人 田村 | High magnification electronic binoculars |
US10506220B2 (en) * | 2018-01-02 | 2019-12-10 | Lumus Ltd. | Augmented reality displays with active alignment and corresponding methods |
-
2020
- 2020-06-05 JP JP2020098884A patent/JP2021192492A/en active Pending
-
2021
- 2021-05-27 US US17/332,809 patent/US11206356B1/en active Active
Also Published As
Publication number | Publication date |
---|---|
JP2021192492A (en) | 2021-12-16 |
US11206356B1 (en) | 2021-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9998650B2 (en) | Image processing apparatus and image pickup apparatus for adding blur in an image according to depth map | |
US8055097B2 (en) | Image pick-up apparatus, image pick-up program, and image processing program | |
CN104065868B (en) | Image capture apparatus and control method thereof | |
US9699369B2 (en) | Focus detection apparatus and control method for the same | |
US10165188B2 (en) | Optical apparatus, display controlling method, and non-transitory computer readable storage medium storing a program, that display object distance information | |
US9781330B2 (en) | Focus detection apparatus and control method for focus detection apparatus | |
US20120057034A1 (en) | Imaging system and pixel signal readout method | |
US10855908B2 (en) | Device, method, and storage medium for determining a focus direction using phase difference detection in focus detection regions | |
US20210051266A1 (en) | Image capture apparatus and control method thereof | |
KR101038815B1 (en) | Image capture system capable of fast auto focus | |
US10999491B2 (en) | Control apparatus, image capturing apparatus, control method, and storage medium | |
TWI693828B (en) | Image-capturing device and method for operating the same | |
US9274402B2 (en) | Imaging device that executes auto focus control by referring to distance image data regarding a depth of a region | |
JP2014202771A (en) | Imaging apparatus and method for controlling the same | |
US11095824B2 (en) | Imaging apparatus, and control method and control program therefor | |
US11206356B1 (en) | Apparatus, method of same, and storage medium that utilizes captured images having different angles of view | |
US20130293682A1 (en) | Image capture device, image capture method, and program | |
JP7435085B2 (en) | Automatic focus detection device, automatic focus detection method, and automatic focus detection program | |
US9699368B2 (en) | Focus detection apparatus and control method for the same | |
US9930245B2 (en) | Focus control apparatus and control method for the same | |
JP2006229392A (en) | Imaging apparatus and image data display method | |
JP5489545B2 (en) | Imaging system and imaging method | |
JP2017011351A (en) | Imaging apparatus, control method of the same, and control program | |
JP2005140851A (en) | Autofocus camera | |
EP3843378A1 (en) | Image processing apparatus, image capturing apparatus and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: CANON KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUZUKI, RYUTA;ICHIMIYA, TAKASHI;IKEDA, KOUJI;SIGNING DATES FROM 20210520 TO 20210524;REEL/FRAME:056783/0439 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |