WO2015020038A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2015020038A1 WO2015020038A1 PCT/JP2014/070595 JP2014070595W WO2015020038A1 WO 2015020038 A1 WO2015020038 A1 WO 2015020038A1 JP 2014070595 W JP2014070595 W JP 2014070595W WO 2015020038 A1 WO2015020038 A1 WO 2015020038A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- setting value
- information
- image
- focus
- subject
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 134
- 238000012545 processing Methods 0.000 claims abstract description 54
- 230000003287 optical effect Effects 0.000 claims description 61
- 230000008859 change Effects 0.000 claims description 39
- 238000012937 correction Methods 0.000 claims description 31
- 230000007246 mechanism Effects 0.000 claims description 17
- 238000000034 method Methods 0.000 description 51
- 238000003860 storage Methods 0.000 description 39
- 230000006870 function Effects 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 230000009467 reduction Effects 0.000 description 6
- 238000001514 detection method Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000005484 gravity Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 4
- 238000003672 processing method Methods 0.000 description 4
- 229910001285 shape-memory alloy Inorganic materials 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000006866 deterioration Effects 0.000 description 3
- 238000005401 electroluminescence Methods 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002542 deteriorative effect Effects 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0075—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/676—Bracketing for image capture at varying focusing conditions
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/672—Focus control based on electronic image sensor signals based on the phase difference signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
Definitions
- the present invention relates to an imaging technique for acquiring an image with an increased depth of field.
- the main subject focused and the subject that is far away in the depth direction will be blurred due to the depth of field determined by the characteristics of the optical system and shooting conditions.
- the depth of field represents the range of the distance on the subject side of the subject in focus in the image.
- the distant subject when shooting a scene that includes both a foreground and a distant view under conditions with a shallow depth of field, the distant subject will be blurred when focused on the distant subject, and the distant view when focused on the distant subject.
- the subject is blurred.
- an automatic focus control (AF) function for automatically focusing on a subject is widely used.
- AF automatic focus control
- AF sometimes mistakes a subject to be focused. For example, when the subject is photographed close to a close object, the focus position may be accidentally adjusted to the background.
- Patent Document 1 proposes a technique for acquiring an image (hereinafter also referred to as an “all-focus image”) in which all subjects are focused from near view to distant view and the depth of field is expanded.
- AF and focus bracket photographing are performed by moving the focus lens in the optical axis direction by the focus lens driving unit.
- the subject is continuously photographed while changing the focal position by focus bracket photographing, and an omnifocal image is obtained based on a plurality of acquired images.
- a contrast method, a phase difference method, and the like have been often used for AF.
- the camera focuses on a subject at some point in the image or at some fixed points, and focuses on a subject within the depth of field including the subject. It was a thing. Even if the depth of field is increased depending on the optical system and shooting conditions, there is a problem of image quality deterioration due to a decrease in the amount of light. Therefore, in a scene where there are subjects whose distances vary greatly in the depth direction, it has been difficult to obtain an omnifocal image that focuses on all subjects using the conventional AF and optical system.
- an omnifocal image is acquired using AF and focus bracket shooting, and the focus position set at the time of focus bracket shooting is set for all subjects in the image. It is assumed that the focus is set on any one of the images. However, it is stated that when a portion A in the image is not in focus in all of the plurality of images, the portion A of the omnifocal image is also out of focus.
- Patent Document 1 when the orientation of the imaging device changes, it may be affected by gravity and may shift the focal position. However, the technique described in Patent Document 1 does not consider this point, and there is a possibility that the setting of the focal position becomes inappropriate when the attitude of the imaging apparatus changes. In order to acquire an omnifocal image by the technique of Patent Document 1 regardless of the posture state of the imaging apparatus when there is such an influence of the posture, a large number of focal positions are set in fine steps, and subjects at all distances are always set. It is necessary to shoot so that it is in focus.
- the present invention has been made in view of the above points, and an object of the present invention is to easily acquire an image in which the depth of field is expanded by performing shooting while appropriately setting a focal position.
- An advantage of some aspects of the invention is that an image acquisition unit that acquires a plurality of images having different focal positions and posture information of the image acquisition unit are acquired.
- An imaging apparatus comprising: an attitude acquisition unit; and an image processing unit that generates an image with a depth of field larger than a depth of field of one of the plurality of images from the plurality of images.
- the imaging apparatus has a focal position setting value for determining the focal position, and the focal position setting value is corrected based on the posture information.
- the posture information includes angle information formed by an optical axis of the image acquisition unit and a vertical direction, and a correction amount for correcting the focus position setting value based on a sign of the angle information. It is an imaging device characterized by different signs.
- the posture information includes angle information formed by an optical axis and a vertical direction of the image acquisition unit, and a correction amount for correcting the focus position setting value based on the size of the angle information. It is an imaging device characterized by making different.
- the focal position setting value is determined based on a reference focal position setting value when focusing on a reference subject serving as a reference for determining the focal position, and based on the posture information,
- the correction amount when correcting the focal position setting value differs depending on the reference focal position setting value.
- the image acquisition unit It is an imaging device characterized by acquiring.
- FIG. 1 is a schematic block diagram illustrating a configuration example of an imaging apparatus according to a first embodiment of the present invention. It is a figure which shows an example of an imaging
- FIG. 1 is a functional block diagram illustrating a schematic configuration example of the imaging apparatus 1 according to the first embodiment of the present invention.
- the imaging device 1 captures posture information of the entire imaging device 1 including the image acquisition unit 10 that acquires a plurality of pieces of image information by changing the focal position and the image acquisition unit 10 or the image acquisition unit 10. Image obtained by performing image processing on image information output from the posture acquisition unit 11 to be acquired, the information storage unit 12 that stores the setting information of the focus position, and the image information output by the image acquisition unit 10, and an image with an expanded depth of field And a processing unit 13.
- the imaging apparatus 1 includes, for example, a processor such as a DSP (Digital Signal Processor) or a CPU (Central Processing Unit), a main storage device such as a RAM (Random Access Memory), and the like, and is stored in a storage device.
- a processor such as a DSP (Digital Signal Processor) or a CPU (Central Processing Unit)
- main storage device such as a RAM (Random Access Memory), and the like
- the processing of each processing unit described above can be realized by executing a program.
- a programmable integrated circuit such as a field programmable gate array (FPGA) or an integrated circuit dedicated to the processing, the processing of the processing unit can be realized by hardware.
- FPGA field programmable gate array
- the image acquisition unit 10 converts received light into electrical signals and uses the image information as image information, such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor), and condenses light from the subject on the image sensor.
- An optical system 101 such as a lens for driving, a lens driving unit 102 that changes the focal position by driving the lens of the optical system 101 in the optical axis direction, and a driving amount setting unit 103 that sets a driving amount of the lens driving unit.
- the lens driving unit 102 drives the lens of the optical system 101 based on the driving amount setting value set by the driving amount setting unit 103 to change the focal position.
- the drive amount setting value is a focus position setting value
- the focus position is set by setting the drive amount setting value.
- the image acquisition unit 10 includes an analog signal processing unit, an A / D (Analog / Digital) conversion unit, and the like (not shown), and outputs a signal from the image sensor as image information.
- the zoom mechanism 14 has a zoom mechanism 14, a diaphragm mechanism 15, and a notification unit 16. Although these mechanisms are explicitly shown in FIG. 1, the zoom mechanism 14 and the diaphragm mechanism 15 are generally included in the image acquisition unit 10, for example.
- the image acquisition unit 10 has an AF function of a conventionally used method such as a contrast method or a phase difference method, focuses on the subject by AF, and acquires information on the lens position at the time of focusing.
- the AF function of the contrast method drives the lens of the optical system 101 by the lens driving unit 102 to change the focal position, acquires image information of a plurality of focal positions, calculates each contrast, By setting the focal position having the maximum value as the in-focus position, it is possible to automatically focus on the subject.
- focusing information is obtained by focusing on a reference subject (hereinafter referred to as “reference subject”), and a plurality of focal positions are set based on the focusing information. Acquire multiple images.
- the reference subject may be specified using the above-mentioned AF function, or may be specified directly by the user.
- the in-focus information is focal position information when the subject is focused, and is information on the lens driving amount when focused on the reference subject.
- the focusing information is output from the lens driving unit 102 to the driving amount setting unit 103.
- the focus information is acquired by specifying the reference subject by the AF function, for example, the reference subject can be specified by the AF spot.
- the AF spot is set on a two-dimensional coordinate representing the range of the shooting angle of view. For example, when an AF spot is near the center of the angle of view, the subject located at the spot position (for example, near the center of the image) is focused.
- AF spots are set at a plurality of positions, it is possible to focus on the subject at the focal point closest to the foreground among the subjects at those spot positions.
- the imaging device when the user directly designates a subject and obtains focusing information, the imaging device includes a display device such as a liquid crystal display or an organic EL (Electro Luminescence) display, or a touch panel such as a capacitance method or an electromagnetic induction method.
- a display device such as a liquid crystal display or an organic EL (Electro Luminescence) display
- a touch panel such as a capacitance method or an electromagnetic induction method.
- a user can designate a reference object by using a button provided in the imaging device from a preview image displayed on the display device.
- the user can directly focus on the subject by directly touching the subject on the display in which the captured image is previewed.
- the reference subject may be set in advance at an arbitrary position within the angle of view of the imaging apparatus, and may be a subject existing at that position.
- a subject existing at an arbitrary preset distance may be used as the reference subject.
- an arbitrary object such as a person, an animal, or a building may be registered in advance and detected using a known image recognition technique, for example, a face recognition technique or an object recognition technique, and used as a reference subject.
- a known image recognition technique for example, a face recognition technique or an object recognition technique
- the drive amount setting unit 103 reads the setting information of the drive amount setting value from the information storage unit 12 based on the focus information of the lens drive unit 102 and the posture information of the posture acquisition unit 11, and drives the lens drive amount setting value. Set. Note that the drive amount setting unit 103 is provided for easy understanding of the description, and a drive amount setting value may be set inside the lens drive unit 102, or outside the image acquisition unit 10. Separately, a setting unit may be provided for setting.
- the lens of the optical system 101 may include a focus lens for changing the focal position, and may have one or more lenses.
- the lens driving unit 102 drives the lens in order to change the focal position.
- the lens driving unit 102 is a VCM (Voice Coil Motor) method or SMA (Shape Memory Alloy) which is adopted in general smartphones and tablet terminals. ) Type actuators.
- VCM Vehicle Coil Motor
- SMA Silicon Memory Alloy
- VCM is a system in which the lens is operated by electromagnetic force using a magnet and a coil
- SMA is a system in which the lens is moved up and down by energizing and heating the shape memory alloy.
- the imaging apparatus can be obtained particularly in a terminal or apparatus that employs the above-described method.
- the optical system and the driving method are not limited to the above, and a liquid lens or the like may be used, for example.
- the liquid lens includes a lens that changes a focal position by an electrical control signal, and a lens that mechanically changes the focal position by an actuator.
- the posture acquisition unit 11 includes a triaxial acceleration sensor mounted on a general mobile device such as a smartphone, and measures the posture information of the image acquisition unit 10 or the entire imaging apparatus 1 including the image acquisition unit 10. .
- the posture information is angle information based on the optical axis of the optical system 101 included in the image acquisition unit 10 and the vertical axis that is the direction in which gravity acts.
- the angle information when the optical axis coincides with the vertical axis is ⁇ 90 degrees, and the orientation of the imaging device is horizontal and the optical axis is parallel to the ground, that is, the optical axis and the vertical axis are at right angles.
- the angle information is 0 degrees. In this way, the sign of the angle information is changed on the boundary when the optical axis is horizontal to the ground.
- the optical axis has a depression angle, and the image pickup apparatus is tilted downward.
- the angle information is positive, the optical axis has an elevation angle, and the image pickup apparatus is tilted upward.
- a sensor such as a capacitance type, a piezoresistive type, or a heat detection type using MEMS (Micro (Electro Mechanical Systems: a microelectromechanical element and its creation technology) is used.
- the capacitance type is a method for detecting a change in capacitance between the movable part of the sensor element and the fixed part
- the piezoresistive type is a spring that is accelerated by a piezoresistive element placed in the spring part that connects the movable part of the sensor element and the fixed part. This is a method for detecting distortion generated in a portion.
- the heat detection type is a method in which a hot air current is generated in a housing by a heater and a convection change due to acceleration is detected by a thermal resistance or the like.
- the posture measurement method is not limited to the method using the above-described three-axis acceleration sensor, and any method may be used as long as posture information can be estimated. However, it is preferable to use the above MEMS because the apparatus can be downsized.
- the posture information may be acquired by estimating the posture of the imaging device from the subject information of the image.
- the posture can be estimated using information on the ground and information on a subject perpendicular to the ground such as a building.
- the posture acquisition unit 11 may be an image processing unit that performs posture estimation based on the image information of the image acquisition unit 10, or the posture estimation unit 11 performs posture estimation without providing the posture acquisition unit 11,
- the posture information may be output to the drive amount setting unit 103.
- the information storage unit 12 is configured by, for example, a flash memory or a hard disk, and stores a focal position at the time of shooting, that is, setting information of a lens driving amount setting value. These pieces of information are preferably stored in the information storage unit 12 in advance, but may be configured so that the user directly inputs them before photographing.
- the image processing unit 13 is configured by a processor such as a DSP or a CPU, for example, and performs image processing on a plurality of pieces of image information input from the image acquisition unit 10 to synthesize an image with an increased depth of field. Details of the processing will be described later.
- FIG. 2 illustrates a scene in which two subjects, a foreground subject 2a and a distant subject 2b, exist.
- the distance between the two subjects 2a and 2b is greatly separated in the depth direction, and when shooting with an imaging device with a narrow depth of field, if one subject is focused, the other subject Will cause blur.
- distance when simply referred to as “distance”, it represents the distance in the depth direction as viewed from the imaging apparatus.
- photographing is performed while changing the focal position, and an image focused on the foreground subject 2a and an image focused on the distant subject 2b are acquired.
- FIG. 3A is a diagram illustrating an example of a relationship between a lens driving amount and a focusing distance when the optical system 101 is driven by the lens driving unit 102 and the focal position is changed in the imaging apparatus 1.
- FIG. 3A shows a relationship when focusing on a distant view when the lens driving amount is small and focusing on a close view when the lens driving amount is large.
- the lens drive amount on the horizontal axis in FIG. 3 indicates the drive amount of the lens from that position with reference to the lens position that can focus on an object at infinity.
- the distance on the vertical axis indicates the distance from the imaging device to the subject.
- the curve in FIG. 3 indicates the distance at which focusing is performed at a certain lens driving amount.
- the distance range (hereinafter referred to as “in-focus”) where the two curves A and B shown in FIG. 3 can be focused without blurring. It is referred to as “range”.
- FIG. 3A shows that the focusing range is ar when the lens driving amount is a1.
- in-focus and out-of-focus can be appropriately set according to the pixel size of the image sensor and the value of the point spread function.
- the focusing range is changed from ar to br. , Can expand the depth of field.
- the characteristics shown in FIG. 3A vary depending on the apparatus characteristics of the lens driving unit 102, the characteristics of the optical system, the photographing conditions, and the like. It is possible to synthesize an omnifocal image with an expanded depth of field with an optimal number of shots.
- the driving amount of the lens is set by a driving amount setting value, and the driving amount setting value is a numerical value indicating the amount of current necessary for driving the lens. Note that the characteristics shown in FIG. 3 are merely examples, and there may be a characteristic in which the in-focus distance becomes farther as the lens driving amount increases, or the change becomes a linear characteristic.
- the optical system 101 and the lens driving unit 102 are affected by gravity, it is necessary to consider that the focal position changes depending on the posture of the imaging apparatus 1.
- FIG. 4 shows the characteristics of the in-focus distance when the posture of the imaging apparatus 1 is changed while the lens driving amount setting value is constant.
- FIG. 4 represents angle information representing the inclination of the optical axis
- the vertical axis in FIG. 4 represents the distance from the imaging device 1 to the subject.
- the curve in FIG. 4 indicates the distance at which the focus is achieved at a certain angle, and indicates the characteristic of the median value of the focus range.
- the characteristics shown in FIG. 4 indicate that the angle information changes in the minus direction, and the in-focus distance changes toward the foreground as the depression angle increases. Similarly, when the angle information changes in the plus direction and the degree of elevation angle increases, the in-focus distance changes to the far side.
- the angle information indicating the inclination of the optical axis of the optical system 101 is angle information of the image acquisition unit 10 including the optical system 101, and the inclination of the image acquisition unit 10 is that of the imaging apparatus 1 including the image acquisition unit 10.
- Angle information Note that the characteristic shown in FIG. 4 is an example, and the change in the distance to be focused may be opposite between the plus direction and the minus direction of the change in angle information. In other words, when the degree of elevation is increased, the in-focus distance may be on the near side, and when the depression is increased, the distance may be on the far side. In addition, the change in characteristics may be linear. As described above, even if the drive amount setting value is the same, the in-focus distance changes depending on the attitude of the imaging apparatus. Details of the method for setting the lens driving amount setting value will be described later.
- the imaging apparatus 1 sets the focal position at the time of imaging in consideration of the optical system characteristics, imaging conditions, and the attitude of the imaging apparatus.
- FIG. 5 is a flowchart showing the process flow of the imaging apparatus 1.
- the image acquisition unit 10 focuses on the subject by the above-described method, and acquires the lens driving amount when the focus is achieved as focusing information (step S101).
- the imaging device 1 acquires posture information of the imaging device 1 by the posture acquisition unit 11 (step S102).
- the posture information is the inclination information of the image acquisition unit 10 or the entire imaging apparatus 1 including the image acquisition unit 10 with respect to the ground, that is, the inclination information of the optical axis of the optical system 101.
- step S101 and step S102 can be performed as follows.
- a terminal equipped with a display device and a touch panel such as a smartphone or a tablet terminal
- the focus information is obtained when the user touches the subject shown in the image displayed in the preview on the display device.
- the subject desired by the user is focused and shooting is performed with a setting corresponding to the posture when the shutter is released. Therefore, even if the posture of the terminal changes after touching the touch panel, it is possible to cope.
- the subject is focused by a half-pressing operation of the shutter button, and the shutter is released by a full-pressing operation.
- the posture since the posture may change before the shutter button is half-pressed to the full-pressed state, it is preferable to estimate the posture when the shutter button is fully pressed.
- the driving amount setting unit 103 reads information from the information storage unit 12 based on the above-described focusing information and posture information, and sets a driving amount setting value (step S103). Details of step S103 will be described later.
- the image acquisition unit 10 drives the lens based on the drive amount setting value of the drive amount setting unit 103 to perform photographing, and acquires a plurality of pieces of image information (step S104).
- the image processing unit 13 performs image processing on a plurality of pieces of image information from the image acquisition unit 10, synthesizes and outputs an image with an increased depth of field (step S105). Details of step S105 will be described later.
- the image output in step S105 may be displayed on a display device such as a liquid crystal display or an organic EL display, or may be stored in a storage device such as a flash memory or a hard disk. Further, a plurality of pieces of image information acquired by the image acquisition unit 10 may be stored in the storage device.
- the imaging device 1 can be configured to include these devices.
- Step S103 Setting method of lens driving amount setting value
- step S103 Setting method of lens driving amount setting value
- the drive amount setting unit 103 reads and sets information from the information storage unit 12 according to the focus information in step S101 and the posture information in step S102.
- information on the driving amount setting value corresponding to an arbitrary posture state of the imaging apparatus 1 and an arbitrary reference subject position is stored in the information storage unit 12 in advance, and is optimal according to the posture information and the focusing information.
- Read and set the correct drive amount setting value When posture information or focus information changes, such as when the posture changes during shooting or the reference subject moves, information is read from the information storage unit 12 as appropriate.
- the information on the drive amount setting value stored in the information storage unit 12 is obtained based on the optimum drive amount setting value when the imaging apparatus 1 is in a certain posture and a reference subject is present at a certain distance. Then, the reference set value can be corrected in consideration of the characteristics of the posture as shown in FIG. 4 and the characteristics when the position of the reference subject changes, and can be obtained in advance.
- FIG. 6A shows the relationship between the lens driving amount setting value and the distance when an imaging apparatus having a certain depth of field characteristic assumes a certain posture.
- the horizontal axis in FIG. 6 indicates the lens drive amount setting value, and the vertical axis indicates the distance from the imaging device to the subject.
- the curve shown in FIG. 6 represents the distance at which a lens driven with a certain drive amount setting value is in focus, and the area between curves C and D is the focus range.
- the driving amount setting unit 103 acquires a driving amount setting value for focusing on the reference subject.
- the driving amount setting value for focusing on the reference subject can be obtained from the focusing information acquired in step S101. Specifically, since the lens driving amount when focusing on the reference subject can be known from the information of the lens driving unit 102, the driving amount setting value of the reference subject can be known from the driving amount setting value at that time.
- FIG. 6A shows that the drive amount setting value for focusing on the reference subject is c0.
- the driving amount setting value is read from the information storage unit 12 based on the driving amount setting value c0 and the posture information in step S102. That is, the driving amount setting value corresponding to the position of the reference subject and the attitude of the imaging device 1 is read out.
- the drive amount setting values c1 and c2 are read from the information storage unit 12 and set.
- the drive amount setting value at the time of focusing on the reference subject as in c0 is referred to as a reference drive amount setting value. Since the drive amount setting value is the focus position setting value as described above, the reference drive amount setting value is the reference focus position setting value.
- c1 and c2 are set to focus on the distance range before and after the focusing range cr0 of c0, and the focusing ranges cr0, cr1, and cr2 are set so as to overlap each other. It is stored in the storage unit 12. This prevents the focus range from being interrupted and prevents the focused subject and the non-focused subject from appearing in succession. For example, when a subject that is continuous in the depth direction, such as the ground, is photographed, it is possible to prevent an in-focus region and an out-of-focus region from appearing in the middle of the subject and thereby preventing image quality from deteriorating. Therefore, it is preferable to store in advance in the information storage unit 12 a drive amount setting value that makes the in-focus range continuous.
- the focus range does not necessarily have to be continuous. That is, if there is no influence such as image quality deterioration, a driving amount setting value that makes the focusing range discontinuous may be set.
- the other drive amount setting values are c1 for focusing on the near view, and for focusing on the distant view.
- c2 was set.
- the reference drive amount setting value is c1
- c0 for focusing on the middle scene and c2 for focusing on the far background are set.
- the reference drive amount setting value is c2
- c1 that focuses on the foreground and c0 that focuses on the middle background are set.
- the reference drive amount setting value is a value for focusing on a distant view
- the distance from the focus information to the reference subject, the near view, the middle view, the distant view, etc. are determined in this way, and whether to shoot by changing the focal position to the near view side or the far view side of the reference subject. This determination is preferable because the depth of field can be appropriately expanded. Further, depending on the characteristics of the depth of field and the posture state of the imaging apparatus, it is assumed that the number of shots is 3 or more.
- the depth of field can be appropriately expanded by changing the ratio of the number of images to be taken on the near and far sides of the reference subject. For example, when the reference subject is in the middle scene, considering the fact that the depth of field becomes narrower as the foreground is closer, the ratio is changed by increasing the number on the near side and decreasing the number on the far side.
- these pieces of information are also stored in the storage information unit 12 in advance, they can be read and set based on the focus information and the posture information. This is preferable because it is possible to take an image with an appropriate focal position and an appropriate number of images according to the orientation of the imaging device and the position of the reference subject, and to expand the depth of field.
- the determination of the distance of the near view, the middle view, the distant view, etc. can be made based on, for example, a distance threshold value set in advance and the focus information and the distance threshold value.
- a rough distance value from the imaging device to the reference subject can be calculated from information on the lens driving amount, which is focusing information, and information such as the angle of view, focal length, and pixel size of the optical system. If the distance value to the reference subject is smaller than a preset distance threshold for foreground determination, it is determined that the reference subject exists in the foreground.
- the reference driving amount setting value if at least one driving amount setting value is set to a value that focuses on a distance larger than the distance threshold, the reference subject and a subject on the far side from the reference subject are set. You can focus.
- the distance value to the reference subject is larger than a preset distance threshold for determining a distant view, it is determined that the reference subject exists in the distant view.
- the reference driving amount setting value if at least one driving amount setting value is set to a value that focuses on a distance smaller than the distance threshold, the reference subject and a subject closer to the foreground than the reference subject are set. You can focus.
- the number and value of the distance threshold values can be arbitrarily set, and are not limited to the above two distance threshold values.
- the two distance thresholds for determining the foreground and for determining the background are set as described above, three distance ranges of the foreground, the middle, and the background can be set.
- the original depth of field determined by the optical system characteristics and shooting conditions is wide, and the distance between the curves C and D is wide. Therefore, it is possible to obtain an image with an expanded depth of field by shooting three images, and if two distance thresholds are set and three distance ranges of foreground, middle and far are set,
- the driving amount setting value can be set so as to focus on the distance range. For example, when the depth of field is narrower than in the example of FIG.
- the distance threshold is set according to the number of shots.
- the depth of field can be appropriately expanded with a minimum number of images.
- the information stored in the information storage unit 12 is read and set. However, not all the information is stored in advance, but only the reference information and the correction information are stored and focused. You may make it correct
- a reference drive amount setting value for focusing on a reference object at a certain distance and other drive amount setting values determined according to the reference drive amount setting value, are previously stored in the information storage unit 12 as reference information. Further, information on the correction amount and the correction direction when the position of the reference subject changes or when the posture changes is stored in the information storage unit 12.
- the drive amount setting unit 103 reads the correction amount and correction direction information from the information storage unit 12 based on the focusing information in step S101 and the posture information in step S102, and the reference information stored in advance based on the information. Correct. Since the reference information includes a reference drive amount set value and other drive amount set values, the respective drive amount set values are corrected.
- FIG. 7 indicates the lens drive amount setting value
- the vertical axis indicates the distance from the imaging device to the subject.
- the curve in FIG. 7 indicates the distance at which the lens is driven with a certain driving amount setting value and is focused at the focal position at that time, and indicates the characteristic of the median value of the focusing range.
- FIG. 7A shows the characteristics when the optical axis is horizontal to the ground and the angle information is 0 degrees.
- FIG. 7 (b) shows the characteristics when the angle information changes in the plus direction and the optical axis is at an elevation angle
- FIG. 7 (c) shows the angle information changes in the minus direction and the optical axis changes to the optical axis. It shows the characteristics when the depression is on.
- the curve indicating the characteristic in FIG. 7B indicates that the curve in FIG. 7A is shifted to the far side.
- the curve indicating the characteristic in FIG. 7C indicates that the curve in FIG. 7A is shifted to the foreground side. Note that the characteristics shown in FIG. 7 are merely examples, and the shift directions of the characteristics shown in FIGS.
- the shift direction is opposite between the case where the elevation angle is added to the optical axis and the case where the depression angle is added. Further, the amount of shift changes in accordance with the degree of inclination of the posture, and the amount of shift increases as the imaging apparatus tilts greatly and the angle information increases.
- FIG. 7 when the characteristic changes depending on the posture, the drive amount setting value is corrected as shown in FIG.
- the horizontal axis indicates the drive amount setting value
- the vertical axis indicates the distance from the imaging device to the subject.
- the curve shown in FIG. 8 represents the distance to be focused at the focal position of the lens driven with a certain drive amount setting value, and the focus range is indicated by two curves.
- FIGS. 8A, 8B, and 8C are characteristics when the posture is horizontal, and when the optical axis has an elevation angle, respectively. This is the characteristic when the optical axis has a depression angle.
- the information storage unit 12 stores drive amount setting values d0, d1, and d2 shown in FIG. 8A as reference information.
- d0 is the reference drive amount setting value.
- correction information and correction direction information corresponding to the posture information are stored in the information storage unit 12, and d0, d1, and d2 are corrected based on the correction information, and the driving amount setting value e0 in FIG. e1, e2, and the driving amount set values f0, f1, and f2 in FIG. 8C.
- FIG. 8 shows an example in which the focal position shifts to the far side when the angle information changes in the positive direction, and the focal position shifts to the near side when the angle information changes to the negative direction, as in the characteristics shown in FIG. ing.
- the drive amount setting value is corrected in a positive direction in which the value increases, and the characteristic shifts to the foreground side as shown in FIG.
- the drive amount set value is corrected in the negative direction in which the value decreases.
- the direction (sign of correction amount) for correcting the drive amount setting value is varied according to the direction (sign) of change of the angle information. For example, contrary to the characteristics shown in FIG.
- the drive amount set value can be appropriately set by changing the correction amount of the drive amount set value according to the magnitude of the change in the angle information.
- the reference information d0, d1, and d2 are corrected by different correction amounts. This is because the in-focus range differs depending on the drive amount setting value, and the characteristics differ depending on the imaging device. For example, if d0, d1, and d2 are all set to a fixed correction amount, it is difficult to appropriately expand the depth of field, for example, the focus range of each drive amount setting value is biased toward the foreground side or the far side. End up.
- the driving amount setting value which is reference information, varies depending on the position of the reference subject. That is, other drive amount setting values also change depending on the reference drive amount setting value when focusing on the reference subject. Therefore, it is possible to set an appropriate driving amount setting by varying the correction amount of each driving amount setting value according to the value of the reference driving amount setting value and the depth of field characteristic of the imaging device. It is possible and preferable.
- the drive amount set value can be appropriately set by correcting the reference drive amount set value information with the correction amount and the correction direction according to the focus information and the posture information.
- FIG. 9A is a functional block diagram illustrating a configuration example of the image processing unit 13.
- FIG. 9B is a flowchart illustrating the processing flow of the image processing unit 13.
- the image processing unit 13 includes an image group alignment processing unit 13-1, an in-focus degree calculation unit 13-2, an in-focus degree comparison unit 13-3, and an image composition unit. 13-4.
- the image group alignment processing unit 13-1 of the image processing unit 13 performs alignment for a plurality of pieces of image information input from the image acquisition unit 10 (S1051).
- the angle of view changes for each image and the subject position is shifted, so it is necessary to align the images.
- the amount of change in the angle of view is obtained based on the lens drive amount, that is, the focal position, using the lens drive amount setting value set in step S103 in the preceding stage, and the image is enlarged or reduced. Can be done.
- the image enlargement / reduction processing can be performed using a general interpolation method such as bilinear interpolation or bicubic interpolation.
- the amount of movement of feature points between images is calculated, and image processing by affine transformation such as enlargement / reduction, rotation, translation, etc. is performed based on the calculated amount of movement. Then, alignment may be performed.
- the feature points can be detected by using a general feature point detection method such as a Harris corner detection method, whereby the same feature point is detected from each image.
- a calculation method such as SAD (Sum of Absolute Difference) or SSD (Sum of Squared Difference)
- SAD Sud of Absolute Difference
- SSD Sud of Squared Difference
- the alignment is performed so that the subject position on the reference image and the subject position of the other images are aligned with respect to any one of the plurality of captured images.
- This reference image can be set freely.
- an image with a wide angle of view includes a range that is not captured with an image with a small angle of view, it is preferable to use an image on the foreground side with the smallest angle of view as a reference.
- Enlargement processing that interpolates pixels without information has a larger image quality degradation than reduction processing, so by aligning an image with a wide angle of view using the reduction processing based on an image with a narrow angle of view as described above. Therefore, it is preferable because deterioration of image quality can be suppressed.
- the degree-of-focus calculation unit 13-2 of each image of the image processing unit 13 calculates the degree of focus for each pixel for each registered image (S1052).
- the degree of focus represents the degree of focus on the subject, and the in-focus area can be obtained using the fact that the contrast is high. For example, by calculating the difference between the maximum value and the minimum value of the pixel values in the rectangular area centered on the target pixel as the contrast value, it can be used as the degree of focus of the target pixel.
- the pixel value is a luminance value or RGB value of the pixel.
- the degree of focus is not limited to this, and the degree of sharpness may be calculated using the degree of spread of the edge portion of the subject, or any method that allows comparison of the degree of focus between images. It may be obtained by a simple method.
- the in-focus level comparing unit 13-3 of the image processing unit 13 compares the in-focus level calculated for each image at all pixel positions of the image after alignment (S1053), and the image of the image processing unit 13 is compared.
- the synthesizer 13-4 synthesizes the image by weighted average so that the weight of the pixel of the image with the highest degree of focus is increased (S1054). As a result, an omnifocal image synthesized with only the pixel with the highest degree of focus is acquired.
- the pixel of the image with the highest focus degree may be simply selected for each pixel position, or a weighted average value based on the focus degree may be calculated. For example, if it is within a characteristic subject area, there is a high probability that any one of the captured images will have the highest focus degree, and the weight of the pixel of the high focus degree image is increased to increase the weight of the omnifocal image. A pixel may be used. However, in a flat region of a subject with few features, there is a possibility that the focus degree does not change in any image and a similar value may be obtained.
- the pixel value of the omnifocal image may be calculated by performing weighted averaging of the pixel value of each image with a coefficient having a small weight gradient. Furthermore, it is also possible to calculate the weight coefficient by comprehensively determining the degree of focus between the target pixel and its surrounding pixels, and calculate the pixel value of the target pixel position by weighted averaging.
- the imaging apparatus 1 of the present embodiment sets the lens driving amount setting value according to the posture information and the focusing information.
- the imaging apparatus 1 of the present embodiment sets the lens driving amount setting value according to the posture information and the focusing information.
- the imaging apparatus 1 of the present embodiment can appropriately set the focal position even when the attitude of the imaging apparatus 1 changes, and can appropriately acquire an image with an expanded depth of field.
- the reference subject considering the depth of field characteristics determined by the optical system and shooting conditions, and setting the focal position based on the focus information on the reference subject, the reference subject must always be in focus and shot.
- An image with an expanded depth of field can be acquired with the number of images to be minimized. As a result, it is possible to suppress the amount of processing, the amount of memory, and the power consumption that increase in accordance with the number of images.
- the imaging apparatus includes a zoom mechanism 14 in the configuration of the imaging apparatus 1 in FIG.
- the zoom mechanism 14 includes, for example, a varifocal lens whose focal position changes at the same time as zooming, a change characteristic of the focal position according to the driving amount of the varifocal lens is acquired in advance.
- the driving amount setting value corresponding to the characteristic information is stored in the information storage unit 12 and the information corresponding to the posture information and the driving amount of the varifocal lens is read, so that the optimum focus is obtained. Shooting is possible at the position.
- characteristics corresponding to the driving amount of the zoom lens are acquired in advance.
- the characteristics of the distance for focusing with the driving amount of the lens change depending on the posture.
- information on the drive amount setting value corresponding to the characteristics is stored in the information storage unit 12, so that the optimum focal position corresponding to the characteristics of the zoom lens can be obtained. Shooting becomes possible.
- the angle of view becomes narrow when zoomed, so that the subject to be photographed may be limited.
- the foreground subject may be out of frame and only the far-angle subject is within the angle of view. is there.
- the zoom amount is large and the distance of the subject known from the focusing information in step S101 is the far side, it is highly likely that the subject is zoomed to the far side and there is no near side subject. In this case, since the subject in the foreground is not photographed, it is not necessary to shoot with the focal point in focus in the foreground.
- the lens drive amount setting value corresponding to the presence / absence and degree of zooming is stored in the information storage unit 12, and information such as subject distance information, near view, middle view, and far view obtained from the focus information is also stored.
- the drive amount setting value may be set accordingly.
- the driving amount setting value corresponding to the zoom information is stored in the information storage unit 12.
- the correction amount information corresponding to the zoom information is stored in the information storage unit 12. You may keep it. Accordingly, the correction amount may be read based on the zoom information, and the drive amount setting value set as a reference may be corrected.
- the imaging apparatus of the present embodiment sets the lens driving amount setting value in consideration of the characteristics of the optical system including the zoom mechanism. It is possible to appropriately capture an image in which the depth of field is expanded by setting the focal position and shooting.
- the lens drive amount setting value in consideration of information such as the presence / absence and degree of zooming and focusing information, an image with an expanded depth of field can be obtained with the optimum focus position and number of shots. It is possible to minimize the amount of processing, the amount of memory, and the power consumption.
- the imaging apparatus according to the third embodiment of the present invention has the same configuration as that of the imaging apparatus 1 shown in FIG.
- the characteristic of the distance at which the lens drive amount setting value is focused changes depending on the attitude of the imaging device. This characteristic is applied when a strong impact is applied to the imaging device, or over time. It may change depending on other factors. Therefore, the imaging apparatus according to the present embodiment reduces the influence of the characteristic change by re-estimating the characteristic of the distance that is in focus with the lens driving amount setting value through calibration.
- the horizontal axis of FIG. 10 indicates the lens drive amount setting value
- the vertical axis indicates the distance from the imaging device to the subject.
- the curve in FIG. 10 indicates the distance at which the lens is driven at a certain driving amount setting value and is focused at the focal position at that time, and indicates the characteristic of the median value of the focusing range.
- the lens is focused on a distance of 50 cm and 100 cm.
- the driving amounts are g1 and g2 in FIG. 10A at the time of manufacture, whereas they are g1 ′ and g2 ′ in FIG. 10B after applying an impact.
- the characteristics of FIG. 10B are unknown when the imaging device is actually used. Therefore, by setting a subject at a distance of 50 cm and 100 cm from the imaging apparatus, focusing on each subject with the AF function and measuring the lens driving amount, the changed characteristics can be estimated.
- the characteristic shown in FIG. 10 Assuming that the shape of the curve indicating the characteristic does not change like the characteristic shown in FIG. 10, if the lens driving amount to be focused on an arbitrary number of points as described above is known, the characteristic shown in FIG. Thus, the entire characteristic can be estimated.
- This estimated characteristic may be newly stored in the information storage unit 12, and the lens driving amount setting value may be set based on this information.
- the shape of the curve in FIG. 10 changes, for example, the characteristic change becomes larger on the near view side or the far view side.
- the lens drive amount is read by focusing with each AF
- the changed characteristics can be known, but the shooting effort and processing amount can be reduced. It's not realistic to think about it. Therefore, for example, a subject at several points such as 50 cm, 100 cm, and 300 cm is photographed, the lens driving amount by AF is read, and a new characteristic is estimated so as not to deviate significantly from the shape of the characteristic curve at the time of manufacture. Since the reliability of the estimated characteristics increases as the number of shooting distances increases, it is possible to set the shooting distance for calibration in consideration of the processing effort associated with the increase in the number of shots and the estimation accuracy of the characteristics. desirable.
- the above calibration can be performed by giving an instruction such as “Please shoot a subject at a position 50 cm away” to the user.
- placing a subject at a specified distance and performing shooting a plurality of times places a heavy burden on the user. Therefore, an object of a standard size, such as B4 paper or newspaper, is photographed, and the distance to the subject is determined from the actual subject size, the pixel size on the image, and the optical system characteristics such as the focal length. calculate. If the subject is focused by AF and the lens driving amount is read, the relationship between the distance to the subject and the lens driving amount can be known.
- the lens can be moved within the range in which the imaging device is moved without setting the subject at an accurate distance. It is possible to estimate the characteristics of the distance to be in focus with the driving amount. However, even if a plurality of images are taken at substantially the same distance, only a part of the characteristics can be estimated. Therefore, it is preferable to take an image by moving the imaging apparatus in the front-rear direction over as wide a distance range as possible to enlarge the estimated range.
- the calibration may be automatically performed using a portrait image or the like taken without the user being aware of the calibration.
- the information on the driving amount setting value at the time of manufacture stored in the information storage unit 12 is updated to information estimated by calibration, and the updated driving amount setting value is set.
- the change amount of the changed drive amount set value may be acquired by calibration, and the drive amount set value that was originally set may be corrected with a correction amount corresponding to the change amount.
- the imaging apparatus estimates the characteristics of the distance to be brought into focus with the lens driving amount setting value that has changed due to impact or aging, by calibration. Since the drive amount setting value is set based on the estimated characteristics, it is possible to capture an image at an appropriate focal position and easily obtain an image with an expanded depth of field.
- the imaging apparatus according to the fourth embodiment of the present invention assumes a case where the optical system 101 of the imaging apparatus 1 in the above-described embodiment includes a diaphragm mechanism 15 and the depth of field of the imaging apparatus is variable by a user operation. To do.
- the depth of field can be changed by narrowing the lens.
- the degree of aperture of the lens is represented by an F value, and the F value is a value obtained by dividing the focal length by the effective aperture. If the F value is changed by the diaphragm and the depth of field is changed, the lens driving amount setting value and the distance characteristics as shown in FIGS. 6 and 7 are also changed. Therefore, a drive amount setting value corresponding to a change in the aperture, that is, a change in the depth of field is stored in the information storage unit 12, and information is read from the information storage unit 12 based on the aperture information such as the F value. Then, set the drive amount set value. Thereby, even if the aperture changes and the depth of field changes, the focal position can be set appropriately.
- the driving amount setting value corresponding to the aperture information is stored in the information storage unit 12.
- the correction amount information corresponding to the aperture information is stored in the information storage unit 12. You may remember. Thereby, the correction amount may be read based on the aperture information, and the drive amount setting value set as a reference may be corrected.
- the imaging apparatus sets the drive amount setting value according to the degree of lens aperture. Therefore, even when using an imaging device equipped with a diaphragm mechanism, the focal position is set appropriately according to the depth of field that changes with the diaphragm, and the captured image is processed to increase the depth of field. It is possible to appropriately acquire the processed image.
- the image pickup apparatus issues a warning or an instruction to the user in order to prevent such an attitude of the image pickup apparatus in which the driving of the lens becomes unstable.
- the imaging apparatus of the present embodiment is similar to the configuration of the imaging apparatus 1 in the above-described embodiment, as shown in FIG. 1, as a notification unit 16, a warning / instruction device such as a lamp, or a display device or speaker Shall be provided.
- the imaging apparatus for example, blinks a warning lamp provided in the imaging apparatus or emits a warning sound when the angle of the optical axis is tilted by 70 degrees or more from a horizontal state. Inform users with the method. As a result, it is notified that the posture is excessively inclined, and the user himself / herself adjusts the posture of the imaging apparatus to perform stable photographing.
- the imaging device includes a display device, characters such as “Please make the posture close to horizontal” may be displayed on the display device. May be.
- the limit of posture for issuing warnings and instructions that is, the threshold for posture determination can be set arbitrarily, but the threshold is set so that the drive amount setting value can be set within the range where the lens can be driven most stably. Is preferably set.
- the imaging apparatus issues a warning or an instruction to the user according to the attitude information, so that the attitude of the imaging apparatus changes excessively and the driving of the lens becomes unstable. You can avoid that. Accordingly, it is possible to appropriately acquire an image with an increased depth of field by performing imaging while stably driving the lens and changing the focal position, and performing image processing on the captured image. .
- the image processing unit 13 may be realized by a computer.
- the program for realizing the function of the processing unit may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into the computer system and executed.
- the “computer system” here is a computer system built in the imaging apparatus 1 and includes an OS and hardware such as peripheral devices.
- the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
- the “computer-readable recording medium” is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line,
- a volatile memory inside a computer system that serves as a server or a client may include one that holds a program for a certain period of time.
- the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
- a part of the imaging device 1 in the above-described embodiment may be realized as an integrated circuit such as an LSI (Large Scale Integration).
- LSI Large Scale Integration
- Each functional block of the imaging apparatus 1 may be individually made into a processor, or a part or all of them may be integrated into a processor.
- the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
- an integrated circuit based on the technology may be used.
- depth-of-field expansion process has been described, it can also be applied to the depth-of-field adjustment process including reduction.
- An image acquisition unit for acquiring a plurality of images having different focal positions;
- a posture acquisition unit that acquires posture information of the image acquisition unit;
- An image processing unit that generates, from the plurality of images, an image in which the depth of field is expanded more than the depth of field of one of the plurality of images, The imaging apparatus, wherein the focal position is determined based on a focal position setting value that is corrected based on attitude information acquired by the attitude acquisition unit.
- the attitude of the imaging apparatus changes, it is possible to appropriately set the focal position and acquire an image with an expanded depth of field.
- the reference subject considering the depth of field characteristics determined by the optical system and shooting conditions, and setting the focal position based on the focus information on the reference subject, the reference subject must always be in focus and shot.
- An image with an expanded depth of field can be acquired with the number of images to be minimized. As a result, it is possible to suppress the amount of processing, the amount of memory, and the power consumption that increase in accordance with the number of images.
- the posture information includes angle information formed by an optical axis and a vertical direction of the image acquisition unit, The imaging apparatus according to (1), wherein the sign of the correction amount of the focal position setting value is made different based on the sign of the angle information.
- the posture information includes angle information formed by an optical axis and a vertical direction of the image acquisition unit, The imaging apparatus according to (1) or (2), wherein a correction amount of the focal position setting value is varied based on the size of the angle information.
- the amount of correction of the drive amount setting value can be varied depending on the magnitude of the change in angle information.
- the focal position setting value is determined based on one reference focal position setting value, In any one of (1) to (3), the correction amount of the other focal position setting value corrected based on the posture information differs depending on the reference focal position setting value.
- the drive amount set value can be appropriately corrected and set according to the posture information.
- (6) Furthermore, it has a zoom mechanism, It has an information storage unit for storing the driving amount setting value of the lens according to the presence / absence and degree of zoom,
- the imaging apparatus according to any one of (1) to (5), further comprising setting a driving amount setting value according to distance information of the subject obtained from the focusing information and distance information.
- the drive amount setting value is set based on the estimated characteristics, it is possible to capture an image at an appropriate focal position and obtain an image with an expanded depth of field.
- an image with an expanded depth of field is set by appropriately setting the focal position according to the depth of field that changes with the aperture, and performing image processing on the captured image. Can be obtained.
- the imaging apparatus according to any one of (1) to (5), further including a notification unit that performs notification according to posture information.
- An image acquisition unit that acquires a plurality of images having different focal positions, a posture acquisition unit that acquires posture information of the image acquisition unit, and an object scene of one of the plurality of images from the plurality of images
- An image processing unit that generates an image with an expanded depth of field rather than a depth, and a processing method in an imaging apparatus,
- the focal position is determined based on a focal position setting value that is corrected based on attitude information acquired by the attitude acquisition unit,
- the focus position setting value is corrected based on the posture information.
- SYMBOLS 1 ... Imaging device, 10 ... Image acquisition part, 11 ... Attitude acquisition part, 12 ... Information storage part, 13 ... Image processing part, 100 ... Imaging element, 101 ... Optical system, 102 ... Lens drive part, 103 ... Drive amount setting Part, 2a ... foreground subject, 2b ... foreground subject, a1, b1 to b3, c0 to c2, d0 to d2, e0 to e2, f0 to f2, g1, g2, g1 ', g2' ... driving amount setting value, ar , Br, cr0 to cr2 ... in-focus range, A, B, C, D ... curves.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computing Systems (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
- Focusing (AREA)
- Details Of Cameras Including Film Mechanisms (AREA)
- Automatic Focus Adjustment (AREA)
Abstract
Description
図1は、本発明の第1の実施形態における撮像装置1の概略構成例を示す機能ブロック図である。撮像装置1は、焦点位置を変化させて撮影することで複数枚の画像情報を取得する画像取得部10と、画像取得部10、もしくは、画像取得部10を含む撮像装置1全体の姿勢情報を取得する姿勢取得部11と、焦点位置の設定情報などを記憶した情報記憶部12と、画像取得部10が出力した画像情報について画像処理を行い、被写界深度が拡大した画像を合成する画像処理部13とを備えている。本実施の形態における撮像装置1は、例えば、DSP(Digital Signal Processor)やCPU(Central Processing Unit)などのプロセッサ、RAM(Random Access Memory)などの主記憶装置などを備え、記憶装置に格納されたプログラムを実行して上記した各処理部の処理を実現することができる。また、FPGA(Field Programmable Gate Array)などのプログラム可能な集積回路や、上記処理専用の集積回路を備えることで、ハードウェアにより上記処理部の処理を実現することもできる。
以下に駆動量設定部103におけるレンズの駆動量設定値の設定方法(ステップS103)を説明する。
画像処理部13の処理手順と処理方法について、図面を参照しながら以下に説明する。図9Aは画像処理部13の一構成例を示す機能ブロック図である。図9Bは画像処理部13の処理の流れを表すフローチャート図である。
上述の第1の実施形態で述べた撮像装置1では、光学系101のレンズには単焦点のレンズを用い、レンズ駆動部102によってフォーカスレンズを駆動することで、焦点位置を変化させることを想定している。
本発明の第3の実施形態の撮像装置は、図1に示す撮像装置1と同様の構成である。上述の実施形態において、撮像装置の姿勢によってレンズの駆動量設定値と合焦する距離の特性が変化することを述べたが、この特性は、撮像装置に強い衝撃を与えた場合や、経年変化などによっても変化する可能性がある。そこで、本実施形態の撮像装置は、キャリブレーションによってレンズの駆動量設定値と合焦する距離の特性を再度推定することで、特性変化の影響を軽減する。
本発明の第4の実施形態の撮像装置は、上述の実施形態における撮像装置1の光学系101に絞りの機構15を備え、ユーザの操作によって撮像装置の被写界深度が可変な場合を想定する。
上述の実施形態において、撮像装置の姿勢が変化した場合にも、適切に焦点位置を設定して撮影し、被写界深度が拡大した画像を取得できることを述べた。しかしながら、過度に姿勢が変化した場合には、レンズ駆動部の動作が不安定になる可能性がある。例えば、撮像装置を真上や真下に向けて過度な姿勢の変化があったときには、最大あるいは最小の駆動量でレンズを駆動させる場合がある。レンズ駆動部は、機構的な問題で、駆動範囲の両端位置までレンズを駆動すると動作が不安定になりやすい。したがって、姿勢情報に基づいて駆動量設定値を設定したとしても、望んだ焦点位置で撮影が行われない可能性がある。
本発明は、以下の開示を含む。
焦点位置が異なる複数の画像を取得する画像取得部と、
前記画像取得部の姿勢情報を取得する姿勢取得部と、
前記複数の画像から、前記複数の画像のうちの1つの画像の被写界深度よりも、被写界深度を拡大した画像を生成する画像処理部と、を備える撮像装置であって、
前記焦点位置は、前記姿勢取得部が取得した姿勢情報に基づいて補正される焦点位置設定値に基づいて決定されることを特徴とする撮像装置。
前記姿勢情報は、前記画像取得部の光軸と鉛直方向とで成す角度情報を有し、
前記角度情報の符号に基づいて、前記焦点位置設定値の補正量の符号を異ならせることを特徴とする(1)に記載の撮像装置。
前記姿勢情報は、前記画像取得部の光軸と鉛直方向とで成す角度情報を有し、
前記角度情報の大きさに基づいて、前記焦点位置設定値の補正量を異ならせることを特徴とする(1)又は(2)に記載の撮像装置。
前記焦点位置設定値は、1つの基準焦点位置設定値に基づいて決定され、
前記姿勢情報に基づいて補正される前記他の焦点位置設定値の補正量が、前記基準焦点位置設定値に応じて異なることを特徴とする(1)から(3)までのいずれか1つに記載の撮像装置。
前記姿勢情報と、前記基準焦点位置設定値とに基づいて、前記基準焦点位置設定値の焦点位置で取得される画像より近景となる前記他の焦点位置設定値の数と、前記基準焦点位置で取得される画像より遠景となる前記他の焦点位置設定値の数との割合を調整することを特徴とする、(4)に記載の撮像装置。
さらに、ズーム機構を有し、
ズームの有無、度合いに応じたレンズの駆動量設定値を記憶する情報記憶部を有し、
さらに、合焦情報から得られる被写体の距離情報、距離情報に応じて駆動量設定値を設定することを特徴とする(1)から(5)までのいずれか1つに記載の撮像装置。
変化したレンズの駆動量設定値と合焦する距離の特性を、キャリブレーションによって推定することを特徴とする(1)から(5)までのいずれか1つに記載の撮像装置。
レンズの絞りの度合いに応じた駆動量設定値を設定することを特徴とする(1)から(5)までのいずれか1つに記載の撮像装置。
姿勢情報に応じて報知する報知部を有することを特徴とする(1)から(5)までのいずれか1つに記載の撮像装置。
焦点位置が異なる複数の画像を取得する画像取得部と、前記画像取得部の姿勢情報を取得する姿勢取得部と、前記複数の画像から、前記複数の画像のうちの1つの画像の被写界深度よりも、被写界深度を拡大した画像を生成する画像処理部と、を備える撮像装置における処理方法であって、
前記焦点位置は、前記姿勢取得部が取得した姿勢情報に基づいて補正される焦点位置設定値に基づいて決定され、
前記焦点位置設定値は、前記姿勢情報に基づいて補正されることを特徴とする処理方法。
コンピュータに、(10)に記載の処理方法を実行させるためのプログラム。
(11)に記載のプログラムを記録するコンピュータ読み取り可能な記録媒体。
Claims (10)
- 焦点位置が異なる複数の画像を取得する画像取得部と、
前記画像取得部の姿勢情報を取得する姿勢取得部と、
前記複数の画像から、前記複数の画像のうちの1つの画像の被写界深度よりも、被写界深度を拡大した画像を生成する画像処理部と、を備える撮像装置であって、
前記焦点位置を決定する焦点位置設定値を有し、
前記焦点位置設定値は、前記姿勢情報に基づいて補正されることを特徴とする撮像装置。 - 前記姿勢情報は、前記画像取得部の光軸と鉛直方向とで成す角度情報を有し、
前記角度情報の符号に基づいて、前記焦点位置設定値を補正する補正量の符号を異ならせることを特徴とする請求項1に記載の撮像装置。 - 前記姿勢情報は、前記画像取得部の光軸と鉛直方向とで成す角度情報を有し、
前記角度情報の大きさに基づいて、前記焦点位置設定値を補正する補正量を異ならせることを特徴とする請求項1又は2に記載の撮像装置。 - 前記焦点位置設定値は、前記焦点位置を決定するための基準となる基準被写体に合焦するときの基準焦点位置設定値に基づいて決定され、
前記姿勢情報に基づいて前記焦点位置設定値を補正するときの補正量は、前記基準焦点位置設定値に応じて異なることを特徴とする請求項1から3のいずれか1項に記載の撮像装置。 - 前記姿勢情報と、前記基準焦点位置設定値とに基づいて、
前記基準被写体よりも近景に存在する被写体に合焦する前記焦点位置設定値の数と、前記基準被写体よりも遠景に存在する被写体に合焦する前記焦点位置設定値の数と、の割合を調整し、
調整された前記焦点位置設定値に基づき、前記画像取得部は前記複数の画像を取得することを特徴とする請求項4に記載の撮像装置。 - 撮像対象の被写体に対してズームを行うことが可能なズーム機構をさらに備え、
前記ズーム機構におけるズームの有無、あるいは、ズームの度合いに応じて、前記焦点位置設定値を設定することを特徴とする請求項1から5のいずれか1項に記載の撮像装置。 - 前記被写体に合焦したときの合焦情報から判断される前記被写体の距離の情報に応じて、前記焦点位置設定値を設定することを特徴とする請求項6に記載の撮像装置。
- 被写界深度を変化させることが可能な絞り機構をさらに備え、
前記絞り機構におけるレンズの絞りの度合いに応じて、前記焦点位置設定値を設定することを特徴とする請求項1から5のいずれか1項に記載の撮像装置。 - レンズの駆動量を設定する駆動量設定値と、合焦する距離との関係を表す特性の情報をキャリブレーションによって推定し、
推定された前記特性の情報に基づいて前記焦点位置設定値を設定することを特徴とする請求項1から5のいずれか1項に記載の撮像装置。 - 前記焦点位置設定値は、レンズの駆動量を設定する駆動量設定値であることを特徴とする請求項1から9のいずれか1項に記載の撮像装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015530897A JP6147347B2 (ja) | 2013-08-07 | 2014-08-05 | 撮像装置 |
US14/909,198 US9888163B2 (en) | 2013-08-07 | 2014-08-05 | Imaging device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013164545 | 2013-08-07 | ||
JP2013-164545 | 2013-08-07 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015020038A1 true WO2015020038A1 (ja) | 2015-02-12 |
Family
ID=52461369
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/070595 WO2015020038A1 (ja) | 2013-08-07 | 2014-08-05 | 撮像装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9888163B2 (ja) |
JP (1) | JP6147347B2 (ja) |
WO (1) | WO2015020038A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015186088A (ja) * | 2014-03-25 | 2015-10-22 | オリンパス株式会社 | 撮像装置、撮像装置の制御方法、及びプログラム |
JP2017068763A (ja) * | 2015-10-01 | 2017-04-06 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および位置情報取得方法 |
JP2018010104A (ja) * | 2016-07-13 | 2018-01-18 | 株式会社Screenホールディングス | 撮像装置および撮像方法 |
WO2018012130A1 (ja) * | 2016-07-13 | 2018-01-18 | 株式会社Screenホールディングス | 画像処理方法、画像処理装置、撮像装置および撮像方法 |
WO2018042786A1 (ja) * | 2016-09-05 | 2018-03-08 | 株式会社Screenホールディングス | 画像処理方法、画像処理装置、および撮像装置 |
JP2018101951A (ja) * | 2016-12-21 | 2018-06-28 | キヤノン株式会社 | 撮像装置、撮像方法およびコンピュータのプログラム |
JP2019036876A (ja) * | 2017-08-17 | 2019-03-07 | 京セラドキュメントソリューションズ株式会社 | 画像読取装置、画像形成装置、画像読取方法及び画像読取プログラム |
JP2019047145A (ja) * | 2017-08-29 | 2019-03-22 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理装置の制御方法およびプログラム |
US10282825B2 (en) | 2016-09-13 | 2019-05-07 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus and imaging apparatus |
JP2019087924A (ja) * | 2017-11-08 | 2019-06-06 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法およびプログラム |
JP2019103097A (ja) * | 2017-12-07 | 2019-06-24 | キヤノン株式会社 | 撮像装置、撮像方法およびプログラム |
CN110023712A (zh) * | 2017-02-28 | 2019-07-16 | 松下知识产权经营株式会社 | 位移计测装置以及位移计测方法 |
JPWO2020012825A1 (ja) * | 2018-07-13 | 2021-07-15 | 富士フイルム株式会社 | 画像生成装置、画像生成方法および画像生成プログラム |
WO2021152933A1 (ja) * | 2020-01-31 | 2021-08-05 | 富士フイルム株式会社 | 撮像装置、撮像装置の作動方法、及びプログラム |
US12041362B2 (en) | 2020-01-31 | 2024-07-16 | Fujifilm Corporation | Imaging apparatus, operation method of imaging apparatus, and program |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5744161B2 (ja) * | 2013-11-18 | 2015-07-01 | シャープ株式会社 | 画像処理装置 |
JP6322099B2 (ja) * | 2014-09-12 | 2018-05-09 | キヤノン株式会社 | 画像処理装置、及び、画像処理方法 |
CN106851117B (zh) * | 2017-03-31 | 2020-01-31 | 联想(北京)有限公司 | 一种获得全景照片的方法及电子设备 |
JP7051373B2 (ja) * | 2017-11-02 | 2022-04-11 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、プログラム、および、記憶媒体 |
JP6828069B2 (ja) * | 2019-02-18 | 2021-02-10 | キヤノン株式会社 | 撮像装置、撮像方法およびプログラム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06300962A (ja) * | 1993-04-14 | 1994-10-28 | Canon Inc | カメラの自動焦点調節装置 |
JP2000039642A (ja) * | 1998-07-21 | 2000-02-08 | Canon Inc | レンズ鏡筒及びそれを用いた光学機器 |
JP2008271240A (ja) * | 2007-04-20 | 2008-11-06 | Fujifilm Corp | 撮像装置、画像処理装置、撮像方法、及び画像処理方法 |
JP2012222431A (ja) * | 2011-04-05 | 2012-11-12 | Canon Inc | 撮像装置およびその制御方法 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4524717B2 (ja) | 2008-06-13 | 2010-08-18 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
JP5800611B2 (ja) * | 2011-07-07 | 2015-10-28 | オリンパス株式会社 | 撮影装置 |
-
2014
- 2014-08-05 US US14/909,198 patent/US9888163B2/en active Active
- 2014-08-05 JP JP2015530897A patent/JP6147347B2/ja not_active Expired - Fee Related
- 2014-08-05 WO PCT/JP2014/070595 patent/WO2015020038A1/ja active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06300962A (ja) * | 1993-04-14 | 1994-10-28 | Canon Inc | カメラの自動焦点調節装置 |
JP2000039642A (ja) * | 1998-07-21 | 2000-02-08 | Canon Inc | レンズ鏡筒及びそれを用いた光学機器 |
JP2008271240A (ja) * | 2007-04-20 | 2008-11-06 | Fujifilm Corp | 撮像装置、画像処理装置、撮像方法、及び画像処理方法 |
JP2012222431A (ja) * | 2011-04-05 | 2012-11-12 | Canon Inc | 撮像装置およびその制御方法 |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015186088A (ja) * | 2014-03-25 | 2015-10-22 | オリンパス株式会社 | 撮像装置、撮像装置の制御方法、及びプログラム |
JP2017068763A (ja) * | 2015-10-01 | 2017-04-06 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および位置情報取得方法 |
WO2017057218A1 (ja) * | 2015-10-01 | 2017-04-06 | 株式会社ソニー・インタラクティブエンタテインメント | 情報処理装置および位置情報取得方法 |
US10447996B2 (en) | 2015-10-01 | 2019-10-15 | Sony Interactive Entertainment Inc. | Information processing device and position information acquisition method |
WO2018012130A1 (ja) * | 2016-07-13 | 2018-01-18 | 株式会社Screenホールディングス | 画像処理方法、画像処理装置、撮像装置および撮像方法 |
CN109417602A (zh) * | 2016-07-13 | 2019-03-01 | 株式会社斯库林集团 | 图像处理方法、图像处理装置、摄像装置及摄像方法 |
CN109417602B (zh) * | 2016-07-13 | 2020-12-22 | 株式会社斯库林集团 | 图像处理方法、图像处理装置、摄像装置及摄像方法 |
JP2018010104A (ja) * | 2016-07-13 | 2018-01-18 | 株式会社Screenホールディングス | 撮像装置および撮像方法 |
US10789679B2 (en) | 2016-07-13 | 2020-09-29 | SCREEN Holdings Co., Ltd. | Image processing method, image processor, image capturing device, and image capturing method for generating omnifocal image |
WO2018042786A1 (ja) * | 2016-09-05 | 2018-03-08 | 株式会社Screenホールディングス | 画像処理方法、画像処理装置、および撮像装置 |
JP2018042006A (ja) * | 2016-09-05 | 2018-03-15 | 株式会社Screenホールディングス | 画像処理方法、画像処理装置、および撮像装置 |
US11323630B2 (en) | 2016-09-05 | 2022-05-03 | SCREEN Holdings Co., Ltd. | Image processing method, image processor, and image capturing device |
US10282825B2 (en) | 2016-09-13 | 2019-05-07 | Panasonic Intellectual Property Management Co., Ltd. | Image processing apparatus and imaging apparatus |
JP2018101951A (ja) * | 2016-12-21 | 2018-06-28 | キヤノン株式会社 | 撮像装置、撮像方法およびコンピュータのプログラム |
CN110023712A (zh) * | 2017-02-28 | 2019-07-16 | 松下知识产权经营株式会社 | 位移计测装置以及位移计测方法 |
JP2019036876A (ja) * | 2017-08-17 | 2019-03-07 | 京セラドキュメントソリューションズ株式会社 | 画像読取装置、画像形成装置、画像読取方法及び画像読取プログラム |
JP2019047145A (ja) * | 2017-08-29 | 2019-03-22 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理装置の制御方法およびプログラム |
JP2019087924A (ja) * | 2017-11-08 | 2019-06-06 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法およびプログラム |
JP2019103097A (ja) * | 2017-12-07 | 2019-06-24 | キヤノン株式会社 | 撮像装置、撮像方法およびプログラム |
JPWO2020012825A1 (ja) * | 2018-07-13 | 2021-07-15 | 富士フイルム株式会社 | 画像生成装置、画像生成方法および画像生成プログラム |
JP7030986B2 (ja) | 2018-07-13 | 2022-03-07 | 富士フイルム株式会社 | 画像生成装置、画像生成方法および画像生成プログラム |
WO2021152933A1 (ja) * | 2020-01-31 | 2021-08-05 | 富士フイルム株式会社 | 撮像装置、撮像装置の作動方法、及びプログラム |
JPWO2021152933A1 (ja) * | 2020-01-31 | 2021-08-05 | ||
JP7354299B2 (ja) | 2020-01-31 | 2023-10-02 | 富士フイルム株式会社 | 撮像装置、撮像装置の作動方法、及びプログラム |
US12041362B2 (en) | 2020-01-31 | 2024-07-16 | Fujifilm Corporation | Imaging apparatus, operation method of imaging apparatus, and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2015020038A1 (ja) | 2017-03-02 |
US20160191784A1 (en) | 2016-06-30 |
US9888163B2 (en) | 2018-02-06 |
JP6147347B2 (ja) | 2017-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6147347B2 (ja) | 撮像装置 | |
US10462374B2 (en) | Zooming control apparatus, image capturing apparatus and control methods thereof | |
JP6224362B2 (ja) | 撮像装置 | |
WO2014153950A1 (zh) | 快速自动聚焦的方法和图像采集装置 | |
JP5453573B2 (ja) | 撮像装置、撮像方法およびプログラム | |
US9111129B2 (en) | Subject detecting method and apparatus, and digital photographing apparatus | |
JP7159868B2 (ja) | フォーカス制御装置とフォーカス制御方法とプログラムおよび撮像装置 | |
JP2017098631A (ja) | 画像合成処理装置 | |
JP4487811B2 (ja) | 撮影装置 | |
KR20200064908A (ko) | 제어장치, 촬상장치, 및 기억매체 | |
JP2015106116A (ja) | 撮像装置 | |
WO2013094552A1 (ja) | 撮像装置、その制御方法およびプログラム | |
JP7172601B2 (ja) | フォーカス制御装置とフォーカス制御方法とプログラムおよび撮像装置 | |
WO2013094551A1 (ja) | 撮像装置、その制御方法およびプログラム | |
JP6645711B2 (ja) | 画像処理装置、画像処理方法、プログラム | |
TW201236448A (en) | Auto-focusing camera and method for automatically focusing of the camera | |
TWI542857B (zh) | 具有測距功能的電子裝置及測距方法 | |
JP2015233211A (ja) | 撮像装置およびその制御方法ならびにプログラム | |
JP2019219529A (ja) | 制御装置、撮像装置、制御方法、プログラム、および、記憶媒体 | |
JP2014216692A (ja) | 高解像度化処理付き撮影装置 | |
US11381728B2 (en) | Image capture apparatus and method for controlling the same | |
TWI423661B (zh) | Face block assisted focus method | |
JP2008263386A (ja) | 静止画撮像装置 | |
JP2023041663A (ja) | マルチカメラシステムのためのカメラ切替制御技法 | |
TW201616168A (zh) | 自動對焦方法及電子裝置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14834947 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015530897 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14909198 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14834947 Country of ref document: EP Kind code of ref document: A1 |