WO2014050189A1 - 撮像装置及び画像処理方法 - Google Patents
撮像装置及び画像処理方法 Download PDFInfo
- Publication number
- WO2014050189A1 WO2014050189A1 PCT/JP2013/061842 JP2013061842W WO2014050189A1 WO 2014050189 A1 WO2014050189 A1 WO 2014050189A1 JP 2013061842 W JP2013061842 W JP 2013061842W WO 2014050189 A1 WO2014050189 A1 WO 2014050189A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject distance
- unit
- lens
- restoration
- restoration filter
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims description 80
- 238000003672 processing method Methods 0.000 title claims description 5
- 238000012545 processing Methods 0.000 claims description 60
- 238000000034 method Methods 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 28
- 238000001514 detection method Methods 0.000 claims description 25
- 230000000875 corresponding effect Effects 0.000 description 72
- 230000006870 function Effects 0.000 description 32
- 238000004891 communication Methods 0.000 description 28
- 238000010586 diagram Methods 0.000 description 22
- 230000033001 locomotion Effects 0.000 description 16
- 230000005540 biological transmission Effects 0.000 description 10
- 238000011156 evaluation Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 230000015556 catabolic process Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 238000006731 degradation reaction Methods 0.000 description 3
- 238000010295 mobile communication Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 238000005315 distribution function Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000005674 electromagnetic induction Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229920013655 poly(bisphenol-A sulfone) Polymers 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/73—Deblurring; Sharpening
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B13/00—Optical objectives specially designed for the purposes specified below
- G02B13/001—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras
- G02B13/0015—Miniaturised objectives for electronic devices, e.g. portable telephones, webcams, PDAs, small digital cameras characterised by the lens design
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/09—Beam shaping, e.g. changing the cross-sectional area, not otherwise provided for
- G02B27/0905—Dividing and/or superposing multiple light beams
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
- G02B7/38—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/81—Camera processing pipelines; Components thereof for suppressing or minimising disturbance in the image signal generation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
- H04N23/663—Remote control of cameras or camera parts, e.g. by remote control devices for controlling interchangeable camera parts based on electronic image sensor signals
Definitions
- the present invention relates to an imaging apparatus and an image processing method, and more particularly, to an imaging apparatus and an image processing method for performing point image restoration processing of an image captured based on a point spread function (PSF).
- PSF point spread function
- the characteristics of deterioration (PSF) due to aberrations of the photographing lens are obtained in advance, and the restored image is created based on the PSF. Is a process of restoring the original image with a high resolution by performing a filtering process using.
- Patent Literature 1 an imaging state including a subject distance is acquired, a restoration filter corresponding to the imaging state is obtained for each restoration area of the degraded image, and a degradation image is obtained for each restoration area using the degradation image and the restoration filter.
- An image processing apparatus to be restored is described. Since PSF changes depending on the type of lens, zoom magnification, F number, subject distance, angle of view (image height), etc., it is necessary to prepare a large number of restoration filters.
- the image processing apparatus described in Patent Document 1 includes a memory that holds a plurality of restoration filter lists corresponding to combinations of lens types, zoom magnifications, and F numbers.
- Each restoration filter list includes a plurality of restoration filter tables classified by subject distance, and the restoration filter table holds a restoration filter (a determinant of filter coefficients constituting the restoration filter) for each image height (restoration region).
- FIG. 3 of Patent Document 1 The image processing apparatus described in Patent Document 1 includes a memory that holds a plurality of restoration filter lists corresponding to combinations of lens types, zoom magnifications, and F numbers.
- Each restoration filter list includes a plurality of restoration filter tables classified by subject distance, and the restoration filter table holds a restoration filter (a determinant of filter coefficients constituting the restoration filter) for each image height (restoration region).
- the optimum restoration filter corresponding to the lens type, zoom magnification, F number, subject distance, angle of view (image height), etc. is selected, and the selected restoration filter is selected.
- a large number of restoration filters must be stored in the memory, which increases the memory capacity.
- Patent Document 1 describes that a plurality of restoration filters are held discretely, and a restoration filter corresponding to an intermediate value is generated by interpolation, thereby reducing the capacity of a memory holding the restoration filter.
- the present invention has been made in view of such circumstances, and can reduce the number of restoration filters retained in advance when performing restoration processing of an image captured using a restoration filter corresponding to the subject distance.
- An object of the present invention is to provide an imaging apparatus and an image processing method capable of improving the resolution by the restoration process.
- an imaging apparatus includes an imaging unit that captures an image of a subject formed by a photographic lens and acquires an image indicating the subject image, and when the imaging unit captures an image.
- One or more restorations selected from a plurality of restoration filters created on the basis of a first object distance estimation unit for obtaining a subject distance estimated by the focus detection unit and a point spread function of at least a photographing lens corresponding to the subject distance
- a restoration filter storage unit that stores a filter, and a restoration filter selection unit that selects a restoration filter corresponding to the subject distance from the restoration filter stored in the restoration filter storage unit based on the subject distance estimated by the first subject distance estimation unit
- a restoration processing unit that performs restoration processing of the image acquired by the imaging unit using the restoration filter selected by the restoration filter selection unit,
- the original filter storage unit adds the maximum estimation variation on the infinity side of the range of estimation variation with respect to the closest subject distance to the closest subject distance estimated by the first subject distance estimation unit.
- the restoration filter storage unit stores a plurality of restoration filters created based on at least the point spread function of the photographing lens corresponding to the subject distance. Instead of storing a plurality of restoration filters created over the entire range, one or more restoration filters selected from the plurality of restoration filters are stored. That is, the restoration filter storage unit estimates the range of estimation variation for the subject distance estimated by the first subject distance estimation unit with respect to the closest subject distance estimated by the first subject distance estimation unit. A restoration filter corresponding to the range of the subject distance (hereinafter referred to as “estimated variation close distance”) and the infinite distance is stored.
- the point spread function (PSF) varies depending on the subject distance, and the PSF on the near side has a wider shape than the PSF on the infinity side. As a result, the restoration filter closer to the infinity side is prepared in advance to perform stronger restoration processing.
- Restoration filters in the range closer to the estimated variation near distance may perform restoration processing that is overcorrected. Therefore, by storing only the restoration filter corresponding to the range of the subject distance between the estimated variation close distance and infinity, and not having the restoration filter in the range closer to the estimated variation close distance from the beginning. The number of restoration filters to be stored in the restoration filter storage unit is reduced.
- the first subject distance estimation unit estimates the subject distance based on the lens position of the focus lens of the photographing lens.
- the photographic lens is an interchangeable lens
- the maximum estimation variation on the infinity side of the estimation variation range with respect to the subject distance estimated by the first subject distance estimation unit is , At least one of estimated variation due to individual differences of interchangeable lenses and estimated variation due to temperature characteristics of the interchangeable lenses.
- an estimation variation acquisition unit that acquires the maximum estimation variation on the infinity side of the range of estimation variation with respect to the subject distance estimated by the first subject distance estimation unit;
- a second subject distance estimation unit that calculates a subject distance obtained by adding the maximum estimation variation on the infinity side acquired by the estimation variation acquisition unit to the subject distance estimated by the first subject distance estimation unit;
- the unit selects a restoration filter on the infinity side closest to the subject distance calculated by the second subject distance estimation unit among the one or more restoration filters stored in the restoration filter storage unit.
- the subject distance is estimated by the first subject distance estimation unit, and the maximum estimation variation on the infinity side in the range of variation (estimation variation) with respect to the estimated subject distance is acquired by the estimation variation acquisition unit.
- a subject distance is calculated by adding the subject distance estimated by the first subject distance estimation unit and the maximum estimation variation on the infinity side acquired by the estimation variation acquisition unit.
- the calculated subject distance is the subject distance when the estimated subject distance varies most toward the infinity side. Then, when selecting a restoration filter to be used for restoration processing from one or more restoration filters corresponding to the subject distance stored in the restoration filter storage unit, restoration corresponding to the infinity side closest to the calculated subject distance is performed. Select a filter.
- restoration filter that corresponds to the infinity side closest to the subject distance when the estimated subject distance varies to the infinity side
- restoration corresponding to the subject distance when the infinity side varies
- a filter or a restoration filter that performs restoration processing weaker than that can be selected.
- the photographing lens is an interchangeable lens including a restoration filter storage unit.
- the restoration filter corresponding to the lens can be acquired from the restoration filter storage unit on the lens side without providing the restoration filter storage unit stage on the imaging device main body side.
- the photographing lens is an interchangeable lens
- the estimated variation acquisition unit is configured to reduce estimated variation due to individual differences between the interchangeable lens and the estimated lens due to temperature characteristics of the interchangeable lens. It is preferable to acquire at least one of them.
- the photographing lens is an interchangeable lens
- the estimation variation acquiring unit includes a lens information acquiring unit that acquires lens information of the interchangeable lens from the mounted interchangeable lens, and a plurality of information It is preferable to have an estimated variation storage unit that stores estimated variation for each interchangeable lens, and to acquire the estimated variation corresponding to the lens information acquired by the lens information acquisition unit from the estimated variation storage unit.
- the imaging apparatus it is preferable to include a recording unit that records an image obtained by the restoration processing by the restoration processing unit.
- an image acquisition step of acquiring an image showing a subject image from an imaging unit having a photographic lens, and a subject distance estimated by a focus detection unit during imaging by the imaging unit are obtained.
- a subject distance estimation step, and a step of preparing a restoration filter storage unit that stores at least one restoration filter selected from a plurality of restoration filters created based on at least a point spread function of a photographing lens corresponding to the subject distance A restoration filter selection step for selecting a restoration filter corresponding to the subject distance from the restoration filter stored in the restoration filter storage unit based on the subject distance estimated by the first subject distance estimation step, and a selection by the restoration filter selection step
- the original filter storage unit includes a range of estimation variations with respect to the subject distance estimated by the first subject distance estimation step with respect to the closest subject distance to the closest subject distance estimated by the first subject distance estimation step. Is stored in the range of the subject distance between the subject distance obtained by adding
- the restoration filter within the range closer to the estimated variation close distance is not provided from the beginning, thereby reducing the number of restoration filters held in advance. Note that it is preferable not to use these restoration filters because restoration filters within the range closer to the estimated variation closest distance may perform restoration processing that is overcorrected.
- FIG. 1 is an external view of an imaging apparatus which is one embodiment of the present invention.
- FIG. 2 is a rear view of the imaging device shown in FIG.
- FIG. 3 is a principal block diagram of the imaging apparatus shown in FIG.
- FIG. 4 is a diagram showing a flow of contrast autofocus processing.
- FIG. 5 is a diagram illustrating contrast autofocus.
- FIG. 6 is a principal block diagram of the focus lens control unit.
- FIG. 7 is a principal block diagram of a main CPU and a digital signal processing unit that perform point image restoration processing.
- FIG. 8 is a diagram for explaining how to store and select the restoration filter in the first embodiment of the present invention.
- FIG. 9 is a diagram for explaining how to store and select the restoration filter according to the first embodiment of the present invention.
- FIG. 1 is an external view of an imaging apparatus which is one embodiment of the present invention.
- FIG. 2 is a rear view of the imaging device shown in FIG.
- FIG. 3 is a principal block diagram of the imaging apparatus shown in FIG
- FIG. 10 is a diagram for explaining how to store and select the restoration filter in the first embodiment of the present invention.
- FIG. 11 is a diagram for explaining how to store and select a restoration filter according to the second embodiment of the present invention.
- FIG. 12 is a diagram for explaining how to store and select the restoration filter according to the second embodiment of the present invention.
- FIG. 13 is a diagram for explaining how to store and select a restoration filter according to the second embodiment of the present invention.
- FIG. 14 is an external view of a smartphone that is another embodiment of the imaging apparatus.
- FIG. 15 is a block diagram illustrating a main configuration of the smartphone.
- FIG. 1 is a perspective view showing an appearance of an imaging apparatus 100 according to one aspect of the present invention.
- the imaging device 100 includes an imaging device body 200 and a lens device 300 that is replaceably attached to the imaging device body 200.
- the imaging apparatus main body 200 and the lens apparatus 300 include a mount 246 (transmission means and reception means) provided in the imaging apparatus main body 200 and a mount 346 (reception means and transmission means) on the lens apparatus 300 side corresponding to the mount 246 ( 3) and are interchangeably mounted.
- a flash 240 is provided on the front surface of the imaging apparatus main body 200, and a release button 220-1 and a shooting mode setting dial 220-2 are provided on the upper surface.
- the mount 246 is provided with a terminal 247 (transmission means, reception means), and the mount 346 is provided with a terminal 347 (transmission means, reception means) (see FIG. 3).
- the corresponding terminal 247 and terminal 347 come into contact with each other to enable communication.
- the terminals 247 and 347 in FIGS. 1 and 3 are conceptually shown, and the position and number of terminals in the present invention are not limited to those in these drawings.
- a monitor 212, a cross button 260, a MENU / OK button 265, a playback button 270, a BACK button 275, and the like are disposed on the back of the imaging apparatus main body 200.
- the imaging apparatus 100 may be an interchangeable lens imaging apparatus or a lens fixed imaging apparatus.
- FIG. 3 is a block diagram illustrating a configuration of the imaging apparatus 100.
- the operation of the imaging apparatus 100 is controlled by the main CPU 214 of the imaging apparatus body 200 and the lens CPU 340 of the lens apparatus 300.
- Programs (including a program for driving the zoom lens ZL, the focus lens FL, and the aperture I) and data necessary for the operation of the main CPU 214 are stored in the flash ROM 226 and ROM 228 in the imaging apparatus main body 200, and the lens CPU 340
- a program including a program for driving the zoom lens ZL, the focus lens FL, and the aperture I
- data necessary for the above operation are stored in the ROM 344 in the lens CPU 340.
- the imaging apparatus main body 200 is provided with an operation unit 220 including a playback button, a MENU / OK key, a cross key, a BACK key, and the like.
- an operation unit 220 including a playback button, a MENU / OK key, a cross key, a BACK key, and the like.
- a signal from the operation unit 220 is input to the main CPU 214.
- the main CPU 214 controls each circuit of the imaging apparatus main body 200 based on the input signal, and mount 246 (transmitting unit, receiving unit, lens information acquiring unit) and mount communication. Signals are transmitted / received to / from the lens apparatus 300 via the unit 250 (transmission unit, reception unit, lens information acquisition unit).
- the above-described terminals include, for example, a grounding terminal, a synchronization signal terminal, a serial communication terminal, a control status communication terminal, and a power supply terminal from the battery 242 of the imaging apparatus main body 200 to each part of the lens apparatus 300.
- the subject light is imaged on the light receiving surface of the imaging element 202 of the imaging apparatus main body 200 via the zoom lens ZL, the focus lens FL, and the aperture stop I of the lens apparatus 300.
- the image sensor 202 is a CMOS type, but is not limited to a CMOS type and may be a CCD type.
- the focus lens FL, the zoom lens ZL, and the aperture I are driven by the zoom lens control unit 310, the focus lens control unit 320 (lens driving unit), and the aperture control unit 330 that are controlled by the lens CPU 340, and focus control / zoom control is performed. ⁇ Aperture control is performed.
- the zoom lens control unit 310 moves the zoom lens ZL in the direction of the optical axis in accordance with a command from the lens CPU 340 to change the photographing magnification. Further, the focus lens control unit 320 moves the focus lens FL forward and backward in the direction of the optical axis in accordance with a command from the lens CPU 340 to focus on the subject.
- the aperture control unit 330 changes the aperture value of the aperture I according to a command from the lens CPU 340.
- the main CPU 214 When the release button 220-1 is pressed (half-pressed) in the first stage, the main CPU 214 starts AF and AE operations, and the image data output from the A / D converter 204 in response thereto is AE / AWB. It is captured by the detection unit 224.
- the main CPU 214 calculates the brightness of the subject (shooting EV value) from the integrated value of the G signal input to the AE / AWB detection unit 224, and based on the result, the aperture value of the aperture I and the charge at the image sensor 202 are calculated.
- the accumulation time (corresponding to the shutter speed), the flash 240 emission time, and the like are controlled.
- the AF detection unit 222 is a part that performs contrast AF processing or phase difference AF processing.
- the focus lens FL in the lens barrel is controlled so that the AF evaluation value indicating the in-focus state, which is calculated by integrating the high frequency components of the image data in the focus area, is maximized.
- the phase difference AF process the defocus amount obtained from the phase difference data calculated using the pixels having a plurality of phase differences in the focus area in the image data is set to 0.
- the focus lens FL in the lens apparatus 300 is controlled.
- the AF detection unit 222 operates as a focus detection unit that detects a focus.
- the flash 240 emits light under the control of the flash control unit 238. Further, based on the readout signal applied from the image sensor control unit 201, the signal charge accumulated in the image sensor 202 is read out as a voltage signal corresponding to the signal charge and added to the analog signal processing unit 203.
- the analog signal processing unit 203 samples and holds the R, G, and B signals of each pixel by a correlated double sampling process on the voltage signal output from the image sensor 202, adds the amplified signal to the A / D converter 204 .
- the A / D converter 204 converts analog R, G, and B signals that are sequentially input into digital R, G, and B signals and outputs them to the image input controller 205.
- the image sensor 202 is a CMOS image sensor
- the A / D converter 204 is often built in the image sensor 202, and the correlated double sampling may not be necessary.
- Image data output from the image input controller 205 is input to the digital signal processing unit 206 and is subjected to signal processing such as offset control, gain control processing including white balance correction and sensitivity correction, gamma correction processing, restoration processing, and YC processing.
- the image is encoded / read by the display control unit 210 through writing / reading to / from the VRAM 230 and output to the monitor 212, whereby the subject image is displayed on the monitor 212.
- a recording unit as means for recording the obtained subject image.
- an image obtained by the restoration process may be recorded in the memory card 236.
- the image data output from the A / D converter 204 in response to the full press of the release button 220-1 is input from the image input controller 205 to the SDRAM (memory) 232 and temporarily stored.
- signal processing such as gain control processing, gamma correction processing and YC processing in the digital signal processing unit 206 and compression processing to JPEG (Joint Photographic Coding Experts Group) format in the compression / decompression processing unit 208 Etc.
- JPEG Joint Photographic Coding Experts Group
- the imaging apparatus main body 200 and the lens apparatus 300 include a mount 246 (transmission means and reception means) and a mount communication unit 250 (transmission means and reception means) of the imaging apparatus main body 200, and a mount 346 (transmission means and reception means) of the lens apparatus.
- a mount communication unit 350 transmission means, reception means
- terminals provided on the mount 246 and the mount 346
- the lens movement command includes control target (zoom lens ZL / focus lens FL / aperture I), drive mode, numerical value (target position of zoom lens ZL / focus lens FL, aperture value of aperture I, etc.), brake ON / OFF Information (information indicating whether or not to apply the brake by short-circuiting the coil of the motor 326 at the target position) is included.
- control target zoom lens ZL / focus lens FL / aperture I
- numerical value target position of zoom lens ZL / focus lens FL, aperture value of aperture I, etc.
- brake ON / OFF Information information indicating whether or not to apply the brake by short-circuiting the coil of the motor 326 at the target position
- contrast AF will be described with reference to FIGS.
- FIG. 4 shows a flowchart of the contrast AF process.
- the main CPU 214 outputs a lens movement command for moving the focus lens FL to the home position (HP) to the lens device 300 in order to detect the lens position of the focus lens FL, and the focus lens. FL is moved to HP (step S10).
- the HP is provided with an HP sensor 322 (FIG. 6) such as a photo interrupter.
- the HP sensor 322 outputs a lens detection signal when the moving focus lens FL reaches HP based on a lens movement command. Output.
- This lens detection signal is used as a reset signal for resetting the count value of the up / down counter 320-8 functioning as a lens position detection means to zero.
- the movement of the focus lens FL to the HP may be performed when the lens apparatus 300 is attached to the imaging apparatus main body 200 or may be performed when the power is turned on.
- the main CPU 214 outputs a lens movement command for moving the focus lens FL from the infinity side (INF) to the closest end (AF search) to the lens apparatus 300 (step S14).
- the main CPU 214 acquires the count value as a lens position signal from the up / down counter 320-8 of the lens apparatus 300 (step S16), and acquires image data in the focus area for each appropriate lens position. Then, an AF evaluation value is calculated by the AF detection unit 222 (FIG. 3) (step S18).
- the AF detection unit 222 calculates an AF evaluation value indicating the in-focus state by integrating the high frequency components of the image data in the focus area, and outputs the calculated AF evaluation value to the main CPU 214.
- the main CPU 214 determines whether or not the AF search has ended (that is, whether or not the count value i (focus lens FL position) is the closest end (0.20)) (step S20), and the count value i. If is not the closest end, the process proceeds to step S14 to continue the AF search.
- the main CPU 214 calculates the lens position (focus position) at which the AF evaluation value is maximized based on each AF evaluation value and lens position acquired in steps S16 and S18. (Step S22, (A) of FIG. 5).
- the main CPU 214 outputs a lens position command for moving the focus lens FL to the calculated focus position to the lens apparatus 300, and moves the focus lens FL to the focus position (step S24, FIG. 5B).
- the main CPU 214 calculates the subject distance from the lens position calculated in step S22 (step S26).
- FIG. 6 shows a functional block diagram of the focus lens control unit 320.
- a lens movement command is sent from the main CPU 214 to the focus lens control unit 320 via the lens CPU 340.
- the focus lens control unit 320 mainly includes a controller 320-2, a driver 320-4, a motor 320-6, and an up / down counter 320-8.
- a lens movement command from the lens CPU 340 is received by the controller 320-2.
- a pulse signal corresponding to the lens movement command is output from the controller 320-2.
- the pulse signal output from the controller 320-2 is input to the driver 320-4.
- the driver 320-4 drives the motor 320-6 according to the input pulse signal.
- the motor 320-6 is preferably a stepping motor.
- the pulse signal output from the controller 320-2 is also sent to the up / down counter 320-8.
- the up / down counter 320-8 counts the pulse signal sent and outputs a count value. Since the up / down counter 320-8 is reset to 0 when the focus lens FL reaches HP by the HP sensor 322 as described above, the count value of the up / down counter 320-8 is based on HP. The lens position of the focus lens FL is shown.
- FIG. 7 is a block diagram showing the main parts of the main CPU 214 and the digital signal processing unit 206 that perform point image restoration processing.
- the main CPU 214 includes a lens position input unit 214-2, a first subject distance estimation unit 214-4, an estimation variation acquisition unit 214-6, and a second subject distance estimation unit 214-8 as functions for restoration processing. ing.
- the lens position input unit 214-2 inputs the current count value of the up / down counter 320-8 after the focus control of the focus lens FL as information indicating the lens position.
- the first subject distance estimation unit 214-4 receives information indicating the lens position from the lens position input unit 214-2, and estimates the distance (subject distance) of the subject focused by the focus lens FL at the current lens position. To do. Since the lens position of the focus lens FL and the subject distance focused by the lens position are in a fixed relationship, the subject distance can be estimated from the lens position of the focus lens FL.
- Information indicating the subject distance (first subject distance) estimated by the first subject distance estimation unit 214-4 is output to the estimation variation acquisition unit 214-6 and the second subject distance estimation unit 214-8.
- the estimation variation acquisition unit 214-6 calculates an error (estimation variation) of the first subject distance based on the information indicating the first subject distance input from the first subject distance estimation unit 214-4.
- the estimation variation is variation in the first subject distance estimated from the lens position of the focus lens FL, and is caused by the detection accuracy of the focus stop position of the focus lens FL. For example, if the lens barrel characteristics change with temperature, the focus stop position of the focus lens FL changes even if the subject distance is constant, and erroneous distance measurement occurs (estimation variation occurs).
- the estimation variation has an error range (estimation variation range) extending from the first object distance to the infinity side and the closest end side.
- the estimated variation acquisition unit 214-6 calculates the maximum estimated variation on the infinity side in the estimated variation range with respect to the first subject distance.
- the estimation variation acquisition unit 214-6 performs the first subject distance. And the estimation variation of the first subject distance is calculated based on the information indicating the error and the error information (n pulses) of the count value.
- the error may be calculated according to various conditions, or may be selected from an error database stored in an error storage unit (not shown) that is a means for storing the error.
- the error is specific to a solid (individual difference) in the interchangeable lens or depends on the temperature characteristics of the interchangeable lens.
- an estimated variation storage unit that is a means for storing an error (estimated variation) is included in the estimated variation acquisition unit 214-6. And an error (estimated variation) for each interchangeable lens may be stored.
- the lens device 300 is attached, lens information related to the lens device 300 is acquired, and a corresponding estimated variation is acquired from the estimated variation storage unit based on the lens information.
- the estimated variation storage unit may be installed in the lens apparatus 300.
- the second subject distance estimation unit 214-8 calculates a second subject distance obtained by adding the estimated first subject distance and the maximum error (estimation variation) on the infinity side, and calculates the calculated second subject distance Information indicating the distance is output to the restoration filter selection unit 206-4 of the digital signal processing unit 206.
- the main parts related to the restoration process are a restoration processing unit 206-2, a restoration filter selection unit 206-4, and a restoration filter storage unit 206-6.
- the restoration filter storage unit 206-6 stores, for each lens, a plurality of restoration filters created based on point spread functions of various lenses including the lens device 300.
- the plurality of restoration filters correspond to combinations of subject distance, zoom magnification, F number, and image height (restoration area), and are a number of restoration filters.
- the restoration filter storage unit 206-6 stores the restoration filter discretely corresponding to the subject distance.
- the restoration filter selection unit 206-4 receives information indicating the second subject distance from the second subject distance estimation unit 214-8, and based on this input information, the restoration filter storage unit 206-6 receives the second information from the second subject distance estimation unit 214-8.
- the restoration filter corresponding to the subject distance is selected.
- the restoration filter on the infinity side closest to the second subject distance is selected. Details of the restoration filter selection method will be described later.
- the restoration processing unit 206-2 performs restoration processing on the image data using the restoration filter selected by the restoration filter selection unit 206-4.
- PSF Point Spread Function
- the former case is suitable for PSF measurement corresponding only to the lens apparatus 300, and the latter case is suitable for PSF measurement considering the influence of the image sensor 202 (color filter or the like).
- the blurred image acquired by the imaging of the point image by the imaging unit including the lens device 300 and the imaging element 202 is G (X, Y)
- the original point image is F (X, Y)
- H (X, Y) that is, the point spread function (PSF) of [Expression 1] is obtained.
- R (X, Y) F (X, Y)
- This R (X, Y) is called a restoration filter.
- a least square filter Wiener filter
- a limited deconvolution filter a limited deconvolution filter
- a recursive filter a homomorphic filter, or the like
- FIGS. 8 to 10 are diagrams for explaining how to store and select the restoration filter in the first embodiment of the present invention, which is performed by the restoration filter selection unit 206-4 and the restoration filter storage unit 206-6.
- FIG. 8 is an image diagram of a point spread function (PSF) according to a plurality of object distances within a range from the closest end (0.2 m) to infinity (INF). As described above, a restoration filter is generated based on these PSFs.
- PSF point spread function
- the PSF on the infinity side has a smaller spread than the PSF on the near side, and therefore the restoration process by the restoration filter on the infinity side is weaker than the restoration process by the restoration filter on the near side. Become.
- the closest end (0.2 m), 0.25 m, 0.33 m,. It is assumed that restoration filters corresponding to six distances (subject distances) of 5 m, 1 m, and INF are stored. These subject distances are subject distances corresponding to the respective lens positions obtained by dividing the amount of movement from the closest end of the focus lens FL to the INF into approximately five equal parts.
- the range indicated by the arrow in FIG. 8 indicates each subject distance when the first subject distance estimation unit 214-4 estimates 0.2 m, 0.25 m, 0.33 m, 0.5 m, 1 m, or INF.
- variation in is shown.
- the circle at the tip of the arrow in the figure indicates that the subject distance (first subject distance) estimated by the first subject distance estimation unit 214-4 is 0.2 m, 0.25 m, 0.33 m, 0. In the case of .5 m, 1 m, or INF, the distance obtained by adding the first subject distance and the maximum estimated variation on the infinity side of the estimated variation range with respect to the first subject distance (second subject distance) ).
- the error range is from 0.25 m to 02 m
- the maximum estimation variation on the infinity side In consideration of the above, the second subject distance estimated by the second subject distance estimation unit 214-8 is 0.25 m.
- the restoration filter on the infinity side closest to the second subject distance is used. That is, a restoration filter generated corresponding to a subject distance of 0.25 m is selected to prevent overcorrection. That is, the restoration filter corresponding to the infinity side closest to the second subject distance is selected.
- the overcorrection in the restoration process is to estimate a subject distance closer to the actual subject distance and perform the restoration process using a restoration filter stronger (larger) than the restoration filter that should be used originally. is there. If overcorrection is performed in the restoration process, the quality of the obtained image will deteriorate.
- the first subject distance estimated by the first subject distance estimation unit 214-4 is 0.25 m, and the error range is 0.33 m to 0.2 m.
- the restoration process may be overcorrected. That is, since the actual subject distance is 0.33 m to 0.2 m, if the actual subject distance is 0.33 m, it is generated corresponding to the subject distance of 0.25 m or 0.2 m. The restoration process using the restored filter is overcorrected.
- the restoration filter generated corresponding to the subject distance that is far away from the second subject distance toward the infinity side when used, the effect of the restoration process cannot be sufficiently obtained.
- the restoration process is performed using a restoration filter corresponding to INF even though the second subject distance is 0.25 m, the restoration process is weak and the effect of the restoration process cannot be sufficiently obtained.
- the first subject distance estimated by the first subject distance estimation unit 214-4 is 0.25 m
- the second subject distance estimated by the second subject distance estimation unit 214-8 is 0.33 m. In some cases, it is possible to prevent the occurrence of overcorrection by using a restoration filter generated corresponding to a subject distance of 0.33 m.
- the range of the arrow indicates the subject distance error range, and is within the error range.
- the restoration filter corresponding to the infinity side closest to the second subject distance is selected. Therefore, when 0.25 m is estimated as the first subject distance, the subject distance corresponding to 0.33 m is selected.
- the restoration filter generated corresponding to the subject distance of 0.5 m is used, and when 0.5 m is estimated, when the restoration filter generated corresponding to the subject distance of 1 m is used and the INF is estimated, the restoration filter generated corresponding to the subject distance of the INF is used.
- the first subject distance is 0.2 m, 0.25 m, 0.33 m, 0.5 m, 1 m
- the second subject distance considering the maximum estimation variation is 0.
- the subject distance is between 5 m and 1 m. Therefore, the restoration filter on the infinity side closest to the second subject distance is a restoration filter generated corresponding to the subject distance of 1 m.
- the restoration generated in accordance with the INF subject distance regardless of the second subject distance may be used.
- FIG. 9 shows distance measurement errors and the like corresponding to a lens (interchangeable lens) different from the lens apparatus 300. Since the temperature characteristics differ depending on the type of the lens or there are individual differences, the focus stop position of the focus lens The estimation variation of the subject distance estimated from is different.
- FIG. 9 shows a case where the estimation variation (error range) indicated by the arrow range is large (long) compared to the case of FIG.
- a restoration filter on the infinity side closest to the maximum possible subject distance from the second subject distance is used. That is, when 0.2 m is estimated as the first subject distance, the restoration filter generated corresponding to the subject distance of 0.33 m is used, and when 0.25 m is estimated, the subject of 0.5 m is used. When the restoration filter generated corresponding to the distance is used and 0.33 m is estimated, the restoration filter generated corresponding to the subject distance of 1 m is used, and when 1 m or more is estimated, the INF A restoration filter generated corresponding to the subject distance is used.
- FIG. 10 shows a case where the error range is larger (longer) than in the case of FIG.
- the subject distance (second subject distance) in consideration of the estimation variation is larger than 1 m. Therefore, the restoration filter that does not cause overcorrection is only the restoration filter generated corresponding to INF. That is, in the case of the lens having the estimation variation shown in FIG. 10, only the restoration filter generated corresponding to the INF is used regardless of the estimated subject distance.
- FIGS. 11 to 13 are diagrams for explaining how to store and select the restoration filter according to the second embodiment of the present invention.
- the second embodiment shown in FIGS. 11 to 13 stores a restoration filter corresponding to the subject distance in the same manner as the first embodiment shown in FIGS.
- the method of storing the restoration filter in the restoration filter storage unit 206-6 is different.
- the restoration filter storage unit 206-6 which is a means for storing the restoration filter, sets the closest object distance estimated by the first object distance estimation unit 214-4 to the closest object distance. Stores a restoration filter corresponding to the range between the subject distance (hereinafter referred to as “estimated variation close distance”) and the infinite range within the range of estimated variation with respect to the subject distance and the maximum estimated variation on the infinity side. Yes.
- Restoration filters in the range closer to the estimated variation near distance may perform restoration processing that is overcorrected. Therefore, the restoration filter storage unit 206-6 stores only a plurality of restoration filters corresponding to the range between the estimated variation close distance and infinity, and the restoration filter within the range closer to the estimated variation close distance is stored. Do not remember from the beginning. Thereby, the number of restoration filters stored in the restoration filter storage unit 206-6 is reduced.
- the restoration filter is stored in the restoration filter storage unit 206-6 regardless of whether or not the restoration filter is used for each of a plurality of preset subject distances.
- the restoration filter that is not used in relation to the error range of the subject distance is not stored in the restoration filter storage unit 206-6 (see FIGS. 11 to 13).
- FIG. 11 is a diagram showing a second embodiment corresponding to the first embodiment shown in FIG. Comparing the second embodiment shown in FIG. 11 with the first embodiment shown in FIG. 8, the second embodiment is different in that there is no restoration filter corresponding to a subject distance of 0.2 m. To do.
- the restoration filter corresponding to the infinity side closest to the estimated variation close distance is a restoration filter generated corresponding to the subject distance of 0.25 m.
- the restoration filter generated corresponding to the subject distance of 0.2 m as shown in FIG. Even do not use. Unused restoration filters are not stored in the restoration filter storage unit 206-6.
- FIG. 12 is a diagram showing a second embodiment corresponding to the first embodiment shown in FIG. Comparing the second embodiment shown in FIG. 12 with the first embodiment shown in FIG. 9, in the second embodiment, restoration filters corresponding to subject distances of 0.2 m and 0.25 m are provided. There is no difference.
- the restoration filter corresponding to the infinity side closest to the estimated variation close distance is a restoration filter generated corresponding to the subject distance of 0.33 m.
- the restoration filter generated corresponding to the subject distances of 0.2 m and 0.25 m as shown in FIG. It is not used regardless of the subject distance. Unused restoration filters are not stored in the restoration filter storage unit 206-6.
- FIG. 13 is a diagram showing a second embodiment corresponding to the first embodiment shown in FIG. Comparing the second embodiment shown in FIG. 13 with the first embodiment shown in FIG. 10, in the second embodiment, 0.2 m, 0.25 m, 0.33 m, 0.5 m, And there is no restoration filter corresponding to a subject distance of 1 m.
- the restoration filter corresponding to the infinity side closest to the estimated variation close distance is the restoration filter of INF (1).
- the restoration filters (2) to (6) that are not used are not stored in the restoration filter storage unit 206-6, but the restoration filter (1) selected from the relationship of the error range is the restoration filter storage unit 206-. 6 is stored.
- the restoration filter that is not used in relation to the error range is not stored in the restoration filter storage unit 206-6. Thereby, the memory capacity for storing the restoration filter can be reduced.
- the restoration filter storage unit which is a means for storing the restoration filter has a plurality of tables each storing a plurality of restoration filters corresponding to the estimation variation, and a restoration filter selection which is a means for selecting the restoration filter.
- the unit selects a corresponding table from a plurality of tables according to the magnitude of the estimated variation acquired by the estimated variation acquisition unit 214-6, which is a means for acquiring the estimated variation, and selects a restoration filter from the selected table. Also good.
- FIG. 14 shows an appearance of a smartphone 500 that is another embodiment of the imaging apparatus 100.
- a smartphone 500 illustrated in FIG. 14 includes a flat housing 502, and a display input in which a display panel 521 as a monitor 212 and an operation panel 522 as an input unit are integrated on one surface of the housing 502. Part 520.
- the housing 502 includes a speaker 531, a microphone 532, an operation unit 540, and a camera unit 541. Note that the configuration of the housing 502 is not limited to this, and for example, a configuration in which the monitor 212 and the input unit are independent may be employed, or a configuration having a folding structure or a slide mechanism may be employed.
- FIG. 15 is a block diagram showing a configuration of the smartphone 500 shown in FIG.
- the main components of the smartphone include a wireless communication unit 510, a display input unit 520, a call unit 530, an operation unit 540, a camera unit 541, a storage unit 550, and an external input / output unit. 560, a GPS (Global Positioning System) receiving unit 570, a motion sensor unit 580, a power supply unit 590, and a main control unit 501.
- a wireless communication function for performing mobile wireless communication via the base station device BS and the mobile communication network NW is provided.
- the wireless communication unit 510 performs wireless communication with the base station apparatus BS accommodated in the mobile communication network NW according to an instruction from the main control unit 501. Using this wireless communication, transmission and reception of various file data such as audio data and image data, e-mail data, and reception of Web data and streaming data are performed.
- the display input unit 520 displays images (still images and moving images), character information, and the like visually by the control of the main control unit 501, and visually transmits information to the user, and detects a user operation on the displayed information.
- a so-called touch panel which includes a display panel 521 and an operation panel 522.
- the display panel 521 is preferably a 3D display panel.
- the display panel 521 uses an LCD (Liquid Crystal Display), an OELD (Organic Electro-Luminescence Display), or the like as a display device.
- the operation panel 522 is a device that is placed so that an image displayed on the display surface of the display panel 521 is visible and detects one or a plurality of coordinates operated by a user's finger or stylus. When this device is operated by a user's finger or stylus, a detection signal generated due to the operation is output to the main control unit 501. Next, the main control unit 501 detects an operation position (coordinates) on the display panel 521 based on the received detection signal.
- the display panel 521 and the operation panel 522 of the smartphone 500 integrally form the display input unit 520, but the operation panel 522 is disposed so as to completely cover the display panel 521. ing.
- the operation panel 522 may have a function of detecting a user operation even in an area outside the display panel 521.
- the operation panel 522 includes a detection area (hereinafter referred to as a display area) for an overlapping portion that overlaps the display panel 521 and a detection area (hereinafter, a non-display area) for an outer edge portion that does not overlap the other display panel 521. May be included).
- the operation panel 522 may include two sensitive regions of the outer edge portion and the other inner portion. Further, the width of the outer edge portion is appropriately designed according to the size of the housing 502 and the like.
- examples of the position detection method employed in the operation panel 522 include a matrix switch method, a resistive film method, a surface acoustic wave method, an infrared method, an electromagnetic induction method, and a capacitance method. You can also
- the call unit 530 includes a speaker 531 and a microphone 532, and converts a user's voice input through the microphone 532 into voice data that can be processed by the main control unit 501, and outputs the voice data to the main control unit 501, or a wireless communication unit 510 or the audio data received by the external input / output unit 560 is decoded and output from the speaker 531.
- the speaker 531 can be mounted on the same surface as the surface on which the display input unit 520 is provided, and the microphone 532 can be mounted on the side surface of the housing 502.
- the operation unit 540 is a hardware key using a key switch or the like, and receives an instruction from the user.
- the operation unit 540 is mounted on the lower and lower side of the display unit of the housing 502 of the smartphone 500 and turns on when pressed with a finger or the like, and restores a spring or the like when the finger is released. It is a push button type switch that is turned off by force.
- the storage unit 550 includes control programs and control data of the main control unit 501, address data in which names and telephone numbers of communication partners are associated, transmitted and received e-mail data, Web data downloaded by Web browsing, and downloaded contents Data is stored, and streaming data and the like are temporarily stored.
- the storage unit 550 includes an internal storage unit 551 built in the smartphone and an external storage unit 552 having a removable external memory slot.
- Each of the internal storage unit 551 and the external storage unit 552 constituting the storage unit 550 includes a flash memory type, a hard disk type, a multimedia card micro type, a multimedia card micro type, This is realized using a storage medium such as a card type memory (for example, Micro SD (registered trademark) memory), a RAM (Random Access Memory), a ROM (Read Only Memory), or the like.
- a card type memory for example, Micro SD (registered trademark) memory
- RAM Random Access Memory
- ROM Read Only Memory
- the external input / output unit 560 serves as an interface with all external devices connected to the smartphone 500, and communicates with other external devices (for example, universal serial bus (USB), IEEE 1394, etc.) or a network.
- external devices for example, universal serial bus (USB), IEEE 1394, etc.
- a network for example, Internet, wireless LAN, Bluetooth (Bluetooth (registered trademark)), RFID (Radio Frequency Identification), infrared communication (Infrared Data Association: IrDA) (registered trademark), UWB (Ultra Wideband) (registered trademark), ZigBee) (registered trademark, etc.) for direct or indirect connection.
- an external device connected to the smartphone 500 for example, a wired / wireless headset, a wired / wireless external charger, a wired / wireless data port, a memory card (Memory card) connected via a card socket, or a SIM (Subscriber).
- Identity Module Card / UIM User Identity Module Card
- external audio / video equipment connected via audio / video I / O (Input / Output) terminal
- external audio / video equipment connected wirelessly yes / no
- the external input / output unit may transmit data received from such an external device to each component inside the smartphone 500, or may allow data inside the smartphone 500 to be transmitted to the external device. it can.
- the GPS receiving unit 570 receives GPS signals transmitted from the GPS satellites ST1 to STn in accordance with instructions from the main control unit 501, performs positioning calculation processing based on the received plurality of GPS signals, and calculates the latitude of the smartphone 500 Detect the position consisting of longitude and altitude.
- the GPS reception unit 570 can acquire position information from the wireless communication unit 510 or the external input / output unit 560 (for example, a wireless LAN), the GPS reception unit 570 can also detect the position using the position information.
- the motion sensor unit 580 includes a triaxial acceleration sensor, for example, and detects the physical movement of the smartphone 500 in accordance with an instruction from the main control unit 501. By detecting the physical movement of the smartphone 500, the moving direction and acceleration of the smartphone 500 are detected. This detection result is output to the main control unit 501.
- the power supply unit 590 supplies power stored in a battery (not shown) to each unit of the smartphone 500 in accordance with an instruction from the main control unit 501.
- the main control unit 501 includes a microprocessor, operates according to a control program and control data stored in the storage unit 550, and controls each unit of the smartphone 500 in an integrated manner. Further, the main control unit 501 has a mobile communication control function for controlling each part of the communication system and an application processing function in order to perform voice communication and data communication through the wireless communication unit 510.
- the application processing function is realized by the main control unit 501 operating according to the application software stored in the storage unit 550.
- Examples of the application processing function include an infrared communication function for controlling the external input / output unit 560 to perform data communication with the opposite device, an e-mail function for sending and receiving e-mails, a web browsing function for browsing web pages, and the present invention. And a function for generating a 3D image from the 2D image according to the above.
- the main control unit 501 also has an image processing function such as displaying video on the display input unit 520 based on image data (still image or moving image data) such as received data or downloaded streaming data.
- the image processing function is a function in which the main control unit 501 decodes the image data, performs image processing on the decoding result, and displays an image on the display input unit 520.
- the main control unit 501 executes display control for the display panel 521 and operation detection control for detecting a user operation through the operation unit 540 and the operation panel 522.
- the main control unit 501 displays an icon for starting application software, a software key such as a scroll bar, or a window for creating an e-mail.
- a software key such as a scroll bar, or a window for creating an e-mail.
- the scroll bar refers to a software key for accepting an instruction to move the display portion of a large image that does not fit in the display area of the display panel 521.
- the main control unit 501 detects a user operation through the operation unit 540, or accepts an operation on the icon or an input of a character string in the input field of the window through the operation panel 522. Or a display image scroll request through a scroll bar.
- the main control unit 501 causes the operation position with respect to the operation panel 522 to overlap with the display panel 521 (display area) or other outer edge part (non-display area) that does not overlap with the display panel 521.
- a touch panel control function for controlling the sensitive area of the operation panel 522 and the display position of the software key.
- the main control unit 501 can also detect a gesture operation on the operation panel 522 and execute a preset function in accordance with the detected gesture operation.
- Gesture operation is not a conventional simple touch operation, but an operation that draws a trajectory with a finger or the like, designates a plurality of positions at the same time, or combines these to draw a trajectory for at least one of a plurality of positions. means.
- the camera unit 541 is a digital camera that performs electronic photography using an imaging element 202 such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge-Coupled Device), and has a function similar to that of the imaging apparatus 100 shown in FIG. I have.
- an imaging element 202 such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge-Coupled Device)
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge-Coupled Device
- the camera unit 541 is configured to be able to switch between a manual focus mode and an autofocus mode.
- a force icon displayed on the operation unit 540 or the display input unit 520 is displayed.
- the photographing lens of the camera unit 541 can be focused.
- the manual focus mode the live view image obtained by combining the split images is displayed on the display panel 521 so that the in-focus state during the manual focus can be confirmed.
- the camera unit 541 converts image data obtained by imaging into compressed image data such as JPEG (Joint Photographic Coding Experts Group) under the control of the main control unit 501, and records the data in the storage unit 550 or externally.
- the data can be output through the input / output unit 560 and the wireless communication unit 510.
- the camera unit 541 is mounted on the same surface as the display input unit 520, but the mounting position of the camera unit 541 is not limited to this, and may be mounted on the back surface of the display input unit 520. Alternatively, a plurality of camera units 541 may be mounted. Note that when a plurality of camera units 541 are mounted, the camera unit 541 used for shooting can be switched to shoot alone, or a plurality of camera units 541 can be used simultaneously for shooting.
- the camera unit 541 can be used for various functions of the smartphone 500.
- an image acquired by the camera unit 541 can be displayed on the display panel 521, or the image of the camera unit 541 can be used as one of operation inputs of the operation panel 522.
- the GPS receiving unit 570 detects the position, the position can also be detected with reference to an image from the camera unit 541.
- the optical axis direction of the camera unit 541 of the smartphone 500 is determined without using the triaxial acceleration sensor or in combination with the triaxial acceleration sensor. It is also possible to determine the current usage environment.
- the image from the camera unit 541 can be used in the application software.
- the position information acquired by the GPS receiver 570 to the image data of the still image or the moving image, the voice information acquired by the microphone 532 (the text information may be converted into voice information by the main control unit or the like), Posture information and the like acquired by the motion sensor unit 580 can be added and recorded in the storage unit 550 or output through the external input / output unit 560 or the wireless communication unit 510.
- DESCRIPTION OF SYMBOLS 100 ... Image pick-up device, 200 ... Image pick-up device main body, 202 ... Image pick-up element, 206 ... Digital signal processing part, 212 ... Monitor, 214 ... Main CPU, 220 ... Operation part, 214 ... AF detection part, 300 ... Lens apparatus, 320 ... F lens control unit, 340 ... lens CPU, 500 ... smartphone
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Studio Devices (AREA)
Abstract
Description
次に、コントラストAFについて、図4及び図5を参照して説明する。
図6は、フォーカスレンズ制御部320の機能ブロック図が示されている。フォーカスレンズ制御部320には、メインCPU214からレンズCPU340を介してレンズ移動指令が送られる。
図7は、点像復元処理を行うメインCPU214及びデジタル信号処理部206の要部ブロック図を示す。メインCPU214は、復元処理用の機能としてレンズ位置入力部214-2、第1の被写体距離推定部214-4、推定ばらつき取得部214-6、及び第2の被写体距離推定部214-8を備えている。
G(X,Y)=H(X,Y)*F(X,Y)
ただし、*はコンボリューションを示す。
G(X,Y)*R(X,Y)=F(X,Y)
このR(X,Y)を復元フィルタという。復元フィルタは、原画像と復元画像との2乗平均誤差を最小にする最小2乗フィルタ(ウィナーフィルタ)、制限付き逆畳み込みフィルタ、再帰フィルタ、準同形フィルタ等を利用することができる。
図8から図10は、復元フィルタ選択部206-4及び復元フィルタ記憶部206-6により行われる、本発明の第1の実施形態における復元フィルタの記憶及び選択の仕方を説明する図である。
又はINF以外の被写体距離が推定された場合、例えば、0.33mと0.5mとの間の被写体距離が推定され場合には、最大の推定ばらつきを考慮した第2の被写体距離は、0.5mと1mとの間の被写体距離になる。したがって、この第2の被写体距離に最も近い無限遠側の復元フィルタは、1mの被写体距離に対応して生成された復元フィルタである。
図11から図13は、本発明の第2の実施形態における復元フィルタの記憶及び選択の仕方を説明する図である。
Claims (9)
- 撮影レンズにより結像された被写体像を撮像し、該被写体像を示す画像を取得する撮像部と、
前記撮像部による撮像時の、焦点検出部より推定した被写体距離を求める第1の被写体距離推定部と、
少なくとも被写体距離に対応した前記撮影レンズの点像分布関数に基づいて作成した複数の復元フィルタから選択された1以上の復元フィルタを記憶する復元フィルタ記憶部と、
前記第1の被写体距離推定部段により推定した被写体距離に基づいて前記復元フィルタ記憶部に記憶された復元フィルタから前記被写体距離に対応する復元フィルタを選択する復元フィルタ選択部と、
前記復元フィルタ選択部により選択した復元フィルタを使用して前記撮像部により取得した画像の復元処理を行う復元処理部と、を備え、
前記復元フィルタ記憶部は、前記第1の被写体距離推定部段により推定される最も至近側の被写体距離に前記最も至近側の被写体距離に対する推定ばらつきの範囲のうちの無限遠側の最大の推定ばらつきだけ加算した被写体距離と、無限遠との範囲内に相当する復元フィルタを記憶する、
撮像装置。 - 前記第1の被写体距離推定部は、前記撮影レンズのフォーカスレンズのレンズ位置に基づいて被写体距離を推定する請求項1に記載の撮像装置。
- 前記撮影レンズは交換レンズであり、
前記第1の被写体距離推定部により推定した被写体距離に対する推定ばらつきの範囲のうちの無限遠側の最大の推定ばらつきは、交換レンズの個体差による推定ばらつき、及び交換レンズの温度特性による推定ばらつきのうちの少なくとも一方である請求項2に記載の撮像装置。 - 前記第1の被写体距離推定部により推定した被写体距離に対する推定ばらつきの範囲のうちの無限遠側の最大の推定ばらつきを取得する推定ばらつき取得部と、
前記第1の被写体距離推定部により推定した被写体距離に前記推定ばらつき取得部により取得した無限遠側の最大の推定ばらつきを加算した被写体距離を算出する第2の被写体距離推定部と、備え、
前記復元フィルタ選択部は、前記復元フィルタ記憶部に記憶された1以上の復元フィルタのうちの前記第2の被写体距離推定部により算出した被写体距離に最も近い無限遠側の復元フィルタを選択する請求項1から3のいずれか1項に記載の撮像装置。 - 前記撮影レンズは、前記復元フィルタ記憶部が内蔵された交換レンズである請求項1から4のいずれか1項に記載の撮像装置。
- 前記撮影レンズは交換レンズであり、
前記推定ばらつき取得部は、交換レンズから該交換レンズの個体差による推定ばらつき、及び交換レンズの温度特性による推定ばらつきのうちの少なくとも一方を取得する請求項4に記載の撮像装置。 - 前記撮影レンズは交換レンズであり、
前記推定ばらつき取得部は、
装着される交換レンズから該交換レンズのレンズ情報を取得するレンズ情報取得部と、複数の交換レンズ毎の推定ばらつきを記憶する推定ばらつき記憶部とを有し、前記レンズ情報取得部により取得したレンズ情報に対応する推定ばらつきを、前記推定ばらつき記憶部から取得する請求項4に記載の撮像装置。 - 前記復元処理部により復元処理して得られた画像を記録する記録部を備えた請求項1から7のいずれか1項に記載の撮像装置。
- 撮影レンズを有する撮像部から被写体像を示す画像を取得する画像取得工程と、
前記撮像部による撮像時の焦点検出部より推定した被写体距離を求める第1の被写体距離推定工程と、
少なくとも被写体距離に対応した前記撮影レンズの点像分布関数に基づいて作成した複数の復元フィルタから選択された1以上の復元フィルタを記憶する復元フィルタ記憶部を準備する工程と、
前記第1の被写体距離推定工程により推定した被写体距離に基づいて前記復元フィルタ記憶部に記憶された復元フィルタから前記被写体距離に対応する復元フィルタを選択する復元フィルタ選択工程と、
前記復元フィルタ選択工程により選択した復元フィルタを使用して前記撮像部により取得した画像の復元処理を行う復元処理工程と、を含み、
前記復元フィルタ記憶部は、前記第1の被写体距離推定工程により推定される最も至近側の被写体距離に前記最も至近側の被写体距離に対する推定ばらつきの範囲のうちの無限遠側の最大の推定ばらつきだけ加算した被写体距離と、無限遠との範囲内に相当する復元フィルタを記憶するものである画像処理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201380049879.2A CN104685863B (zh) | 2012-09-27 | 2013-04-23 | 摄像装置及图像处理方法 |
DE112013004760.8T DE112013004760B4 (de) | 2012-09-27 | 2013-04-23 | Bilderfassungsvorrichtung und Bildverarbeitungsverfahren |
JP2014538218A JP5746795B2 (ja) | 2012-09-27 | 2013-04-23 | 撮像装置及び画像処理方法 |
US14/661,361 US9363430B2 (en) | 2012-09-27 | 2015-03-18 | Imaging device and image processing method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-213831 | 2012-09-27 | ||
JP2012213831 | 2012-09-27 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/661,361 Continuation US9363430B2 (en) | 2012-09-27 | 2015-03-18 | Imaging device and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014050189A1 true WO2014050189A1 (ja) | 2014-04-03 |
Family
ID=50387609
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/061842 WO2014050189A1 (ja) | 2012-09-27 | 2013-04-23 | 撮像装置及び画像処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9363430B2 (ja) |
JP (1) | JP5746795B2 (ja) |
CN (1) | CN104685863B (ja) |
DE (1) | DE112013004760B4 (ja) |
WO (1) | WO2014050189A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017041727A (ja) * | 2015-08-19 | 2017-02-23 | キヤノン株式会社 | 画像処理装置、撮像装置および画像処理プログラム |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014050188A1 (ja) * | 2012-09-27 | 2014-04-03 | 富士フイルム株式会社 | 撮像装置及び画像処理方法 |
JP5864037B2 (ja) * | 2013-08-02 | 2016-02-17 | 富士フイルム株式会社 | 画像処理装置、撮像装置、画像処理方法及びプログラム |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001197354A (ja) * | 2000-01-13 | 2001-07-19 | Minolta Co Ltd | デジタル撮像装置および画像復元方法 |
JP2011259314A (ja) * | 2010-06-10 | 2011-12-22 | Fujifilm Corp | 撮像装置、画像処理装置及びプログラム |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010008418A1 (en) | 2000-01-13 | 2001-07-19 | Minolta Co., Ltd. | Image processing apparatus and method |
JP4389371B2 (ja) * | 2000-09-28 | 2009-12-24 | 株式会社ニコン | 画像修復装置および画像修復方法 |
JP3468231B2 (ja) * | 2001-07-02 | 2003-11-17 | ミノルタ株式会社 | 画像処理装置、画質制御方法、プログラム及び記録媒体 |
US8139130B2 (en) * | 2005-07-28 | 2012-03-20 | Omnivision Technologies, Inc. | Image sensor with improved light sensitivity |
KR100829581B1 (ko) * | 2006-11-28 | 2008-05-14 | 삼성전자주식회사 | 영상 처리 방법, 기록매체 및 장치 |
JP5111306B2 (ja) * | 2008-08-29 | 2013-01-09 | キヤノン株式会社 | 像ブレ補正機能を有する光学機器及びその制御方法 |
US20110199511A1 (en) * | 2008-10-20 | 2011-08-18 | Camelot Co., Ltd. | Image photographing system and image photographing method |
US8218823B2 (en) * | 2009-08-11 | 2012-07-10 | Eastman Kodak Company | Determining main objects using range information |
JP5387377B2 (ja) * | 2009-12-14 | 2014-01-15 | ソニー株式会社 | 画像処理装置、および画像処理方法、並びにプログラム |
US20120026360A1 (en) * | 2010-02-08 | 2012-02-02 | Panasonic Corporation | Imaging device |
JP5501069B2 (ja) | 2010-03-31 | 2014-05-21 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法およびプログラム |
JP5454348B2 (ja) * | 2010-05-12 | 2014-03-26 | ソニー株式会社 | 撮像装置および画像処理装置 |
US8774550B2 (en) * | 2010-06-04 | 2014-07-08 | Panasonic Corporation | Picture processing device, picture processing method, integrated circuit, and program |
CN102625043B (zh) * | 2011-01-25 | 2014-12-10 | 佳能株式会社 | 图像处理设备、成像设备和图像处理方法 |
US8928783B2 (en) * | 2011-09-26 | 2015-01-06 | Ricoh Company, Ltd. | Imaging apparatus including switchable edge extraction |
JP6115781B2 (ja) * | 2012-03-29 | 2017-04-19 | パナソニックIpマネジメント株式会社 | 画像処理装置及び画像処理方法 |
JP6039301B2 (ja) * | 2012-08-09 | 2016-12-07 | キヤノン株式会社 | 撮像装置、撮像システム、撮像装置の制御方法、プログラム、および、記憶媒体 |
US8724919B2 (en) * | 2012-09-21 | 2014-05-13 | Eastman Kodak Company | Adjusting the sharpness of a digital image |
US8928772B2 (en) * | 2012-09-21 | 2015-01-06 | Eastman Kodak Company | Controlling the sharpness of a digital image |
WO2014050188A1 (ja) * | 2012-09-27 | 2014-04-03 | 富士フイルム株式会社 | 撮像装置及び画像処理方法 |
US20150381965A1 (en) * | 2014-06-27 | 2015-12-31 | Qualcomm Incorporated | Systems and methods for depth map extraction using a hybrid algorithm |
-
2013
- 2013-04-23 DE DE112013004760.8T patent/DE112013004760B4/de not_active Expired - Fee Related
- 2013-04-23 WO PCT/JP2013/061842 patent/WO2014050189A1/ja active Application Filing
- 2013-04-23 JP JP2014538218A patent/JP5746795B2/ja active Active
- 2013-04-23 CN CN201380049879.2A patent/CN104685863B/zh active Active
-
2015
- 2015-03-18 US US14/661,361 patent/US9363430B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001197354A (ja) * | 2000-01-13 | 2001-07-19 | Minolta Co Ltd | デジタル撮像装置および画像復元方法 |
JP2011259314A (ja) * | 2010-06-10 | 2011-12-22 | Fujifilm Corp | 撮像装置、画像処理装置及びプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017041727A (ja) * | 2015-08-19 | 2017-02-23 | キヤノン株式会社 | 画像処理装置、撮像装置および画像処理プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP5746795B2 (ja) | 2015-07-08 |
CN104685863B (zh) | 2017-11-03 |
DE112013004760T5 (de) | 2015-08-13 |
JPWO2014050189A1 (ja) | 2016-08-22 |
US9363430B2 (en) | 2016-06-07 |
CN104685863A (zh) | 2015-06-03 |
DE112013004760B4 (de) | 2017-10-19 |
US20150195447A1 (en) | 2015-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6033454B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP6031587B2 (ja) | 撮像装置、信号処理方法、信号処理プログラム | |
JP6086975B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP5901801B2 (ja) | 画像処理装置、撮像装置、プログラム及び画像処理方法 | |
US9800774B2 (en) | Image capture device with restoration processing and image restoration processing method | |
JP6000446B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP5746794B2 (ja) | 撮像装置及び画像処理方法 | |
WO2014045740A1 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP5746795B2 (ja) | 撮像装置及び画像処理方法 | |
WO2014171423A1 (ja) | 撮像装置、キャリブレーションシステム、キャリブレーション方法、及びプログラム | |
JP6171105B2 (ja) | 撮像装置及び合焦制御方法 | |
JP6379307B2 (ja) | 撮像装置、合焦制御方法、及び合焦制御プログラム | |
JP5972485B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
JP5901780B2 (ja) | 画像処理装置、撮像装置、画像処理方法及び画像処理プログラム | |
WO2013145887A1 (ja) | 撮像装置及び撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13842904 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014538218 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112013004760 Country of ref document: DE Ref document number: 1120130047608 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13842904 Country of ref document: EP Kind code of ref document: A1 |