CN102483508B - Imaging device and imaging method - Google Patents

Imaging device and imaging method Download PDF

Info

Publication number
CN102483508B
CN102483508B CN201080038527.3A CN201080038527A CN102483508B CN 102483508 B CN102483508 B CN 102483508B CN 201080038527 A CN201080038527 A CN 201080038527A CN 102483508 B CN102483508 B CN 102483508B
Authority
CN
China
Prior art keywords
motion
imaging device
subject
imaging
focusing position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201080038527.3A
Other languages
Chinese (zh)
Other versions
CN102483508A (en
Inventor
伊藤圭
二矢川和也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Publication of CN102483508A publication Critical patent/CN102483508A/en
Application granted granted Critical
Publication of CN102483508B publication Critical patent/CN102483508B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/667Camera operation mode switching, e.g. between still and video, sport and normal or high- and low-resolution modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/675Focus control based on electronic image sensor signals comprising setting of focusing regions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2628Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2101/00Still video cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automatic Focus Adjustment (AREA)
  • Studio Devices (AREA)
  • Focusing (AREA)

Abstract

An imaging device is configured to include an imaging lens, an imaging unit which acquires image data based on an optical image of a subject received via the imaging lens, a motion detector which detects a motion of the subject based on image data sequentially obtained from the imaging unit, a focus detector which calculates focus position data based on the image data obtained from the imaging lens when the motion detector detects a motion of the subject, and an in-focus position estimating unit which estimates an in-focus position based on the calculated focus position data.

Description

Imaging device and formation method
The cross reference of related application
The application based on and the right of priority of No. 2010-45936th, the Japanese patent application submitted in No. 2009-172558th, the Japanese patent application requiring on July 23rd, 2009 to submit to and on March 2nd, 2010, its mode being disclosed in this its entirety is by reference incorporated into this.
Technical field
Also can to focus on the imaging device of the auto-focus function of subject when subject moves fast even if the present invention relates to have and use the formation method of this imaging device.
Background technology
The imaging device of such as digital camera and so on is generally incorporated to automatic focus (AF) unit automatically to focus on subject.Such as, the what is called becoming known for AF unit is climbed the mountain autofocus control method (such as, disclosed in the open No.39-5265 (list of references 1) of the Kokai Patent of Japan's examination).The AF estimated value of the instruction focus level that auto focus control of climbing the mountain will calculate based on the luminance difference by integrating the neighbor that vision signal comprises focuses on subject, and wherein said vision signal exports from imageing sensor according to the optical imagery of the subject via imaging len.
When subject is focused, the edge of subject image is sharp keen and clear, and when subject is not focused, its edge is fuzzy.And the brightness being in the vision signal of the neighbor of the subject image of focusing (in-focus) state is less with the difference of the brightness being in non-focusing state.That is, AF estimated value is maximum being under focusing state.
AF cell location is the vision signal of the subject image obtaining predetermined instant while mobile imaging lens, vision signal according to each moment calculates AF estimated value, and the position by imaging len being moved to automatically acquisition with the vision signal of maximum AF estimated value focuses on subject.Thus, by auto focus control of climbing the mountain, the maximal value of the AF estimated value calculated with predetermined instant by detection while mobile imaging lens, moves to focusing position automatically by imaging len.
Note, the position of the imaging len be under focusing state is called focusing position, and by order to calculate AF estimated value, the scope of imaging len movement is called focus (focus) hunting zone.
No. 3851027th, Jap.P. (list of references 2) discloses auto focus control of climbing the mountain more accurately, at a high speed, it comprises with the first mode of fine interval calculation AF estimated value and second pattern of sampling to AF estimated value with rough interval, until imaging len is near focusing position, and it such as calculates with fine interval when imaging len is near focusing position.By using the first and second patterns individually, automatic focus operation can be carried out to heavens and making subject focus on quickly.
Further, No. 2008-58559th, Japanese Laid-Open Patent Publication (list of references 3) focuses on the AF control method of resetting after being disclosed in and subject being focused on, with store be in before focusing position subject image and relatively before image and present image to calculate matching degree and to reduce focus search scope when matching degree is such as in preset range.
But the problem that disclosed in list of references 2, AF control method has is, when the subject being in focusing moves, after rough sampling, carry out fine sampling needs some times and makes to carry out AF control fast enough to tackle movement.That is, during automatic focus operation, the motion of subject causes self-focusing repetition, and this may cause catching the subject being in focusing.In order to address this is that, new AF is needed to control to move the motion of following via in the subject image of imaging len acquisition with fine scope by the automatic focus controlling imaging len.
Unless caught subject in focusing position, otherwise AF control method disclosed in documents 3 can not be effective.Owing to not being configured to along with the change of time or subject is to follow subject, unless therefore subject is in focusing, otherwise it does not work, and spends the much time to complete AF operation.In addition, focus search scope and lens position independently reduce without exception, make the change along with the distance relative to subject, and AF controls the cost more time.Consider to solve this problem, even if need new AF to control also to estimate focusing position by reducing focus search scope and carry out automatic focus in trickle hunting zone when subject is not focused, and can according to for automatic focus, imaging len is in coke side far away or wide-angle side is to change focus search scope.
Summary of the invention
Target of the present invention is to provide such imaging device: its can by with small hunting zone mobile imaging lens to carry out automatic focus to estimate the focusing position when the subject image obtained via imaging len moves fast.Further, its target is to provide by changing according to the position of moving imaging len when carrying out automatic focus fast when subject the imaging device that focus search scope improves the estimated accuracy of focusing position.
In one aspect of the invention, imaging device is configured to comprise: imaging len; Image-generating unit, its optical imagery based on the subject received via imaging len obtains view data; Motion detector, it detects the motion of subject from the view data that image-generating unit obtains successively; Focal point detector, it, when the motion of subject being detected when motion detector, calculates focal position data based on the view data obtained via imaging len; And focusing position estimation unit, it estimates focusing position based on the focal position data calculated.
Preferably, focusing position estimation unit be configured to based on focal position data arrange in the driving starting position of imaging len and driving direction at least one, to make imaging len near focusing position.
Preferably, focal position data are results of the level and smooth difference operation based on the AF estimated value calculated according to view data.
Preferably, AF estimated value is obtained by the difference of the brightness integrating the neighbor of composing images data.
Preferably, the summation of the value obtained is integrated in the weighting that level and smooth difference operation will calculate the difference of the AF estimated value by neighbor; And the weighting coefficient that weighting uses in integrating is arranged so that the difference in AF estimated value is larger, then weighting coefficient is larger.
Preferably, focusing position estimation unit is configured to, after estimation focusing position, be moved to by imaging len and drive starting position.
Preferably, when being unable to estimate focusing position, focusing position estimation unit is configured to the driving starting position of change imaging len and again calculates focal position data.
Preferably, focal point detector comprises driving scope and changes unit, and it is configured to the driving scope changing imaging len according to predetermined condition.
Preferably, described predetermined condition is the position of the imaging len of focal point detector when starting working.
Preferably, described predetermined condition is the screening-mode of focal point detector when starting working.
Preferably, described predetermined condition is the focal length of focal point detector when starting working.
In another aspect of this invention, provide a kind of by using the formation method of imaging device, described imaging device comprises: imaging len; Image-generating unit, its optical imagery based on the subject received via imaging len obtains view data; Motion detector, it detects the motion of subject from the view data that image-generating unit obtains successively; Focal point detector, it, when motion detector detects the motion of subject, calculates focal position data based on the view data obtained via imaging len; And focusing position estimation unit, it estimates focusing position based on the focal position data calculated.Described method comprises following steps: utilize motion detector to detect the motion of subject; The motion of the subject that focal point detector detects according to motion detector, drives imaging len from precalculated position scheduled volume to obtain view data in a predetermined direction, and obtains focal position data based on obtained view data; And focusing position estimation unit estimates the focusing position of imaging len based on focal position data.
Accompanying drawing explanation
These and other purposes, features and advantages of the present invention become obvious by according to the detailed description with reference to accompanying drawing.
Fig. 1 is the front view of an example according to imaging device of the present invention;
Fig. 2 is the vertical view of an example according to imaging device of the present invention;
Fig. 3 is the rear view of an example according to imaging device of the present invention;
Fig. 4 is the functional block diagram of an example according to imaging device of the present invention;
Fig. 5 illustrates the autofocus area according to imaging device of the present invention;
Fig. 6 is the process flow diagram of the pre-automatic focus operation according to imaging device of the present invention;
Fig. 7 is the sequential chart that the sequential obtaining focal position data according to imaging device of the present invention is shown;
Fig. 8 is the process flow diagram of detailed pre-automatic focus operation;
Fig. 9 is the process flow diagram of detailed pre-automatic focus operation;
Figure 10 A, 10B are the coordinate curves of the example of the change that the AF estimated value obtained in the automatic focus operation according to imaging device of the present invention is shown;
Figure 11 A, 11B are the coordinate curves of the example of the change that the level and smooth difference value calculated in the automatic focus operation according to imaging device of the present invention is shown;
Figure 12 is the process flow diagram of detailed pre-automatic focus operation;
Figure 13 is the coordinate curve illustrating that the focusing position in pre-automatic focus operation is estimated;
Figure 14 is the coordinate curve illustrating that the focusing position in pre-automatic focus operation is estimated;
Figure 15 is the coordinate curve illustrating that the focusing position in pre-automatic focus operation is estimated;
Figure 16 is the process flow diagram of detailed pre-automatic focus operation;
Figure 17 illustrates the example driving starting position and the scope of driving in pre-automatic focus operation;
Figure 18 illustrates another example driving starting position and the scope of driving in pre-automatic focus operation;
Figure 19 is the process flow diagram of another example of pre-automatic focus operation according to imaging device of the present invention;
Figure 20 is the process flow diagram of detailed pre-automatic focus operation;
Figure 21 is the process flow diagram of detailed pre-automatic focus operation;
Figure 22 is the process flow diagram of detailed pre-automatic focus operation;
Figure 23 is the process flow diagram of detailed pre-automatic focus operation;
Figure 24 is the process flow diagram of another example of detailed pre-automatic focus operation;
Figure 25 A, 25B illustrate the example of the display of LCD photographs pattern;
Figure 26 illustrates the example of screening-mode according to the imaging len of imaging device of the present invention and focal length;
Figure 27 is the process flow diagram of another example of detailed pre-automatic focus operation;
Figure 28 is the process flow diagram that the example that the detailed movement in pre-automatic focus operation detects is shown;
Figure 29 is the process flow diagram that another example that the detailed movement in pre-automatic focus operation detects is shown;
Figure 30 is the process flow diagram that another example that the detailed movement in pre-automatic focus operation detects is shown; With
Figure 31 illustrates the example of focal length according to the imaging len of imaging device of the present invention and drive volume.
Embodiment
Below with reference to accompanying drawing, describe the embodiment according to imaging device of the present invention in detail.
Fig. 1 ~ 3 respectively from above, top, the back side illustrates the outward appearance of the imaging device (such as, digital camera) according to one embodiment of the present invention.In FIG, strobe light unit 3, optical finder 4, remote control optical receiver 6, the barrel unit (imaging len) 7 that comprises zoom lens and condenser lens are provided in before the camera body CB as the shell of imaging device.The lid of storage card/battery chamber 2 is provided in the side of camera body CB.
As shown in Figure 2, the top surface of camera body CB provides release-push SW1, mode dial SW2 and secondary liquid crystal display (LCD) 1.
In figure 3, optical finder 4, automatic focus light emitting diode (LED) 8, stroboscopic LED 9, LCD 10, power switch SW13, wide-angle zoom interrupteur SW 3, burnt Zoom switch SW4 far away, automatic timer set/reset interrupteur SW 5, menu switch SW6, upwards/flashlamp interrupteur SW 7, to the right interrupteur SW 8, display switch SW9, downwards/macro switch SW10, left/image inspection interrupteur SW 11, confirmation interrupteur SW 12 and fast access interrupteur SW 13 are provided in after camera body CB.
Following reference Fig. 4 describes the functional block according to the imaging device of one embodiment of the present invention.The operation (function) of imaging device is controlled by the processor 104 as digital signal processing integrated circuit (IC).Processor 104 comprises the first charge-coupled image sensor (CCD1) signal transacting block the 1041, the 2nd CCD (CCD2) signal transacting block 1042, CPU block 1043, local SRAM (static random accessmemory, static RAM) 1044, USB (universal serial bus, USB (universal serial bus)) block 1045, serial block 1046, JPEG encoding and decoding block 1047, size readjustment block 1048, TV signal displaying block 1049 and memory card controller block 10410.These blocks are connected to each other by bus.
SDRAM 103 (synchronous random access memory, synchronous RAM), that the ROM 108 of RAM 107, internal storage 120, storage control program is provided in processor 104 is outside, and is connected to processor 104 via bus.SDRAM 103 stores the RAW-RGB view data of the subject image of catching, YUV image data and jpeg image data (they are called view data uniformly).
Barrel unit 7 comprises the varifocal optical system 71 with zoom lens 71a, the Focused Optical system 72 with condenser lens 72a, has the aperture diaphragm unit 73 of aperture diaphragm 73a and have the mechanical shutter unit 74 of mechanical shutter 74a.Optical focal distance setting system 71, optical focusing system 72, aperture diaphragm unit 73 and mechanical shutter unit 74 are driven by zoom motor 71b, focus motor 72b, aperture diaphragm motor 73b and mechanical shutter motor 74b respectively.These motors are driven by the motor driver 705 of the CPU block 1043 being controlled by processor 104.Zoom motor 71b and focus motor 72b wants mobile imaging lens.
Subject image is focused on the imaging len on the imaging surface of CCD 101 by zoom lens 71a and condenser lens 72a formation.CCD 101 is in order to subject image is converted to electrical picture signal and picture signal is exported to the imageing sensor of F/E (front end)-IC 102.F/E-IC 102 comprises correlated-double-sampling (CDS) 1021, automatic gain controller (AGC) 1022 and analog digital (A/D) converter 1023, in order to carry out predetermined process to picture signal respectively.It also comprises the timing sequencer (TG) 1024 that vertical drive (VD) signal and horizontal drive (HD) signal are input to from the first ccd signal processing block 1041 of processor 104.F/F-IC 102 synchronously processes picture signal with the VD/HD signal via TG 1024.
Electrical picture signal from CCD 101 is converted to digital signal by F/E-IC 102, and is exported to the first ccd signal processing block 1041.The signal transacting that first ccd signal processing block 1041 pairs digital signal carries out such as white balance adjusting, γ regulates and so on, and it can be used as view data to be stored in SDRAM 103 and export VD/HD signal.CCD 101, F/E-IC 102, first ccd signal processing block 1041 and CPU block 1043 form the image-generating unit of imaging device.
The CPU block 1043 of processor 104 is configured to the audio recording controlling audio recording circuit 1151.Audio frequency is converted to audio recording signal by microphone 1153, amplifies and be recorded on internal storage 120 through amplifier of microphone 1152.CPU block 1043 also controls the operation of audio reproducing circuit 1161.Audio reproducing circuit 1161 is configured to read voice data from internal storage 120, and utilizes note amplifier 1162 to be amplified to export from loudspeaker 1163.CPU block 1043 also controls in order to the stroboscopic circuit 114 from strobe light unit 3 luminescence, and controls unshowned range cells.
Note, be configured to carry out automatic focus operation (describing after a while) based on the view data obtained via imaging len according to the imaging device of one embodiment of the present invention.Therefore, always do not need range cells to measure the distance of subject, but imaging device can get rid of range cells.Alternately, the range information that range cells obtains can be used in strobe light emission control by stroboscopic circuit 114, or addedly for the focus control based on the view data of catching.
CPU block 1043 is connected to the secondary CPU 109 being arranged in processor 104 outside, and secondary CPU109 controls the display on sub-LCD 1 via lcd driver 111.Secondary CPU 109 is connected with automatic focus LED 8, the LED 9 that glistens, remote control optical receiver 6, the operation push-button unit with interrupteur SW 1 ~ SW13 (Fig. 3) and hummer 113.
USB block 1045 is connected to USB connector 122, and serial block 1046 is connected to RS-232C connector 1232 by serial driver circuit 1231.TV signal displaying block 1049 is connected to LCD 10 via lcd driver 117, and is connected to video jacks 119 via the video amplifier 118.Memory card controller block 10410 is connected to memory card slot (throttle) contact point between 121 and storage card, in order to be electrically connected with storage card when being arranged in groove 121.
When imaging device is arranged on screening-mode by Land use models driver plate SW2, processor 104 via secondary CPU 109 detecting pattern driver plate SW2 setting and control that barrel unit 7 moves on to by motor driver 75 can camera site.Further, power supply is supplied CCD 101, F/E-IC 102, LCD 10 etc. to bring into operation by it.Once energising, start working under viewfmder mode.
Under viewfmder mode, the light from subject is incident on CCD 101 via the imaging len of barrel unit 7, thus is converted to electric signal and as RGB analog signal output to CDS 1021.Then, RGB simulating signal is sent to A/D converter 1023 via AGC 1022, thus is converted to RGB digital signal.Digital signal is presented on LCD 10 or televisor via TV signal displaying block 1049, the video amplifier 118 and video jacks 119.
The RGB digital signal that A/D converter 1023 is changed is converted to the view data of yuv format by the second ccd signal processing block 1042, and is stored in SDRAM 103.Rgb image data, by the suitable process of such as filtering and so on, is converted to YUV image data by the second ccd signal processing block 1042.CPU block 1043 is from SDRAM 103 reads image data and send it to LCD 10 for display.The process being incident to the display LCD 10 from the light from subject repeats with the interval of 1/30 second, and being presented under viewfmder mode to upgrade every 1/30 second on LCD 10.
Present description is according to the AF operation of the imaging device of one embodiment of the present invention and automatic exposure (AE) operation.In AF operation, the AE estimated value of the AF estimated value of the focus level of at least part of view data and instruction depth of exposure is indicated to calculate according to the view data inputing to the first ccd signal processing block 1041 via imaging len.Then, the lens position with maximum AF estimated value is defined as focusing position by CPU block 1043, and drives focus motor 72b so that imaging len is moved on to focusing position.
AF estimated value calculates according to the specific region of the imaging data obtained via imaging len.This specific region is referred to here as AF region.Fig. 5 illustrates the example in the example of the LCD 10 of the imaging device being in viewfmder mode and the pericentral AF region of LCD 10.At the center of screen, with the vertical of composing images data, AF region is set with the level 40% of the sum of horizontal pixel, the size of vertical 30%.The size in AF region does not should be limited to above example, but it at random can be arranged according to AF processing time and AF precision.For larger AF region, automatic focus precision is improved but the AF processing time increases, and for less AF region, automatic focus precision reduces but the AF processing time shortens.
RGB digital signal is divided into multiple region (such as, level 16 × vertical 16) to find the brightness data in each region.The pixel having exceeded predetermined threshold in each region is confirmed as target, and by its added luminance and the number being multiplied by object pixel to calculate AE estimated value.Suitable exposure calculates according to the Luminance Distribution in each region, and for correcting the exposure of next picture frame.
Imaging device comprises multiple screening-mode, and differently arranges AF coverage according to each screening-mode.Such as, AF coverage under normal AF pattern for 1m is to infinite, and for 1cm is to infinite under microspur AF pattern.Land use models driver plate SW2 arranges AF pattern.
First embodiment
Following reference Fig. 6 describes the formation method used according to the imaging device of first embodiment of the invention.Fig. 6 is the process flow diagram of the pre-automatic focus operation in step S10, S20, S30, S40, S50, S60, S70.
According to the first embodiment, pre-automatic focus comprises focusing position before operating in AF operation and estimates, to find focusing position rapidly.When the motion of subject being detected from the view data obtained via imaging len while being just operated under viewfmder mode at imaging device, pre-AF operation starts.
In pre-automatic focus operation, in the scope narrower range in operating than AF, while mobile imaging lens, obtain the view data used in the estimation at focusing position.Such as once press to release-push SW1, then the AF estimated value that AF operation will calculate based on the view data obtained according to the imaging len via movement in whole mobile range determines focusing position.Pre-automatic focus operation is described below in detail.
First, whether motion in step slo detects in (motion detector), carry out about using the view data in AF region (Fig. 5) determination of the motion in subject image to be detected.When continue via imaging len obtain view data between difference exceed predetermined threshold and the actual motion of subject or imaging device not necessarily about time, the motion in subject image detected.
When the motion in subject image not detected ("No" in step S20), repeating motion detects.When the motion in subject image being detected ("Yes" in step S20), the driving starting position of imaging len is determined in the flow process starting position advanced in step S30.In starting position is determined, the driving starting position of imaging len, driving direction and focus search scope (drive volume) are set to obtain the view data obtaining focal position data according to this.Be in pre-self-focusing focus search scope and correspond to the drive volume less than the drive volume in AF operation.
In focal position data acquisition (focal point detector) in step s 40, condenser lens 72a moves to obtain focal position data from the driving starting position arranged with driving direction.
Then, focusing position in step s 50 estimates that in (focal position estimation unit), the level and smooth difference value according to calculating in focal position data acquisition in step s 40 estimates focusing position.
In step S60 (storage of AF value), from the result that focusing position is estimated, obtain AF value, make imaging len will next pre-self-focusing middle focusing position close being stored in SDRAM 103.The instruction of AF value is about the information driving starting position and driving direction.
In step S70, imaging len moves to based on the AF value stored in step S70 and drives starting position, completes pre-automatic focus operation.
Next, each step of pre-automatic focus operation is described in detail.First Describing Motion detects (step S10).Imaging device at predetermined instant from subject Image Acquisition view data.According to single VD signal, drive being preset as the condenser lens 72a of imaging len with predetermined amount.Such as by impulse motor is used for focus motor 72b, predetermined umber of pulse corresponds to lens drive volume.There is provided driving pulse to drive condenser lens 72a with acting in agreement with the decline of the pulse of VD signal.VD signal pulse when once declining, drive condenser lens 72a again with described predetermined drive volume.Thus, synchronously condenser lens 72a is driven with VD signal (or frame period).
Fig. 7 is the sequential chart of VD signal, the driver' s timing of the condenser lens synchronous with VD signal, discharge pulse (SUB) sequential of electroshutter and exposure time series.As shown in Figure 7, once generate single VD signal, then generate two pulses driving condenser lens 72a, and with the amount mobile focusing lens 72a corresponding with two driving pulses.In addition, by being triggered by VD signal, generate discharge pulse (subpulse) in the schedule time, and according to the number of subpulse from CCD 101 by charge discharge to carry out exposing operation.By exposing operation, catch subject image as view data.The number of driving pulse is variable along with focal length and condenser lens amount of movement (driving scope).
During motion in step slo detects, synchronously continue to obtain view data with VD signal as above, and view data is stored in the unshowned memory buffer of SDRAM 103.Based on the integrated results of luminance difference, detected the motion in subject image by the view data and present image comparing storage.Such as, calculate the luminance difference between the view data of last stored in memory buffer and the view data of current acquisition, and cover the view data stored in memory buffer with current image date.Thus, the view data repetition difference operation that subsequent time obtains is used in.
Difference operation is the difference of the brightness of the neighbor wanted among each pixel comprising view data in integrative levels and vertical direction, the result that result and moment before obtain is compared, and the difference in combined horizontal direction and vertical direction is to calculate the detection estimated value Q that moves.Motion detects estimated value Q and calculates in the moment generating VD signal.
Arithmetic expression (1) for difference operation during motion detects is as follows, and H (v) is the result of the integration of the luminance difference of neighbor in current time horizontal direction:
H ( v ) = Σ i = Hstart m - 1 | D ( i , v ) - D ( i , v + 1 ) | - - - ( 1 )
Wherein, D (i, v) is the coordinate of pixel in AF region, and Hstart is the horizontal starting position in AF region, and m is the horizontal extent in AF region.
Arithmetic expression (2) is as follows.V (h) is the result of the integration of the luminance difference of neighbor in current time vertical direction.
V ( h ) = Σ j = Vstart n - 1 | D ( h , j ) - D ( h + 1 , j ) | - - - ( 2 )
Wherein, D (h, j) is the coordinate of pixel in AF region, and Vstart is the horizontal starting position in AF region, and n is the vertical range in AF region.
The H ' (v) calculated in moment before and summation Q (t) of the result of V ' (h) and the result of H (v) and V (h) is represented by following expression formula (3).
Q ( t ) = Σ v = Vstart n - 1 | H ( v ) - H ′ ( v ) | + Σ h = Hstart m - 1 | V ( h ) - V ′ ( h ) | - - - ( 3 )
When the motion estimated values Q (t) calculated be predetermined threshold or larger time, determine that motion in subject image detects ("Yes" in step S20).
May exist utilize the difference of the view data of above expression formula under bright or dark condition or view data that camera shake causes change and exceed the situation of predetermined threshold.Therefore in order to avoid wrong motion detects, preferably threshold value is set to admissible value.
Be configured to pass above motion according to the imaging device of first embodiment of the invention and detect the motion determining subject.But the present invention should not be limited to this.Other detection method such as utilizing histogrammic difference to extract and so on can be used.In addition, can according to the difference calculating optical stream (optical flow) in view data, as long as processing speed calculates enough soon for reply.
The following starting position described in step S30 with reference to Fig. 8 is determined.First, in step S301, carry out about whether having performed pre-automatic focus operation and store the determination of AF value in SDRAM 103.When storing AF value ("Yes" in step S301), in step s 302, AF value is read to arrange the driving starting position of condenser lens 72a based on AF value from SDRAM 103.
When not storing AF value in the predetermined storage area of SDRAM 103 ("No" in step S301), in step S303, the driving starting position of condenser lens 72a is set to the position of moving scheduled volume in infinity from current location, and it is driven into nearside from infinite side.This be due to imaging len is moved to from infinite side nearside for obtain focusing position data estimator based on AF estimated value.
In step s 304, driving direction and focus search scope are set based on the AF value read, and in step S305, condenser lens 72a are moved to and drives starting position, complete and drive starting position to determine (S30).
The following focal position data acquisition described with reference to Fig. 9 in step S40.Focal position data are the level and smooth difference value calculated from AF estimated value.First, the calculating according to the sliding difference value of AF estimated value peace in the imaging device of first embodiment of the invention is described.
The level and smooth difference value Y [0] being in the condenser lens 72a of current location is calculated by following expression formula (4).
Y [ O ] = ( Σ i = 0 a ( x [ i ] - x [ - i ] ) × bi ) - - - ( 4 )
Wherein, X [0] is the AF estimated value calculated according to the view data via the condenser lens 72a being in current location, X [-i] is the AF estimated value calculated according to the view data individual as " i " before current image date, X [i] is the AF estimated value calculated according to the view data individual as " i " after current image date, and bi is the coefficient of AF estimated value (X [-i] ~ X [i]).
Level and smooth difference value Y [0] is calculated, Y [0]=(X [1]-X [-1]) × 1+ (X [2]-X [-2]) × 2+ (X [3]-X [-3]) × 3 for three AF estimated values before and after the current AF estimated value X [0] of use and current AF estimated value.Here, weighting coefficient (bi=1,2,3...) along with AF estimated value closer to currency X [0] (such as, X [1]) time be set to smaller value, and be set to greater value at it further from time currency X [0] (such as, X [3]).Accordingly, calculate with larger coefficient bi the AF estimated value being associated lower with currency X [0].Concrete coefficient value is not limited to example above.
The peaceful relation slided between difference value of driving of condenser lens 72a is described with reference to Figure 10 A, 10B and Figure 11 A, 11B.In the coordinate curve of 10A, 10B and Figure 11 A, 11B, abscissa axis represents that condenser lens 72a's always can drive scope, and axis of ordinates represents the AF estimated value calculated according to the view data obtained at each lens position.
Figure 10 A, 10B are the example of the AF estimated value of the subject image of catching with the different light quantities of LV 10 and LV 8 respectively.Figure 10 B illustrates the AF estimated value calculated according to the view data of catching under darker condition (less light quantity).For less light quantity, the luminance difference of the neighbor of subject image will be very little and be greatly subject to noise effect.As a result, AF estimated value will depart from like that as shown in Figure 10 B and show multiple peak value.According to the coordinate curve illustrated in Figure 10 of single peak value (maximal value), peak value easily can be judged to be focusing position, and according to the coordinate curve illustrated in Figure 10 B of multiple peak value, focusing position can not be judged simply.
Figure 11 A illustrates the example of the change of the level and smooth difference value calculated according to the AF estimated value in Figure 10 A, and Figure 11 B illustrates the example of the change of the level and smooth difference value calculated according to the AF estimated value in Figure 10 B.As shown therein, level and smooth difference value moves along with condenser lens 72a and increases, and it is just reversed to from negative when exceeding certain point.The AF estimated value obtained at rollback point is maximum, and the lens position that maximum AF estimated value is corresponding therewith will become focusing position.In other words, the lens position had as the level and smooth difference value of zero is focusing position.Thus, by using level and smooth difference, though as shown in Figure 10 B such have in AF estimated value when deviating from also accurately can determine focusing position.
Referring back to Fig. 9, in step S401, the detection of the decline of flow waits VD signal.Once decline be detected, then in step S402 according in order to drive the predetermined pulse rate (pulserate) of condenser lens 72a to drive focus motor 72b, and in step S403, obtain view data to calculate AF estimated value.
In step s 404, the determination of the default end position whether having reached focus search scope about condenser lens 72a is carried out.Repeat step S401 ~ S404, until condenser lens 72a reaches end position ("No" in step S404).When condenser lens 72a is in end position ("Yes" in step S404), in step S405, use the AF estimated value of acquisition to calculate level and smooth difference value.
The focusing position described in detail in step S50 referring now to Figure 12 is estimated.First, the estimation of focusing position is described roughly.When focus search scope is set to condenser lens 72a always can drive scope time, the AF estimated value wherein calculated comprises maximal value as shown in fig. 13, and as illustrated in fig. 14 the lens position corresponding with the rollback point of level and smooth difference value is defined as focusing position.
But, in pre-automatic focus, not change this little driving scope of the condenser lens at visual angle to arrange focus search scope, so as its not always can with this little driving scope to determine according to AF estimated value calculate the rollback point of level and smooth difference value.That is, in step s 40, level and smooth difference value is calculated in any one of region A, B, C in fig .15.
Then, during focusing position is in step s 50 estimated, first, the determination (step S501) of value about whether existing among the level and smooth difference value calculated close to zero is carried out.When existence one ("Yes" in step S502), flow process advances to step S503 (all boundary values are determined).
Whether the level and smooth difference value of the value surrounding that all boundary values in step S503 are determined to determine close to zero declines monotonously relative to the value close to zero or rises.Definitely, when close to zero value before the level and smooth difference value that obtains be less than value close to zero and close to zero value after the level and smooth difference value that obtains be greater than the value close to zero time, determine that level and smooth difference value rises monotonously.On the contrary, when close to zero value before the level and smooth difference value that obtains be greater than value close to zero and close to zero value after the level and smooth difference value that obtains be less than the value close to zero time, determine that level and smooth difference value declines monotonously.
When monotone increasing or the decline of determining level and smooth difference value ("Yes" in S503), the focusing lens positions with maximum AF estimated value must be in level and smooth difference value (value close to zero and close to zero value before and after the value that obtains) corresponding condenser lens 72a position among.That is, think in the region A that these level and smooth difference value are in Figure 15 to make it possible to the focusing position (step S504) estimating condenser lens to be detected.
When there is level and smooth difference value close to zero, ("Yes" in S502) be not and it rises monotonously or declines time ("No" in S503), determine the infeasible of the detection of focusing position, the focusing position in completing steps S509 is estimated.
When the level and smooth difference value not close to zero, the level and smooth difference value according to calculating estimates focusing position.Owing to being arranged in zonule by focus search scope in pre-automatic focus operation as described above, therefore only focusing position can not be specified based on the level and smooth difference value calculated.
When the level and smooth difference value not close to zero ("No" in step S502), carry out in step S505 about the level and smooth difference value calculated be all whether on the occasion of determination.When all values is all positive ("Yes" in step S505), in step S506, can estimate that the level and smooth difference value calculated in the data acquisition step S40 of focal position belongs to the region B in Figure 15.The value calculated is in region B and means do not have focusing position in the focus search scope of condenser lens 72a movement, but it will be obtained by mobile focusing lens 72a on same driving direction.
When finding that all level and smooth difference value calculated are all negative value ("Yes" in step S507), the value that can obtain in estimating step S40 in step S508 belongs to the region C in Figure 15.This means there is no focusing position in the focus search scope of condenser lens 72a movement, but focusing position to obtain by going up mobile focusing lens 72a in the opposite direction in front wheel driving side with it.
When there is not the level and smooth difference value close to zero ("No" in S502) or and the value that calculates of not all be all on the occasion of ("No" in step 505) or be all negative value ("No" in step 507) when, the estimation of focusing position is infeasible, thus focusing position is estimated to terminate in step S509.The AF value that flow process advances in step S60 stores.
The following AF value described in detail in step S60 with reference to Figure 16 stores.When estimating that focusing position is in region A in step s 50 ("Yes" in step S601), in step S602, driving starting position is set to scheduled volume from focusing position to the position of infinite side movement, and is stored in SDRAM 103.Then, in step S603, focus search scope be set to the twice of the scope be generally arranged in pre-automatic focus and be stored in SDRAM 103.The driving starting position stored and focus search scope are used for the AF value that will read from SDRAM 103 in next pre-automatic focus operation or AF operation.Focus search scope can change along with the condition of such as focal length, environmental baseline (bright or dark) and so on.According to this condition, driving scope change (driving scope to change unit) can be carried out to change focused search region.This makes it possible in the limited focus search scope comprising focusing position, carry out next pre-automatic focus operation, increases the speed of AF operation.
When estimating that focusing position is not in region A ("No" in step S601) but is in region B ("No" in step S604), in step s 605, the end position of focus search scope in step S40 be set to next driving starting position of condenser lens 72a and be stored in SDRAM 103.In step S606, driving direction will from infinite side to nearside, and focus search scope is set to the preset range relative to driving starting position, and they are also stored in SDRAM 103.
As shown in Figure 17, for the level and smooth difference value in the B of region, focusing position does not appear in the region of the infinite side of pre-autofocus area (focused search region), but from pre-self-focusing end position in the region of nearside.Accordingly, condenser lens 72a can drive from it having been completed to the end position of the focus search scope that focusing position is estimated.This can limit the moving range of condenser lens 72a in AF operation, causes the speed increasing AF operation.
When estimating that focusing position is in region C when ("Yes" in step S607) instead of region A ("No" in step S601) and region B ("No" in step S604), in step S608, the starting position of focus search scope in step S40 be set to next driving starting position of condenser lens 72a and be stored in SDRAM 103.Focus search scope for nearside is to infinite side, and in step S609, will be set to the preset range of the driving starting position relative to infinite side, and they be also stored in SDRAM 103 by driving direction.
As shown in Figure 18, for the level and smooth difference value in the C of region, focusing position does not appear in the region of the nearside of pre-autofocus area, but from pre-self-focusing starting position in the region of infinite side.Accordingly, condenser lens 72a can be driven into infinite side from it having been completed to the starting position of the focus search scope that focusing position is estimated.This can limit the moving range of condenser lens 72a in AF operation, causes the speed increasing AF operation.
Further, when estimating any one that focusing position is not in region A, B, C in step S601, S604, S607, determine that the estimation of focusing position is infeasible.In step 6010, next driving starting position of condenser lens 72a is set to infinite end, and in step S6011, focus search scope is set to the whole of condenser lens 72a and drives scope.
As mentioned above, even if when detecting focusing position at once, the imaging device according to the first embodiment also can by repeating pre-automatic focus operation gradually near focusing position.Accordingly, owing to limiting the actual driving scope (focus search scope) of imaging len in advance, therefore can determine the focusing position in AF operation rapidly, described AF operates in and starts by release-push SW once half, and wherein drives imaging len with whole driving scope.
Second embodiment
Following reference Figure 19 describes the formation method used according to the imaging device of second embodiment of the invention.Figure 19 is the process flow diagram of pre-automatic focus operation.In the present embodiment, give the step identical with those steps in the first embodiment by attached for identical numbering, and its description is omitted.
First, during motion in step slo detects, carry out about whether using the view data in AF region (Fig. 5) to detect the determination of the motion in subject image.When the motion in subject image not detected ("No" in step S20), complete pre-automatic focus operation.Imaging device is always ready to perform pre-automatic focus during screening-mode, makes it once not detect that motion does not carry out pre-automatic focus, and once detect that motion restarts pre-automatic focus.
When the motion in subject image being detected ("Yes" in step S20), the driving starting position of pre-self-focusing imaging len is determined in the flow process starting position advanced in step S30.In step S40 (focal position data acquisition), condenser lens 72a drives starting position to move to obtain focal position data set by from focus search scope.Then, in step S50 (focusing position estimation), the level and smooth difference value according to calculating in focal position data acquisition in step s 40 estimates focusing position.In step S60 (storage of AF value), AF value is stored in SDRAM 103.The instruction of AF value is about the driving starting position in focused search region and the information of driving direction.
In step S80, carry out the determination whether comprising focusing position in the focus search scope about focal position data acquisition (S40).When focusing position is in focus search scope ("No" in step S80), imaging len moves to focusing position, the pre-automatic focus operation in completing steps S120.When focusing position is not in focus search scope ("Yes" in step S80), the AF value stored from SDRAM 103 read step S60, from focus search scope and the driving starting position calculating AF estimated value of the instruction of AF value, and again obtain focal position data in step S90.Focal position data in step S90 obtain with step S40 similar again.
In the step s 100, based on the focal position data determination focusing position obtained.In step s 110, AF value is determined based on focusing position and is stored in SDRAM 103.Then, in the step s 120, imaging len moves to according to AF value and drives starting position, thus completes pre-automatic focus operation.
The focusing position described in detail in step S100 referring now to Figure 20 is determined.In step S101, carry out the determination of level and smooth difference value about whether existing among the level and smooth difference value calculated in step S90 close to zero.When finding the value close to zero ("Yes" in step S102), carrying out all boundary values in step s 103 and determining.
Whether all boundary values in step S103 are determined to determine decline monotonously relative to the value close to zero around the level and smooth difference value of the value close to zero or rise.Definitely, when close to zero value before the level and smooth difference value that obtains be less than value close to zero and close to zero value after the level and smooth difference value that obtains be greater than the value close to zero time, determine that level and smooth difference value rises monotonously.On the contrary, when close to zero value before the level and smooth difference value that obtains be greater than value close to zero and close to zero value after the level and smooth difference value that obtains be less than the value close to zero time, determine that level and smooth difference value declines monotonously.
When determining the monotone increasing of level and smooth difference value or declining ("Yes" in S103), in step S104, determine the existence of focusing position, complete and determine process.When the level and smooth difference value ("No" in S102) not close to zero time ("No" in S103) and it does not rise monotonously or declines, is determined not existing of focusing position, completed and determine process in step S105.
The following AF value described in step S110 with reference to Figure 21 stores.First, when determining focusing position in the step s 100 ("Yes" in step S111), in step S113, the driving starting position of imaging len is set to the position of moving scheduled volume from focusing position, and AF value focus search scope being set to predetermined scope twice is stored in the predetermined storage area of SDRAM 103.In the present embodiment, by way of example, the focus search scope doubled is set to and comprises focusing position, and be stored as the AF value that will be used for next pre-automatic focus or AF operation.Focusing position is preferably variable along with focal length or environmental baseline (dark or bright) etc.Its scope should be not limited to the twice generally (making a reservation for) scope.Thus, due to the limited movement range of condenser lens 72a, can operate by process next AF in more speed ground.
Focusing position ("No" in step 111) is not determined in focusing in the step s 100 in determining when, determine that the determination of focusing position is infeasible.Then, the driving starting position of imaging len was set to focusing (over-focus) position, and focus search scope will be set to the whole predetermined storage area (step S114, S115) driving the AF value of scope to be stored in SDRAM 103 of imaging len.Driving starting position was set to focal position to make not carry out AF operation when can not judge focusing position in step S111, this is due to when catching subject with low contrast, even if after superincumbent pre-automatic focus, in AF operation, also focusing position may not be determined.
3rd embodiment
The Figure 22 of the process flow diagram that following reference operates as pre-automatic focus, describes the formation method used according to the imaging device of third embodiment of the invention.In the present embodiment, give the step identical with those steps in the first embodiment by attached for identical numbering, and its description is omitted.
With reference to Figure 22, during motion in step slo detects, carry out the determination about the motion in subject image whether being detected.When the motion in subject image not detected ("No" in step S21), complete pre-automatic focus operation.Imaging device is always ready to perform pre-automatic focus during screening-mode, makes it once not detect that motion does not carry out pre-automatic focus, and once detect that motion restarts pre-automatic focus.
When the motion in subject image being detected ("Yes" in step S21), the driving starting position of pre-self-focusing imaging len is determined in the flow process starting position advanced in step S31.In step S40 (focal position data acquisition), condenser lens 72a moves to obtain focal position data from the driving starting position of the setting region of search.Then, in step S50 (focusing position estimation), the level and smooth difference value according to calculating in focal position data acquisition in step s 40 estimates focusing position.In step S60, from the focusing position estimated, obtain AF value, and be stored in SDRAM 103.In step S70, based on the AF value stored, imaging len moves to and drives starting position, completes pre-automatic focus.
Describe in detail with reference to Figure 23 and determine (step S31) as the starting position of the feature of present embodiment.First, in step S311, focus search scope is set according to the screening-mode arranged and the position of the zoom lens 71a forming imaging len.
Then, in step S312, carry out self-focusingly in advance determining about whether having performed according to the presence or absence of AF value stored in SDRAM 103.Carrying out ("Yes" in step S312) in pre-self-focusing situation, in step S313, read AF value to set driving starting position and the focusing position of condenser lens 72a from SDRAM 103.
Not yet carrying out ("No" in step S312) in pre-self-focusing situation, in step S315, by the position driving starting position to be set to the current location scheduled volume of distance condenser lens 72a in infinity.This is due to once partly by both the pre-automatic focus operation started to release-push SW1 and AF operation, is set to by imaging len be driven into nearside from infinite.
In step S314, condenser lens 72a is moved on to and drives starting position, complete starting position and determine.
Next, the focus search scope described in detail in step S311 with reference to Figure 24 is arranged.Focus search scope is preferably even as big as calculating the AF estimated value of level and smooth difference and narrower than AF operation.
Whether in step S3111, carrying out is the determination of normal mode about the screening-mode arranged.During the viewfmder mode of imaging device according to the present embodiment, screening-mode becomes macro mode by pressing macro switch SW10.User visually can check the screening-mode of the setting on LCD 10.By way of example, Figure 25 A illustrates the normal mode of setting, and at top right-hand side display word " normally " of LCD 10, Figure 25 B illustrates the macro mode of setting, it shows word " microspur ".
Referring now to Figure 26, the difference of the driving resolution of imaging len between macro mode and normal mode is described.Figure 26 illustrate when in focus search scope with identical pulsation rate drive imaging len time, the example of the driven scope of imaging len under normal mode and macro mode.In the accompanying drawings as on the ordinate of focal length, the focus search scope D under normal mode is from 0.5m to infinite, and the focus search scope E under macro mode is from 7cm ~ 12cm.In this narrow focus search scope, even if the very small motion of subject can both make subject depart from visual angle.Accordingly, compared under normal mode, need under macro mode, arrange wider focus search scope.
Referring back to Figure 24, under normal mode ("Yes" in step S3111), in step S3112, in focus search scope, driving pulse rate is set to 16 pulses, and under macro mode ("No" in step S3111), in step S3113, be set to 24 pulses.
Usually, the driving resolution of condenser lens 72a ratio in microspur region is trickleer in normal region.Therefore, when driving condenser lens with the pulsation rate identical with the pulsation rate under normal mode in focus search scope under macro mode, the region of the AF estimated value that can calculate narrows, and makes to carry out focusing fully and determines.Therefore, in order to reliably estimate focusing position, under macro mode, increase driving pulse rate, think focusing position estimate obtain level and smooth difference value based on AF estimated value.
In step S3114, such as, counting based on the scrambler of zoom lens 71a carries out the determination whether being in burnt position far away about focal length.When focal length is in burnt position far away ("Yes" in step S3114), focus search scope is set to the twice of preset distance in step S3115 and S3116.So do is because the driven scope of condenser lens 72a is different in wide-angle side or coke side far away.During compared to wide-angle side, scope can be driven wider when condenser lens 72a is in coke side far away.Thus, when condenser lens 72a is in coke side far away, wider focus search scope is set in advance.
Figure 31 illustrates that the drive volume of imaging len and the example of focal length are to illustrate the example of the focusing position of wide-angle side and coke side far away.Imaging device according to the present embodiment, imaging len is the twice of the drive volume in wide-angle side in the drive volume of coke side far away.The difference of drive volume should be not limited to " twice ".Preferably, the coefficient for focus search scope can change according to various focal length.
When focal length is not in burnt position far away ("No" in step S3114), in step S3116, focus search scope is set to preset range, the focus search scope in completing steps S311 is arranged.
4th embodiment
Following reference Figure 27 describes the formation method used according to the imaging device of the disclosure the 4th embodiment.Present embodiment relates to another example that in the 3rd embodiment, focus search scope is arranged.In the present embodiment, give the step identical with those steps in the 3rd embodiment by attached for identical numbering, and its description is omitted.
In step S3111-1, carry out about the current determination whether be in the position be equal to the focal length of 2.5m or larger of condenser lens 72a.The current lens position of condenser lens 72a is converted to focal length by the characteristic based on condenser lens.
When focal length is 2.5m or larger ("Yes" in step S3111-1), once determine that subject is near focal length, then only AF estimated value is obtained for about 2.5m region or distal region, to calculate the level and smooth difference value that focusing position is estimated.In step S3112-1, in focus search scope, driving pulse rate is set to 8 pulses.
When focal length is below 2.5m or 0.5m or more ("Yes" in step S3111-2), in order to widen focus search scope, in step S3112, driving pulse rate is set to 16 pulses.When focal length is 0.5m or less ("No" in step S3111-2), because condenser lens 72a is in nearly scope, therefore in step S3113, driving pulse rate is set to 24 pulses.
In step S3114, such as, counting based on the scrambler of zoom lens 71a carries out the determination whether being in burnt position far away about focal length.The driven scope of condenser lens 72a is different in wide-angle side and coke side far away.During compared to wide-angle side, scope can be driven wider when condenser lens 72a is in coke side far away.In addition, as under macro mode and normal mode, it drives resolution to be different in wide-angle side and coke side far away.The focal length of coke side far away is such as the twice of the focal length of wide-angle side.
Accordingly, when focal length is in coke side far away ("Yes" in step S3114), in step S3115, S3116, focus search scope is set to the twice of preset range.When focal length is not in coke side far away ("No" in step S3114), in step S3116, focus search scope is set to predetermined scope.
5th embodiment
Following description uses the formation method according to the imaging device of fifth embodiment of the invention.Present embodiment relates in the first to the four embodiment another example of moving and detecting.In motion according to the present embodiment detects, the threshold value of the motion for determining subject can be changed according to the position of the zoom lens of imaging device and operator scheme.
In motion detects, by each VD signal acquisition view data (hereinafter referred to as frame), using the difference before calculating between frame and present frame and Output rusults as motion pixel quantity.Only carry out difference calculating in AF region in Figure 5.Motion pixel quantity is positive integer value or negative integer value, and the direction of motion of indicating image.Determine that when the summation of the motion pixel number of each frame exceedes certain value or detection threshold motion detects.Detection threshold can be arbitrary, such as, be 30 pixels.
Once the motion in subject be detected, then the motion pixel quantity integrating (altogether) is reset to zero, and again add the motion pixel quantity of each frame.When the direction of motion of the amount vanishing of motor image prime number or image is from result upset before, the motion pixel quantity integrated is resetted.
How present description changes detection threshold according to the zoom position of imaging len.Zoom magnification changes by wide-angle zoom interrupteur SW 3 and burnt Zoom switch SW4 far away in viewfmder mode.Acquiescently, it is set to wide-angle side.According to zoom position, wide-angle side or burnt end far away, the image on LCD 10 has different greatly.When making camera point to identical subject with identical distance, the image on display looks very large when zoom burnt end far away.Accordingly, at burnt end far away with the mobile camera of identical amount, the mobile pixel quantity calculated is larger compared to wide-angle side.Therefore, in the zoom position of burnt end far away, detection threshold is set to the value larger than wide-angle side.
The motion described in detail according to the present embodiment with reference to Figure 28 detects.In the present embodiment, to burnt end far away, zoom position can be set from wide-angle side with 16 ranks.Once imaging device energising, when equipment is in viewfmder mode, setting in motion detects, and determines current zoom position in step S1001 ~ S1003.When zoom position is in wide-angle side, in step S1004, detection threshold is set to 30.When zoom position is in the 8th rank, in step S1005, detection threshold is set to 45.When zoom position is in burnt end far away, in step S1006, detection threshold is set to 60.
In step S1007, carry out about zoom position whether from the determination that the zoom position of frame before changes.When zoom position changes ("No" in step S1007), in step S1014, total motion pixel quantity is reset to zero, complete operation.When not having to change ("Yes" in step S1007), the difference of the pixel of frame and present frame before calculating in step S1008.When not finding motion pixel ("No" in step S1009), in step S1014, by total motor image prime number reset-to-zero, complete operation.When motion pixel non-zero, in step S1010, carry out direction of motion about imaging len whether from the determination that the direction of motion of frame before changes.When direction of motion changes ("No" in step S1010), in step S1014, by total motion pixel quantity reset-to-zero, complete operation.
Meanwhile, when not changing in direction of motion, in step S1011, motion pixel is added.In step S1012, the determination of the detection threshold of setting before carrying out whether exceeding about total motion pixel quantity.When total quantity exceedes detection threshold ("Yes" in step S1012), in step S1013, start pre-automatic focus.
As mentioned above, imaging device according to the present embodiment, changeably for each zoom position arranges best detection threshold, can make to prevent the too responsive motion causing external power consumption to detect and cause too insensitive motion of the delay in pre-automatic focus operation and poor efficiency to detect.
6th embodiment
Following description uses the formation method according to the imaging device of sixth embodiment of the invention.Present embodiment relates in the May 4th embodiment another example of moving and detecting.In motion detects, the difference before calculating between frame and present frame, and result is exported as motion pixel quantity.Once detect that the motion in subject exceedes detection threshold, then will integrate the motion pixel quantity reset-to-zero of (altogether), and again add the motion pixel quantity of each frame.When the direction of motion of the amount vanishing of motor image prime number or image is from result upset before, the motion pixel quantity integrated is resetted.
During viewfmder mode, forward imaging device to macro mode by pressing macro switch SW10.Under macro mode, can to take subject in the very near scope can not taken in the normal mode.Therefore, even if the small movements of subject also causes the increase of the motion pixel quantity of view data in AF region (Fig. 5).This is identical with the zoom position at burnt end place far away in the 5th embodiment.Preferably compared under normal mode, larger detection threshold is set under macro mode.
The motion described according to the present embodiment referring now to Figure 29 detects.In step S1101, carry out the determination whether be operated in about imaging device under macro mode.Under macro mode, in step S1102, detection threshold is set to 60, and in pattern beyond macro mode, in step S1103, detection threshold is set to 30.
In step S1104, carry out whether there is the determination of the change of screening-mode relative to frame before.In case of a change ("No" in step S1104), in step S1111, by total motion pixel quantity reset-to-zero, complete operation.When not having to change ("No" in step S1104), before calculating in step S1105 frame and present frame pixel in difference.When not finding motion pixel ("No" in step S1106), in step S1111, by total motion pixel quantity reset-to-zero, complete operation.When motion pixel non-zero, in step S1107, carry out direction of motion about imaging len whether from the determination that the direction of motion of frame before changes.When direction of motion changes ("No" in step S1107), in step S1111, by total motion pixel quantity reset-to-zero, complete operation.
Meanwhile, when direction of motion does not change, in step S1108, add motion pixel.In step S1109, the determination of the detection threshold of setting before carrying out whether exceeding about total motion pixel quantity.When total quantity exceedes detection threshold ("Yes" in step S1109), in step S1110, start pre-automatic focus.
As mentioned above, imaging device according to the present embodiment, for each zoom position, optimal detection threshold can be set changeably, make to prevent the too responsive motion causing external power consumption to detect and cause too insensitive motion of the delay in pre-automatic focus operation and poor efficiency to detect.
7th embodiment
Following description uses the formation method according to the imaging device of seventh embodiment of the invention.Present embodiment relates in the 5th embodiment another example of moving and detecting.In motion detects, the difference before calculating between frame and present frame, and result is exported as motion pixel quantity.Once detect that the motion in subject exceedes detection threshold, then will integrate the motion pixel quantity reset-to-zero of (altogether), and again add the motion pixel quantity of each frame.When the direction of motion of the amount vanishing of motor image prime number or image is from result upset before, the motion pixel quantity integrated is resetted.
During viewfmder mode, imaging device can be gone to can the special scenes pattern selected from multiple scene mode of Land use models driver plate SW2.Scene mode is selected to make it possible to easily arrange acquisition parameters (f-number, white balance etc.) according to shooting condition.
Such as, when Land use models driver plate SW2 selects physical culture pattern, subject is likely constantly moved.The precise moments that lower detection threshold is wanted with focused user to catch preferably is set.In addition, under landscape configuration, desired subject is at a distance and moves hardly.Even if because the motion pixel quantity expected is also very little when camera moves, therefore preferably arrange less detection threshold (being 15 pixels in the present embodiment).
The motion described according to the present embodiment with reference to Figure 30 detects.In step S1201, carry out the determination whether be operated in about imaging device under scene mode.In pattern beyond scene mode, in step S1204, detection threshold is set to 30.In scene mode ("Yes" in step S1201), determine the type of scene mode.When selecting physical culture pattern ("Yes" in step S1202), in step S1205, detection threshold is set to 15.When selecting landscape configuration, in step S1206, detection threshold is set to 15.
In step S1207, carry out whether there is the determination of the change of scene mode about relative to frame before.In case of a change ("No" in S1207), in step S1214, by total motion pixel quantity reset-to-zero, complete operation.When not having to change ("Yes" in step S1207), the difference of the pixel of frame and present frame before calculating in step S1208.When not finding motion pixel ("No" in step S1209), in step S1214, by total motion pixel quantity reset-to-zero, complete operation.When motion pixel non-zero, in step S1210, carry out direction of motion about imaging len whether from the determination that the direction of motion of frame before changes.When direction of motion changes ("No" in step S1210), in step S1214, by total motion pixel quantity reset-to-zero, complete operation.
Meanwhile, when direction of motion does not change ("Yes" in step S1210), in step S1211, motion pixel is added.In step S1212, the determination of the detection threshold of setting before carrying out whether exceeding about total motion pixel quantity.When total quantity exceedes detection threshold ("Yes" in step S1212), in step S1213, start pre-automatic focus.
As mentioned above, imaging device according to the present embodiment, for each zoom position, optimal detection threshold can be set changeably, make to prevent the too responsive motion causing external power consumption to detect and cause too insensitive motion of the delay in pre-automatic focus operation and poor efficiency to detect.
In addition, focusing position can estimated once the motion detecting subject with very little focus search scope according to imaging device any one in above-mentioned embodiment, and carry out AF operation rapidly by arranging the driving starting position of imaging len for next focus detection.
In addition, focus search scope can changed to estimate focusing position once the motion detecting subject according to the position of imaging len according to imaging device any one in above embodiment, and carry out AF operation rapidly by arranging the driving starting position of imaging len for next focus detection.
The present invention can be applicable to the imaging device with camera function installed in handheld device and the formation method using this imaging device.
Although describe the present invention with regard to illustrative embodiments, but it is not limited thereto.Should be appreciated that those skilled in the art can when do not depart from as described in claims limit the scope of the invention, modification is carried out to described embodiment.

Claims (12)

1. an imaging device, comprises:
Imaging len;
Image-generating unit, its optical imagery based on the subject received via imaging len obtains view data;
It is characterized in that, described imaging device comprises further:
Motion detector, it detects the motion of subject from the view data that image-generating unit obtains successively;
Focal point detector, it, when the motion of subject being detected when motion detector, calculates focal position data based on the view data obtained via imaging len; And
Focusing position estimation unit, it estimates focusing position based on the focal position data calculated;
Wherein, whether described motion detector exceedes by the difference between the view data that obtains successively described in determining the motion that predetermined threshold detects subject.
2. imaging device as claimed in claim 1, wherein
Focusing position estimation unit arranges at least one in the driving starting position of imaging len and driving direction, to make imaging len near focusing position based on focal position data.
3. imaging device as claimed in claim 1, wherein
Focal position data are results of the level and smooth difference operation based on the AF estimated value calculated according to view data.
4. imaging device as claimed in claim 3, wherein
AF estimated value is obtained by the difference of the brightness integrating the neighbor of composing images data.
5. imaging device as claimed in claim 3, wherein
The summation of the value obtained is integrated in the weighting that level and smooth difference operation will calculate the difference of the AF estimated value by neighbor; And
The weighting coefficient that weighting uses in integrating is arranged so that the difference in AF estimated value is larger, then weighting coefficient is larger.
6. imaging device as claimed in claim 2, wherein
Focusing position estimation unit is configured to, after estimation focusing position, be moved to by imaging len and drive starting position.
7. imaging device as claimed in claim 1, wherein
When being unable to estimate focusing position, focusing position estimation unit is configured to the driving starting position of change imaging len and again calculates focal position data.
8. imaging device as claimed in claim 1, wherein
Focal point detector comprises driving scope and changes unit, and it is configured to the driving scope changing imaging len according to predetermined condition.
9. imaging device as claimed in claim 8, wherein
Described predetermined condition is the position of the imaging len of focal point detector when starting working.
10. imaging device as claimed in claim 8, wherein
Described predetermined condition is the screening-mode of focal point detector when starting working.
11. imaging devices as claimed in claim 8, wherein
Described predetermined condition is the focal length of focal point detector when starting working.
The formation method that 12. 1 kinds of imaging devices use, described imaging device comprises: imaging len; Image-generating unit, its optical imagery based on the subject received via imaging len obtains view data;
It is characterized in that, described imaging device comprises further: motion detector, and it detects the motion of subject from the view data that image-generating unit obtains successively; Focal point detector, it, when motion detector detects the motion of subject, calculates focal position data based on the view data obtained via imaging len; And focusing position estimation unit, it estimates focusing position based on the focal position data calculated,
Described method comprises following steps:
Motion detector is utilized to detect the motion of subject;
The motion of the subject that focal point detector detects according to motion detector, drives imaging len from precalculated position scheduled volume to obtain view data in a predetermined direction, and obtains focal position data based on obtained view data; And
Focusing position estimation unit estimates the focusing position of imaging len based on focal position data;
Wherein, whether described motion detector exceedes by the difference between the view data that obtains successively described in determining the motion that predetermined threshold detects subject.
CN201080038527.3A 2009-07-23 2010-07-22 Imaging device and imaging method Expired - Fee Related CN102483508B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2009172558 2009-07-23
JP2009-172558 2009-07-23
JP2010045936A JP5564996B2 (en) 2009-07-23 2010-03-02 Imaging apparatus and imaging method
JP2010-045936 2010-03-02
PCT/JP2010/062737 WO2011010745A1 (en) 2009-07-23 2010-07-22 Imaging device and imaging method

Publications (2)

Publication Number Publication Date
CN102483508A CN102483508A (en) 2012-05-30
CN102483508B true CN102483508B (en) 2015-06-24

Family

ID=43499216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201080038527.3A Expired - Fee Related CN102483508B (en) 2009-07-23 2010-07-22 Imaging device and imaging method

Country Status (5)

Country Link
US (1) US9332173B2 (en)
EP (1) EP2457118B8 (en)
JP (1) JP5564996B2 (en)
CN (1) CN102483508B (en)
WO (1) WO2011010745A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9395512B2 (en) * 2010-07-14 2016-07-19 Lg Electronics Inc. Auto focus device and a method therefor
JP5857610B2 (en) 2011-10-13 2016-02-10 株式会社リコー Imaging apparatus and imaging method
US9294667B2 (en) * 2012-03-10 2016-03-22 Digitaloptics Corporation MEMS auto focus miniature camera module with fixed and movable lens groups
US20130293765A1 (en) * 2012-03-10 2013-11-07 Digitaloptics Corporation MEMS Auto Focus Miniature Camera Module with Abutting Registration
JP6095280B2 (en) * 2012-05-30 2017-03-15 キヤノン株式会社 Projection type display device and control method thereof
JP6057608B2 (en) * 2012-08-20 2017-01-11 キヤノン株式会社 Focus adjustment apparatus and control method thereof
CN104104876B (en) * 2014-07-25 2017-09-15 广东欧珀移动通信有限公司 A kind of method of rapid focus, device and mobile device
JP2016225799A (en) * 2015-05-29 2016-12-28 オリンパス株式会社 Imaging apparatus and method for controlling imaging apparatus
US9883097B2 (en) * 2016-06-03 2018-01-30 Google Inc. Optical flow based auto focus
CN106131415B (en) * 2016-07-19 2020-02-11 Oppo广东移动通信有限公司 Two-dimensional code image scanning method and device and mobile terminal
CN107124556B (en) * 2017-05-31 2021-03-02 Oppo广东移动通信有限公司 Focusing method, focusing device, computer readable storage medium and mobile terminal
CN108600638B (en) * 2018-06-22 2020-08-04 中国计量大学 Automatic focusing system and method for camera
CN111147732B (en) * 2018-11-06 2021-07-20 浙江宇视科技有限公司 Focusing curve establishing method and device
CN109813866B (en) * 2019-01-24 2021-08-17 中南大学 Method for measuring matrix potential of unsaturated frozen soil
CN109963083B (en) 2019-04-10 2021-09-24 Oppo广东移动通信有限公司 Image processor, image processing method, photographing device, and electronic apparatus
CN111464742B (en) * 2020-04-09 2022-03-22 中山易美杰智能科技有限公司 Imaging system and method capable of continuously photographing without causing motion blur
CN112217991B (en) * 2020-09-28 2022-02-15 北京环境特性研究所 Image acquisition device, focus adjusting device and focus adjusting method

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3851027B2 (en) 1999-08-27 2006-11-29 株式会社リコー Autofocus device and camera
JP3728241B2 (en) * 2001-12-20 2005-12-21 キヤノン株式会社 Focus adjustment apparatus, imaging apparatus, focusing method, program, and storage medium
JP5051812B2 (en) * 2004-01-14 2012-10-17 株式会社リコー Imaging apparatus, focusing method thereof, and recording medium
JP4618019B2 (en) 2005-06-30 2011-01-26 株式会社日立製作所 Image recording / playback device
JP2006091915A (en) * 2005-12-01 2006-04-06 Konica Minolta Photo Imaging Inc Imaging apparatus
US7725018B2 (en) * 2006-08-01 2010-05-25 Canon Kabushiki Kaisha Focus control apparatus, image sensing apparatus and focus control method
JP4827662B2 (en) 2006-08-31 2011-11-30 Hoya株式会社 Camera with focus adjustment device
JP2008233156A (en) 2007-03-16 2008-10-02 Ricoh Co Ltd Method for controlling imaging apparatus and imaging apparatus
JP2008242226A (en) * 2007-03-28 2008-10-09 Fujifilm Corp Photographing device and focusing control method of photographic lens
JP4979507B2 (en) * 2007-07-05 2012-07-18 株式会社リコー Imaging apparatus and imaging method
JP4483930B2 (en) * 2007-11-05 2010-06-16 ソニー株式会社 Imaging apparatus, control method thereof, and program
JP5043736B2 (en) * 2008-03-28 2012-10-10 キヤノン株式会社 Imaging apparatus and control method thereof

Also Published As

Publication number Publication date
CN102483508A (en) 2012-05-30
JP5564996B2 (en) 2014-08-06
EP2457118B1 (en) 2016-03-30
WO2011010745A1 (en) 2011-01-27
EP2457118A1 (en) 2012-05-30
EP2457118B8 (en) 2016-09-14
EP2457118A4 (en) 2012-05-30
US20120105710A1 (en) 2012-05-03
JP2011043789A (en) 2011-03-03
US9332173B2 (en) 2016-05-03

Similar Documents

Publication Publication Date Title
CN102483508B (en) Imaging device and imaging method
CN100366058C (en) Imaging apparatus, a focusing method, a focus control method
CN101959020B (en) Imaging device and imaging method
CN100581221C (en) Image pickup unit, focusing method, and focusing control method
CN101765862B (en) Image processor, image processing method, digital camera, and imaging apparatus
JP4444927B2 (en) Ranging apparatus and method
EP2698658B1 (en) Image pickup apparatus, semiconductor integrated circuit and image pickup method
US8670064B2 (en) Image capturing apparatus and control method therefor
US8300137B2 (en) Image sensing apparatus providing driving direction of focus lens for attaining in-focus state and control method therefor
CN104519276A (en) Image capturing apparatus and control method thereof
EP2065741B1 (en) Auto-focus apparatus, image- pickup apparatus, and auto- focus method
JP3823921B2 (en) Imaging device
CN103108133A (en) Imaging device
CN109690378B (en) Lens device, camera system and lens driving method
US10187564B2 (en) Focus adjustment apparatus, imaging apparatus, focus adjustment method, and recording medium storing a focus adjustment program thereon
JP2019092119A (en) Imaging device, lens device, and control method thereof
CN103649807A (en) Imaging device
JP4181620B2 (en) Digital camera
JP2006308813A (en) Imaging apparatus with focus bracket function
JP2011053318A (en) Focusing-state detection apparatus, imaging apparatus, and control method therefor
JP2007225897A (en) Focusing position determination device and method
JP2017044821A (en) Image-capturing device and control method therefor, program, and storage medium
WO2013047641A1 (en) Three-dimensional image processing device and three-dimensional image processing method
JP5366693B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND COMPUTER PROGRAM
JP2013210572A (en) Imaging device and control program of the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150624

Termination date: 20210722

CF01 Termination of patent right due to non-payment of annual fee