CN102087459B - Automatic focusing method - Google Patents

Automatic focusing method Download PDF

Info

Publication number
CN102087459B
CN102087459B CN200910260425.5A CN200910260425A CN102087459B CN 102087459 B CN102087459 B CN 102087459B CN 200910260425 A CN200910260425 A CN 200910260425A CN 102087459 B CN102087459 B CN 102087459B
Authority
CN
China
Prior art keywords
spatial frequency
high spatial
frequency value
image
zone
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN200910260425.5A
Other languages
Chinese (zh)
Other versions
CN102087459A (en
Inventor
朱大国
甘龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Omnivision Technologies Inc
Original Assignee
HAOWEI TECH Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HAOWEI TECH Co Ltd filed Critical HAOWEI TECH Co Ltd
Priority to CN200910260425.5A priority Critical patent/CN102087459B/en
Publication of CN102087459A publication Critical patent/CN102087459A/en
Application granted granted Critical
Publication of CN102087459B publication Critical patent/CN102087459B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a method for an automatic focusing digital imaging device. The method comprises the following steps: (1) capturing a plurality of images, each image being captured at different positions of a lens of the imaging device; (2) generating a high spatial frequency value for each image, each high spatial frequency value representing the high spatial frequency information in the image; (3) determining the image with the largest high spatial frequency value; and (4) adjusting the lens to a position corresponding to the position at which the image with the largest high spatial frequency value is captured.

Description

Auto focusing method
Invention field
The present invention relates to a kind of auto focusing method for digital imaging apparatus and system.
Background technology
Digital imaging apparatus can be categorized as automatic focus (automatic focus (auto-focus)) or fixed-focus device.The fixed-focus device can't be regulated camera lens usually or change aperture, and it depends on the big depth of field that target appears at the focus place on the contrary.Although the image that is captured by the fixed-focus device is acceptable in a lot of situations, it is clear like that by the image that automatic focusing system captures not as those.
The image that captures at the focus place of camera lens is sharply focused image, and herein, focus refers to be positioned at point on the camera lens axle of camera, convergence of rays.Yet, even target is not or not the focus place, if target in the depth of field of camera lens, also can generate target image fully clearly, i.e. sharply focused image.The depth of field refers to the distance from camera, and the target image that captures in this distance range can fully focus on.Therefore, the depth of field is the regional extent of camera lens focus both sides.
Method and the relevant device of some automatic focus digital imaging apparatus are disclosed at present.In a method, camera lens is mobile in the direction that the light activated element light intensity increases, and stops mobile reaching the maximum light intensity rear lens.In other method, the automatic focus technology then depends on finite impulse response (FIR) (FIR) wave filter, (or claims acutance, sharpness) to determine edge feature and its sharpness.Have at image under the situation of intensity/color-values of wide variation, such technology is not too effective, because average relevant with the FIR wave filter makes distortion as a result.
Other autofocus system then adopts the means of the distance of measurement from the target to the camera in the prior art, and will this distance corresponding with some measurable amounts that drive camera optical system, thereby realizes correct focusing.This self-focusing method and the parallel execution of main imaging path, this is for being desirable for the camera of film.
Although the automatic focus digital imaging apparatus is more excellent than fixed-focus imaging device performance usually, automatic focusing system is more complicated at hardware aspect usually in the prior art, make more expensive and the operation on slower than fixed-focus device.Yet, expectation provide the automatic focus that cost is minimum and exceptional space is minimum, and can utilize the existing assembly of imaging device, do not increase special assembly and be not required to be automatic focus.
Summary of the invention
The purpose of this invention is to provide a kind of self-focusing method and system for digital imaging apparatus, this method can be utilized the existing assembly of imaging device, does not increase special assembly and be not required to be automatic focus, or only increases special assembly seldom.
For realizing above-mentioned goal of the invention, in one embodiment of the present invention, a kind of self-focusing method for digital imaging apparatus is provided, it comprises: (1) catches a plurality of images, wherein, each image is to catch at the diverse location of the camera lens of imaging device, (2) high spatial frequency (high spatial frequency) that produces separately for each image is worth, wherein, each high spatial frequency value representative is included in the quantity of information of the high spatial frequency in this image, (3) determine to have the image of maximum high spatial frequency value, and (4) regulate camera lens, make its position corresponding with the position that the image with maximum high spatial frequency value is captured.
Preferably, in above-mentioned method, the step that produces high spatial frequency value separately for each image comprises:
For each image is selected a plurality of focal zones;
Determine each the high spatial frequency value separately in a plurality of focal zones; And
In conjunction with each the high spatial frequency value in a plurality of focal zones, to obtain the high spatial frequency value of this image.
Further, in above-mentioned method, determine that each the step of high spatial frequency value separately of a plurality of focal zones comprises:
Determine the high spatial frequency value and vertical high spatial frequency value of level for each zone; And
In conjunction with high spatial frequency value level and vertical, to determine this regional high spatial frequency value.
In the method for the present invention, the high spatial frequency value of determining level can comprise the difference between the brightness value of the pixel of determining that level in this zone is contiguous, determines that vertical high spatial frequency value then can comprise the difference between the brightness value of determining vertical contiguous pixel in this zone.
In the method for the present invention, at least two that can be included as in a plurality of focal zones in conjunction with each the step of high spatial frequency value in a plurality of focal zones are distributed different weights.
In the method for the present invention, the step that produces high spatial frequency value separately for each image can comprise:
Determine the high spatial frequency value high spatial frequency value vertical with at least one of at least one level for each image; And
In conjunction with the high spatial frequency value of at least one level high spatial frequency value vertical with at least one, in order to determine the high spatial frequency value of this image.
Similarly, the high spatial frequency value of determining at least one level comprises the difference between the brightness value of the pixel of determining that level is contiguous, and determines that at least one vertical high spatial frequency value comprises the difference between the brightness value of determining vertical contiguous pixel.
On the other hand, the present invention also provides a kind of autofocus system for digital imaging apparatus, and this autofocus system can be realized above-mentioned auto focusing method.
Autofocus system provided by the present invention and method can be avoided using complicated hardware at automatic focusing system, make expensive and the shortcoming slower than fixed-focus device in operation thereby overcome in the prior art automatic focusing system.Utilize method of the present invention, may be provided in this automatic focus minimum and exceptional space is minimum, and can utilize the existing assembly of imaging device, do not increase special assembly and be not required to be automatic focus.
Below in conjunction with accompanying drawing, explain the present invention.But should be appreciated that for the sake of clarity, some element is not drawn in proportion in the accompanying drawing.
Description of drawings
Fig. 1 is the calcspar of camera arrangement according to an embodiment of the present invention.
Fig. 2 is process flow diagram according to an embodiment of the present invention, that adopt camera arrangement execution auto focusing method shown in Figure 1.
Fig. 3 be according to an embodiment of the present invention, focal zone is selected the detail view of step in the method shown in Figure 2.
Fig. 4 be according to an embodiment of the present invention, the detail flowchart of frequency values calculation procedure between method high and medium shown in Figure 2.
Embodiment
The disclosed autofocus system of the application can advantageously realize in digital imaging apparatus, almost not or have seldom special assembly, and to overcome at least some problems relevant with autofocus system in the prior art usually.For example, system described herein is by determining the high spatial frequency quantity of information at different focal positions, thereby determines best focal position.When comprising maximum high spatial frequency quantity of information in the image, being familiar with is best focused condition.For example, by digital filtering being applied on the part of digital image data, can measure high spatial frequency information.Can adopt the energy that calculates of the spectrum that is filtered to come the measurement space frequency content.
The example of the camera that has autofocus system has been shown among Fig. 1.Fig. 1 represents camera arrangement 100, and it comprises automatic focus (AF) camera lens 110, imageing sensor 120 and the digital signal processor 135 with central processing unit (processor) 137.AF camera lens 110 focuses on incident light 140 (by the arrow indication) on the imageing sensor 120.Imageing sensor 120 is converted to Digital Image Data 125 (by the arrow indication) with optical information, and sends data 125 to digital signal processor 135.Digital signal processor 135 analysing digital image data 125, and by new position data 130 is sent to AF camera lens 110, regulate the position of AF camera lens 110.
Now together with Fig. 1 with reference to Fig. 2, Fig. 2 is the process flow diagram that self-focusing disposal route 200 be used for is carried out in expression, its camera arrangement 100 by Fig. 1 is carried out.Disposal route 200 is begun by the step 205 that following variable is reset to zero: LENS_POS (keeping current lens location), BEST_POS (maintenance optimum lens position) and MAX_HFV (keeping maximum high spatial frequency value).Subsequently, in step 210, AF camera lens 110 is driven to the position corresponding with the currency of LENS_POS.In step 215, imageing sensor 120 catches image.Step 220 an one or more focal zone (more detailed discussion being arranged in the part that relates to Fig. 3) of selection.The focal zone that treatment of selected is selected in step 225 subsequently.The focal zone that step 225 treatment of selected is selected is to produce the high spatial frequency quantity of information in this image of being included in by variable H FV (high spatial frequency) representative.Subsequently, this method proceeds to determining step 230, checks whether HFV is bigger than MAX_HFV.If the answer of determining step 230 is " deny ", then proceed to step 240.If the answer to determining step 230 is " being ", then proceeds to HFV value and LENS_POS are stored into the step 235 of MAX_HFV and BEST_POS respectively, and proceed to step 240 subsequently.Step 240 makes the variables L ENS_POS moving cell that advances, and this moving cell is the design parameter of so selecting, and makes the AF camera lens 110 one section acceptable distance of moving in method 200.Subsequently, determining step 245 checks whether LENS_POS has surmounted the outermost possible position of AF camera lens 110, and this position is indicated by variable MAX_POS.If the answer to determining step 245 is " denying ", then turn back to step 210.If the answer to determining step 245 is " being ", then proceed to step 250.Step 250 drives AF camera lens 110 and arrives by the indicated best focus position of BEST_POS.
Consult Fig. 3 together with Fig. 2 now, Fig. 3 shows the further details that focal zone is selected the exemplary embodiment of step 220.Fig. 3 presentation graphs picture frame 300 and five focal zones 310,320,330,340 and 350 example are selected.Because focal zone is unique image-region of being handled by disposal route 200, they define the zone that will become in the image " focusing " when disposal route 200 is finished effectively.Camera arrangement 100 can be selected the zone usually automatically, sometimes the user of camera manually the selective focus zone not on the same group.It should be noted that the quantity of selected focal zone, position and size can change.The example that focal zone is selected comprises: in the single zone of picture centre, bigger central area and less neighboring area or even cover the single zone of entire image.
Now together with Fig. 2 and Fig. 3 with reference to Fig. 4, Fig. 4 has described the further details of HFV calculation procedure 225.At first, in step 410 focal zone selecting to be limited by step 220 which handled.Subsequently, in step 420, adopt for example method of the absolute difference sum of horizontal neighborhood pixels brightness value, calculate the horizontal HFV value in this zone:
Figure GDA00002305802500061
Wherein, row and column is respectively at first index that changes in the scope of last row and column in this zone, and L is the brightness value of pixel.Depend on the color form that camera arrangement 100 adopts, brightness value L can adopt G component (according to the form of RGB and RGB-RAW), Y component (according to the form of YUV and YcbCr) or adopt any other method of represent pixel brightness to obtain.
Subsequently, in step 430, adopt for example method of the absolute difference sum of vertical neighborhood pixels brightness value, calculate the vertical HFV value in this zone:
Figure GDA00002305802500071
Wherein, row and column is respectively at first index that changes in the scope of last row and column in this zone, and ∠ is the brightness value of pixel, and it can obtain according to one of above-mentioned steps 420 described methods.
It should be noted that step 420 and 430 can be carried out according to any order.In some embodiments, step 420 can be moved after step 430, or runs parallel with step 430.In addition, HFV VerticallyAnd/or HFV LevelCalculating can " skip " one or more row or column; For example, can use (L OK, row-L OK+2, row) replacement (L OK, row-L OK+1, row) and be used for HFV VerticallyCalculating, skip row and/or row with accelerated processing method 200 thereby utilize.
Still with reference to Fig. 4, step 440 is with HFV LevelAnd HFV VerticallyBe combined into a value, represent the high-frequency numerical value in the current focal zone.Many average/the standardization formula can be used for two values of integration.For example, associated value H FV The zoneCan utilize in following two expression formulas any to be calculated:
HFV The zone=(HFV Level+ HFV Vertically) * a
Or
Figure GDA00002305802500072
Wherein, a is constant.
Work as HFV The zoneCalculated, this processing proceeds to determining step 450, and it checks that whether the focal zone just handled is last of the focal zone that limited by step 220.If the answer to determining step 450 is " denying ", then turn back to step 410 subsequently in order to handle next focal zone.If the answer to determining step 450 is " being ", then proceed to step 460.Step 460 is calculated final HFV value based on the HFV value of all focal zones.Also there is different average formulas may be utilized at this.For example, final HFV value can utilize in following two expression formulas any to be calculated:
Figure GDA00002305802500081
Or
Figure GDA00002305802500082
Wherein, a is constant.Can select to drip, can adopt different weights for different zones:
Figure GDA00002305802500083
Wherein, a is constant, W The zoneIt is zone-certain weights.Work as HFV FinallyCalculated, it is transmitted to the output of step 225.
Disposal route 200 can be carried out in DSP135 hardware, and its calculating is gone up effective framework and allowed it partly or entirely to carry out in software.For example, this software can be carried out by the CPU 137 of DSP135 inside or by outer CPU.Software carry out the complicacy significantly reduced hardware and thereby reduced camera arrangement 100 with and use digital still life camera/digital camera for example, the cost of cellular phone and other handheld apparatus.For example, camera arrangement can be handled 10 images at least to obtain the optimum focusing performance 100 p.s.s.
Will be understood by those skilled in the art that, under the prerequisite that does not depart from the scope of the invention, can above-mentioned method and system be changed or improve fully.Accordingly, it should be understood that content shown in the content that comprises in the top description and the accompanying drawing just is used for explaining the present invention, rather than going up limitation of the present invention in all senses.

Claims (8)

1. self-focusing method that is used for digital imaging apparatus, it comprises:
Catch a plurality of images, wherein, each image is to catch at the diverse location of the camera lens of described imaging device;
Be each image generation high spatial frequency value separately, each high spatial frequency value represents the high spatial frequency quantity of information that comprises in this image;
Determine to have the image of maximum high spatial frequency value; And
Regulate camera lens, make its position corresponding with the position that captures the image with maximum high spatial frequency value;
Wherein, described step for each image generation high spatial frequency value separately comprises:
Determine the high spatial frequency value high spatial frequency value vertical with at least one of at least one level for each image; And
In conjunction with the high spatial frequency value of described at least one level high spatial frequency value vertical with at least one, in order to determine the high spatial frequency value of this image;
Wherein, the high spatial frequency value of determining described level comprises the difference between the brightness value of the pixel of determining that level is contiguous; And
Determine that described vertical high spatial frequency value comprises the difference between the brightness value of determining vertically contiguous pixel.
2. the method for claim 1, wherein described step for each image generation high spatial frequency value separately further comprises:
For each image is selected a plurality of focal zones;
Determine each the high spatial frequency value separately in described a plurality of focal zone; And
In conjunction with each the high spatial frequency value in described a plurality of focal zones, to obtain the high spatial frequency value of described image.
3. method as claimed in claim 2, wherein, describedly determine that each the step of high spatial frequency value separately of described a plurality of focal zones comprises:
Determine the high spatial frequency value HFV of level for each zone LevelWith vertical high spatial frequency value HFV VerticallyAnd
In conjunction with high spatial frequency value described level and vertical, to determine this regional high spatial frequency value.
4. method as claimed in claim 3, wherein:
The high spatial frequency value of determining described level comprises the difference between the brightness value of the pixel of determining that level in this zone is contiguous; And
Determine that described vertical high spatial frequency value comprises the difference between the brightness value of determining the pixel of vertical vicinity in this zone.
5. method as claimed in claim 4, wherein:
The high spatial frequency value HFV of described level LevelThe absolute difference sum of the horizontal neighborhood pixels brightness value of employing calculates:
Wherein, row and column is respectively at first index that changes in the scope of last row and column in this zone, and L is the brightness value of pixel.
6. method as claimed in claim 4, wherein:
Described vertical high spatial frequency value HFV VerticallyAdopt the method for the absolute difference sum of vertical neighborhood pixels brightness value to calculate:
Figure FDA00002305802400031
Wherein, row and column is respectively at first index that changes in the scope of last row and column in this zone, and ∠ is the brightness value of pixel.
7. method as claimed in claim 3, wherein:
The high spatial frequency value HFV in zone The zoneBe to utilize in following two expression formulas any to be calculated:
HFV The zone=(HFV Level+ HFV Vertically) * a
Or
Figure FDA00002305802400032
Wherein, a is constant.
8. method as claimed in claim 2, wherein, at least two of being included as in described a plurality of focal zone of the step of the high spatial frequency value of described each in conjunction with in described a plurality of focal zones distribute different weights.
CN200910260425.5A 2009-12-04 2009-12-04 Automatic focusing method Active CN102087459B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200910260425.5A CN102087459B (en) 2009-12-04 2009-12-04 Automatic focusing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200910260425.5A CN102087459B (en) 2009-12-04 2009-12-04 Automatic focusing method

Publications (2)

Publication Number Publication Date
CN102087459A CN102087459A (en) 2011-06-08
CN102087459B true CN102087459B (en) 2013-07-03

Family

ID=44099315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200910260425.5A Active CN102087459B (en) 2009-12-04 2009-12-04 Automatic focusing method

Country Status (1)

Country Link
CN (1) CN102087459B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104867125B (en) * 2015-06-04 2018-03-02 北京京东尚科信息技术有限公司 Obtain the method and device of image
CN105516668B (en) * 2015-12-14 2018-11-13 浙江宇视科技有限公司 A kind of focus method and device applied to dynamic scene

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1303025A (en) * 2000-12-25 2001-07-11 蒋宏 Space-frequency contrast method as criterion of automatic focussing in optical imaging system
CN1506744A (en) * 2002-12-10 2004-06-23 ������������ʽ���� Automatic focusing device
CN1577039A (en) * 2003-07-28 2005-02-09 佳能株式会社 Focus adjusting system, image capture apparatus and control method thereof
CN1763624A (en) * 2004-10-22 2006-04-26 亚洲光学股份有限公司 Automatic focusing method and automatic focusing apparatus of electronic camera
CN1811516A (en) * 2005-01-25 2006-08-02 佳能株式会社 Camera, control method therefor, program, and storage medium
CN101285989A (en) * 2007-04-12 2008-10-15 索尼株式会社 Auto-focus apparatus, image-pickup apparatus, and auto-focus method
CN101408709A (en) * 2007-10-10 2009-04-15 鸿富锦精密工业(深圳)有限公司 Image viewfinding device and automatic focusing method thereof

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1303025A (en) * 2000-12-25 2001-07-11 蒋宏 Space-frequency contrast method as criterion of automatic focussing in optical imaging system
CN1506744A (en) * 2002-12-10 2004-06-23 ������������ʽ���� Automatic focusing device
CN1577039A (en) * 2003-07-28 2005-02-09 佳能株式会社 Focus adjusting system, image capture apparatus and control method thereof
CN1763624A (en) * 2004-10-22 2006-04-26 亚洲光学股份有限公司 Automatic focusing method and automatic focusing apparatus of electronic camera
CN1811516A (en) * 2005-01-25 2006-08-02 佳能株式会社 Camera, control method therefor, program, and storage medium
CN101285989A (en) * 2007-04-12 2008-10-15 索尼株式会社 Auto-focus apparatus, image-pickup apparatus, and auto-focus method
CN101408709A (en) * 2007-10-10 2009-04-15 鸿富锦精密工业(深圳)有限公司 Image viewfinding device and automatic focusing method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP平10-213736A 1998.08.11

Also Published As

Publication number Publication date
CN102087459A (en) 2011-06-08

Similar Documents

Publication Publication Date Title
US8537267B2 (en) Image processing apparatus, image processing method and program
US8184171B2 (en) Image pickup apparatus, image processing apparatus, image pickup method, and image processing method
KR101265358B1 (en) Method of controlling an action, such as a sharpness modification, using a colour digital image
CN109981945B (en) Image pickup optical system
US10511781B2 (en) Image pickup apparatus, control method for image pickup apparatus
JP5642433B2 (en) Imaging apparatus and image processing method
EP3154251A1 (en) Application programming interface for multi-aperture imaging systems
KR20120127903A (en) Image pickup device, digital photographing apparatus using the device, auto-focusing method, and computer-readable storage medium for performing the method
US11831979B2 (en) Image pickup apparatus, an image processing method and a non-transitory computer-readable medium for displaying a plurality of images different in in-focus position
RU2013114371A (en) AUTO FOCUS MANAGEMENT USING STATIC IMAGE DATA BASED ON Rough and Accurate Autofocus Indicators
CN105578033A (en) Image capturing apparatus and method for controlling image capturing apparatus
JP5325966B2 (en) Imaging apparatus and imaging method
JP5572700B2 (en) Imaging apparatus and imaging method
JP2016038414A (en) Focus detection device, control method thereof, and imaging apparatus
US20100045825A1 (en) Image Apparatus and Image Processing Method
CN102811309A (en) Method and device for generating shallow depth-of-field image
JP2007102061A (en) Imaging apparatus
CN102087459B (en) Automatic focusing method
KR100855370B1 (en) Apparatus and method for controlling auto focusing
US8711271B2 (en) Digital photographing apparatus and control method for evaluating validity of an auto-focus operation
KR101747304B1 (en) A digital photographing apparatus, a method for auto-focusing, and a computer-readable storage medium for executing the method
KR101777353B1 (en) Digital photographing apparatus and control method thereof
JP2015220634A (en) Image processing system, image processing system control method, and imaging apparatus
TW200920108A (en) Image capturing device and auto-focus method for the same
US20100134649A1 (en) Signal processor

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C56 Change in the name or address of the patentee
CP01 Change in the name or title of a patent holder

Address after: Sunnyvale, California in the United States

Patentee after: OmniVision Technologies, Inc.

Address before: Sunnyvale, California in the United States

Patentee before: Haowei Tech Co., Ltd.