CN106154689A - Atomatic focusing method and use the image capture unit of this Atomatic focusing method - Google Patents
Atomatic focusing method and use the image capture unit of this Atomatic focusing method Download PDFInfo
- Publication number
- CN106154689A CN106154689A CN201510206247.3A CN201510206247A CN106154689A CN 106154689 A CN106154689 A CN 106154689A CN 201510206247 A CN201510206247 A CN 201510206247A CN 106154689 A CN106154689 A CN 106154689A
- Authority
- CN
- China
- Prior art keywords
- state
- photosensory assembly
- camera lens
- numerical value
- value
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Abstract
The present invention provides a kind of Atomatic focusing method and uses the image capture unit of this Atomatic focusing method.This image capture unit includes camera lens and photosensory assembly.This Atomatic focusing method includes first changing this camera lens position relative with this photosensory assembly with the first amount of space and obtaining a plurality of image at a plurality of positions pick-up image respectively, then the definition values of these images is calculated, again these definition values are obtained a plurality of operation values after suitable mathematical functional operation and make these operation values relation curve with position close to straight line, re-use linear interpolation to calculate and extrapolate the camera lens corresponding to desired value of definition exactly with the target of photosensory assembly relative to position, finally with the second amount of space adjust the camera lens relative position with photosensory assembly to this target relative to position to complete to focus.
Description
Technical field
The present invention is related to a kind of Atomatic focusing method and uses the image capture unit of this Atomatic focusing method,
In particular to a kind of the definition values of image is carried out computing with mathematical function after to estimate definition
Atomatic focusing method that desired value thereby carrying out is focused and the image capture using this Atomatic focusing method fill
Put.
Background technology
In known focusing method, having so-called two benches focusing method, the first stage is first driven by stepper motor
Move and focus lens is moved and respectively obtains image definition angle value with bigger amount of space in multiple positions, then look for
Go out the maximum in these image definition values, in second stage, by stepping horse near above-mentioned maximum
Reach driving focus lens to be moved and in multiple positions, finds out image definition angle value with less amount of space
Good value.
But owing to above-mentioned two stage focusing method is relatively time consuming, therefore another kind of known Atomatic focusing method
Then disclose first to drive with bigger amount of space and focus lens is moved and respectively obtains corresponding to respectively in multiple positions
The image definition angle value of individual position, then estimate out in the way of these image definition angle value are with mathematical operation with
Another less amount of space moves the corresponding position to the optimum that can obtain image definition angle value during focus lens
Put, as it is shown in figure 1, change the position P to focus lens in camera lens with larger space amount0' to PN', and
Obtain the most several image definition value, then take preferably three position P of image definition angle value1’、P2' and
P3', and by these three position P1’、P2' and P3' corresponding image definition angle value M1’、M2' and M3’
Application linear interpolation calculates when moving focus lens with relatively closely spaced amount on auto-focusing curve C
Best placement P21', at this optimum position P21' available optimal image definition angle value Mf’.I.e. such as Fig. 1
Shown in, P11' it is P1' and P2' midpoint, P22' it is P2' and P3' midpoint, it is assumed that P11' to P21’
Distance and P21' to P22' the ratio of distance be Δ x'/(S'-Δ x'), with linear interpolation, it meets Therefore Known And
S'=(P2'-P1'), therefore can obtain:
But in the method shown in Fig. 1, due to auto-focusing curve C non-rectilinear, and use linear interior
The optimum of the image definition angle value that the method for inserting is tried to achieve on auto-focusing curve C and actual image definition
The optimum of value has error, thus cannot correctly find the position that can produce sharp image.
Summary of the invention
The technical problem to be solved in the present invention is, cannot be accurate for Atomatic focusing method of the prior art
Find the defect that can produce the position of sharp image, it is provided that a kind of Atomatic focusing method, can accurately find
The position of sharp image can be produced.
Another object of the present invention is to provide a kind of image capture unit, the auto-focusing of the applicable present invention
Method.
The present invention solves that its technical problem be employed technical scheme comprise that, it is provided that a kind of Atomatic focusing method
One embodiment is used for image capture unit, and this image capture unit includes camera lens and photosensory assembly, this camera lens
Can be adjusted according to parameter with the relative position of this photosensory assembly and complete focusing, this parameter can be according to first
Amount of space and the second amount of space are changed, and this Atomatic focusing method includes: change according to this first amount of space
Become this parameter and obtain a plurality of numerical value to adjust this camera lens position relative with this photosensory assembly, these numerical value
Corresponding to via this camera lens plurality of states of imaging on this photosensory assembly;These numerical value are carried out mathematics fortune
Calculate and respectively obtain a plurality of operation values;Change according to this second amount of space to estimate out based on these operation values
This parameter and obtain this camera lens target with this photosensory assembly relative to position, this target relative to position corresponding to warp
By this camera lens dbjective state of imaging on this photosensory assembly;And adjust this camera lens or this photosensory assembly to being somebody's turn to do
Target relative to position to obtain this dbjective state.
In the above-described embodiments, these states include the first state, the second state and the third state, via
The imaging on this photosensory assembly of this camera lens is more clear, and this first state, the second state and the third state are divided
Dui Yingyu the first numerical value, second value and third value in these numerical value.
In the above-described embodiments, this camera lens includes focus lens, and this parameter is that this to the position of focus lens or is somebody's turn to do
The position of photosensory assembly.
In the above-described embodiments, this first state, the second state and the third state correspond respectively to this focusing
Eyeglass or the primary importance of this photosensory assembly, the second position and the 3rd position, this primary importance and the 3rd
It is positioned adjacent to this second position.
In the above-described embodiments, this second state is to become on this photosensory assembly via this camera lens in these states
The most clear state of picture.
Above-described embodiment further includes and so that mathematical function carries out computing, these numerical value is respectively obtained these computings
Value.
In the above-described embodiments, this mathematical function includes logarithmic function, this target relative to position with following equations
Formula speculates:
Wherein at P1Represent this primary importance, P2Represent this second position, P3Represent the 3rd position, M1Representing should
First numerical value, M2Represent this second value, M3Represent this third value, P21Represent that this target is relative to position.
Above-described embodiment further includes: adjust this parameter to plurality of parameters value with adjust this camera lens with this photosensitive group
The relative position of part, these parameter values from minima with this first amount of space progression to maximum;And whenever
When this parameter adjustment is to the parameter value therein of these parameter values, capture photosensitive in this via this lens imaging
The image of assembly, and obtain a plurality of image;And calculated these numerical value respectively by these images.
In the above-described embodiments, these numerical value are the definition values of these images.
In the above-described embodiments, this second amount of space is little compared with this first amount of space.
One embodiment of the image capture unit of the present invention includes: photosensory assembly, is converted into by the light of reception
Electric signal;Camera lens, light via this lens imaging in this photosensory assembly;Processing unit, for according to the
One amount of space changes parameter and obtains a plurality of numerical value to adjust this camera lens position relative with this photosensory assembly,
These numerical value, corresponding to via this camera lens plurality of states of imaging on this photosensory assembly, utilizes these numerical value
Carry out computing and obtain a plurality of operation values, change according to the second amount of space to estimate out based on these operation values
This parameter and obtain this camera lens target with this photosensory assembly relative to position, this target relative to position corresponding to warp
By this camera lens dbjective state of imaging on this photosensory assembly, and according to this second amount of space change this parameter with
Adjust this camera lens or this photosensory assembly to this target relative to position to obtain this dbjective state.
In the above-described embodiments, these states include the first state, the second state and the third state, via
The imaging on this photosensory assembly of this camera lens is more clear, and this first state, the second state and the third state are divided
Dui Yingyu the first numerical value, second value and third value in these numerical value.
In the above-described embodiments, this camera lens includes focus lens, and this parameter is that this to the position of focus lens or is somebody's turn to do
The position of photosensory assembly.
In the above-described embodiments, this first state, the second state and the third state correspond respectively to this focusing
Eyeglass or the primary importance of this photosensory assembly, the second position and the 3rd position, this primary importance and the 3rd
It is positioned adjacent to this second position.
In the above-described embodiments, this second state is to become on this photosensory assembly via this camera lens in these states
The most clear state of picture.
Above-described embodiment further includes this processing unit and is obtained respectively so that mathematical function carries out computing by these numerical value
To these operation values.
In the above-described embodiments, this mathematical function includes logarithmic function, this target relative to position by this process list
Unit speculates with following equation:
Wherein at P1Represent this primary importance, P2Represent this second position, P3Represent the 3rd position, M1Representing should
First numerical value, M2Represent this second value, M3Represent this third value, P21Represent that this target is relative to position.
In the above-described embodiments, this processing unit adjust this parameter to plurality of parameters value with adjust this camera lens with
The relative position of this photosensory assembly, these parameter values from minima with this first amount of space progression to maximum;
Whenever the parameter value therein of this parameter of adjustment to these parameter values, this processing unit captures via this camera lens
Image in the image of this photosensory assembly, and obtain a plurality of image;This processing unit is counted respectively by these images
Calculate these numerical value.
In the above-described embodiments, these numerical value are the definition values of these images.
In the above-described embodiments, this second amount of space is little compared with this first amount of space.
Implement the Atomatic focusing method of the present invention and use the image capture unit of this Atomatic focusing method, having
Following beneficial effect: the numerical value of image definition is obtained operation values after suitable mathematical functional operation,
And the relation curve of this operation values and position is close to straight line, after so re-using linear interpolation computing, with regard to energy
Extrapolate the target phase para-position of the camera lens corresponding to the optimal values of image definition and photosensory assembly exactly
Put.
Accompanying drawing explanation
In order to above and other objects of the present invention, feature and advantage can be become apparent, cited below particularly go out
Embodiment also coordinates accompanying drawing to elaborate.
Fig. 1 represents the auto-focusing curve that known Atomatic focusing method is used.
Fig. 2 A is the square of an embodiment of the image capture unit of the Atomatic focusing method being applicable to the present invention
Figure.
Fig. 2 B is the side of another embodiment of the image capture unit of the Atomatic focusing method being applicable to the present invention
Block figure.
Fig. 3 is the flow chart of the Atomatic focusing method of the present invention.
The auto-focusing curve that the Atomatic focusing method that Fig. 4 is the present invention is used.
Detailed description of the invention
Referring to Fig. 2 A, its expression is suitable for a reality of the image capture unit of the Atomatic focusing method of the present invention
Execute example.This image capture unit includes camera lens 10, photosensory assembly 20, processing unit 30, control unit 40
And driver element 50, camera lens 10 has focus lens 12.Light images in photosensitive group via camera lens 10
On part 20, and by focus lens 12 adjusts focusing, the light of reception is converted into electronics by photosensory assembly 20
Signal, processing unit 30 images in the image on photosensory assembly 20 for capturing and this image is carried out computing
After process, the result of calculation process passing to control unit 40, control unit 40 controls driver element 50
To change the position to focus lens 12, thereby adjust the relative position to focus lens 12 with photosensory assembly 20
And make light clearly image on photosensory assembly 20.In the present embodiment, photosensory assembly 20 is image sense
Survey assembly, such as CCD or CMOS.
Referring to Fig. 2 B, its expression is suitable for another of the image capture unit of the Atomatic focusing method of the present invention
Embodiment.This image capture unit includes camera lens 10, photosensory assembly 20, processing unit 30, control unit
40 and driver element 50, camera lens 10 has focus lens 12.Light images in photosensitive via camera lens 10
On assembly 20, the light of reception is converted into electric signal by photosensory assembly 20, and processing unit 30 is used for picking
After taking the image imaged on photosensory assembly 20 and this image being carried out calculation process, by the knot of calculation process
Fruit passes to control unit 40, control unit 40 according to the output control driver element 50 of this computing to change
Become the position of photosensory assembly 20, thereby adjust position relative with photosensory assembly 20 to focus lens 12 and make
Light clearly images on photosensory assembly 20.In the present embodiment, photosensory assembly 20 is image sensing group
Part, such as CCD or CMOS.
The Atomatic focusing method of the present invention can realize by the image capture unit of Fig. 2 A or Fig. 2 B, but does not limits
In Fig. 2 A or the image capture unit of Fig. 2 B.The Atomatic focusing method of the present invention is adjusted by changing a parameter
Whole above-mentioned completing focus lens 12 with the relative position of photosensory assembly 20 is focused, and above-mentioned parameter can be right
The position of focus lens 12 or the position of photosensory assembly 20, above-mentioned parameter can according to the first amount of space with
And second amount of space be changed, can be by the position of the position of focus lens 12 or photosensory assembly 20
Driver element 50 is adjusted, and driver element 50 can be stepper motor, the most aforesaid second amount of space
Can be the minimum step of stepper motor driving, the first amount of space be then the multiple of minimum step.
Refer to Fig. 3, the flow process of the Atomatic focusing method of its expression present invention.In step sl, according to
It is relative with photosensory assembly 20 to adjust camera lens 10 to plurality of parameters value that first amount of space changes above-mentioned parameter
Position and obtain a plurality of numerical value, these parameter values from minima with this first amount of space progression to maximum,
These numerical value is corresponding to via camera lens 10 plurality of states of imaging on photosensory assembly 20.Driver element
50 move focus lens 12 or photosensory assembly 20 with the first amount of space with the first amount of space, i.e. stepper motor
To adjust the relative position of camera lens 10 and photosensory assembly 20, processing unit 30 is obtaining with this first amount of space
The a plurality of positions arrived pick-up image respectively, as shown in Figure 4, at position P0To PNBy processing unit 30 points
Other pick-up image, P0For minimum position, PNFor maximum position, then picked for each by processing unit 30
The image taken calculates its definition values (numerical value), and the definition values calculated can correspond to light via camera lens
10 on photosensory assembly 20 state of imaging.When processing unit 30 calculates the definition values of each image
After, enter step S2.
In step s 2, above-mentioned numerical value is carried out computing and respectively obtain a plurality of operation values.Processing unit
The a plurality of definition values calculated are performed mathematical calculations and obtain a plurality of operation values, in this reality by 30
Executing in example, arithmetic element 30 takes definition values preferably primary importance P1, second position P2And the 3rd position
P3The image captured, calculates the first definition values (the first numerical value) M respectively1, the second definition values (
Two numerical value) M2And the 3rd definition values (third value) M3, the first definition values M1, the second definition values
M2And the 3rd definition values M3Represent that light images in the first shape of illuminant module 20 through camera lens 10 respectively
State, the second state and the third state, wherein second position P2The second definition values M2For all positions
P0To PNDefinition values in the highest, primary importance P1With the 3rd position P3Adjacent to second position P2,
Then would correspond to primary importance P1, second position P2And the 3rd position P3First, second and third definition
Value M1、M2And M3Carry out computing with mathematical function, in order to improve linear interpolation in known technology be applied to right
Burnt curve has the problem of error, finds after calculating checking with actual numerical value, as shown in Figure 4, passes through
It is closer to straight line, hence with logarithm with the relation curve D of the operation values after logarithmic function computing Yu position
Operation values after functional operation can obtain fairly small with the relation curve D application linear interpolation of position
Estimate error.Subsequently enter step S3.
In step s3, above-mentioned parameter is changed to estimate out according to this second amount of space based on these operation values
And obtain the camera lens 10 target with photosensory assembly 20 relative to position, this target relative to position corresponding to via mirror
10 on photosensory assembly 20 dbjective state of imaging, the second amount of space is little compared with the first amount of space, in this reality
Executing in example, this dbjective state is via camera lens 10 the most clear state of imaging on photosensory assembly 20.Due to
In Fig. 4, operation values after logarithmic function computing is closer to straight line with the relation curve D of position, therefore applies
Relation curve D calculate the target of imaging relative to position, as shown in Figure 4, P21 is above-mentioned target phase para-position
Putting, target is Mf relative to the largest image definition values corresponding to the P21 of position, largest image definition values
For the logarithmic function operation values of Mf be logMf, P11 be the midpoint of primary importance P1 and second position P2,
P22 is second position P2 and the midpoint of the 3rd position P3.P11 Yu P22, at a distance of S, makes P11 and P21
At a distance of Δ x, therefore P21 Yu P22 S-Δ x apart, according to the linear relationship of Fig. 4, P11 to P21 away from
Difference and logM2 and logM1 of logM2 Yu logM3 it is equal to from the ratio with the distance of P21 to P22
The ratio of difference, i.e.
The most knownAnd S=(P2-P1), can obtain:
By the available target in equation (1) and (2) relative to position P21For:
Processing unit 30 calculates target relative to position P according to equation (3)21After, subsequently enter step S4.
In step s 4, camera lens 10 or photosensory assembly 20 to target relative position P are adjusted21To obtain this mesh
Mark state.Processing unit 30 calculates target relative to position P according to equation (3)21After, target is relative
Position P21Data be transferred to control unit 40, control unit 40 according to target relative to position P21Data
Control driver element 50 and drive and focus lens 12 or photosensory assembly 20 are moved to target relative to position P21,
The image optimal to obtain definition, i.e. completes focusing.
The Atomatic focusing method of the present invention utilizes mathematical function to be calculated as the definition values of picture, and it is straight to obtain approximation
The relation curve of line, thereby can utilize linear interpolation to estimate imaging position the most clearly exactly, and complete
Become auto-focusing.
Though the present invention is disclosed above with embodiment, but it is not used to limit the scope of the present invention, the skill of this area
Art personnel, without departing from the scope of the present invention, when a little change and retouching can be done, therefore this
Bright protection domain is when being as the criterion depending on as defined in claim.
Claims (12)
1. an Atomatic focusing method, for image capture unit, it is characterised in that this image capture fills
Putting and include camera lens and photosensory assembly, this camera lens can be adjusted according to parameter with the relative position of this photosensory assembly
Whole and complete focusing, this parameter can be changed according to the first amount of space and the second amount of space, and this is the most right
Burnt method includes:
According to this first amount of space change this parameter adjusting this camera lens position relative with this photosensory assembly and
Obtaining a plurality of numerical value, these numerical value is corresponding to via this camera lens a plurality of shapes of imaging on this photosensory assembly
State;
These numerical value are utilized to carry out computing and obtain a plurality of operation values;
Based on these operation values with estimate out change this parameter according to this second amount of space and obtain this camera lens with
The target of this photosensory assembly is relative to position, and this target corresponds to via this camera lens at this photosensory assembly relative to position
The dbjective state of upper imaging;And
This parameter is changed relative to this target to adjust this camera lens or this photosensory assembly according to this second amount of space
Position is to obtain this dbjective state.
2. Atomatic focusing method as claimed in claim 1, it is characterised in that these states include first
State, the second state and the third state, more clear via the imaging on this photosensory assembly of this camera lens, and
This second state is the most clear state in these states, and this camera lens includes focus lens, and this parameter is that this is right
The position of focus lens, this first state, the second state and a third state correspond respectively to this to focus lens
Primary importance, the second position and the 3rd position, this primary importance with the 3rd be positioned adjacent to this second
Position, this first state, the second state and the third state correspond respectively to the first numerical value in these numerical value,
Second value and third value.
3. Atomatic focusing method as claimed in claim 1, it is characterised in that these states include first
State, the second state and the third state, more clear via the imaging on this photosensory assembly of this camera lens, and
This second state is the most clear state in these states, and this parameter is the position of this photosensory assembly, and this is first years old
State, the second state and the third state correspond respectively to this to the primary importance of focus lens, the second position with
And the 3rd position, this primary importance and the 3rd is positioned adjacent to this second position, this first state, second
State and the third state correspond respectively to the first numerical value in these numerical value, second value and the 3rd number
Value.
4. Atomatic focusing method as claimed in claim 2 or claim 3, it is characterised in that further include these
Numerical value respectively obtains these operation values so that mathematical function performs mathematical calculations.
5. Atomatic focusing method as claimed in claim 4, it is characterised in that it is right that this mathematical function includes
Number function, this target speculates with following equation relative to position:
Wherein at P1Represent this primary importance, P2Represent this second position, P3Represent the 3rd position, M1Table
Show this first numerical value, M2Represent this second value, M3Represent this third value, P21Represent this target phase para-position
Put.
6. Atomatic focusing method as claimed in claim 1, it is characterised in that this second amount of space is less than
This first amount of space.
7. an image capture unit, it is characterised in that including:
Photosensory assembly, is converted into electric signal by the light of reception;
Camera lens, light via this lens imaging in this photosensory assembly;And
Processing unit, for changing parameter to adjust the phase of this camera lens and this photosensory assembly according to the first amount of space
Position obtains a plurality of numerical value, and these numerical value is corresponding to via the imaging on this photosensory assembly of this camera lens
Plurality of states, utilizes these numerical value to carry out computing and obtain a plurality of operation values, based on these operation values with
Estimate out and change this parameter according to the second amount of space and obtain the target phase para-position of this camera lens and this photosensory assembly
Putting, this target corresponds to via this camera lens dbjective state of imaging on this photosensory assembly relative to position, and root
According to this second amount of space change this parameter with adjust this camera lens or this photosensory assembly to this target relative to position with
Obtain this dbjective state.
8. image capture unit as claimed in claim 7, it is characterised in that these states include first
State, the second state and the third state, more clear via the imaging on this photosensory assembly of this camera lens, and
This second state is the most clear state in these states, and this camera lens includes focus lens, and this parameter is that this is right
The position of focus lens, this first state, the second state and a third state correspond respectively to this to focus lens
Primary importance, the second position and the 3rd position, this primary importance with the 3rd be positioned adjacent to this second
Position, this first state, the second state and the third state correspond respectively to the first numerical value in these numerical value,
Second value and third value.
9. image capture unit as claimed in claim 7, it is characterised in that these states include first
State, the second state and the third state, more clear via the imaging on this photosensory assembly of this camera lens, and
This second state is the most clear state in these states, and this parameter is the position of this photosensory assembly, and this is first years old
State, the second state and the third state correspond respectively to this to the primary importance of focus lens, the second position with
And the 3rd position, this primary importance and the 3rd is positioned adjacent to this second position, this first state, second
State and the third state correspond respectively to the first numerical value in these numerical value, second value and the 3rd number
Value.
10. image capture unit as claimed in claim 8 or 9, it is characterised in that this processing unit will
These numerical value respectively obtain these operation values so that mathematical function performs mathematical calculations.
11. image capture units as claimed in claim 10, it is characterised in that this mathematical function includes
Logarithmic function, this target is speculated with following equation by this processing unit relative to position:
Wherein at P1Represent this primary importance, P2Represent this second position, P3Represent the 3rd position, M1Table
Show this first numerical value, M2Represent this second value, M3Represent this third value, P21Represent this target phase para-position
Put.
12. image capture units as claimed in claim 7, it is characterised in that this second amount of space is less than
This first amount of space.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510206247.3A CN106154689A (en) | 2015-04-28 | 2015-04-28 | Atomatic focusing method and use the image capture unit of this Atomatic focusing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510206247.3A CN106154689A (en) | 2015-04-28 | 2015-04-28 | Atomatic focusing method and use the image capture unit of this Atomatic focusing method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106154689A true CN106154689A (en) | 2016-11-23 |
Family
ID=57346779
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510206247.3A Pending CN106154689A (en) | 2015-04-28 | 2015-04-28 | Atomatic focusing method and use the image capture unit of this Atomatic focusing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106154689A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108513709A (en) * | 2017-08-25 | 2018-09-07 | 深圳市大疆创新科技有限公司 | Manual focus householder method, device and unmanned vehicle |
CN108668118A (en) * | 2017-03-31 | 2018-10-16 | 中强光电股份有限公司 | Autofocus system, the projector with autofocus system and Atomatic focusing method |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1811576A (en) * | 2005-12-19 | 2006-08-02 | 佛山普立华科技有限公司 | Automatic focusing method for digit imaging system |
CN1949074A (en) * | 2005-10-13 | 2007-04-18 | 致茂电子股份有限公司 | Automatically focusing method and application thereof |
US20090009653A1 (en) * | 2007-07-06 | 2009-01-08 | Hon Hai Precision Industry Co., Ltd. | Auto focus system and auto focus method for use in video cameras |
CN100559255C (en) * | 2006-03-15 | 2009-11-11 | 亚洲光学股份有限公司 | The image capture unit of automatic focusing method and use said method |
CN103048766A (en) * | 2013-01-24 | 2013-04-17 | 华为技术有限公司 | Automatic focusing method and automatic focusing device |
-
2015
- 2015-04-28 CN CN201510206247.3A patent/CN106154689A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1949074A (en) * | 2005-10-13 | 2007-04-18 | 致茂电子股份有限公司 | Automatically focusing method and application thereof |
CN1811576A (en) * | 2005-12-19 | 2006-08-02 | 佛山普立华科技有限公司 | Automatic focusing method for digit imaging system |
CN100559255C (en) * | 2006-03-15 | 2009-11-11 | 亚洲光学股份有限公司 | The image capture unit of automatic focusing method and use said method |
US20090009653A1 (en) * | 2007-07-06 | 2009-01-08 | Hon Hai Precision Industry Co., Ltd. | Auto focus system and auto focus method for use in video cameras |
CN103048766A (en) * | 2013-01-24 | 2013-04-17 | 华为技术有限公司 | Automatic focusing method and automatic focusing device |
Non-Patent Citations (1)
Title |
---|
小仓磐夫: "《现代照相机和照相物镜技术》", 31 December 1989 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108668118A (en) * | 2017-03-31 | 2018-10-16 | 中强光电股份有限公司 | Autofocus system, the projector with autofocus system and Atomatic focusing method |
CN108513709A (en) * | 2017-08-25 | 2018-09-07 | 深圳市大疆创新科技有限公司 | Manual focus householder method, device and unmanned vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10178321B2 (en) | Machine vision inspection system and method for obtaining an image with an extended depth of field | |
CN109451244B (en) | Automatic focusing method and system based on liquid lens | |
KR101686926B1 (en) | Image blurring method and apparatus, and electronic device | |
US9383199B2 (en) | Imaging apparatus | |
CN109521547B (en) | Variable-step-length automatic focusing method and system | |
US8988592B2 (en) | Image processing apparatus, image processing method, image processing program, and image pickup apparatus acquiring a focusing distance from a plurality of images | |
US8049811B2 (en) | Automatic focusing apparatus and method for digital images using automatic filter switching | |
EP2378760A2 (en) | Four-dimensional polynomial model for depth estimation based on two-picture matching | |
WO2011158498A1 (en) | Image capture device and image capture method | |
WO2018196303A1 (en) | Projector calibration method and apparatus based on multi-directional projection | |
CN105044879B (en) | Automatic focusing system using multiple lenses and method thereof | |
CN109085113B (en) | Automatic focusing method and device for cervical exfoliated cell detection device | |
CN106991650A (en) | A kind of method and apparatus of image deblurring | |
CN101852970B (en) | Automatic focusing method for camera under imaging viewing field scanning state | |
CN103973957A (en) | Binocular 3D camera automatic focusing system and method | |
US11512946B2 (en) | Method and system for automatic focusing for high-resolution structured light 3D imaging | |
JP2021196951A (en) | Image processing apparatus, image processing method, program, method for manufacturing learned model, and image processing system | |
CN106772926A (en) | A kind of automatic focusing method | |
US20150279043A1 (en) | Imaging system with depth estimation mechanism and method of operation thereof | |
JP2020174331A (en) | Image capturing apparatus, image processing apparatus, control method, and program | |
CN106888344A (en) | Camera module and its inclined acquisition methods of image planes and method of adjustment | |
CN106154689A (en) | Atomatic focusing method and use the image capture unit of this Atomatic focusing method | |
JP2012147281A (en) | Image processing apparatus | |
TWI489164B (en) | Method for adjusting focusing point with a 3d object and system thereof | |
JP7191588B2 (en) | Image processing method, image processing device, imaging device, lens device, program, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20161123 |