WO2015010623A1 - 一种图像自动聚焦方法和应用该方法的摄像机 - Google Patents

一种图像自动聚焦方法和应用该方法的摄像机 Download PDF

Info

Publication number
WO2015010623A1
WO2015010623A1 PCT/CN2014/082836 CN2014082836W WO2015010623A1 WO 2015010623 A1 WO2015010623 A1 WO 2015010623A1 CN 2014082836 W CN2014082836 W CN 2014082836W WO 2015010623 A1 WO2015010623 A1 WO 2015010623A1
Authority
WO
WIPO (PCT)
Prior art keywords
region
focus evaluation
evaluation value
image sub
light source
Prior art date
Application number
PCT/CN2014/082836
Other languages
English (en)
French (fr)
Inventor
陈芳
陈庆议
Original Assignee
浙江宇视科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 浙江宇视科技有限公司 filed Critical 浙江宇视科技有限公司
Priority to US14/905,276 priority Critical patent/US9706104B2/en
Publication of WO2015010623A1 publication Critical patent/WO2015010623A1/zh

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the present invention relates to the field of video surveillance cameras, and more particularly to a method for calculating a focus evaluation value of an image having a moving light source and a camera applied to the method. Background technique
  • Autofocus is an important factor in determining image quality and is the first step in getting a clear image. Focusing performance depends on the accuracy and effectiveness of the focus evaluation function.
  • the good evaluation function has the characteristics of good unbiasedness, strong unimodality and good anti-noise performance.
  • the essence of image blur is the loss of high-frequency components.
  • the focused image contains more information and details than the defocused image, which is the basis for designing the focus evaluation function.
  • the choice of the focus evaluation function is an important basis for an integrated camera to achieve focus for high quality images.
  • the figure shows a focus evaluation curve (also called a sharpness evaluation curve) with a dynamic light source at night.
  • the abscissa indicates the relative value of the focus position, and the ordinate indicates the image focus evaluation value (focus evaluation value, also called the sharpness evaluation value) calculated based on the focus evaluation function.
  • the curve shows two peaks. According to the curve, the camera selects the position corresponding to the peak 1 as the final focus position to complete the focus because the position has the highest resolution evaluation value. However, through the verification of the actual image, the position corresponding to the peak 1 is not the clearest position of the image, and the clearest position of the image is actually located at the position 2 indicated in FIG.
  • the reason why the focus evaluation function fails in this example is mainly because the dynamic light source in the environment switches from light to off during the auto focus process, and the brightness of the local area of the picture corresponding to position 2 is lower than the peak 1 position picture.
  • the brightness of the local area, resulting in the focus evaluation value of the picture of position 2 is smaller than the focus evaluation value of the picture of the position of the peak 1, causing the failure of the autofocus algorithm.
  • Figure 1 is a graph of the sharpness evaluation under the dynamic light source scene.
  • Figure 2 is a structural diagram of a camera in an example.
  • Figure 3 is a flow chart of an image autofocus method in an example.
  • FIG. 4 is a flow chart of an image autofocus method in another example of the present invention.
  • Fig. 5 is a view showing an example of an image sub-area in an example of the present invention.
  • Fig. 6 is a comparison diagram of a sharpness evaluation curve and a conventional evaluation curve using the technique of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION To solve the problems mentioned in the prior art, the present invention optimizes an existing autofocus method that eliminates the influence of a moving light source on an autofocus algorithm, thereby resulting in a final focus evaluation curve. Has a strong single peak.
  • the present invention provides a camera 20 including a processor 21, a memory 22 non-volatile memory 23, and other hardware (such as sensors, etc.) through an internal bus. 25 connected.
  • the processor 21 reads the image autofocus logic in the nonvolatile memory 23 into the memory 22 to perform the following steps:
  • Step 31 Divide the image area into a plurality of image sub-areas
  • Step 32 Determine an image sub-area where the moving light source exists
  • Step 33 correcting a focus evaluation value of the image sub-area of the active light source by using a focus evaluation value of the image sub-area adjacent to the image sub-area of the active light source;
  • Step 34 Calculate a focus evaluation value of the entire image according to the focus evaluation value of each image sub-region.
  • FIG. 4 there is shown a flow chart of an autofocus method in yet another example of the present invention.
  • This autofocus method can be applied to scenes with moving light sources.
  • Step 41 Divide the image area into a plurality of image sub-areas.
  • Step 42 Calculate a current focus evaluation value FV CUI _ of each image sub-region by using a focus evaluation function, And calculating a focus evaluation value change rate ⁇ of each image sub-area according to a focus evaluation value FV p of a previous frame of each image sub-region and a current focus evaluation value FV CUI >, the focus evaluation value change rate
  • the focus evaluation function here can use some existing focus evaluation functions, such as Lap lace function and Brenner function Tenengrad function, Rober t function and square gradient function. In this step, it is necessary to calculate the rate of change of the focus evaluation value of each image sub-area.
  • the focus evaluation value change rate is a rate of change of the focus evaluation value of the current frame of the sub-region with respect to the focus evaluation value of the previous frame.
  • the focus evaluation value change rate provides a basis for evaluating whether there is a moving light source in the image sub-area.
  • Step 43 Determine whether there is a moving light source in the image sub-area according to the focus evaluation value change rate ⁇ of each image sub-area. If yes, go to step 44; otherwise, go to step 45.
  • Method 1 If the absolute value of the change rate of the focus evaluation value ⁇ of an image sub-area is greater than a preset threshold Thresho ldl , the moving light source is considered to exist in the image sub-area;
  • Method 2 If the absolute value of the change rate ⁇ of the focus evaluation value of an image sub-region is greater than the preset threshold Thresho ld2, and the change trend of the focus evaluation value of the sub-region is opposite to the change trend of the focus evaluation value of other surrounding regions, Then, the image sub-area is considered to have a moving light source. among them,
  • Thres ho ldl is greater than Thre sho ld2 .
  • Thres ho l dl and Thresho l d2 can be set empirically.
  • the focus evaluation value calculated in step 42 is used as the current focus evaluation value of the image sub-area.
  • the current focus evaluation value change rate of adjacent sub-areas can be used and The corrected area is calculated in the same way.
  • the method of correcting the focus evaluation value of the image sub-region with the moving light source is used, and the current focus evaluation value change rate of the adjacent sub-regions in the region is utilized to eliminate the influence of the dynamic light source on the calculation of the focus evaluation value.
  • the post-processing area is affected by the rate of change of the focus evaluation value after the correction of the pre-processing area.
  • the image sub-regions in which the moving light source is present are sorted first, and then arranged.
  • the sequenced image sub-regions are sequentially corrected for the focus evaluation value.
  • the basis for sorting is as follows: The image sub-region with a large change rate of the evaluation value is in the front, and the rate of change of the focus evaluation value is small.
  • Step 45 Calculate the focus evaluation value of the entire image according to the current focus evaluation value of each image sub-region.
  • Step 46 Determine whether the focus ends. If yes, the process ends, otherwise return to step 42. If the focus evaluation value of the image sub-area is corrected, the corrected focus evaluation value is used as the current focus evaluation value of the image sub-area to calculate the focus evaluation value of the entire image. How to calculate the focus evaluation value of the entire image and the judgment of whether the focus ends or not according to the focus evaluation value of each image sub-area belongs to the prior art, and will not be described again.
  • the image area is divided into 5*8 image sub-areas. These image sub-regions are described as (i, j).
  • the sub-regions of the first row and the first column are represented as image sub-regions (1, 1)
  • the sub-regions of the first row and the second column are represented as image sub-regions ( 1, 2), and so on.
  • the current focus evaluation value of the image sub-region (i, j) is calculated by the focus evaluation function, and the change rate of the focus evaluation value of the image sub-region (i, j) is calculated.
  • the image sub-area (3, 5) as an example, first calculate its current focus evaluation value FV( 3 , 5)c; ui _, for example, the calculated FV( 3 , 5)c; ui _ is 60
  • the rate of change of the focus evaluation value for each image sub-area it is judged whether or not there is a moving light source in the area according to a predetermined rule. For example, the presence or absence of the dynamic light source is judged by the fact that the absolute value of the change rate of the focus evaluation value is greater than 3, and the ⁇ 35 calculated above determines that the image sub-area (3, 5) has a moving light source. The same method is used to determine whether or not there is a moving light source in the other image sub-areas in FIG. It is assumed that by the determination that only the image sub-region (3, 5) has a moving light source, the remaining sub-regions do not have a moving light source.
  • the adjacent sub-regions of the image sub-region (3, 5) may be (3, 4), (3, 6), (2, 5), and
  • 5 ⁇ ⁇ 5 and 5 ⁇ 45 are the change rates of the focus evaluation values of the image sub-regions ( 3, 4), ( 3, 6), ( 2, 5) and ( 4, 5), respectively, if the calculated ⁇ 34 .
  • the values of ⁇ 36 , ⁇ 25 and ⁇ 45 are 1, 1, 1, 2, respectively, and the value of ⁇ 35 is calculated according to the above formula.
  • the adjacent sub-areas of the image sub-area (5, 2) may be three sub-areas of (5, 1), (5, 3), (4, 2).
  • the focus evaluation value correction of the image sub-area (3, 5) or the focus evaluation value of (3, 6) will be generated first.
  • the same correction result In order to ensure that the corrected focus evaluation value changes in accordance with the actual situation, for example, the image sub-area (3, 5) has a focus evaluation value of 60 before correction, and the image sub-area (3, 6) has a focus evaluation value of 100 before correction. Then, the focus evaluation value of the corrected image sub-area (3, 6) should still be larger than the focus evaluation value of the corrected image sub-area (3, 5), so when there are moving light sources in multiple image sub-areas, follow the focus.
  • the current evaluation value of the image sub-region of the moving light source is sequentially corrected in descending order of the evaluation value change rate. For example, after the calculation in step 42, the change rate of the focus evaluation value ⁇ 35 of the image sub-region (3, 5) is 5, and the change rate of the focus evaluation value of the image sub-region (3, 6)
  • this figure is an experimentally derived focus evaluation curve obtained using the method of the present invention for the same scenario of Figure 1. Comparing the curve of Fig. 6 with that of Fig. 1, it is found that the corrected curve has a significant improvement, satisfying unimodality and unbiasedness.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

一种应用于动光源场景的聚焦评价值计算方法:将图像区域划分成若干个图像子区域;确定存在动光源的图像子区域;利用与该存在动光源的图像子区域相邻的图像子区域的聚焦评价值修正该存在动光源的图像子区域的聚焦评价值;根据每个图像子区域的聚焦评价值计算整个图像的聚焦评价值。实现了具有动光源场景的自动聚焦,实际效果非常理想。

Description

一种图像自动聚焦方法和应用该方法的摄像机
技术领域 本发明涉及视频监控摄像机领域, 尤其涉及一种具有动光源的图像的聚 焦评价值计算方法和应用于该方法的摄像机。 背景技术
自动聚焦是决定图像质量的重要因素, 是获取清晰图像的第一步。 聚焦 性能取决于调焦评价函数的准确性和有效性。 好的评价函数具有无偏性好、 单峰性强和较好的抗噪性能等特点。 而图像模糊的本质是高频分量的损失, 聚焦图像比离焦图像包含更多的信息和细节,这是设计聚焦评价函数的基础。 聚焦评价函数的选择是一体化摄像机实现对焦, 以获取高质量图像很重要的 依据。
如图 1所示, 该图显示了夜间具有动态光源下的聚焦评价曲线 (又称清 晰度评价曲线) 。 横坐标表示焦点位置的相对值, 纵坐标表示根据聚焦评价 函数计算得到的图像聚焦评价值 (聚焦评价值, 又称清晰度评价值) 。 该曲 线显示了两个波峰, 根据该曲线, 摄像机会选择波峰 1对应的位置作为最终 的焦点位置完成聚焦, 因为该位置的清晰度评价值最高。 但通过实际图像的 验证, 波峰 1对应的位置不是图像最清晰的位置, 图像最清晰的位置实际上 位于图 1 中标示的位置 2。 之所以发生聚焦评价函数在本例中失效, 主要是 因为自动聚焦过程中, 环境中的动态光源发生了由亮到灭的切换, 位置 2所 对应的图片局部区域的亮度低于波峰 1位置图片局部区域的亮度, 从而导致 位置 2的图片的聚焦评价值小于波峰 1位置图片的聚焦评价值, 引起自动聚 焦算法的失效。 附图说明
图 1是动态光源场景下的清晰度评价曲线图。
图 2是一个例子中摄像机的结构图。
图 3是一个例子中图像自动聚焦方法流程图。
图 4是本发明另一个例子中图像自动聚焦方法流程图。
图 5是本发明一个例子中图像子区域示例图。
图 6是使用本发明技术后清晰度评价曲线和现有的评价曲线比较图。 具体实施方式 为解决现有技术中提到的问题,本发明对现有的自动聚焦方法进行优化, 该优化的自动聚焦方法可以排除动光源对自动聚焦算法的影响, 从而使得最 终的聚焦评价曲线具有较强的单峰。 在一个例子中, 请参考图 2以及图 3, 本发明提供一种摄像机 20, 该摄像机包括处理器 21、 内存 22非易失性存储 器 23以及其他硬件 (比如传感器等) , 这些器件通过内部总线 25相连。 其 中处理器 21将非易失性存储器 23中图像自动聚焦逻辑读取到内存 22中运行, 从而执行如下步骤:
步骤 31、 将图像区域划分成若干个图像子区域;
步骤 32、 确定存在动光源的图像子区域;
步骤 33、利用与该存在动光源的图像子区域相邻的图像子区域的聚焦评 价值修正该存在动光源的图像子区域的聚焦评价值;
步骤 34、根据每个图像子区域的聚焦评价值计算整个图像的聚焦评价值。 以下通过具体的例子来阐述上述处理流程。
参考图 4, 该图为本发明又一个例子中的自动聚焦方法流程图。 该自动 聚焦方法可以适用于具有动光源的场景。
步骤 41、 将图像区域划分成若干个图像子区域。
步骤 42、利用聚焦评价函数计算每个图像子区域当前的聚焦评价值 FVCUI_, 并且根据每个图像子区域前一帧的聚焦评价值 FVp 和当前的聚焦评价值 FVCUI>计算每个图像子区域的聚焦评价值变化率 δ, 所述聚焦评价值变化率
FVpre
这里的聚焦评价函数可以选用现有的一些聚焦评价函数, 比如说 Lap lace函数和 Brenner函数 Tenengrad函数、 Rober t函数和平方梯度函数。 本步骤中, 需要计算各图像子区域的聚焦评价值变化率。 该聚焦评价值变化 率是指该子区域当前帧的聚焦评价值相对于前一帧聚焦评价值的变化率。 该 聚焦评价值变化率为评估该图像子区域是否存在动光源提供了依据。
步骤 43、根据每个图像子区域的聚焦评价值变化率 δ确定所述图像子区 域是否存在动光源, 如果存在, 则转步骤 44 ; 否则转步骤 45。 方法一: 如果某个图像子区域的聚焦评价值变化率 δ的绝对值大于预设 门限值 Thresho ldl , 则认为该图像子区域存在动光源;
方法二: 如果某个图像子区域的聚焦评价值变化率 δ的绝对值大于预设 门限值 Thresho ld2, 并且该子区域的聚焦评价值变化趋势与周边其他区域的 聚焦评价值变化趋势相反, 则认为该图像子区域存在动光源。 其中,
Thres ho ldl大于 Thre sho ld2 。 Thres ho l dl和 Thresho l d2可以根据经验进 行设置。
如果判断出图像子区域存在动光源,则需要对步骤 42计算出的该区域当 前的聚焦评价值进行修正,否则还是以步骤 42中计算出的聚焦评价值作为当 前该图像子区域的聚焦评价值。
步骤 44、按照聚焦评价值变化率从大到小的顺序依次修正存在动光源的 图像子区域当前的聚焦评价值, 修正后的聚焦评价值 FVeu = FVpre(l + δ ' );其中 δ ' 为被修正区域当前聚焦评价值变化率和其相邻子区域当前的 聚焦评价值变化率的均值。 相邻子区域当前的聚焦评价值变化率可以釆用和 被修正区域同样的方法计算。
修正具有动光源的图像子区域聚焦评价值的方法, 利用了该区域相邻子 区域当前的聚焦评价值变化率从而达到了排除动光源对聚焦评价值计算的影 响。
但是当多个相邻的图像子区域存在动光源时, 后处理的区域会受到先处 理区域修正后的聚焦评价值变化率影响。 为了不让聚焦评价值变化率小的区 域更大程度上受到聚焦评价值变化率大的区域影响, 所以按照聚焦评价值变 化率, 先对存在动光源的图像子区域进行排序, 再对排好序的图像子区域依 次进行聚焦评价值的修正。 排序的依据为: 聚焦评价值变化率大的图像子区 域在前, 聚焦评价值变化率小的在后。
步骤 45、根据每个图像子区域当前的聚焦评价值计算整个图像的聚焦评 价值。
步骤 46、 判断聚焦是否结束, 如果是, 则流程结束, 否则返回步骤 42。 如果图像子区域的聚焦评价值经过修正, 则将修正后的聚焦评价值作为 该图像子区域当前的聚焦评价值来计算整个图像的聚焦评价值。 如何根据每 个图像子区域的聚焦评价值计算整个图像的聚焦评价值以及聚焦结束与否的 判断属于现有技术, 不再赘述。
参考图 5的具体示例来描述本发明具体实施方式。
首先, 该图像区域被划分成 5*8个图像子区域。 这些图像子区域被描述 为 ( i, j ) , 比如说第 1行第 1列的子区域表示为图像子区域( 1, 1 ) , 第 1行第 2列的子区域表示为图像子区域( 1, 2 ) , 以此类推。
利用聚焦评价函数计算图像子区域( i, j ) 当前的聚焦评价值, 再计算 该图像子区域( i, j ) 的聚焦评价值变化率。 以图像子区域( 3, 5 ) 为例, 先计算其当前的聚焦评价值 FV(3,5)c;ui_,比如说计算得到的 FV(3,5)c;ui_的值为 60, 然后通过公式计算子区域( 3, 5 )的聚焦评价值变化率 δ ^ = FV(3's)curFV(3's)pre
35 FV(3,s)pre 假设该子区域前一帧的聚焦评价值 FV(35)pi_e的值为 10, 则 δ 35的值为 5。 针 对图 5的 40个图像子区域,将获得 40个聚焦评价值和 40个聚焦评价值变化 率。
针对每一个图像子区域的聚焦评价值变化率, 根据预定的规则判断在该 区域是否存在动光源。 比如说动光源存在与否的判断依据为聚焦评价值变化 率的绝对值大于 3, 根据上面计算得到的 δ 35则确定图像子区域( 3, 5)存 在动光源。通过同样的方法来确定图 5中其它图像子区域中是否存在动光源。 假设通过判定仅图像子区域( 3, 5)存在动光源,其余子区域不存在动光源。 在这种情况下, 需要对图像子区域( 3, 5) 当前的聚焦评价值进行修正。 在 对子区域( 3, 5)当前的聚焦评价值进行修正时, 需要先计算子区域( 3, 5) 聚焦评价值变化率和其相邻子区域当前的聚焦评价值变化率的均值 δ 35' 。 图像子区域( 3, 5) 的相邻子区域可以是( 3, 4) 、 ( 3, 6) 、 (2, 5) 和
(4, 5)这四个图像子区域, 所以 δ ' = 5 34+ 5 36+ 5 25+ 5 45+ 5 35 ? δδ
35 5 34 36
5^ Δ 5和5^ 45分别为图像子区域( 3, 4) 、 ( 3, 6) 、 (2, 5) 和 (4, 5) 的 聚焦评价值变化率, 若计算得到的 δ 34、 δ 36、 δ 25和 δ 45的值分别为 1, 1, 1, 2, 则根据上述公式计算得到 δ 35的值为 2。 利用计算得到的 δ 35来修正图 像子区域( 3,5)当前的聚焦评价值,具体为:FV(3,5)CUI·=FV(3,5)pI·e*( l+δ 35)=10* ( 1+2) =30。 所以图像子区域( 3, 5) 当前的聚焦评价值由 60 被修正为了 30。
如果除了图像子区域( 3, 5)存在动光源外, 图像子区域(5, 2) 也存 在动光源, 则可以再利用同样的方法可以修正图像子区域( 5, 2) 当前的聚 焦评价值, 图像子区域(5, 2) 的相邻子区域可以是(5, 1) 、 (5, 3) 、 (4, 2) 这三个子区域。
但是, 如果图像子区域( 3, 6) 也存在动光源的话, 则先对图像子区域 ( 3, 5) 进行聚焦评价值修正还是先对 ( 3, 6) 进行聚焦评价值将会产生不 同的修正结果。 为了保证修正后的聚焦评价值变化趋势与实际一致, 比如说 图像子区域( 3, 5)修正前的聚焦评价值为 60, 图像子区域( 3, 6)修正前 的聚焦评价值为 100, 则修正后的图像子区域( 3, 6) 的聚焦评价值仍应大 于修正后的图像子区域( 3, 5) 的聚焦评价值, 所以在多个图像子区域具有 动光源的时候, 按照聚焦评价值变化率从大到小的顺序依次修正存在动光源 的图像子区域当前的聚焦评价值。比如说经过步骤 42的计算,图像子区域( 3, 5 ) 的聚焦评价值变化率 δ 35为 5, 图像子区域( 3, 6) 的聚焦评价值变化率
8
66为 , 则按照步骤 44 的方法先修正图像子区域( 3, 6) 的聚焦评价值, 然后再修正图像子区域( 3, 5) 的聚焦评价值。
参见图 6, 该图为实验得出的针对图 1 的同样的场景利用本发明的方法 获得的聚焦评价曲线。 将图 6的曲线和图 1的进行对比, 发现修正后的曲线 有明显的改善, 满足单峰性和无偏性。
以上所述仅为本发明的较佳实施例而已, 并不用以限制本发明, 凡在本 发明的精神和原则之内, 所做的任何修改、 等同替换、 改进等, 均应包含在 本发明保护的范围之内。

Claims

权利要求书
1、 一种具有动光源的图像的聚焦评价值计算方法, 其特征在于, 该方法 包括如下步骤:
将图像区域划分成若干个图像子区域;
确定存在动光源的图像子区域;
利用与该存在动光源的图像子区域相邻的图像子区域的聚焦评价值修正 该存在动光源的图像子区域的聚焦评价值;
根据每个图像子区域的聚焦评价值计算整个图像的聚焦评价值。
2、 如权利要求 1所述的方法, 其特征在于, 确定存在动光源的图像子区 域的步骤包括:
计算每个图像子区域的聚焦评价值变化率 δ, 其中 δ = FVcur"FVpre FVcur
FVpre , 为利用聚焦评价函数计算得到的每个图像子区域当前的聚焦评价值, FVpi_e为 利用聚焦评价函数计算得到的每个图像子区域前一帧的聚焦评价值;
根据每个图像子区域的聚焦评价值变化率 δ确定所述图像子区域是否 存在动光源。
3、 如权利要求 2所述的方法, 其特征在于, 根据每个图像子区域的聚焦 评价值变化率 δ确定所述图像子区域是否存在动光源包括:
如果该图像子区域的聚焦评价值变化率 δ的绝对值大于第一预设门限值 或者, 该图像子区域的聚焦评价值变化率 δ的绝对值大于第二预设门限值且 该图像子区域的聚焦评价值变化趋势与周边其他区域相反, 则确定该图像子 区域存在动光源。
4、 如权利要求 1所述的方法, 其特征在于, 利用与该存在动光源的图像 子区域相邻的图像子区域的聚焦评价值修正该存在动光源的图像子区域的聚 焦评价值包括:
根据公式 FVCU = FVpre(l + δ ' )修正存在动光源的图像子区域的聚 焦评价值;其中, FVCU 为修正后的存在动光源的图像子区域的聚焦评价值,
FVpi_e为该存在动光源的图像子区域前一帧的聚焦评价值, δ ' 为该存在动光 源的图像子区域的聚焦评价值变化率和其相邻子区域当前的聚焦评价值变化 率的均值, 图像子区域的聚焦评价值变化率 δ等于该子区域当前聚焦评价值 与该区域前一帧聚焦评价值的差与该子区域当前聚焦评价值的比值。
5、 如权利要求 4所述的方法, 其特征在于, 当存在动光源的图像子区域 包括多个时, 按照聚焦评价值变化率从大到小的顺序依次修正所述存在动光 源的图像子区域当前的聚焦评价值。
6、一种摄像机, 包括处理器以及存储有若干计算机指令的非易失性存储 器, 其特征在于, 该些计算机指令被处理器执行时执行如下处理:
将图像区域划分成若干个图像子区域;
确定存在动光源的图像子区域;
利用与该存在动光源的图像子区域相邻的图像子区域的聚焦评价值修正 该存在动光源的图像子区域的聚焦评价值;
根据每个图像子区域的聚焦评价值计算整个图像的聚焦评价值。
7、 如权利要求 6所述的摄像机, 其特征在于, 确定存在动光源的图像子 区域的步骤包括:
计算每个图像子区域的聚焦评价值变化率 δ, 其中 δ = FVcur"FVpre FVcur
FVpre , 为利用聚焦评价函数计算得到的每个图像子区域当前的聚焦评价值, FVpi_e为 利用聚焦评价函数计算得到的每个图像子区域前一帧的聚焦评价值;
根据每个图像子区域的聚焦评价值变化率 δ确定所述图像子区域是否 存在动光源。
8、 如权利要求 7所述的摄像机, 其特征在于, 根据每个图像子区域的聚 焦评价值变化率 δ确定所述图像子区域是否存在动光源包括:
如果该图像子区域的聚焦评价值变化率 δ的绝对值大于第一预设门限值 或者, 该图像子区域的聚焦评价值变化率 δ的绝对值大于第二预设门限值且 该图像子区域的聚焦评价值变化趋势与周边其他区域相反, 则确定该图像子 区域存在动光源。
9、 如权利要求 6所述的摄像机, 其特征在于, 利用与该存在动光源的图 像子区域相邻的图像子区域的聚焦评价值修正该存在动光源的图像子区域的 聚焦评价值包括:
根据公式 FVCU = FVpre(l + δ ' )修正存在动光源的图像子区域的聚 焦评价值;其中, FVCU 为修正后的存在动光源的图像子区域的聚焦评价值, FVpi_e为该存在动光源的图像子区域前一帧的聚焦评价值, δ ' 为该存在动光 源的图像子区域的聚焦评价值变化率和其相邻子区域当前的聚焦评价值变化 率的均值, 图像子区域的聚焦评价值变化率 δ等于该子区域当前聚焦评价值 与该区域前一帧聚焦评价值的差与该子区域当前聚焦评价值的比值。
10、 如权利要求 9所述的摄像机, 其特征在于, 当存在动光源的图像子 区域包括多个时, 按照聚焦评价值变化率从大到小的顺序依次修正所述存在 动光源的图像子区域当前的聚焦评价值。
PCT/CN2014/082836 2013-07-24 2014-07-23 一种图像自动聚焦方法和应用该方法的摄像机 WO2015010623A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/905,276 US9706104B2 (en) 2013-07-24 2014-07-23 Image auto-focusing method and camera using same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201310318378.1A CN103354599B (zh) 2013-07-24 2013-07-24 一种应用于动光源场景的自动聚焦方法和装置
CN201310318378.1 2013-07-24

Publications (1)

Publication Number Publication Date
WO2015010623A1 true WO2015010623A1 (zh) 2015-01-29

Family

ID=49310898

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/082836 WO2015010623A1 (zh) 2013-07-24 2014-07-23 一种图像自动聚焦方法和应用该方法的摄像机

Country Status (3)

Country Link
US (1) US9706104B2 (zh)
CN (1) CN103354599B (zh)
WO (1) WO2015010623A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190316215A1 (en) * 2018-04-16 2019-10-17 Carl R. Nylen Portable Self-Contained Reverse Osmosis System

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103354599B (zh) 2013-07-24 2016-08-10 浙江宇视科技有限公司 一种应用于动光源场景的自动聚焦方法和装置
CN104079832B (zh) * 2014-06-30 2017-06-06 苏州科达科技股份有限公司 一种一体化摄像机自动跟踪聚焦方法及系统
CN104853087B (zh) * 2015-03-16 2017-12-15 浙江宇视科技有限公司 一种点光源场景的识别和聚焦方法
CN107578373A (zh) * 2017-05-27 2018-01-12 深圳先进技术研究院 全景图像拼接方法、终端设备及计算机可读存储介质
CN107197152B (zh) * 2017-06-16 2020-01-14 Oppo广东移动通信有限公司 对焦方法、装置、计算机可读存储介质和移动终端
EP3422068B1 (en) 2017-06-26 2019-05-01 Axis AB A method of focusing a camera
JP7263080B2 (ja) * 2019-03-29 2023-04-24 キヤノン株式会社 撮像装置及び信号処理装置
CN113109936B (zh) * 2021-04-08 2022-03-11 西南石油大学 一种基于图像清晰度评估的显微镜自动对焦方法与装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010122301A (ja) * 2008-11-17 2010-06-03 Hitachi Ltd フォーカス制御装置、およびフォーカス制御方法
CN101840055A (zh) * 2010-05-28 2010-09-22 浙江工业大学 基于嵌入式媒体处理器的视频自动聚焦系统
US20130016245A1 (en) * 2011-07-14 2013-01-17 Sanyo Electric Co., Ltd. Imaging apparatus
CN103095983A (zh) * 2011-10-31 2013-05-08 株式会社日立制作所 图像信号处理装置
CN103354599A (zh) * 2013-07-24 2013-10-16 浙江宇视科技有限公司 一种应用于动光源场景的自动聚焦方法和装置

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR0138337B1 (ko) * 1994-07-28 1998-05-15 김광호 포커스 제어방법 및 장치
JP3302003B2 (ja) * 2000-03-08 2002-07-15 三洋電機株式会社 オートフォーカス機能を有する撮像装置
TWI374664B (en) * 2007-12-05 2012-10-11 Quanta Comp Inc Focusing apparatus and method
US8525923B2 (en) * 2010-08-30 2013-09-03 Samsung Electronics Co., Ltd. Focusing method and apparatus, and recording medium for recording the method
TWI428654B (zh) * 2010-11-23 2014-03-01 Ind Tech Res Inst 自動聚焦模組與其方法
JP5780756B2 (ja) * 2010-12-24 2015-09-16 キヤノン株式会社 焦点調節装置及び方法
WO2012133413A1 (ja) * 2011-03-31 2012-10-04 富士フイルム株式会社 撮像装置、撮像装置の制御方法及びプログラム
JP5883654B2 (ja) * 2012-01-06 2016-03-15 株式会社 日立産業制御ソリューションズ 画像信号処理装置
JP2013160919A (ja) * 2012-02-06 2013-08-19 Hitachi Ltd 画像信号処理装置及び画像信号処理方法
KR20140091959A (ko) * 2013-01-14 2014-07-23 삼성전자주식회사 초점상태 표시장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010122301A (ja) * 2008-11-17 2010-06-03 Hitachi Ltd フォーカス制御装置、およびフォーカス制御方法
CN101840055A (zh) * 2010-05-28 2010-09-22 浙江工业大学 基于嵌入式媒体处理器的视频自动聚焦系统
US20130016245A1 (en) * 2011-07-14 2013-01-17 Sanyo Electric Co., Ltd. Imaging apparatus
CN103095983A (zh) * 2011-10-31 2013-05-08 株式会社日立制作所 图像信号处理装置
CN103354599A (zh) * 2013-07-24 2013-10-16 浙江宇视科技有限公司 一种应用于动光源场景的自动聚焦方法和装置

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190316215A1 (en) * 2018-04-16 2019-10-17 Carl R. Nylen Portable Self-Contained Reverse Osmosis System

Also Published As

Publication number Publication date
CN103354599B (zh) 2016-08-10
US9706104B2 (en) 2017-07-11
CN103354599A (zh) 2013-10-16
US20160165124A1 (en) 2016-06-09

Similar Documents

Publication Publication Date Title
WO2015010623A1 (zh) 一种图像自动聚焦方法和应用该方法的摄像机
US9172863B2 (en) Video signal processing apparatus
JP4725802B2 (ja) 撮影装置、合焦方法および合焦プログラム
US9066002B2 (en) System and method for utilizing enhanced scene detection in a depth estimation procedure
CN107258077B (zh) 用于连续自动聚焦(caf)的系统和方法
JP5400171B2 (ja) 認識用撮像装置及びその制御方法
TW201704836A (zh) 曝光控制系統及其方法
US20170019582A1 (en) Focusing for point light source scene
US9208570B2 (en) System and method for performing depth estimation by utilizing an adaptive kernel
JP2013097082A (ja) 画像信号処理装置
US20180286020A1 (en) Image pickup apparatus, image processing apparatus, and control method of image pickup apparatus
JP2008276217A (ja) イメージセンサの自動焦点調節のための装置及び方法
US8736707B2 (en) System and method for utilizing scene detection in a depth estimation procedure
TWI394088B (zh) The adjustment method of the size of the selection of the image object
JP6087714B2 (ja) 撮像装置およびその制御方法
US9020280B2 (en) System and method for evaluating focus direction under various lighting conditions
KR102025361B1 (ko) 자동 초점 조절 시스템 및 방법
JP2002277725A (ja) 合焦制御方法及び撮像装置
US10600201B2 (en) Method of determining focus lens position, control program for making computer execute the method, and imaging device
US9615020B2 (en) Auto focus method and apparatus using the same
JP2023071092A5 (zh)
US9001223B2 (en) Method and apparatus for applying camera shake compensation to video content
US11334966B2 (en) Image processing apparatus, image processing method, and non-transitory computer-readable storage medium
KR102163007B1 (ko) 저조도 영상의 압축 속도 향상을 위한 예측 모드 결정 최적화 조정 방법 및 장치
KR101651624B1 (ko) 자동초점조절장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14828972

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14905276

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14828972

Country of ref document: EP

Kind code of ref document: A1