TW202411711A - Electronic device and method for phase detection autofocus - Google Patents

Electronic device and method for phase detection autofocus Download PDF

Info

Publication number
TW202411711A
TW202411711A TW111133031A TW111133031A TW202411711A TW 202411711 A TW202411711 A TW 202411711A TW 111133031 A TW111133031 A TW 111133031A TW 111133031 A TW111133031 A TW 111133031A TW 202411711 A TW202411711 A TW 202411711A
Authority
TW
Taiwan
Prior art keywords
phase difference
focus position
view
value
defocus
Prior art date
Application number
TW111133031A
Other languages
Chinese (zh)
Inventor
林銘達
Original Assignee
薩摩亞商偉光有限公司
Filing date
Publication date
Application filed by 薩摩亞商偉光有限公司 filed Critical 薩摩亞商偉光有限公司
Publication of TW202411711A publication Critical patent/TW202411711A/en

Links

Images

Abstract

The present disclosure discloses a method for phase detection autofocus and an electronic device thereof. The method is applicable to an image capture device and includes the following steps: calculating a plurality of phase difference values and a plurality of defocus values of a focusing image by a lens of the image capture device, wherein each of the phase difference values corresponds to a defocus value; grouping the phase difference values into a foreground group and a background group; determining a foreground focus position by the defocus values corresponding to the phase differences in the foreground group; determining a background focus position by the defocus values corresponding to the phase differences in the background group; determining an ideal focus position based on a depth of field of the focusing image, the foreground focus position and the background focus position; and moving the lens to the ideal focus position.

Description

電子設備及相位對焦方法Electronic device and phase focusing method

本揭露係關於一種電子設備,特別是關於一種具有影像擷取裝置的電子設備及相位對焦方法。The present disclosure relates to an electronic device, and more particularly to an electronic device having an image capture device and a phase focusing method.

隨著智慧型手機的普及,消費者使用智慧型手機進行拍照的機會越來越多,對照片的品質要求也是越來越高,而對焦更是相機拍照過程中至關重要的環節。更快的對焦速度及更精准的對焦精度很有利的提升使用者體驗及產品競爭力。With the popularity of smart phones, consumers have more and more opportunities to take photos with their smart phones, and their requirements for photo quality are getting higher and higher. Focus is a crucial part of the camera photography process. Faster focus speed and more accurate focus accuracy are very beneficial to improving user experience and product competitiveness.

為方便使用者快速拍攝清晰影像,智慧型手機上的相機一般均配備有自動對焦功能,其可在使用者啟用相機的同時,即主動偵測相機視野範圍內的物件並自動移動鏡頭以對焦於物件。然而,既存智慧型手機具備的自動對焦功能只能夠在相機視野範圍內存在單景深物件時進行精準且快速對焦,當相機視野範圍內出現多個不同景深的物件時,自動對焦功能容易因無法判斷哪一個物件為主體而發生失焦的問題。In order to facilitate users to quickly capture clear images, the cameras on smartphones are generally equipped with an autofocus function, which can actively detect objects within the camera's field of view and automatically move the lens to focus on the object when the user activates the camera. However, the autofocus function of existing smartphones can only accurately and quickly focus when there is an object with a single depth of field within the camera's field of view. When multiple objects with different depths of field appear within the camera's field of view, the autofocus function is prone to lose focus because it cannot determine which object is the main subject.

上文之「先前技術」說明僅係提供背景技術,並未承認上文之「先前技術」說明揭示本揭露之標的,不構成本揭露之先前技術,且上文之「先前技術」之任何說明均不應作為本案的任一部份。The above “prior art” description is only to provide background technology, and does not admit that the above “prior art” description discloses the subject matter of the present disclosure, does not constitute the prior art of the present disclosure, and any description of the above “prior art” should not be regarded as any part of the present case.

本揭露提供一種相位對焦方法,其適用於一影像擷取裝置並包括以下步驟:計算該影像擷取裝置的鏡頭對焦中畫面之複數個相位差值以及複數個離焦值,其中該等相位差值各自對應一離焦值;執行一分群操作,對該等相位差值進行分群,從而獲得一近景群組及一遠景群組;以該近景群組中的相位差值所對應的離焦值決定一近景對焦位置;以該遠景群組中的相位差值所對應的離焦值決定一遠景對焦位置;基於所述對焦中畫面之一景深、該近景對焦位置及該遠景對焦位置,決定該影像擷取裝置的鏡頭之一理想對焦位置;以及移動該影像擷取裝置的鏡頭至該理想對焦位置。The present disclosure provides a phase focusing method, which is applicable to an image capture device and includes the following steps: calculating a plurality of phase difference values and a plurality of defocus values of a picture in the focus of the lens of the image capture device, wherein each of the phase difference values corresponds to a defocus value; performing a grouping operation to group the phase difference values, thereby obtaining a near-view group and a far-view group; and using the near-view group to group the near-view group. The invention relates to a method for determining a near-view focus position according to a defocus value corresponding to the phase difference value in the far-view group; determining a far-view focus position according to the defocus value corresponding to the phase difference value in the far-view group; determining an ideal focus position of a lens of the image capture device based on a depth of field of the in-focus picture, the near-view focus position and the far-view focus position; and moving the lens of the image capture device to the ideal focus position.

本揭露提供一種電子設備,其包括一影像擷取裝置及一處理模組。該影像擷取裝置包括一鏡頭、一鏡頭控制模組及一影像感測模組;該鏡頭控制模組用以控制該鏡頭的移動,該影像感測模組係以通過該鏡頭進入該影像擷取裝置之光線生成一對焦中畫面。處理模組耦接於該影像擷取裝置中的該鏡頭控制模組和該影像感測模组,並經配置以執行以下操作:計算該鏡頭對焦中畫面之複數個相位差以及複數個離焦值;對該等相位差值進行分群,從而獲得一近景群組及一遠景群組;從被分群為該近景群組和該遠景群組的該等相位差值所對應的離焦值決定一近景對焦位置及一遠景對焦位置;以及基於所述對焦中畫面的一景深、該近景對焦位置和該遠景對焦位置,決定該鏡頭的一理想對焦位置,並使該鏡頭控制模組根據該理想對焦位置移動該鏡頭。The present disclosure provides an electronic device, which includes an image capture device and a processing module. The image capture device includes a lens, a lens control module and an image sensing module; the lens control module is used to control the movement of the lens, and the image sensing module generates a focused image using light entering the image capture device through the lens. The processing module is coupled to the lens control module and the image sensing module in the image capture device, and is configured to perform the following operations: calculating a plurality of phase differences and a plurality of defocus values of the lens in focus; grouping the phase difference values to obtain a near view group and a far view group; determining a near view focus position and a far view focus position from the defocus values corresponding to the phase difference values grouped into the near view group and the far view group; and determining an ideal focus position of the lens based on a depth of field of the in-focus picture, the near view focus position and the far view focus position, and causing the lens control module to move the lens according to the ideal focus position.

以下使用較為具體的文字描述圖式中繪示知本揭示內容的多種實施例或示例。應當理解,這些敘述僅為例示,其本意並非用於限制本揭示內容。對與本揭示內容相關領域中具有通常知識者而言,針對所述之實施例的任何改變或修飾,以及本文件中所述原理的進一步應用,皆屬一般慣常改變。在各實施例間可能會重複使用元件符號,但不代表一個實施例的一或多特徵必然出現在另一個實施例中,即便其使用了相同的元件符號。The following uses more specific text to describe various embodiments or examples of the present disclosure in the drawings. It should be understood that these descriptions are for illustration only and are not intended to limit the present disclosure. For those with ordinary knowledge in the field related to the present disclosure, any changes or modifications to the embodiments described, as well as further applications of the principles described in this document, are all common and customary changes. Component symbols may be reused between embodiments, but it does not mean that one or more features of one embodiment must appear in another embodiment, even if the same component symbols are used.

此處所用的詞彙是為了描述特定例示性的實施方式,且其本意並非用以限制本揭示內容的發明概念。在本揭示內容中,單數形的「一」和「該」亦涵蓋其複數形,除非上下文明確地另為表示。當可進一地理解,在本說明書中,開放性的詞彙如「包括」與「包含」代表存在所述特徵、整數、步驟、操作、元件或部件,但並非排除一或多個其他特徵、整數、步驟、操作、元件、部件或其群組織存在或加入。The terms used herein are for describing specific exemplary embodiments and are not intended to limit the inventive concepts of the present disclosure. In the present disclosure, the singular forms "a", "an" and "the" also include the plural forms unless the context clearly indicates otherwise. It should be further understood that in this specification, open terms such as "include" and "comprise" represent the presence of the features, integers, steps, operations, elements or components, but do not exclude the presence or addition of one or more other features, integers, steps, operations, elements, components or their group organizations.

圖1為根據本揭示內容的電子設備10的背面示意圖,圖2為本揭示內容的電子設備10的功能方塊圖。請參照圖1及圖2,電子設備10包括機殼110、影像擷取裝置120及處理模組130;影像擷取裝置120的鏡頭122從機殼110露出,處理模組130安裝在機殼110內並用於控制影像擷取裝置120進行影像擷取。示例性地,電子設備10可以為移動或便攜式並執行通信的各種類型的計算機系統中的任何一種。具體地,電子設備10可為智慧型手機,但本發明的影像擷取裝置120並不以應用於智慧型手機的後鏡頭為限。影像擷取裝置120可視需求應用於智慧型手機的前鏡頭或其他需要移動對焦的設備;舉例來說,影像擷取裝置120可應用於數位平板、數位相機或智能穿戴式設備。前揭電子設備10僅是示範性地說明本發明的實際運用例子,並非限制本發明之影像擷取裝置120的運用範圍。FIG1 is a schematic diagram of the back of an electronic device 10 according to the present disclosure, and FIG2 is a functional block diagram of the electronic device 10 according to the present disclosure. Referring to FIG1 and FIG2 , the electronic device 10 includes a housing 110, an image capture device 120, and a processing module 130; a lens 122 of the image capture device 120 is exposed from the housing 110, and the processing module 130 is installed in the housing 110 and is used to control the image capture device 120 to capture images. Exemplarily, the electronic device 10 can be any of various types of computer systems that are mobile or portable and perform communications. Specifically, the electronic device 10 can be a smart phone, but the image capture device 120 of the present invention is not limited to the rear lens used in a smart phone. The image capture device 120 can be applied to the front lens of a smart phone or other devices that require mobile focus according to needs; for example, the image capture device 120 can be applied to a digital tablet, a digital camera or a smart wearable device. The above-mentioned electronic device 10 is only an example of the actual application of the present invention, and does not limit the application scope of the image capture device 120 of the present invention.

影像擷取裝置120進一步包括影像感測模組124及鏡頭控制模組126,二者配置在機殼110內並電性耦接於處理模組130。在欲利用影像擷取裝置120來拍攝包括光源和物件的目標場景時,目標場景中物件的反光及光源的光會穿過鏡頭122中的一或多個光學透鏡(未圖示)並在影像感測模組124上成像,影像感測模組124中各個像素的光電元件(例如光電二極體)將所感測到的光線進行光電轉換生成相應的電訊號,並傳送至處理模組130。The image capture device 120 further includes an image sensing module 124 and a lens control module 126, both of which are disposed in the housing 110 and electrically coupled to the processing module 130. When the image capture device 120 is used to capture a target scene including a light source and an object, the reflection of the object in the target scene and the light of the light source will pass through one or more optical lenses (not shown) in the lens 122 and form an image on the image sensing module 124. The photoelectric element (e.g., photodiode) of each pixel in the image sensing module 124 performs photoelectric conversion on the sensed light to generate a corresponding electrical signal and transmit it to the processing module 130.

在圖3的實施例中,影像感測模組124中的一些像素1242可兩兩一對以光線遮擋部1244進行對稱式的半遮蔽處理而形成相位偵測像素1245A和1245B,專門用來進行相位檢測。成對配置的相位偵測像素1245A和1245B可相隔一定距離,且在成對配置的相位偵測像素1245A和1245B之間可具有一個或多個未上覆光線遮擋部1244的像素1242;然而,在其他實施例中,成對配置的相位偵測像素1245A和1245B可以彼此相鄰設置。在一些實施例中,像素1242可通過微透鏡1246接收來自目標場景的光線。In the embodiment of FIG. 3 , some pixels 1242 in the image sensing module 124 may be symmetrically semi-shielded by a light shielding portion 1244 in pairs to form phase detection pixels 1245A and 1245B, which are specifically used for phase detection. The paired phase detection pixels 1245A and 1245B may be separated by a certain distance, and there may be one or more pixels 1242 not covered by the light shielding portion 1244 between the paired phase detection pixels 1245A and 1245B; however, in other embodiments, the paired phase detection pixels 1245A and 1245B may be arranged adjacent to each other. In some embodiments, the pixel 1242 may receive light from the target scene through a microlens 1246.

為了方便說明,以下將配置在圖3左側的相位偵測像素1245A稱為左相位偵測像素,位在圖3右側的相位偵測像素1245B稱為右相位偵測像素。光線遮擋部1244用於遮蔽來自於特定方向的光。舉例來說,左相位偵測像素1245A的光線遮擋部1244阻擋光線到達像素1242的左側,右相位偵測像素1245B的光線遮擋部1244阻擋光線到達像素1242的右側;因此,左相位偵測像素1245A只能夠接收來自右側的光線,而右相位偵測像素1245B只能夠接收來自左側的光線。For the convenience of explanation, the phase detection pixel 1245A disposed on the left side of FIG. 3 is referred to as the left phase detection pixel, and the phase detection pixel 1245B disposed on the right side of FIG. 3 is referred to as the right phase detection pixel. The light blocking portion 1244 is used to shield light from a specific direction. For example, the light blocking portion 1244 of the left phase detection pixel 1245A blocks light from reaching the left side of the pixel 1242, and the light blocking portion 1244 of the right phase detection pixel 1245B blocks light from reaching the right side of the pixel 1242; therefore, the left phase detection pixel 1245A can only receive light from the right side, and the right phase detection pixel 1245B can only receive light from the left side.

在圖4A中,鏡頭122是處在合焦位置F0,此時物件1250剛好成像在影像感測模組124的像素1242上,因此物件1250的反射光在經過鏡頭122之後的左側光L及右側光R會照射到相同的位置,使得左相位偵測像素1245A所感測到的光強度分佈曲線I A會與右相位偵測像素1245B所感測到的光強度分佈曲線I B重合。 In FIG. 4A , the lens 122 is at the focusing position F0, and the object 1250 is imaged on the pixel 1242 of the image sensing module 124. Therefore, the left light L and the right light R of the reflected light of the object 1250 after passing through the lens 122 will illuminate the same position, so that the light intensity distribution curve IA sensed by the left phase detection pixel 1245A will overlap with the light intensity distribution curve IB sensed by the right phase detection pixel 1245B.

然而,在圖4B中,由於鏡頭122的位置太過靠近物件1250,亦即鏡頭122的所在位置F1相較於合焦位置F0太過前移,因此左相位偵測像素1245A所感測到右側光R之光強度分佈曲線I A的峰值會在右相位偵測像素1245B所感測到的左側光L之光強度分佈曲線I B的峰值的左方。在此情況下,若以右方為正,則光強度分佈曲線I A的峰值座標減去與光強度分佈曲線I B的峰值座標即可得出一負向的偏移量SF1。 However, in FIG4B , since the position of the lens 122 is too close to the object 1250, that is, the position F1 of the lens 122 is too far forward compared to the focus position F0, the peak value of the light intensity distribution curve IA of the right side light R sensed by the left phase detection pixel 1245A will be to the left of the peak value of the light intensity distribution curve IB of the left side light L sensed by the right phase detection pixel 1245B. In this case, if the right side is positive, a negative offset SF1 can be obtained by subtracting the peak coordinates of the light intensity distribution curve IA from the peak coordinates of the light intensity distribution curve IB .

相對地,在圖4C中,由於鏡頭122的位置太過遠離物件1250,亦即鏡頭122的所在位置F2相較於合焦位置F0太過後移,因此左相位偵測像素1245A所感測到右側光R之光強度分佈曲線I A的峰值會在右相位偵測像素1245B所感測到左側光L之光強度分佈曲線I B的峰值的右方。在此情況下,若以右方為正,則光強度分佈曲線I A的峰值座標減去與光強度分佈曲線I B的峰值座標即可得出一正向的偏移量SF2。 In contrast, in FIG4C , since the position of the lens 122 is too far from the object 1250, that is, the position F2 of the lens 122 is too far back compared to the focus position F0, the peak value of the light intensity distribution curve IA of the right side light R sensed by the left phase detection pixel 1245A will be to the right of the peak value of the light intensity distribution curve IB of the left side light L sensed by the right phase detection pixel 1245B. In this case, if the right side is positive, a positive offset SF2 can be obtained by subtracting the peak coordinates of the light intensity distribution curve IA from the peak coordinates of the light intensity distribution curve IB .

從圖4A至4C中可以觀察到,當鏡頭122與物件1250之間的距離改變時,光強度分佈曲線I A的峰值與光強度分佈曲線I B的峰值位置也會改變,因此依據兩者峰值之間的偏移量SF1及SF2就可以得知鏡頭122目前的對焦情況,並且可以推算出鏡頭122與合焦位置F0的距離大小及偏離的方向,亦即離焦值DF1及DF2。 It can be observed from FIGS. 4A to 4C that when the distance between the lens 122 and the object 1250 changes, the peak positions of the light intensity distribution curve IA and the peak positions of the light intensity distribution curve IB will also change. Therefore, the current focusing status of the lens 122 can be known based on the offsets SF1 and SF2 between the two peaks, and the distance between the lens 122 and the focus position F0 and the direction of the deviation, i.e., the defocus values DF1 and DF2, can be calculated.

在本實施例中,處理模組130可以通過比對成對配置之相位偵測像素1245A和1245B輸出之電訊號中呈現的光強度分佈資訊的相位差值,來確定影像擷取裝置120是處於對焦狀態(即圖4A所示狀態)、前焦狀態(即圖4B所示狀態)或後焦狀態(即圖4C所示狀態)。鏡頭控制模組126可包括馬達,其接收來自處理模組130之控制使鏡頭122沿光軸移動,以適當地調整由影像感測模組124擷取之影像的聚焦(即實現鏡頭對焦功能)。In this embodiment, the processing module 130 can determine whether the image capture device 120 is in a focus state (i.e., the state shown in FIG. 4A ), a front focus state (i.e., the state shown in FIG. 4B ), or a back focus state (i.e., the state shown in FIG. 4C ) by comparing the phase difference of the light intensity distribution information presented in the electrical signals output by the paired phase detection pixels 1245A and 1245B. The lens control module 126 may include a motor that receives control from the processing module 130 to move the lens 122 along the optical axis to appropriately adjust the focus of the image captured by the image sensing module 124 (i.e., to realize the lens focusing function).

圖5為本揭示內容的相位對焦方法的流程圖。請參照圖2及圖5,相位對焦方法300可包括步驟S302-S318,並可應用於電子設備10。在一些實施例中,電子設備10例如是在使用者啟動攝像功能後,便進入即時預覽模式(如圖6);此時,目標場景的影像會顯示在電子設備10的螢幕140,以供使用者預視將擷取到的影像內容。在同一時間,影像感測模組124可不斷地擷取即時對焦中畫面142並將其轉換為相應的電訊號傳遞至處理模組130,以執行自動對焦程序。FIG5 is a flow chart of the phase focusing method of the present disclosure. Referring to FIG2 and FIG5, the phase focusing method 300 may include steps S302-S318 and may be applied to the electronic device 10. In some embodiments, the electronic device 10 enters the real-time preview mode (as shown in FIG6) after the user activates the camera function; at this time, the image of the target scene is displayed on the screen 140 of the electronic device 10 for the user to preview the image content to be captured. At the same time, the image sensing module 124 may continuously capture the real-time focusing picture 142 and convert it into a corresponding electrical signal to be transmitted to the processing module 130 to execute the automatic focusing procedure.

於執行相位對焦的過程中,處理模組130自影像感測模組124提供的電訊號而計算得到對焦中畫面142之中各區塊144(如圖7所示)所對應的相位差值及離焦值(步驟S302)。在本實施例中,成對配置的相位偵測像素1245A和1245B的數量至少等於區塊144的數量,以確保處理模組130可由各成對配置之相位偵測像素1245A和1245B輸出的電訊號計算得到每一區塊144所對應的相位差值和個離焦值。圖7所示的區塊144及其數量,僅為方便說明起見之例示,並非用於限制本申請。In the process of performing phase focusing, the processing module 130 calculates the phase difference value and the defocus value corresponding to each block 144 (as shown in FIG. 7 ) in the focused image 142 from the electrical signal provided by the image sensing module 124 (step S302). In this embodiment, the number of the paired phase detection pixels 1245A and 1245B is at least equal to the number of blocks 144 to ensure that the processing module 130 can calculate the phase difference value and the defocus value corresponding to each block 144 from the electrical signal output by each paired phase detection pixel 1245A and 1245B. The blocks 144 and their number shown in FIG. 7 are merely examples for the convenience of explanation and are not intended to limit the present application.

在本實施例中,相位差值可以例如依據圖4B和圖4C中,左相位偵測像素1245A和對應的右相位偵測像素1245B分別提供的光強度分佈曲線I A和I B的峰值之間的偏移量SF1及SF2來推算,然而本申請並不以此為限,在有些其他實施例中,處理模組130也可依據其他的方式或定義來計算相位差。再者,離焦值可例如是圖4B中鏡頭122的位置F1與合焦位置F0的距離DF1,以及圖4C中鏡頭122的位置F1與合焦位置F0的距離DF2。 In this embodiment, the phase difference value can be calculated, for example, based on the offsets SF1 and SF2 between the peaks of the light intensity distribution curves IA and IB provided by the left phase detection pixel 1245A and the corresponding right phase detection pixel 1245B in FIG. 4B and FIG. 4C, respectively. However, the present application is not limited thereto. In some other embodiments, the processing module 130 can also calculate the phase difference according to other methods or definitions. Furthermore, the defocus value can be, for example, the distance DF1 between the position F1 of the lens 122 and the focus position F0 in FIG. 4B, and the distance DF2 between the position F1 of the lens 122 and the focus position F0 in FIG. 4C.

此外,一般來說,相位偏移的正確度會與影像的內容有關,舉例來說,若畫面中的物件大多沒有明顯邊界或有較多雜訊的情況下,則左相位偵測像素1245A及右相位偵測像素1245B所感測到的光強度的峰值就可能十分相近,以致於無法從兩者所感測到的光強度中取得準確的相位差。因此,本申請的對焦方法可依據特定的參數對計算得到的每個區塊144的相位差值進行評價,並移除不適格的相位差值,從而避免影像擷取裝置120因為參考不準確的相位差而得出錯誤的鏡頭對焦位置。舉例來說,在步驟S303中,處理模組130可取得對應於每個相位差值的信心水準值,作為判斷相位差值是否適格的依據。信心水準值可與處理模組130在計算相位偏移量時,對焦中畫面偵測出的相位差值是否準確相關;信心水準值可例如與對焦中畫面易於偵測相位差值的邊緣的數量成正比的參考索引,即相位差值的可靠程度或精確程度。在本實施例中,信心水準值可以由影像感測模組124在感測影像的過程中產生並提供給處理模組130,或由處理模組130在計算相位差值時估計產生。In addition, generally speaking, the accuracy of the phase shift is related to the content of the image. For example, if most of the objects in the picture do not have obvious boundaries or have more noise, the peak values of the light intensity sensed by the left phase detection pixel 1245A and the right phase detection pixel 1245B may be very close, so that it is impossible to obtain an accurate phase difference from the light intensity sensed by the two. Therefore, the focusing method of the present application can evaluate the calculated phase difference value of each block 144 according to specific parameters and remove unqualified phase difference values, thereby preventing the image capture device 120 from obtaining an erroneous lens focus position due to reference to an inaccurate phase difference. For example, in step S303, the processing module 130 may obtain a confidence level value corresponding to each phase difference value as a basis for determining whether the phase difference value is qualified. The confidence level value may be related to whether the phase difference value detected by the processing module 130 in the focused image is accurate when calculating the phase offset; the confidence level value may be, for example, a reference index proportional to the number of edges of the focused image that are easy to detect the phase difference value, that is, the reliability or accuracy of the phase difference value. In this embodiment, the confidence level value may be generated by the image sensing module 124 during the process of sensing the image and provided to the processing module 130, or may be estimated by the processing module 130 when calculating the phase difference value.

在本實施例中,除了參考信心水準值之外,處理模組130還可選擇性地在步驟S304中執行時域濾波(temporal filtering)以求取對焦位置平均值和標準差,並可在步驟S306中依據標準差及信心水準值來評估相應的相位差值是否適格。In this embodiment, in addition to referring to the confidence level value, the processing module 130 may selectively perform temporal filtering in step S304 to obtain the focus position average value and standard deviation, and may evaluate whether the corresponding phase difference value is qualified based on the standard deviation and the confidence level value in step S306.

在執行時域濾波時,處理模組130可在既定期間內的多個時間點,執行步驟S304以根據每一個離焦值來估算得到一個對焦位置,然後以這些時間點估算得到的多個對焦位置求取在既定期間的對焦位置平均值和標準差。舉例來說,處理模組130係以離焦值和鏡頭控制模組126提供的鏡頭122當前位置估算出對焦位置。When performing time domain filtering, the processing module 130 may perform step S304 at multiple time points within a given period to estimate a focus position based on each defocus value, and then obtain the focus position average and standard deviation within the given period using the multiple focus positions estimated at these time points. For example, the processing module 130 estimates the focus position using the defocus value and the current position of the lens 122 provided by the lens control module 126.

在影像擷取裝置120的鏡頭122在對焦中畫面所得到的多個相位差值以及多個離焦值之個數均為 M個,既定期間的時點數量為 N時,處理模組130以式(1)至式(3)執行時域濾波運算: 式(1) 式(2) 式(3) 其中, D i,m 代表在時間點 i得到的第 m個離焦值, F i 代表影像擷取裝置120的鏡頭122於時間點 i所處的當前位置, x i,m 代表根據離焦值 D i,m 於時間點 i估算得到的對焦位置, 代表根據第 m個離焦值於既定期間經過時域濾波運算而產生的對焦位置平均值,且 std m 代表根據第 m個離焦值於既定期間經過時域濾波運算而產生的標準差。此外,M及N為正整數;在本實施例中,N可為3至5。 When the number of phase difference values and defocus values obtained by the lens 122 of the image capture device 120 during focusing is M and the number of time points during a given period is N , the processing module 130 performs time domain filtering operations according to equations (1) to (3): Formula 1) Formula (2) Formula (3) wherein D i,m represents the mth defocus value obtained at time point i , F i represents the current position of the lens 122 of the image capture device 120 at time point i , x i,m represents the focus position estimated at time point i based on the defocus value D i,m , represents the average value of the focus position generated by the time domain filtering operation based on the mth defocus value during the given period, and std m represents the standard deviation generated by the time domain filtering operation based on the mth defocus value during the given period. In addition, M and N are positive integers; in this embodiment, N can be 3 to 5.

為了提高對焦精度及速度,處理模組130可基於信心水準值和標準差以從步驟S302中獲得的所有相位差值中篩選出適格的相位差值(步驟S306)。舉例來說,處理模組130可以把信心水準值大於第一閾值且標準差小於第二閾值的相位差值判斷為適格相位差值,而將信心水準值不大於第一閾值或標準差不小於第二閾值的相位差值判斷為不適格相位差值;其中,第一閾值和第二閾值可取決於影像感測模組124的性能參數。處理模組130會捨棄不適格的相位差值(步驟S308),不對其進行分群處理。要特別說明的是,處理模組130可以不執行步驟S304的時域濾波操作而只利用信心水準值對在步驟S302中得到的多個相位差值進行篩選。In order to improve the focusing accuracy and speed, the processing module 130 can filter out qualified phase difference values from all phase difference values obtained in step S302 based on the confidence level value and the standard deviation (step S306). For example, the processing module 130 can determine the phase difference value whose confidence level value is greater than the first threshold value and whose standard deviation is less than the second threshold value as a qualified phase difference value, and determine the phase difference value whose confidence level value is not greater than the first threshold value or whose standard deviation is not less than the second threshold value as an unqualified phase difference value; wherein the first threshold value and the second threshold value can be determined by the performance parameters of the image sensing module 124. The processing module 130 will discard the unqualified phase difference value (step S308) and will not perform grouping processing on it. It should be particularly noted that the processing module 130 may not perform the time domain filtering operation of step S304 but only use the confidence level value to filter the multiple phase difference values obtained in step S302.

在步驟S310中,處理模組130可以利用一分群演算法將適格相位差值分群,以獲得近景群組和遠景群組。在一些實施例中,處理模組130可利用K平均演算法(K-means)對適格相位差值進行分群。圖8為本揭示內容之執行分群操作的流程圖。請參照圖8,在步驟S3102,處理模組130可計算出適格相位差值的平均值。接著,在步驟S3104,處理模組130可根據在步驟S3102中計算得到的平均值對所有適格相位差值進行第一次分群。在一些實施例中,處理模組130可例如以每個適格相位差值和平均值的比較決定這個適格相位差值應分為近景群組或遠景群組;舉例來說,在進行第一次分群時,小於及等於平均值的適格相位差值分屬於近景群組,而大於平均值的適格相位差值分屬於遠景群組,然而,本申請並不以此為限,在有些其他實施例中,處理模組130也可以隨機地選取兩個群集中心作為遠景群組的中心及近景群組的中心,並依據每個適格相位差值與這兩個群集中心的距離遠近來進行第一次的分群。In step S310, the processing module 130 may group the qualified phase difference values using a grouping algorithm to obtain a near-view group and a distant view group. In some embodiments, the processing module 130 may group the qualified phase difference values using a K-means algorithm. FIG8 is a flow chart of the grouping operation performed in the present disclosure. Referring to FIG8, in step S3102, the processing module 130 may calculate the average value of the qualified phase difference values. Then, in step S3104, the processing module 130 may perform a first grouping of all qualified phase difference values based on the average value calculated in step S3102. In some embodiments, the processing module 130 may, for example, determine whether the qualified phase difference value should be classified into a near-view group or a distant view group by comparing each qualified phase difference value with the average value; for example, when performing the first grouping, qualified phase difference values less than or equal to the average value belong to the near-view group, and qualified phase difference values greater than the average value belong to the distant view group. However, the present application is not limited thereto. In some other embodiments, the processing module 130 may also randomly select two cluster centers as the center of the distant view group and the center of the near-view group, and perform the first grouping based on the distance between each qualified phase difference value and the two cluster centers.

在完成第一次分群之後,處理模組130會計算分屬於近景群組中的適格相位差值的平均值(以下稱近景相位平均值)以及分屬遠景群組中的適格相位差值的平均值(以下稱遠景相位平均值)(步驟S3106)。接著,處理模組130會以近景相位平均值和遠景相位平均值為群心,對所有的適格相位差值進行第二次分群(步驟S3108)。After the first grouping is completed, the processing module 130 calculates the average value of the qualified phase difference values belonging to the near-view group (hereinafter referred to as the near-view phase average value) and the average value of the qualified phase difference values belonging to the distant view group (hereinafter referred to as the distant view phase average value) (step S3106). Then, the processing module 130 performs a second grouping of all the qualified phase difference values with the near-view phase average value and the distant view phase average value as the group centers (step S3108).

在進行第二次分群時,處理模組130會根據每個適格相位差值與近景相位平均值、以及遠景相位平均值之間的差距決定每個相位差值應分到近景群組或遠景群組。舉例來說,處理模組130可計算出每個適格相位差值與近景相位平均值的第一差值,以及每個適格相位差值與遠景相位平均值的第二差值。在一個適格相位差值所對應的第一差值小於或等於第二差值時,表示這個適格相位差值與近景相位平均值的差距較小,因此可將這個適格相位差值分到近景群組;在一個適格相位差值所對應的第一差值大於第二差值時,表示這個適格相位差值與遠景相位平均值的差距較小,因此可將這個適格相位差值分到遠景群組。如此一來,在第二次分群後,就可以進一步降低近景群組及遠景群組內部相位差值的相異程度。When performing the second grouping, the processing module 130 will determine whether each phase difference value should be assigned to the near view group or the distant view group according to the difference between each qualified phase difference value and the near view phase average value and the distant view phase average value. For example, the processing module 130 can calculate a first difference between each qualified phase difference value and the near view phase average value, and a second difference between each qualified phase difference value and the distant view phase average value. When the first difference corresponding to a qualified phase difference value is less than or equal to the second difference, it means that the difference between the qualified phase difference value and the near view phase average value is smaller, so the qualified phase difference value can be assigned to the near view group; when the first difference corresponding to a qualified phase difference value is greater than the second difference, it means that the difference between the qualified phase difference value and the distant view phase average value is smaller, so the qualified phase difference value can be assigned to the distant view group. In this way, after the second grouping, the difference in phase difference between the near-view group and the far-view group can be further reduced.

處理模組130在進行再次分群時,可根據是否有適格相位差值須從近景群組和遠景群組中的一方移動到另一方來判斷是否結束分群操作(步驟S3110)。舉例來說,若在進行再次分群時,沒有任何一個適格相位差值從近景群組移動到遠景群組,也沒有任何一個適格相位差值從遠景群組移動到近景群組,則表示分群操作已趨穩定而可結束分群操作;倘若有任一個適格相位差值從近景群組移動到遠景群組或者由遠景群組移動到近景群組,則表示分群操作可能尚未穩定,而須重複執行步驟S3106至S3108,直至所有相位差值不再於近景群組和遠景群組之間移動為止。When performing re-grouping, the processing module 130 can determine whether to terminate the grouping operation based on whether there is a qualified phase difference value that needs to be moved from one of the near-view group and the distant-view group to the other (step S3110). For example, if during re-grouping, no eligible phase difference value moves from the close-view group to the distant group, and no eligible phase difference value moves from the distant group to the close-view group, it means that the grouping operation has become stable and the grouping operation can be terminated; if any eligible phase difference value moves from the close-view group to the distant group or from the distant group to the close-view group, it means that the grouping operation may not be stable yet, and steps S3106 to S3108 must be repeated until all phase difference values no longer move between the close-view group and the distant group.

復參閱圖5;在完成分群操作後,處理模組130會以分群在近景群組中的適格相位差值所對應的離焦值決定近景對焦位置,以及以分群在遠景群組中的適格相位差所對應的離焦值決定遠景對焦位置(步驟S312)。在一些實施例中,處理模組130會以近景群組中的每一個相位差值對應的離焦值估測得到一個第一目標對焦位置,並在得到近景群組中所有相位差值相應的第一目標對焦位置之後,平均這些第一目標對焦位置而得到第一平均值作為近景對焦位置。類似地,處理模組130會以遠景群組中的每一個相位差值對應的離焦值估測得到一個第二目標對焦位置,並在得到遠景群組中所有相位差值相應的第二目標對焦位置之後,平均這些第二目標對焦位置而得到第二平均值作為遠景對焦位置。Referring to FIG. 5 again, after the grouping operation is completed, the processing module 130 determines the near-view focus position with the defocus value corresponding to the qualified phase difference value grouped in the near-view group, and determines the far-view focus position with the defocus value corresponding to the qualified phase difference value grouped in the far-view group (step S312). In some embodiments, the processing module 130 estimates a first target focus position with the defocus value corresponding to each phase difference value in the near-view group, and after obtaining the first target focus positions corresponding to all phase difference values in the near-view group, averages these first target focus positions to obtain a first average value as the near-view focus position. Similarly, the processing module 130 estimates a second target focus position using the defocus value corresponding to each phase difference value in the distant view group, and after obtaining the second target focus positions corresponding to all phase difference values in the distant view group, averages these second target focus positions to obtain a second average value as the distant view focus position.

接著,處理模組130係算出影像擷取裝置120的超焦距(hyperfocal distance)、近景深、遠景深和景深;其中,超焦距是在其鏡頭122可以對焦同時保持處於無限遠處的物體的可接受的銳利的最近距離。通常,當鏡頭對焦於超焦距離時,在從超焦距離的一半到無窮遠的距離內的所有物體皆可在影像感測模組124上清晰成像,而景深指的是影像感測模組124對焦點前後相對清晰的成像範圍。超焦距(H)、近景深(D N)、遠景深(D F)和景深(DOF)可以如下列公式來算出: 式(4) 式(5) 式(6) 式(7) Next, the processing module 130 calculates the hyperfocal distance, near depth of field, far depth of field and depth of field of the image capture device 120; wherein the hyperfocal distance is the shortest distance at which the lens 122 can focus while maintaining an acceptable sharpness on an object at infinity. Generally, when the lens is focused at the hyperfocal distance, all objects within a distance from half of the hyperfocal distance to infinity can be clearly imaged on the image sensing module 124, and the depth of field refers to the relatively clear imaging range before and after the focus point of the image sensing module 124. The hyperfocal distance (H), near depth of field ( DN ), far depth of field ( DF ) and depth of field (DOF) can be calculated as follows: Formula (4) Formula (5) Formula (6) Formula (7)

其中式(4)至式(7)中的 f為焦距, N為光圈值, c為模糊圈之直徑,而 s為物距。 In equations (4) to (7), f is the focal length, N is the aperture value, c is the diameter of the blur circle, and s is the object distance.

在得到景深之後,處理模組130可基於景深、近景對焦位置及遠景對焦位置判斷鏡頭122的理想對焦位置(步驟S314)。在一些實施例中,處理模組130藉由遠景對焦位置和前景對焦位置的差值與景深的比較判斷鏡頭122的理想對焦位置。具體地,在遠景對焦位置和近景對焦位置之間的距離大於景深時,這意味著在目標場景中,只有落在景深範圍內的物件能夠被清晰地成像,而一般來說,使用者在拍攝時通常是以近物為主體,因此處理模組130可優先選擇以近景對焦位置作為理想對焦位置(步驟S316);反之,在遠景對焦位置和近景對焦位置之間的距離等於或小於景深時,這意味著目標場景中的物件都可被清晰地成像,故處理模組130可以近景對焦位置和遠景對焦位置的平均值作為理想對焦位置(步驟S318)。After obtaining the depth of field, the processing module 130 can determine the ideal focus position of the lens 122 based on the depth of field, the near focus position and the far focus position (step S314). In some embodiments, the processing module 130 determines the ideal focus position of the lens 122 by comparing the difference between the far focus position and the foreground focus position with the depth of field. Specifically, when the distance between the distant focus position and the near focus position is greater than the depth of field, this means that in the target scene, only objects within the depth of field can be clearly imaged. Generally speaking, users usually focus on nearby objects when taking pictures. Therefore, the processing module 130 can preferentially select the near focus position as the ideal focus position (step S316). On the contrary, when the distance between the distant focus position and the near focus position is equal to or less than the depth of field, this means that all objects in the target scene can be clearly imaged. Therefore, the processing module 130 can use the average value of the near focus position and the distant focus position as the ideal focus position (step S318).

最後,步驟S320,處理模組130會發送控制訊號移動影像擷取裝置120的鏡頭122至理想對焦位置,以完成相位對焦。Finally, in step S320, the processing module 130 sends a control signal to move the lens 122 of the image capture device 120 to an ideal focus position to complete phase focusing.

雖然已經詳細描述了本揭示內容及其優點,但是應當理解,再不脫離由所附申請專利範圍限定的本揭示內容的技術的情況下,可以在此進行各種改變、替換或變更。例如,以上所討論之程序中的許多可用不同的方法論實施,以及可由其他程序來取代,或其等之組合。Although the present disclosure and its advantages have been described in detail, it should be understood that various changes, substitutions or modifications may be made therein without departing from the technology of the present disclosure as defined by the appended patent applications. For example, many of the procedures discussed above may be implemented using different methodologies and may be replaced by other procedures, or a combination thereof.

此外,本申請案的範圍不意欲限於在本說明書中所述之程序、機器、製品、組成物、裝置、方法與步驟的特定實施例。如本技術領域中之具有通常知識者將從本發明之揭露中容易理解的是,目前存在或後來將開發的且與於此所述之相對應的實施例實質上執行相同的功能或實質上實現相同的結果的程序、機器、製品、組成物、裝置、方法或步驟可根據本發明來使用。因此,隨附申請專利範圍意欲在其範疇中包括此等過程、機器、製造、物質組成、手段、方法及步驟。Furthermore, the scope of the present application is not intended to be limited to the specific embodiments of the procedures, machines, articles, compositions, devices, methods, and steps described in this specification. As will be readily understood by those skilled in the art from the disclosure of the present invention, procedures, machines, articles, compositions, devices, methods, or steps that exist now or will be developed later and that perform substantially the same functions or achieve substantially the same results as the corresponding embodiments described herein may be used in accordance with the present invention. Therefore, the attached application is intended to include within its scope such processes, machines, manufactures, compositions of matter, means, methods, and steps.

10:電子設備 110:機殼 120:影像擷取裝置 122:鏡頭 124:影像感測模組 126:鏡頭控制模組 130:處理模組 140:螢幕 142:對焦中畫面 144:區塊 300:相位對焦方法 1242:像素 1244:光線遮擋部 1245A:左相位偵測像素 1245B:右相位偵測像素 1246:微透鏡 1250:物件 L:左側光 R:右側光 S302-S320:步驟 S3102-S3110:步驟 SF1:偏移量 SF2:偏移量 10: Electronic equipment 110: Housing 120: Image capture device 122: Lens 124: Image sensing module 126: Lens control module 130: Processing module 140: Screen 142: Focusing image 144: Block 300: Phase focusing method 1242: Pixel 1244: Light blocking part 1245A: Left phase detection pixel 1245B: Right phase detection pixel 1246: Micro lens 1250: Object L: Left side light R: Right side light S302-S320: Steps S3102-S3110: Steps SF1: Offset SF2: offset

參閱實施方式與申請專利範圍合併考量圖式時,可得以更全面了解本申請案之揭示內容,圖式中相同的元件符號係指相同的元件。 圖1是根據本揭示內容的電子設備的背面示意圖。 圖2是根據本揭示內容的電子設備的功能方塊圖。 圖3是根據本揭示內容的影像感測模組的部分像素的剖視圖。 圖4A是根據本揭示內容的影像擷取裝置處於對焦狀態的示意圖。 圖4B和4C根據本揭示內容的影像擷取裝置處於離焦狀態的示意圖。 圖5是本揭示內容的相位對焦方法的流程圖。 圖6是根據本揭示內容的對焦中畫面的示意圖。 圖7是根據本揭示內容將對焦中畫面分割為複數區塊的示意圖。 圖8是根據本揭示內容的分群操作的流程圖。 When referring to the embodiments and the drawings in combination with the scope of the patent application, a more comprehensive understanding of the disclosure of the present application can be obtained. The same component symbols in the drawings refer to the same components. FIG. 1 is a schematic diagram of the back of an electronic device according to the disclosure. FIG. 2 is a functional block diagram of an electronic device according to the disclosure. FIG. 3 is a cross-sectional view of a portion of pixels of an image sensing module according to the disclosure. FIG. 4A is a schematic diagram of an image capture device in a focused state according to the disclosure. FIG. 4B and 4C are schematic diagrams of an image capture device in a defocused state according to the disclosure. FIG. 5 is a flow chart of a phase focusing method according to the disclosure. FIG. 6 is a schematic diagram of a focusing screen according to the disclosure. FIG. 7 is a schematic diagram of dividing the focused image into multiple blocks according to the disclosure. FIG. 8 is a flow chart of the grouping operation according to the disclosure.

S302-S320:步驟 S302-S320: Steps

Claims (18)

一種相位對焦方法,適用於一影像擷取裝置,該相位對焦方法包括: 計算該影像擷取裝置的鏡頭對焦中畫面之複數個相位差值以及複數個離焦值,其中該等相位差值各自對應一離焦值; 執行一分群操作,對該等相位差值進行分群,從而獲得一近景群組及一遠景群組; 以該近景群組中的相位差值所對應的離焦值決定一近景對焦位置; 以該遠景群組中的相位差值所對應的離焦值決定一遠景對焦位置; 基於所述對焦中畫面之一景深、該近景對焦位置及該遠景對焦位置,決定該影像擷取裝置的鏡頭之一理想對焦位置;以及 移動該影像擷取裝置的鏡頭至該理想對焦位置。 A phase focusing method is applicable to an image capture device, and the phase focusing method includes: Calculating a plurality of phase difference values and a plurality of defocus values of a focus image of a lens of the image capture device, wherein each of the phase difference values corresponds to a defocus value; Performing a grouping operation to group the phase difference values, thereby obtaining a near-view group and a far-view group; Determining a near-view focus position by the defocus value corresponding to the phase difference value in the near-view group; Determining a far-view focus position by the defocus value corresponding to the phase difference value in the far-view group; Determining an ideal focus position of the lens of the image capture device based on a depth of field of the focus image, the near-view focus position and the far-view focus position; and Move the lens of the image capture device to the ideal focus position. 如請求項1所述之方法,其中: 當該遠景對焦位置和該近景對焦位置之間的距離大於該景深時,選擇以該近景對焦位置作為該理想對焦位置,以及 當該遠景對焦位置和該近景對焦位置之間的距離等於或小於該景深時,選擇以該前景對焦位置和該遠景對焦位置的一平均值作為該理想對焦位置。 The method as described in claim 1, wherein: When the distance between the distant focus position and the near focus position is greater than the depth of field, the near focus position is selected as the ideal focus position, and When the distance between the distant focus position and the near focus position is equal to or less than the depth of field, an average value of the foreground focus position and the distant focus position is selected as the ideal focus position. 如請求項1所述之方法,其中: 以該近景群組中的相位差值所對應的離焦值決定該近景對焦位置包含: 以該近景群組中的各相位差值所對應的離焦值估測得到複數個第一目標對焦位置;以及 計算該等第一目標對焦位置的一第一平均值,並以該第一平均值作為該近景對焦位置;以及 以該遠景群組中的相位差值所對應的離焦值決定該遠景對焦位置包含: 以該遠景群組中的各相位差值所對應的離焦值估測得到複數個第二目標對焦位置;以及 計算該等第二目標對焦位置的一第二平均值,並以該第二平均值作為該遠景對焦位置。 The method as described in claim 1, wherein: Determining the near-view focus position by the defocus value corresponding to the phase difference value in the near-view group includes: Estimating a plurality of first target focus positions by the defocus value corresponding to each phase difference value in the near-view group; and Calculating a first average value of the first target focus positions, and using the first average value as the near-view focus position; and Determining the far-view focus position by the defocus value corresponding to the phase difference value in the far-view group includes: Estimating a plurality of second target focus positions by the defocus value corresponding to each phase difference value in the far-view group; and Calculating a second average value of the second target focus positions, and using the second average value as the far-view focus position. 如請求項1所述之方法,其中在計算該等相位差值時,更得到與每一該等相位差值相關的一信心水準值,在執行分群操作前之一篩選操作中,以該信心水準值判斷該相關的相位差值是否適格。As described in claim 1, when calculating the phase difference values, a confidence level value associated with each of the phase difference values is obtained, and in a screening operation before performing a grouping operation, the confidence level value is used to determine whether the associated phase difference value is qualified. 如請求項4所述之方法,更包含: 在執行分群操作之前,捨棄該信心水準值低於一第一閾值的不適格相位差值。 The method as described in claim 4 further comprises: Before performing the grouping operation, discarding unqualified phase difference values whose confidence level is lower than a first threshold. 如請求項5所述之方法,更包含: 在一既定期間內之複數個時間點,根據每一該等離焦值估算一對焦位置,然後以該等時間點所估算得到的複數個對焦位置求取該既定期間之一對焦位置平均值以及一標準差; 其中,在該篩選操作中,是以該信心水準值和該標準差判斷該相位差值是否適格。 The method as described in claim 5 further comprises: At a plurality of time points within a given period, a focus position is estimated according to each of the defocus values, and then a focus position average value and a standard deviation are calculated for the given period using the plurality of focus positions estimated at the time points; wherein, in the screening operation, the confidence level value and the standard deviation are used to determine whether the phase difference value is qualified. 如請求項6所述之方法,更包含: 在執行分群操作之前,捨棄該信心水準值低於一第一閾值且該標準差大於一第二閾值的不適格相位差值。 The method as described in claim 6 further comprises: Before performing the grouping operation, discarding unqualified phase difference values whose confidence level value is lower than a first threshold and whose standard deviation is greater than a second threshold. 如請求項6所述之方法,其中在對焦中畫面所得到的該等相位差值以及該等離焦值之個數均為 M個, M是正整數,該 M個相位差值與該 M個離焦值一一對應,該方法更包含: 執行一時域濾波運算: 其中,該既定期間之時點數量為 ND i,m 代表在該等時間點其中之一時間點 i得到的該等離焦值其中之第 m個離焦值, F i 代表該影像擷取裝置的鏡頭於該時間點 i所處的一當前位置, x i,m 代表根據該離焦值 D i,m 於該時間點 i估算得到的一對焦位置, std m 分別代表根據該第 m個離焦值於該既定期間經過該時域濾波運算而產生的一對焦位置平均值與一標準差。 The method as claimed in claim 6, wherein the number of the phase difference values and the defocus values obtained in the focused image is M , M is a positive integer, and the M phase difference values correspond to the M defocus values one by one, and the method further comprises: performing a time domain filtering operation: Wherein, the number of time points in the predetermined period is N , D i,m represents the mth defocus value among the defocus values obtained at one of the time points i , F i represents a current position of the lens of the image capture device at the time point i , x i,m represents a focus position estimated at the time point i according to the defocus value D i,m , and std m respectively represent a focus position average value and a standard deviation generated according to the m -th defocus value during the given period through the time domain filtering operation. 如請求項1所述之方法,其中該分群操作包含: (a) 計算該等相位差值的一平均值; (b) 以各該相位差值與該平均值的比較決定各該相位差值應分到該近景群組或該遠景群組; (c) 計算一近景相位平均值及一遠景相位平均值,該近景相位平均值為該近景群組中該等相位差值的平均值,該遠景相位平均值為該遠景群組中該等相位差值的平均值; (d) 以各該相位差值與該近景相位平均值、以及與該遠景相位平均值之間的差距決定各該相位差值應分到該近景群組或該遠景群組;以及 (f) 重複步驟(c)~(d),直到該等相位差值不再於該近景群組和該遠景群組之間移動為止。 The method as described in claim 1, wherein the grouping operation comprises: (a) calculating an average value of the phase difference values; (b) determining whether each phase difference value should be assigned to the near view group or the distant view group by comparing each phase difference value with the average value; (c) calculating a near view phase average value and a distant view phase average value, wherein the near view phase average value is the average value of the phase difference values in the near view group, and the distant view phase average value is the average value of the phase difference values in the distant view group; (d) determining whether each phase difference value should be assigned to the near view group or the distant view group by the difference between each phase difference value and the near view phase average value and the distant view phase average value; and (f) repeating steps (c) to (d) until the phase difference values no longer move between the near view group and the distant view group. 如請求項9所述之方法,其中若該相位差值與該近景相位平均值之間的差距小於該相位差值與該遠景相位平均值之間的差距,則該相位差值屬於該近景群組,以及若該相位差值與該近景相位平均值之間的差距大於該相位差值與該遠景相位平均值之間的差距,則該相位差值屬於該遠景群組。A method as described in claim 9, wherein if the difference between the phase difference value and the foreground phase average value is smaller than the difference between the phase difference value and the foreground phase average value, the phase difference value belongs to the foreground group, and if the difference between the phase difference value and the foreground phase average value is larger than the difference between the phase difference value and the foreground phase average value, the phase difference value belongs to the foreground group. 如請求項1所述之方法,其中該分群操作以K平均演算法(K-means)來進行。The method as described in claim 1, wherein the clustering operation is performed using a K-means algorithm. 一種電子設備,包括: 一種影像擷取裝置,包括: 一鏡頭; 一鏡頭控制模組,用以控制該鏡頭的移動;以及 一影像感測模組,以通過該鏡頭進入該影像擷取裝置之光線生成一對焦中畫面;以及 一處理模組,耦接於該影像擷取裝置中的該鏡頭控制模組和該影像感測模組,並經配置以執行以下操作: 計算該鏡頭對焦中畫面之複數個相位差以及複數個離焦值; 對該等相位差值進行分群,從而獲得一近景群組及一遠景群組; 從被分群為該近景群組和該遠景群組的該等相位差值所對應的離焦值決定一近景對焦位置及一遠景對焦位置;以及 基於所述對焦中畫面的一景深、該近景對焦位置和該遠景對焦位置,決定該鏡頭的一理想對焦位置,並使該鏡頭控制模組根據該理想對焦位置移動該鏡頭。 An electronic device, comprising: An image capture device, comprising: A lens; A lens control module for controlling the movement of the lens; and An image sensing module for generating a focused image with light entering the image capture device through the lens; and A processing module, coupled to the lens control module and the image sensing module in the image capture device, and configured to perform the following operations: Calculating a plurality of phase differences and a plurality of defocus values of the focused image of the lens; Grouping the phase difference values to obtain a close-up group and a distant group; Determine a near-view focus position and a far-view focus position from the defocus values corresponding to the phase difference values grouped into the near-view group and the far-view group; and Determine an ideal focus position of the lens based on a depth of field of the image in focus, the near-view focus position and the far-view focus position, and enable the lens control module to move the lens according to the ideal focus position. 如請求項12所述之電子設備,其中: 該處理模組於該遠景對焦位置和該近景對焦位置之間的距離大於該景深時,選擇以該近景對焦位置作為該理想對焦位置;以及 該處理模組於該遠景對焦位置和該近景對焦位置之間的距離等於或小於該景深時,選擇以該近景對焦位置和該遠景對焦位置的一平均值作為該理想對焦位置。 The electronic device as described in claim 12, wherein: The processing module selects the near-view focus position as the ideal focus position when the distance between the far-view focus position and the near-view focus position is greater than the depth of field; and The processing module selects an average value of the near-view focus position and the far-view focus position as the ideal focus position when the distance between the far-view focus position and the near-view focus position is equal to or less than the depth of field. 如請求項12所述之電子設備,其中: 該處理模組從被分群為該近景群組的該等相位差值所對應的該等離焦值估測得到複數個第一目標對焦位置,計算該等第一目標對焦位置的一第一平均值以作為該近景對焦位置;以及 該處理模組從被分群為該遠景群組的該等相位差所對應的離焦值估測得到複數個第二目標對焦位置,並計算該等第二目標對焦位置的一第二平均值以作為該遠景對焦位置。 An electronic device as described in claim 12, wherein: The processing module estimates a plurality of first target focus positions from the defocus values corresponding to the phase difference values grouped as the near-view group, and calculates a first average value of the first target focus positions as the near-view focus position; and The processing module estimates a plurality of second target focus positions from the defocus values corresponding to the phase differences grouped as the far-view group, and calculates a second average value of the second target focus positions as the far-view focus position. 如請求項12所述之電子設備,其中該處理模組更經配置以執行以下操作: 在計算該等相位差值時,取得每一該等相位差值相關之一信心水準值; 以該信心水準值判斷該相關的相位差值是否適格;以及 在對該等相位差值進行分群之前,捨棄不適格相位差值。 An electronic device as described in claim 12, wherein the processing module is further configured to perform the following operations: When calculating the phase difference values, obtain a confidence level value associated with each of the phase difference values; Determine whether the associated phase difference value is qualified based on the confidence level value; and Before grouping the phase difference values, discard unqualified phase difference values. 如請求項15所述之電子設備,其中該處理模組更經配置以執行以下操作: 在一既定期間內的複數個時間點,根據每一該等離焦值估算一對焦位置,然後以該等時間點所估算得到的複數個對焦位置求取該既定時間之一對焦位置以及一標準差;以及 以該信心水準值和該標準差判斷該相關的相位差值是否適格;以及 在對該等相位差值進行分群之前,捨棄不適格相位差值。 An electronic device as described in claim 15, wherein the processing module is further configured to perform the following operations: At a plurality of time points within a given period, estimate a focus position according to each of the defocus values, and then obtain a focus position and a standard deviation at the given time using the plurality of focus positions estimated at the time points; and Determine whether the relevant phase difference value is qualified using the confidence level value and the standard deviation; and Discard unqualified phase difference values before grouping the phase difference values. 如請求項12所述之電子設備,其中該鏡頭對焦中畫面所得到的該等相位差值以及該等離焦值之個數均為 M個, M是正整數,該 M個相位差值與該 M個離焦值一一對應,該處理模組更經配置以執行一時域濾波運算: 其中,該既定期間之時點數量為 ND i,m 代表在該等時間點其中之一時間點 i得到的該等離焦值其中之第 m個離焦值, F i 代表該影像擷取裝置的鏡頭於該時間點 i所處的一當前位置, x i,m 代表根據該離焦值 D i,m 於該時間點 i估算得到的一對焦位置, std m 分別代表根據該第 m個離焦值於該既定期間經過該時域濾波運算而產生的一對焦位置平均值與一標準差。 The electronic device as claimed in claim 12, wherein the number of the phase difference values and the defocus values obtained by the lens in focus is M , M is a positive integer, the M phase difference values correspond to the M defocus values one by one, and the processing module is further configured to perform a time domain filtering operation: Wherein, the number of time points in the predetermined period is N , D i,m represents the mth defocus value among the defocus values obtained at one of the time points i , F i represents a current position of the lens of the image capture device at the time point i , x i,m represents a focus position estimated at the time point i according to the defocus value D i,m , and std m respectively represent a focus position average value and a standard deviation generated according to the m -th defocus value during the given period through the time domain filtering operation. 如請求項14所述之電子設備,其中該處理模組利用K平均演算法將該等相位差值分為該近景群組及該遠景群組。The electronic device as described in claim 14, wherein the processing module utilizes a K-average algorithm to divide the phase difference values into the near-view group and the far-view group.
TW111133031A 2022-08-31 Electronic device and method for phase detection autofocus TW202411711A (en)

Publications (1)

Publication Number Publication Date
TW202411711A true TW202411711A (en) 2024-03-16

Family

ID=

Similar Documents

Publication Publication Date Title
KR102278776B1 (en) Image processing method, apparatus, and apparatus
KR101345093B1 (en) Autofocus with confidence measure
WO2019105214A1 (en) Image blurring method and apparatus, mobile terminal and storage medium
TWI538512B (en) Method for adjusting focus position and electronic apparatus
US9131145B2 (en) Image pickup apparatus and control method therefor
US20100157135A1 (en) Passive distance estimation for imaging algorithms
JP2011029905A (en) Imaging device, method and program
US20120307009A1 (en) Method and apparatus for generating image with shallow depth of field
JP6151867B2 (en) Imaging device, imaging device body, and lens barrel
JP2016038414A (en) Focus detection device, control method thereof, and imaging apparatus
JP2014138290A (en) Imaging device and imaging method
JP2013044844A (en) Image processing device and image processing method
CN106973199B (en) Multi-aperture camera system for improving depth accuracy by using focusing distance scanning
TW201345229A (en) Image editing method and a related blur parameter establishing method
JP6395429B2 (en) Image processing apparatus, control method thereof, and storage medium
US20150264249A1 (en) Image processing apparatus and image processing method
US11924542B2 (en) Accuracy estimation apparatus, image capturing apparatus, accuracy estimation method, control method, and storage medium
JP6346484B2 (en) Image processing apparatus and control method thereof
TW202411711A (en) Electronic device and method for phase detection autofocus
JP5741353B2 (en) Image processing system, image processing method, and image processing program
JP6645711B2 (en) Image processing apparatus, image processing method, and program
JP2002116372A (en) Automatic focusing device, its focusing method, and computer-readable storage medium stored with program for performing the method by computer
CN117692770A (en) Electronic equipment and phase focusing method
JP2012142729A (en) Camera
KR101737260B1 (en) Camera system for extracting depth from images of different depth of field and opertation method thereof