TWI399524B - Method and apparatus for extracting scenery depth imformation - Google Patents

Method and apparatus for extracting scenery depth imformation Download PDF

Info

Publication number
TWI399524B
TWI399524B TW098105357A TW98105357A TWI399524B TW I399524 B TWI399524 B TW I399524B TW 098105357 A TW098105357 A TW 098105357A TW 98105357 A TW98105357 A TW 98105357A TW I399524 B TWI399524 B TW I399524B
Authority
TW
Taiwan
Prior art keywords
coding element
optical
coding
depth information
axis direction
Prior art date
Application number
TW098105357A
Other languages
Chinese (zh)
Other versions
TW201031895A (en
Inventor
Chuan Chung Chang
Chir Weei Chang
Yung Lin Chen
Original Assignee
Ind Tech Res Inst
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ind Tech Res Inst filed Critical Ind Tech Res Inst
Priority to TW098105357A priority Critical patent/TWI399524B/en
Priority to US12/505,481 priority patent/US8467578B2/en
Publication of TW201031895A publication Critical patent/TW201031895A/en
Application granted granted Critical
Publication of TWI399524B publication Critical patent/TWI399524B/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Description

景物深度資訊之取得方法與裝置Method and device for obtaining scene depth information

本發明係關於一種景物深度資訊之取得,尤指一種景物深度資訊之取得方法與裝置。The invention relates to the acquisition of a scene depth information, in particular to a method and a device for acquiring scene depth information.

如何擷取實際景物中不同物體之距離或深度資訊,在立體顯示領域、測距系統或深度形廓顯示系統中一直是一個重要的課題。傳統的擷取方法與裝置可分為「主動式」與「被動式」兩種。在「被動式」系統中利用「視差」的概念,其方法主要係使用雙鏡頭、多鏡頭以及類似概念之多孔式光圈取像系統。然,此類系統需要多個成像鏡頭、多個影像感測器才能獲得立體影像資訊。而「主動式」係藉由使用一額外主動發射之訊號源(通常為一光源),根據此訊號源照射被拍攝物體後飛行時間(time of flight)之差異來求出不同物體之距離或是將特定形式之分布條紋投射至被拍攝物體,並利用此條紋在物體表面的變形程度來評估被拍攝物體間彼此的相對距離。How to capture the distance or depth information of different objects in the actual scene has always been an important issue in the field of stereoscopic display, ranging system or depth profile display system. Traditional methods and devices can be classified into "active" and "passive". The concept of "parallax" is used in the "passive" system, which is mainly based on a multi-lens, multi-lens and similar concept apertured aperture imaging system. However, such systems require multiple imaging lenses and multiple image sensors to obtain stereoscopic image information. The "active" method uses a signal source (usually a light source) that is additionally actively transmitted to determine the distance of different objects based on the difference in time of flight after the source of the signal is illuminated. A specific form of distribution fringes is projected onto the object to be photographed, and the degree of deformation of the stripe on the surface of the object is utilized to evaluate the relative distance between the photographed objects.

此外,美國專利第5,521,695號揭示藉由特殊設計的光學元件使一個物點在經該光學元件後,於其成像面上會形成四個像點,並藉由四個像點之相對位置變化求出被拍攝物體的相對距離。In addition, U.S. Patent No. 5,521,695 discloses that a specially designed optical element causes an object to pass through the optical element to form four image points on the imaging surface thereof, and the relative positional changes of the four image points are obtained. The relative distance of the object being shot.

另一方面,在取像系統中,鏡頭的點擴散函數(Point Spread Function:PSF)會隨著物距之不同而產生變化。因此,可依據此點擴散函數之特性來獲得物距資訊。然而, 鏡頭的點擴散函數除了隨物距不同而產生變化之外,在相同物距但不同視場的情況下,鏡頭之點擴散函數也將隨之改變。因此,欲獲得被拍攝物體間之距離差異及其深度資訊,後端影像處理時需同時考慮點擴散函數隨物距以及視場所造成的影響。On the other hand, in the imaging system, the point spread function (PSF) of the lens changes with the object distance. Therefore, the object distance information can be obtained according to the characteristics of the diffusion function at this point. however, The point spread function of the lens changes in addition to the object distance. In the case of the same object distance but different fields of view, the point spread function of the lens will also change. Therefore, in order to obtain the difference in distance between the objects to be photographed and the depth information, the back-end image processing needs to consider the influence of the point spread function with the object distance and the visual field.

本發明提供一種深度資訊之取得方法及其光學擷取裝置,其藉由具有編碼元件之裝置,使不同視場條件的點擴散函數相似性提高,而沿著該光學擷取裝置光軸上的點擴散函數仍具差異性,進而藉由軸上點擴散函數在不同物距之變化情形得到景物之深度資訊。The invention provides a method for obtaining depth information and an optical capturing device thereof, which improve the similarity of a point spread function of different field of view conditions by means of a device having a coding element, along the optical axis of the optical capturing device The point spread function is still different, and then the depth information of the scene is obtained by the on-axis point spread function in different object distance changes.

本發明提出一種深度資訊之光學擷取裝置,其包含一光學元件及一編碼元件。其中,該編碼元件置於行經該光學元件之被擷取景物光的路徑上,用以調變該光學元件之點擴散函數。The invention provides an optical information capture device for depth information, comprising an optical component and a coding component. Wherein the coding element is placed in a path through the optical element to be captured by the optical element for modulating a point spread function of the optical element.

本發明提出一種深度資訊之取得方法,其包含:藉由一光學擷取裝置取得一影像;取得該光學擷取裝置至少一點擴散函數資訊;以及掃瞄該影像不同區域並根據該點擴散函數資訊執行還原比對程序。The invention provides a method for obtaining depth information, comprising: acquiring an image by an optical capturing device; obtaining at least one diffusion function information of the optical capturing device; and scanning different regions of the image and according to the diffusion function information of the point Perform a restore comparison procedure.

圖1係根據本發明一實施範例之深度資訊取得裝置之方塊圖。入射光線通過深度資訊取得裝置100後,由感測器所接收。該深度資訊取得裝置100包含一光學元件101及一編碼元件102。此用以成像之光學元件101可為單一透鏡、一 透鏡組或一反射成像鏡組等。編碼元件102可為一波前相位編碼元件、一波前振幅編碼元件或一波前相位及振幅混合編碼元件,其中,該編碼元件102為波前相位編碼元件時,其編碼方式可為具軸對稱之編碼方式。該編碼元件102之波前編碼可用相互正交之座標系統表示。在本實施例中,編碼元件102之波前編碼以下列方程式表示: 其中x及y分別為該編碼元件x軸方向及y軸方向之座標位置、n為正整數、l+m為偶數、Anx 及Any 分別為第n項在x軸方向及y軸方向上之係數大小、Almxy 為xy耦合項之係數大小。1 is a block diagram of a depth information acquisition apparatus according to an embodiment of the present invention. After the incident light passes through the depth information acquisition device 100, it is received by the sensor. The depth information acquisition device 100 includes an optical component 101 and an encoding component 102. The optical element 101 for imaging may be a single lens, a lens group or a reflective imaging lens set or the like. The encoding component 102 can be a wavefront phase encoding component, a wavefront amplitude encoding component, or a wavefront phase and amplitude hybrid encoding component. Where the encoding component 102 is a wavefront phase encoding component, the encoding mode can be an axis. Symmetric coding. The wavefront coding of the coding element 102 can be represented by mutually orthogonal coordinate systems. In the present embodiment, the wavefront coding of the coding element 102 is expressed by the following equation: Where x and y are the coordinate positions of the coding element in the x-axis direction and the y-axis direction, n is a positive integer, l+m is an even number, and A nx and A ny are the nth term in the x-axis direction and the y-axis direction, respectively. The coefficient size, A lmxy is the coefficient size of the xy coupling term.

該編碼元件102可放置於深度資訊取得裝置100之光圈處、光圈處附近、出瞳面、出瞳面附近、入瞳面或入瞳面附近。該編碼元件102亦可與光學元件101結合在一起,例如該編碼元件102可製作在光學元件101中鏡片表面上。結合後之波前編碼可以下列方程式表示:W '(x ,y )=W (x ,y )+W 0 (x ,y ),其中W' (x ,y )為深度資訊取得裝置100加入編碼元件102後之波前、W 0 (x ,y )為深度資訊取得裝置100未加入編碼元件102之波前。本領域熟悉該項技術者瞭解上述之W '(x ,y )、W (x ,y )及W 0 (x ,y )亦可以壬尼克(zernike)多項式表示。以上所述編碼元件102之放置方式是使通過深度資訊取得裝置100之被擷取景物光的波前在加入此編碼元件後所產生之光學波前主要由W (x ,y )構成。此外,該編碼元件102可為一折射元件或一 繞射元件或同時具有上述兩者光學性質之元件。The coding element 102 can be placed at the aperture of the depth information acquisition device 100, near the aperture, near the exit pupil plane, near the exit pupil plane, near the entrance pupil plane or near the entrance pupil plane. The coding element 102 can also be combined with the optical element 101, for example, the coding element 102 can be fabricated on the lens surface of the optical element 101. The combined wavefront coding can be expressed by the following equation: W '( x , y ) = W ( x , y ) + W 0 ( x , y ), where W' ( x , y ) is the code added to the depth information acquisition device 100 The wavefront after the element 102, W 0 ( x , y ), is the wavefront of the depth information acquisition device 100 that is not added to the coding element 102. Those skilled in the art familiar with the art know that W '( x , y ), W ( x , y ) and W 0 ( x , y ) can also be represented by a zernike polynomial. The coding element 102 is placed in such a manner that the optical wavefront generated by the depth information acquisition device 100 after the wavefront of the captured object light is added to the coding element is mainly composed of W ( x , y ). Furthermore, the coding element 102 can be a refractive element or a diffractive element or an element having both optical properties described above.

根據本發明之一實施,光學元件101之有效焦距為10.82mm、F#為5、全視角大小為10.54度。感測器103對角線長度為2mm。相位編碼元件102使用上述之波前方程式進行編碼,其中n=2、A2x =A2y =22.8PI、Almxy =0。According to one embodiment of the present invention, the optical element 101 has an effective focal length of 10.82 mm, a F# of 5, and a full viewing angle of 10.54 degrees. The sensor 103 has a diagonal length of 2 mm. The phase encoding component 102 encodes using the above-described wavefront program, where n = 2, A 2x = A 2y = 22.8 PI, and A lmxy =0.

圖2A繪示使用光學元件101來成像物距為1000mm的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場之點擴散函數的斑點圖(Spot diagram)。圖2B繪示使用光學元件101結合相位編碼元件102之景物深度資訊取得裝置100來成像物距為1000mm的景物時,不同視場之點擴散函數的斑點圖。2A illustrates a point spread function of a scene light having a wavelength of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm in a red, green, and blue light band when the object is imaged at a distance of 1000 mm using the optical element 101. Spot diagram. FIG. 2B is a speckle diagram of a point spread function of different fields of view when the scene depth information acquisition device 100 of the phase encoding element 102 is used in combination with the optical element 101 to image a scene having an object distance of 1000 mm.

圖3A繪示使用光學元件101來成像物距為790mm的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場之點擴散函數的斑點圖。圖3B繪示使用光學元件101結合相位編碼元件102之深度資訊取得裝置100來成像物距為790mm的景物時,不同視場之點擴散函數的斑點圖。FIG. 3A illustrates a point spread function of a scene light having a wavelength of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm at different fields of view when the object is imaged at an object distance of 790 mm using the optical element 101. Spotted map. FIG. 3B is a speckle diagram showing the point spread function of different fields of view when the depth information acquisition device 100 of the phase encoding element 102 is used in combination with the optical element 101 to image a scene having an object distance of 790 mm.

圖4A繪示使用光學元件101來成像物距為513mm的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場之點擴散函數的斑點圖。圖4B繪示使用光學元件101結合相位編碼元件102之深度資訊取得裝置100來成像物距為513mm的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場之點擴散函數的斑點圖。4A illustrates a point spread function of a scene light having a wavelength of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm in a red, green, and blue light band when the object is imaged at a distance of 513 mm using the optical element 101. Spotted map. 4B illustrates the use of the optical element 101 in combination with the depth information acquisition device 100 of the phase encoding element 102 to image a scene having an object distance of 513 mm, which is represented by red, green, and blue wavelengths of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm. A speckle pattern of the spread function of the scene light at different points of view.

根據本發明之另一實施例,光學元件101之有效焦距為10.82mm、F#為5、全視角大小為10.54度。感測器103對角線長度為2mm。相位編碼元件102使用上述之波前方程式進行編碼,其中n=3、A3x =A3y =12.7PI、Almxy =0。根據本發明之另一實施例,圖5繪示使用光學元件101結合相位編碼元件102之深度資訊取得裝置100來成像不同物距(513mm、790mm及1000mm)的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場(即像高位置為0mm、0.7mm及1mm)之點擴散函數的斑點圖。According to another embodiment of the present invention, the optical element 101 has an effective focal length of 10.82 mm, a F# of 5, and a full viewing angle of 10.54 degrees. The sensor 103 has a diagonal length of 2 mm. The phase encoding component 102 encodes using the above-described wavefront program, where n = 3, A 3x = A 3y = 12.7 PI, and A lmxy =0. According to another embodiment of the present invention, FIG. 5 illustrates the use of the optical component 101 in conjunction with the depth information acquisition device 100 of the phase encoding component 102 to image scenes of different object distances (513 mm, 790 mm, and 1000 mm), which are represented as red, green, and A speckle pattern of a point spread signal of a wavelength of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm in a blue light band at different fields of view (ie, 0 mm, 0.7 mm, and 1 mm in the high position).

根據本發明之再一實施例,光學元件101之有效焦距為10.82mm、F#為5、全視角大小為10.54度。感測器103對角線長度為2mm。相位編碼元件102使用上述之波前方程式進行編碼,其中n=2、A2x =A2y =19.1PI、A22xy =9.55PI。根據本發明之再一實施例,圖6繪示使用光學元件101結合相位編碼元件102之深度資訊取得裝置100來成像不同物距(513mm、790mm及1000mm)的景物時,表示為紅、綠及藍光波段之波長656.3nm、波長587.6nm及波長486.1nm的景物光在不同視場(即像高位置為0mm、0.7mm及1mm)之點擴散函數的斑點圖。According to still another embodiment of the present invention, the optical element 101 has an effective focal length of 10.82 mm, a F# of 5, and a full viewing angle of 10.54 degrees. The sensor 103 has a diagonal length of 2 mm. The phase encoding component 102 encodes using the above-described wavefront program, where n = 2, A 2x = A 2y = 19.1 PI, and A 22xy = 9.55 PI. According to still another embodiment of the present invention, FIG. 6 illustrates the use of the optical component 101 in conjunction with the depth information acquisition device 100 of the phase encoding component 102 to image different objects (513 mm, 790 mm, and 1000 mm), which are represented as red, green, and A speckle pattern of a point spread signal of a wavelength of 656.3 nm, a wavelength of 587.6 nm, and a wavelength of 486.1 nm in a blue light band at different fields of view (ie, 0 mm, 0.7 mm, and 1 mm in the high position).

相較於僅使用光學元件101之點擴散函數,由圖2A至圖6可以看出本發明實施例之深度資訊取得裝置100在相同物距但不同視場的情況下,其點擴散函數之外觀形狀的變異程度相當小。為了近一步確認點擴散函數相似性的改善程 度,使用希爾伯特空間角度(Hilbert space angle)進行點擴散函數之相似性計算。圖7顯示相同物距但在不同視場之點擴散函數相似性比較圖(比較基準物距為790mm)。圖8顯示不同物距下沿著光軸上點擴散函數相似性比較圖。從圖7中可以看出,本發明實施例之深度資訊取得裝置100的點擴散函數在不同視場下的相似性提高(在希爾伯特空間角度中,所計算之數值越小代表相似性越高)。另一方面,從圖8中可以看出深度資訊取得裝置100之光軸上點擴散函數在不同物距時仍具備差異性。Compared with the point spread function using only the optical element 101, it can be seen from FIG. 2A to FIG. 6 that the depth information obtaining apparatus 100 of the embodiment of the present invention has the appearance of the point spread function in the case of the same object distance but different fields of view. The degree of variation in shape is quite small. In order to further confirm the improvement of the similarity of the point spread function Degree, the similarity calculation of the point spread function is performed using the Hilbert space angle. Figure 7 shows a comparison of similarity of diffusion function at the same object distance but at different points of view (comparative reference object distance is 790 mm). Figure 8 shows a comparison of the similarity of the spread function along the optical axis at different object distances. As can be seen from FIG. 7, the similarity of the point spread function of the depth information obtaining apparatus 100 according to the embodiment of the present invention in different fields of view is improved (in the Hilbert space angle, the smaller the calculated value represents the similarity. Higher). On the other hand, it can be seen from FIG. 8 that the point spread function on the optical axis of the depth information acquiring apparatus 100 still has a difference at different object distances.

此外,可依光學元件101或深度資訊取得裝置100實際工作時的景物光波長波段,由前述的波前編碼方程式設計與光學元件101結合的相位編碼元件102,並不限於波長656.3nm、波長587.6nm及波長486.1nm的景物光波段。In addition, the phase encoding component 102 combined with the optical component 101 by the aforementioned wavefront coding equation design may be limited to a wavelength of 656.3 nm and a wavelength of 587.6 depending on the wavelength band of the scene light when the optical element 101 or the depth information acquiring apparatus 100 actually operates. Nm and the wavelength of the light of 486.1 nm.

為了使本領域通常知識者可以透過本實施範例的教導實施本發明,以下搭配上述景物深度資訊取得之裝置,另提出一方法實施範例。In order to enable the ordinary knowledge in the art to implement the present invention through the teachings of the present embodiment, a method implementation example is further provided below with the device for obtaining the scene depth information.

圖9係根據本發明一實施範例之深度資訊取得方法之步驟流程圖。在步驟S901中使用取得裝置100來成像物距為1000mm的景物時之點擴散函數資訊。此點擴散函數資訊可藉由量測該裝置100或根據裝置100之設計參數取得。在步驟S902中儲存此點擴散函數資訊。另一方面在步驟S903中分別取得物距為1000mm、980mm、900mm、790mm及513mm處之影像。接下來,在步驟S904中使用已儲存之點擴散函數資訊分別針對不同物距之影像執行掃瞄及還原比對程 序。此還原比對程序可運用維納濾波器(Wiener filter)或直接逆轉(direct inverse)運算來還原影像。影像還原之後,分別求取該些還原影像之均方根差(mean square error,MSE)並與使用者預先設定之門檻值做比較。在步驟S905中即可根據比較結果判斷該影像之物距,進而取得其深度資訊。此外,在步驟S904中也可採用界線邊緣是否分明之方法評價該還原影像品質,進而在步驟S905中判斷該影像之物距並取得其深度資訊。FIG. 9 is a flow chart showing the steps of the depth information obtaining method according to an embodiment of the present invention. In step S901, the acquisition device 100 is used to image the point spread function information when the object has an object distance of 1000 mm. This point spread function information can be obtained by measuring the device 100 or according to the design parameters of the device 100. This point spread function information is stored in step S902. On the other hand, in step S903, images of object distances of 1000 mm, 980 mm, 900 mm, 790 mm, and 513 mm are respectively obtained. Next, in step S904, the stored point spread function information is used to perform scanning and restoration comparison on images of different object distances, respectively. sequence. This reduction comparison program can restore the image using a Wiener filter or a direct inverse operation. After the image is restored, the mean square error (MSE) of the restored images is separately calculated and compared with the threshold value preset by the user. In step S905, the object distance of the image can be determined according to the comparison result, thereby obtaining the depth information. In addition, in step S904, the restored image quality may be evaluated by using a method of determining whether the boundary of the boundary is clear, and then the object distance of the image is determined and the depth information is obtained in step S905.

根據本發明一實施範例,圖10A顯示物距為1000mm之還原影像。圖10B顯示物距為980mm之還原影像。圖10C顯示物距為900mm之還原影像。圖10D顯示物距為790mm之還原影像。圖10E顯示物距為513mm之還原影像。因為使用物距為1000mm時之點擴散函數資訊執行掃瞄及還原比對程序,故相較於其他物距之影像,物距為1000mm之還原效果最佳,其MSE為2.4×10-7 。因此在步驟S905中即可得知此影像之深度之資訊為1000mm。In accordance with an embodiment of the present invention, FIG. 10A shows a restored image with an object distance of 1000 mm. Fig. 10B shows a restored image with an object distance of 980 mm. Fig. 10C shows a restored image with an object distance of 900 mm. Figure 10D shows a reduced image with an object distance of 790 mm. Figure 10E shows a reduced image with an object distance of 513 mm. Because the scanning and reduction comparison program is performed using the point spread function information when the object distance is 1000 mm, the reduction effect of the object distance of 1000 mm is the best compared with the image of other object distances, and the MSE is 2.4×10 -7 . Therefore, in step S905, the information of the depth of the image is 1000 mm.

習知技術之系統,其後端影像處理時需同時考慮點擴散函數隨物距以及視場所造成的影響。本發明實施例藉由具有編碼元件102之取得裝置100,使其不同視場條件的點擴散函數相似性提高,並保持軸上點擴散函數的差異性,進而藉由軸上點擴散函數在不同物距之變化情形得到景物之深度資訊。本發明之技術內容及技術特點已揭示如上,然而熟悉本項技術之人士仍可能基於本發明之教示及揭示而作種種不背離本發明精神之替換及修飾。因此,本發明之 保護範圍應不限於實施範例所揭示者,而應包括各種不背離本發明之替換及修飾,並為以下之申請專利範圍所涵蓋。In the conventional technology system, the back-end image processing needs to consider both the point spread function and the influence of the object distance and the visual field. The embodiment of the present invention improves the similarity of the point spread function of different field of view conditions by the obtaining device 100 having the encoding element 102, and maintains the difference of the point spread function on the axis, and thus the difference function on the on-axis point is different. The change of the object distance gives the depth information of the scene. The technical and technical features of the present invention have been disclosed as above, and those skilled in the art can still make various substitutions and modifications without departing from the spirit and scope of the invention. Therefore, the present invention The scope of the protection should not be limited by the scope of the invention, but should be construed as being included in the following claims.

100‧‧‧景物深度資訊取得之裝置100‧‧‧Site device for obtaining depth information

101‧‧‧光學元件101‧‧‧Optical components

102‧‧‧編碼元件102‧‧‧ coding components

S901-S905‧‧‧步驟S901-S905‧‧‧Steps

圖1顯示本發明一實施範例之景物深度資訊取得裝置之方塊圖;圖2A繪示使用光學元件來成像物距為1000mm的景物時,不同視場之點擴散函數的斑點圖;圖2B繪示使用深度資訊取得裝置來成像物距為1000mm的景物時,不同視場之點擴散函數的斑點圖;圖3A繪示使用光學元件來成像物距為790mm的景物時,不同視場之點擴散函數的斑點圖;圖3B繪示使用深度資訊取得裝置來成像物距為790mm的景物時,不同視場之點擴散函數的斑點圖;圖4A繪示使用光學元件來成像物距為513mm的景物時,不同視場之點擴散函數的斑點圖;圖4B繪示使用深度資訊取得裝置來成像物距為513mm的景物時,不同視場之點擴散函數的斑點圖;圖5繪示使用深度資訊取得裝置(未使用xy耦合項)來成像不同物距景物時,不同像高位置之點擴散函數的斑點圖;圖6繪示使用深度資訊取得裝置(使用xy耦合項)來成像不同物距景物時,不同像高位置之點擴散函數的斑點圖;圖7顯示相同物距但在不同視場之點擴散函數相似性比較圖;圖8顯示不同物距下沿深度資訊取得裝置光軸上點擴散 函數相似性比較圖;圖9顯示本發明實施範例之深度資訊取得方法之步驟流程圖;圖10A顯示物距為1000mm之還原影像;圖10B顯示物距為980mm之還原影像;圖10C顯示物距為900mm之還原影像;圖10D顯示物距為790mm之還原影像;及圖10E顯示物距為513mm之還原影像。1 is a block diagram of a scene depth information obtaining apparatus according to an embodiment of the present invention; and FIG. 2A is a fragment diagram showing a point spread function of different fields of view when an optical element is used to image a scene having an object distance of 1000 mm; FIG. 2B When using the depth information acquisition device to image a scene with a object distance of 1000 mm, the speckle pattern of the point spread function of different fields of view; FIG. 3A shows the point spread function of different fields of view when using an optical element to image a scene with an object distance of 790 mm. FIG. 3B illustrates a speckle pattern of a point spread function of different fields of view when a depth information acquisition device is used to image a scene with an object distance of 790 mm; FIG. 4A illustrates the use of an optical element to image a scene with a object distance of 513 mm. , the speckle pattern of the point spread function of different fields of view; FIG. 4B shows the speckle pattern of the point spread function of different fields of view when the depth information acquiring device is used to image the object with the object distance of 513 mm; FIG. 5 shows the use of depth information. The device (without the xy coupling term) is used to image the speckle pattern of the point spread function at different image height positions when different objects are from the scene; Figure 6 shows the use of the depth information acquisition device (using the xy coupling term) When plotting different objects from the scene, the speckle pattern of the point spread function of different image height positions; Figure 7 shows the comparison of the similarity of the diffusion function at the same object distance but at different points of view; Figure 8 shows the depth information obtained at different object distances. Point spread on the optical axis of the device FIG. 9 is a flow chart showing the steps of the depth information obtaining method according to the embodiment of the present invention; FIG. 10A shows a restored image with an object distance of 1000 mm; FIG. 10B shows a restored image with an object distance of 980 mm; It is a restored image of 900 mm; FIG. 10D shows a restored image with an object distance of 790 mm; and FIG. 10E shows a restored image with an object distance of 513 mm.

100‧‧‧景物深度資訊取得之裝置100‧‧‧Site device for obtaining depth information

101‧‧‧光學元件101‧‧‧Optical components

102‧‧‧編碼元件102‧‧‧ coding components

Claims (16)

一種景物深度資訊取得裝置,包含:一光學元件;以及一編碼元件,其中,該編碼元件置於行經該光學元件之景物光的路徑上,用以調變該光學元件之點擴散函數;其中該編碼元件之編碼可由方程式W(x,y)=ΣAnx x2n +Any y2n +Almxy xl ym 表示,其中x及y分別為該編碼元件x軸方向及y軸方向之座標位置、n為正整數、l+m為偶數、Anx 及Any 分別為第n項在x軸方向及y軸方向上之係數大小、Almxy 為xy耦合項之係數大小。A scene depth information obtaining device comprising: an optical component; and an encoding component, wherein the encoding component is placed on a path of the scene light passing through the optical component to modulate a point spread function of the optical component; The coding of the coding element can be expressed by the equation W(x, y) = ΣA nx x 2n +A ny y 2n +A lmxy x l y m , where x and y are the coordinate positions of the coding element in the x-axis direction and the y-axis direction, respectively. n is a positive integer, l+m is an even number, A nx and A ny are the coefficient values of the nth term in the x-axis direction and the y-axis direction, respectively, and A lmxy is the coefficient size of the xy coupling term. 根據請求項1之裝置,其中該光學元件可為單一透鏡、一透鏡組或一反射成像鏡組。 The device of claim 1, wherein the optical component is a single lens, a lens group, or a reflective imaging lens set. 根據請求項1之裝置,其中該編碼元件為波前相位編碼元件。 The apparatus of claim 1, wherein the coding element is a wavefront phase coding element. 根據請求項1之裝置,其中該編碼元件為波前振幅編碼元件。 The apparatus of claim 1, wherein the coding element is a wavefront amplitude coding element. 根據請求項1之裝置,其中該編碼元件為波前相位及振幅混合編碼元件。 The apparatus of claim 1, wherein the coding element is a wavefront phase and amplitude hybrid coding element. 根據請求項1之裝置,其中該編碼元件為軸對稱編碼元件。 The device of claim 1, wherein the coding element is an axisymmetric coding element. 根據請求項1之裝置,其中該編碼元件之編碼可用相互正交之座標系統表示。 The apparatus of claim 1, wherein the coding of the coding element is represented by a coordinate system that is orthogonal to each other. 根據請求項1之裝置,其中該編碼元件可置於該裝置之光圈附近、出瞳面、出瞳面附近、入瞳面或入瞳面附近。 The device of claim 1, wherein the coding element is positionable near the aperture of the device, near the exit pupil, near the exit pupil plane, into the pupil plane or near the pupil plane. 根據請求項1之裝置,其中該編碼元件可為一折射元件或 一繞射元件或同時具有上述兩者光學性質之元件。 The device of claim 1, wherein the coding element is a refractive element or A diffractive element or an element having both optical properties described above. 根據請求項1之裝置,其中該編碼元件與該光學元件合而為一。 The device of claim 1, wherein the coding element is integrated with the optical element. 一種景物深度資訊取得之方法,包含:藉由一光學裝置取得一影像,其中該光學裝置具備一波前相位編碼元件,其中該波前相位編碼可由W(x,y)=ΣAnx x2n +Any y2n +Almxy xl ym 表示,其中x及y分別為該編碼元件x軸方向及y軸方向之座標位置、n為正整數、1+m為偶數、Anx 及Any 分別為第n項在x軸方向及y軸方向上之係數大小、Almxy 為xy耦合項之係數大小;取得該光學裝置至少一點擴散函數資訊;以及掃瞄該取得影像之區域,並根據該點擴散函數資訊執行還原比對程序。A method for obtaining scene depth information includes: acquiring an image by an optical device, wherein the optical device is provided with a wavefront phase encoding component, wherein the wavefront phase encoding is W(x, y)=ΣA nx x 2n + A ny y 2n + A lmxy x l y m , where x and y coordinate position of the coding element for each x-axis direction and the direction of the axis y, n is a positive integer, 1 + m is an even number, A nx A ny, respectively, and The coefficient size of the nth term in the x-axis direction and the y-axis direction, A lmxy is the coefficient size of the xy coupling term; obtaining at least one diffusion function information of the optical device; and scanning the region of the acquired image, and according to the point The diffusion function information performs a restore comparison procedure. 根據請求項11之方法,其另包含根據該還原比對程序之結果取得一深度資訊。 According to the method of claim 11, the method further comprises obtaining a depth information according to the result of the reduction ratio program. 根據請求項11之方法,其中該點擴散函數資訊可藉由量測該光學裝置或根據該光學裝置之設計參數取得。 The method of claim 11, wherein the point spread function information is obtained by measuring the optical device or according to design parameters of the optical device. 根據請求項11之方法,其中該還原比對程序可運用維納濾波器(Wiener filter)或直接逆轉(direct inverse)運算。 According to the method of claim 11, wherein the reduction comparison program can use a Wiener filter or a direct inverse operation. 根據請求項11之方法,其中該還原比對程序可包含均方根差(mean square error)運算。 The method of claim 11, wherein the reduction comparison program can include a mean square error operation. 根據請求項15之方法,其中該還原比對程序另包將該均方差運算結果與一使用者設定之門檻值做比較。 According to the method of claim 15, wherein the reduction comparison program further compares the mean square error operation result with a user-set threshold value.
TW098105357A 2009-02-20 2009-02-20 Method and apparatus for extracting scenery depth imformation TWI399524B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
TW098105357A TWI399524B (en) 2009-02-20 2009-02-20 Method and apparatus for extracting scenery depth imformation
US12/505,481 US8467578B2 (en) 2009-02-20 2009-07-18 Method and apparatus for extracting scenery depth information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW098105357A TWI399524B (en) 2009-02-20 2009-02-20 Method and apparatus for extracting scenery depth imformation

Publications (2)

Publication Number Publication Date
TW201031895A TW201031895A (en) 2010-09-01
TWI399524B true TWI399524B (en) 2013-06-21

Family

ID=42630994

Family Applications (1)

Application Number Title Priority Date Filing Date
TW098105357A TWI399524B (en) 2009-02-20 2009-02-20 Method and apparatus for extracting scenery depth imformation

Country Status (2)

Country Link
US (1) US8467578B2 (en)
TW (1) TWI399524B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI509216B (en) * 2014-12-19 2015-11-21 Apparatus and method for obtaining depth information in a scene

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5477464B2 (en) * 2010-04-21 2014-04-23 富士通株式会社 Imaging device
US8305485B2 (en) * 2010-04-30 2012-11-06 Eastman Kodak Company Digital camera with coded aperture rangefinder
JP2012003233A (en) * 2010-05-17 2012-01-05 Sony Corp Image processing device, image processing method and program
TWI428567B (en) 2010-09-03 2014-03-01 Pixart Imaging Inc Distance measurement method and system, and processing software thereof
TWI428569B (en) 2010-09-06 2014-03-01 Pixart Imaging Inc Distance measurement method and system, and processing software thereof
US8582820B2 (en) 2010-09-24 2013-11-12 Apple Inc. Coded aperture camera with adaptive image processing
JP5591090B2 (en) * 2010-12-13 2014-09-17 キヤノン株式会社 Image processing apparatus and method
US20130114883A1 (en) * 2011-11-04 2013-05-09 Industrial Technology Research Institute Apparatus for evaluating volume and method thereof
US9098147B2 (en) 2011-12-29 2015-08-04 Industrial Technology Research Institute Ranging apparatus, ranging method, and interactive display system
US9325971B2 (en) * 2013-01-10 2016-04-26 The Regents Of The University Of Colorado, A Body Corporate Engineered point spread function for simultaneous extended depth of field and 3D ranging
CN103268608B (en) * 2013-05-17 2015-12-02 清华大学 Based on depth estimation method and the device of near-infrared laser speckle
JP6578960B2 (en) * 2016-01-21 2019-09-25 オムロン株式会社 IMAGING DEVICE, IMAGING METHOD, IMAGING PROGRAM, AND RECORDING MEDIUM CONTAINING THE IMAGING PROGRAM
US11037320B1 (en) 2016-03-01 2021-06-15 AI Incorporated Method for estimating distance using point measurement and color depth
US10311590B1 (en) 2016-03-01 2019-06-04 Al Incorporated Method for estimating distance using point measurement and color depth
TWI630345B (en) 2017-12-26 2018-07-21 財團法人工業技術研究院 Illumination apparatus

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6873733B2 (en) * 2001-01-19 2005-03-29 The Regents Of The University Of Colorado Combined wavefront coding and amplitude contrast imaging systems
US6940649B2 (en) * 1995-02-03 2005-09-06 The Regents Of The University Of Colorado Wavefront coded imaging systems
TW200841703A (en) * 2007-04-13 2008-10-16 Primax Electronics Co Ltd Image processing method and related partial PSF estimation method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5521695A (en) * 1993-06-25 1996-05-28 The Regents Of The University Of Colorado Range estimation apparatus and method
JP3275010B2 (en) * 1995-02-03 2002-04-15 ザ・リジェンツ・オブ・ザ・ユニバーシティ・オブ・コロラド Optical system with extended depth of field
US7646549B2 (en) * 2006-12-18 2010-01-12 Xceed Imaging Ltd Imaging system and method for providing extended depth of focus, range extraction and super resolved imaging
CN101241173B (en) 2007-02-07 2011-08-24 南京理工大学 Infrared stereoscopic vision thermal image method and its system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6940649B2 (en) * 1995-02-03 2005-09-06 The Regents Of The University Of Colorado Wavefront coded imaging systems
US6873733B2 (en) * 2001-01-19 2005-03-29 The Regents Of The University Of Colorado Combined wavefront coding and amplitude contrast imaging systems
TW200841703A (en) * 2007-04-13 2008-10-16 Primax Electronics Co Ltd Image processing method and related partial PSF estimation method thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Silvia Cirstea, Amar Aggoun, Malcolm McCormick , "Depth Extraction from 3D-Integral Images Approached as an Inverse Problem" , Industrial Electronics, 2008. ISIE 2008. IEEE International Symposium on , p.798~802 , 2008/06/30~2008/07/02.。 *
Zhao Xin, Sun Mingzhu, Yu Bin, Lu Guizhang, Liu Jingtai, Huang Dagang, "Extracting Depth Information from Microscopic Image of Micro Manipulator",Proceedings of the 2004 IEEE International Conference on Robotics and Biomimetics ,p.629-633 , 2004/08/22-2004/08/26. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI509216B (en) * 2014-12-19 2015-11-21 Apparatus and method for obtaining depth information in a scene

Also Published As

Publication number Publication date
US20100215219A1 (en) 2010-08-26
TW201031895A (en) 2010-09-01
US8467578B2 (en) 2013-06-18

Similar Documents

Publication Publication Date Title
TWI399524B (en) Method and apparatus for extracting scenery depth imformation
JP6260006B2 (en) IMAGING DEVICE, IMAGING SYSTEM USING THE SAME, ELECTRONIC MIRROR SYSTEM, AND RANGING DEVICE
US7826067B2 (en) Method and apparatus for quantitative 3-D imaging
JP6319329B2 (en) Surface attribute estimation using plenoptic camera
US7916309B2 (en) Single-lens, single-aperture, single-sensor 3-D imaging device
KR101639227B1 (en) Three dimensional shape measurment apparatus
JP4807986B2 (en) Image input device
JP2013185832A (en) Information processing apparatus and information processing method
Schöberl et al. Dimensioning of optical birefringent anti-alias filters for digital cameras
WO2015059971A1 (en) Imaging device and phase difference detection method
JP2022128517A (en) ranging camera
CN108805921A (en) Image-taking system and method
US20150062399A1 (en) Imaging apparatus and method for controlling imaging apparatus
KR101706934B1 (en) 3D Measurement Method for Micro-optical Structure Using Digital Holography Data, and Inspection Machine Operated Thereby
CN101846798B (en) Method and device for acquiring scene depth information
US20230316708A1 (en) Signal processing device, signal processing method, and program
JP2013102362A (en) Optical device, image processing method and program
JP7112223B2 (en) optical device
Lu et al. Analysis and design of optical system for the defocused light field cameras
Wang et al. Depth estimation by combining stereo matching and coded aperture
Bae et al. Supporting Information for" Machine-learned Light-field Camera Reads Facial Expression from High Contrast and Illumination Invariant 3D Facial Images"
KR20160017418A (en) 3D camera apparatus, obtaining 3D image information using the same

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees