TW200528945A - 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same - Google Patents

3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same Download PDF

Info

Publication number
TW200528945A
TW200528945A TW94101592A TW94101592A TW200528945A TW 200528945 A TW200528945 A TW 200528945A TW 94101592 A TW94101592 A TW 94101592A TW 94101592 A TW94101592 A TW 94101592A TW 200528945 A TW200528945 A TW 200528945A
Authority
TW
Taiwan
Prior art keywords
camera
image
eye
projection
eye camera
Prior art date
Application number
TW94101592A
Other languages
Chinese (zh)
Other versions
TWI375136B (en
Inventor
Chuang-Jan Chang
Original Assignee
Chuang-Jan Chang
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chuang-Jan Chang filed Critical Chuang-Jan Chang
Priority to TW94101592A priority Critical patent/TW200528945A/en
Publication of TW200528945A publication Critical patent/TW200528945A/en
Application granted granted Critical
Publication of TWI375136B publication Critical patent/TWI375136B/zh

Links

Abstract

The invention provides a 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same. The system is able to obtain internal optical parameters of projective geometry of descriptive camera, such as focal constants, projection functions, and distortion center, and also is able to obtain external parameters, such as the unique projection center and direction of optic axis in the physical space to serve as references for constructing mechanical visual coordinate system. Thus, a 2D fish-eye image can be converted into a spatial light path using the projection center as the reference. These are the determinative cores for the feasibility of a computer visual model. Two fish-eye cameras are located at different positions in the physical space serving as a left eye and a right eye, respectively. This invention may adopt the measurement deduction techniques for constructing perspective images from the conventional pinhole images to realize 3D visual measurement. As such, the operative viewing angle can be expanded and high measurement precision is still provided when the viewing angle exceeds 100 DEG, which is impossible to accomplish in the conventional skill. Furthermore, the cost is relatively low because this invention adopts the conventional low-cost lenses.

Description

200528945 九、發明說明: 【發明所屬之技術領域】 , 本發明係有關於-種以魚眼相機為視覺感知器的三維視 ^量系麟錢設方法;_是—财能力鱗魚眼相機内 ' ^光學減當做建構電腦視覺模型的根據;系_有習知系 « 制視肖下仍具有度量精確度及採錄為便宜的設 施70件來組成。 【先前技術】 • μ j度量系統是攫取在實體空間實驗物體的影像為 H凉、I、二維位置的座標位置的—種非接觸式度量系統。習 知技術所能数的影像必縣接近針 明發展的度量系統能夠採用非常偏離針孔投射的 之―。把魚眼相機當成視覺感知i建衫 it兩大主題。—為魚眼相機的電腦視覺模型 轉換技術,-為決疋該模型在實體空_位置和方 巧解析魚眼相機的影像投射邏輯;後項 =乂建構一、.隹視衫度置糸統。現有機械視覺技術都以針 來發展相機模型’習知技術都無法適用於魚眼影像。以下介紹 ® 此兩領域當今的技術背景、成就與不足處。 、 ^ 〈習知技術相機所裝置的鏡頭〉 * .ΐίϊίΐί,理的光學定律。參考這種成影機制來設 光學物理領域常見。此投射機制叫做線框性透視 ^射(、ectillnear projecti〇n,本文簡稱針孔投射由於 二投巧何演繹代數非常完整,所以這是電腦視覺科 最 夕、最透徹的相機模型。相對地,其他種類有的; 〈魚眼鏡頭及大視角機械視覺的現況〉 裝置魚眼鏡頭的相機可以攫取到無限景深的清晰影像,視 5 200528945 ^度甚至可以超過18G度。但是魚輯頭的光學投射幾何 (optical ge〇metry)與針孔模型差異很大一主要是影 顧的桶狀失真(barrel distortion)—若以習知的濟繹針 翻技術來處理魚眼相機影像(_魚眼影像),相 數热法準確。因此,已經成熟的電腦視覺度量技術對备眼 相機影像(簡稱魚眼影像)幾乎毫無對策。 為了突破機械視覺操作視角太小的限制,最近蓬勃發展全 向影像(omnidirectional imaging: 一種採用單一 决 曝光可以拍攝接近(或超過)半球體立體視野角度的影立 數位處理技術)技術也由於無有效的電腦視覺模型可以 ς ^目,鏡頭供發展數位影像處理,所以幾乎放棄採用魚眼鏡^ (在全向影像學Η魚眼鏡頭又稱「折射影像❹ = 獅「)),轉而發展較複雜的反射折射複二c (catadioptric sensor)° 題 全向影像術設計和使用反射折射式感測器是當 〜’其利用額外的一或複數個幾何外型非常精準的 ^ ^或採用複合稜鏡)可以能夠反射超過半球視野角的影|至= 置折射影像感測|§」的-般相機以間接取得影像,如、 國專利378454、382067與美國專利6,118, 474 , 中所揭露的技術即是。然而這種方㈣球鏡面f要跡 曲度的反射鏡面錄裝時有光學對準的問題;而使相^ ^ Ϊ複:且昂?。另外:,含射元件會使得影像訊號多土次; 減、將反射鏡置於鏡頭前的實施態樣會導致 ^ 可避免的盲點’在很多場合引起不便。到目前】= 測器來架設三維立體視影度量裝置。 木見以此類感 此外,習知大視角影像素_利用旋轉 攝來得到環圓周場影像的方式,或是擺設多 拍 重疊視野取料缝合為-彡像,例 _9與美國專利6,25M58 B1。但是,旋轉式攝 6 200528945 =ΪΤ票的物周圍所有的影像,且這種系統無法實行近 拍攝,更不用說相機本身及旋轉機構的重量會耗費較多的 j去ΐ體ΐ難簡小很歸藏,或是使❹個械除了成本 从2二ί又ί故障,如何對各個相機所拍攝的影像予以取樣、 縫5亦存在著許多的難題。 &quot; 〈現有處理魚眼相機影像的技術〉 士魚眼影像沒有前述的諸多缺點;但其在數位機 ϊϊ由影像解析實體空_資訊,使其用途受_,· i有技術 二有些將就的簡易方法。其中之一所設定的飯說是在一半球 =开角r實?空間内所有的光路將符合一‘‘專-投射函數” ^又射形成一魚眼影像。因此可以直接以平面影像的幾何位置 來反向推演空間視野角的資訊當作為影像處理或重摺 (remapping^的演算根據。請參照「第丄八圖」與「第工b 圖」’其巾「第1 A @」示意電㈣統可以制—個框出邊 的圓形成影區域1 ;而「第1B圖」則為投射到「第lA 之特徵點及線所來自的空間光路示意圖;兩張圖中標示了 ,與其對應的實體空間偏軸角α 與繞 軸角β (azimuthal distance);此兩術語將接著解釋。 為了呈現入射線匯集到投影中心的投射幾何邏輯,「第工 B圖」安排-個虛擬小球來辅助說明各端點在球心(球心即是 投影中心入射光的立體角位置。此半球體可以想像相當於 一個地球儀的北半球部分,繞軸角p為〇的方位是在本初子午 線(prime meridian) 13'(也就是換日線)的位置。而偏軸 角α為0的基準是北極光路σ。依此原則光路八,投射角度可 以用的παΟ及(Η30或表示成R[A1=(a〇,即)來表示。同理, 圖上的各條光路可以被定義成:κ[ΒΊ&lt;π/2,π/2 ); R[C’K〇, b),b 為任意值;Κ[ΐ),Ηπ/8π,3/2) ; R[E,Hti/4 π3/2) ; R[F’&gt;U3/8,π3/2);及町『]二(兀/2,π)。 , 根據魚眼成像模型符合一“專一投射函數,,的假說,其視野 7 200528945 線對影偉备平面擁有下述邊界條件: 衫像的成影區域是可以解析的圓形或橢圓形,所 成1⑹即為R[c']的成影位置。更進—步、R「e,i 為影像平面之失真中心 η 中在後面段落詳細討論)。 扯射二影ί邊緣係由赤道上入射的水平投射線(即肛π/” 豕间 QPrinciPal distance,以下以 0 矣-a、 E、_e4^、而可得立體光路D,、咖對影像平面^、、 到的 ί=Γΐΐ;==59 監視 機制的通用性,致使其影像錄拖妯个成奴”、、眼鏡碩對攻種 的魚眼鏡頭結合到特定的相機機::蕊:限疋$ ;使該專利方法(美國專利5,185,667= 5·67於魚眼影像處理的關與不足〉 熙酾如何’專利5,185, 667由旦^多*从^、 -部分魚輯頭是不能_的,^ f推絲路立體角對 性質與可能的變異: 口為匕堤背了下列基本的光學 魚眼鏡頭的投射函數是多機的、丄/ 剛㈣。η,以下簡稱為EDP),_“邏(輯 8 200528945 眼鏡頭唯一的.趣&amp; 目p浐涵像式。請芩照「第2圖」,其顯示三種典型的魚 線’也就是鏡頭的目標視格投射機制有可能是另 =2fxtaii(a^开罐射(Stere〇gr_iC 坪〇細以^ project:〆:pJ二Μ)形投射(ortho卿hic 球,二使格是卿,架上鏡頭立體視野角A小不怪為半 是否直或較小。僅從影像無法判斷鏡頭的視野角度 呈現白I:因為無論鏡頭的視野多大,鏡頭的成影區域1 η =圓形(或橢圓形)。再由「第2圖」中可以看 降。尤其是低價、簡單的鐘頭1故緣處急劇下 難被精確地定出(註:考岸光的^象影像邊界是很 定的邊緣特徵)。亏以的、兀射現象’甚至於不存在-確 法有效實行影像轉換了。 、、此,-員現,根本就無 總結以上觀點,直接以_影像成 ;;所選用的光學元件息息相關的;在部分實二據 明顯地,該習知技術並未探討到—個 成严輯的内部參數。更別說確定相用來 翏數。在此限制底下,無法進一步將备目p拉男體工間的外部 用領域。 4'、、、眼鏡頭用在更精準的應 9 200528945 本發明探討這些相關主題,所發展出來的求取相機參數的 方法學不受前輕格的f彡像呈現需求;*並域夠精霉地 得到一目標魚眼相機的光學參數。如此一來,可將魚眼影像依 演繹得到的光學參數’重摺及呈現為具有度量精確度的影像; 而更進一步用來發展三維視影度量學。利用非常非線性如魚眼 相機架设二維視影度量系統是本發明最具創新的部分。 〈當今的機械視覺影像技術〉200528945 IX. Description of the invention: [Technical field to which the invention belongs] The present invention relates to a method for setting a three-dimensional vision system using a fisheye camera as a visual sensor; '^ Optical subtraction is used as the basis for constructing computer vision models; Department _ 有 知 知 系 «Under the system of vision, there are still 70 pieces of facilities with accurate measurement and low cost. [Prior art] • The μ j measurement system is a non-contact measurement system that captures the coordinates of H, I, and two-dimensional positions of images of experimental objects in physical space. The number of images that can be counted by the conventional technology must be close to that of the needle development. The measurement system can adopt the projection that is very far from the pinhole—. Think of fisheye cameras as two major themes of visual perception. —The computer vision model conversion technology of the fisheye camera, —to determine the model ’s image projection logic of the fisheye camera in the physical space and the location; the latter term is: . Existing machine vision technologies use needles to develop camera models. None of the conventional techniques can be applied to fish-eye images. The following introduces the current technical background, achievements and deficiencies of these two fields. , ^ "Lens installed in conventional technology cameras" * .ΐίϊίΐί, the law of rational optics. With reference to this imaging mechanism, it is common in the field of optical physics. This projection mechanism is called wireframe perspective projection (ectillnear projection), which is referred to as pinhole projection in this article. Because the two-shot projection deduction algebra is very complete, this is the latest and most thorough camera model in the computer vision department. In contrast, Other types are available; 〈Current status of fisheye lens and large-view mechanical vision〉 The camera equipped with fisheye lens can capture clear images with infinite depth of field, and the visual angle can be even more than 18G degrees. But the optical projection of fish head Geometric (optical geometry) is very different from the pinhole model. One is the barrel distortion of the patrons—if the fisheye camera image is processed with the conventional pinning technique (_fisheye image) The phase number thermal method is accurate. Therefore, the mature computer vision measurement technology has almost no countermeasures for eye preparation camera images (referred to as fish-eye images). In order to break through the limitation of the machine vision operation perspective, the omnidirectional image has been booming recently ( omnidirectional imaging: a single digital processing technique that can capture (or exceed) the stereoscopic field angle of a hemisphere using a single exposure ) The technology also lacks an effective computer vision model, and the lens is used for the development of digital image processing. Therefore, the use of fisheye lenses is almost abandoned. (In omnidirectional imaging, the fisheye lens is also called "refraction image" = lion "). Instead, the development of a more complex reflection and refraction complex c (catadioptric sensor) ° omnidirectional imaging design and the use of reflection and refraction sensors is ~ 'which uses additional one or more geometric appearance is very accurate ^ ^ Or use a compound 稜鏡) can reflect the shadow of more than the hemisphere angle of view | to = set refraction image sensing | § "-like cameras to obtain images indirectly, such as, Chinese patents 378454, 382067 and US patent 6,118, 474, That's the technology revealed in. However, this kind of square mirror has a problem of optical alignment when recording and reflecting the curved mirror surface. . In addition: the transmitting element will make the image signal more earthy; the implementation of reducing or placing the reflector in front of the lens will cause ^ avoidable blind spots' which causes inconvenience in many occasions. Until now] = measuring device to set up a three-dimensional stereoscopic video measurement device. In this sense, I see the large-view image pixels _ using the rotating camera to obtain a circumferential field image, or setting up multiple shots to overlap the field of view and stitching as -artificial images, example _9 and US patent 6, 25M58 B1. However, the rotating camera 6 200528945 = all the images around the object of the ticket, and this system cannot perform close-up shooting, let alone the weight of the camera and the rotating mechanism will take a lot of time to remove the body. Returning to Tibet, or making a machine fail in addition to the cost from 22 to 15, also has many problems in how to sample the images taken by each camera. &quot; <Existing Techniques for Processing Fisheye Camera Images> Shiyueye images do not have many of the aforementioned disadvantages; however, they are not digitally resolved by the image analysis entity, so that their use is affected. Easy way. One of them is set at half ball = opening angle r real? All the light paths in the space will conform to a "special-projection function" and form a fish-eye image. Therefore, the information of the spatial viewing angle can be directly derived from the geometric position of the plane image as image processing or refolding ( The calculation basis of remapping ^. Please refer to "Figure 28" and "Figure b" "The towel" 1 A @ "indicates that the electrical system can make a circle with a border to form shadow area 1; and" Figure 1B is a schematic diagram of the spatial light path from which the characteristic points and lines of item 1A are projected. The two figures indicate the corresponding physical space off-axis angle α and the azimuthal distance β (azimuthal distance); these two terms In order to present the projection geometric logic where the incoming rays are collected at the projection center, "Plan B" arranges a virtual ball to help explain the endpoints at the center of the sphere (the center of the sphere is the solid angle of the incident light at the projection center) Location. This hemisphere can be imagined to be equivalent to the northern hemisphere of a globe. The azimuth around the axis angle p is 0 at the position of the prime meridian 13 '(that is, the date line). The off-axis angle α is 0 benchmark Northern light path σ. According to this principle, the light path is eight, and the projection angle can be expressed by πα0 and (Η30 or expressed as R [A1 = (a〇, that is)). Similarly, each light path in the figure can be defined as: κ [ΒΊ &lt; π / 2, π / 2); R [C'K〇, b), b is any value; κ [ΐ), Ηπ / 8π, 3/2); R [E, Hti / 4 π3 / 2); R [F '&gt; U3 / 8, π3 / 2); According to the hypothesis that the fisheye imaging model conforms to a "specific projection function," the field of view 7 200528945 line-to-shadow plane has the following boundary conditions: The shadow-forming area of the shirt image is a resolvable circle or ellipse, so A 1⑹ is the image formation position of R [c ']. Further, R—e, i is the distortion center of the image plane η will be discussed in detail in the following paragraphs. The edge of the two-shot image is incident from the equator. The horizontal projection line (that is, the anal π / ”between QPrinciPal distance, the following is 0 矣 -a, E, _e4 ^, and the three-dimensional light path D, and the pair of image planes ^,, are reached Γ = Γΐΐ; == 59 The versatility of the monitoring mechanism has led to its video recording becoming a slave ", and the fisheye lens of the eyeglass master has been combined with a specific camera :: core: limit $; this patent method (US Patent 5 185,667 = 5.67 in the fisheye image processing and its deficiencies> How Hee Hee's patent 5,185, 667 from Dan ^ more * from ^,-part of the fish series is not possible, ^ f push the silk road solid angle pair Properties and possible variations: The projection function for the following basic optical fisheye lens is multi-machined , 丄 / ㈣. Η, hereinafter referred to as EDP), _ "Logic (Ed. 8 200528945) The only unique and interesting style of eyeglasses. Please refer to" Figure 2 ", which shows three typical The "fish line", which is the lens's target frame projection mechanism, may be another = 2fxtaii (a ^ open can shoot (Stere〇gr_iC ping〇 fine in ^ project: 〆: pJ 二 M) shape projection (orthoqing hic ball, two Make the grid is clear, the three-dimensional field of view angle A of the lens is not strange whether it is half straight or small. It can not be judged from the image only that the field of view of the lens is white I: because no matter how large the field of view of the lens is, the lens formation area 1 η = Circular (or oval). It can be seen from the "Figure 2". Especially the low-priced, simple clock 1 is difficult to be accurately determined because of the sudden fall (Note: the ^ image of Kao Kwangkwang The boundary of the image is a very fixed edge feature.) The lack of, and the phenomenon of shooting, 'even does not exist-the method of effective image conversion has been effectively implemented. ,,,,,,,,,,,,,,,,,' ;; the optical components used are closely related; in some evidence, it is obvious that the conventional technology has not explored Discussed an internal parameter of Cheng Yan. Not to mention the determination of the number used. Under this limitation, it is impossible to further pull the eyeball p into the external field of the male body room. 4 ',,, eyeglasses In a more precise application, the research on these related topics is based on the invention. The method developed to obtain the camera parameters is not subject to the f-image presentation requirements of the previous light grid; * and the domain is sufficient to obtain a target fisheye camera. In this way, the fisheye image can be re-folded and rendered as an image with measurement accuracy based on the deduced optical parameters; and it is further used to develop three-dimensional video metric. Setting up a two-dimensional video measurement system using a very nonlinear camera such as a fish-eye camera is the most innovative part of the present invention. 〈Current Machine Vision Imaging Technology〉

將攝影機做為機器人裝置的視覺感知器 sensors ) ’如同人類的眼睛,此即為機器視覺(R〇b〇t vi s丨〇n 〇r i^achine vision)或視覺伺服(Vi_ serv〇ing)。機械人由視 見伺服提供加工物件位置訊息的概念在197〇年代末期已經形 f 可概分為二大類:基於固定位置的視覺伺服 (Position-basedvisual servoing,PBVS)及基於影像的視覺 伺服(Image-based visual servoing,IBVShPBVS 需要目標 物的精確三維座標及攝影機的精確位置,計算量大且複雜,不 =於應用在劾人視制㈣統及影像追蹤线的即時控 11鹏將目標物的特徵二維影像進行處理,以獲知目標物 為ΐ資訊’做為回授訊號,引導機11人運動或控制攝影 動且’ 到機械人定位或影像追縱等目的。因此,祕較 Β改純^現’但是醜在於所制的影像訊號必須是準確 熟強健的,㈣㈣城_只有針孔投射_的制較為成 同古if 以兩(或更多)部的針孔相機布置於在實體空間不 的三維視影度量系統,這是—個習知的技術。如有 二 的針孔相機,由於相機的位置差異,使-工作物 ^ ^點在兩相機呈影位置赴視差(如降ity)位移。 理i^、^allaX π QPtiCal Η⑽。如果在機器人裝置兩 計Ϊ實體相對方位可以確定,則可用三角學等式來 #、豆饿裔人間的空間位置。也就是給定—物點在兩相 10 200528945 機的對應呈影位置,參考兩相機的投影中心 參點)以三角幾何學可以計算三維空間位置。$ 腦視覺教科書有討論。 乂技術在很多電 〈針孔相機的電腦模型〉 「針孔投射」是機械視覺感知器的理想模。垂难 越投射機制的鏡頭可贿系統精準度越高^^ 二件的特性’大孔徑小視角的鏡頭比較有能力接近-理二怎 =。所以在减密高崎㈣統1作視野肖 ^ 體 …、曰、、、原口疋…在特疋領域有很大的助益。很重 ί由ί t狀,與設備兩造之間無電氣、機械信號連結而 像3貧訊來被電腦系統處理。如此架設簡單,但 生產力、口口貝的可靠度相對提高。 出相機3的减用來校正影像到符合針孔影像及定 可達到古夢機”視覺兩個重要主題。如此不$高價相機仍 機的旦;、月,广ί解析度。蔡氏方法學更可以用來處理廉價相 -的參數 ητ&quot; 乂ΓΡ rVi Arr ^ , — ------- f — —— -----,&quot; , 、^ v I |^T &gt;,H' I S3S\^ ΦΎ 4+ H AA&gt; -f—The camera is used as the visual sensor of the robot device.) ′ Just like the human eye, this is machine vision (Robot vision) or visual servoing (Vi_serving). The concept of robots providing information on the position of processed objects by visual servoing has been formed in the late 19970s and can be roughly divided into two categories: Position-based visual servoing (PBVS) and Image-based visual servoing (Image -based visual servoing, IBVShPBVS requires the precise three-dimensional coordinates of the target and the precise position of the camera. The calculation is large and complicated. It is not applicable to the real-time control of human vision systems and image tracking lines. The two-dimensional image is processed with the knowledge of the target as the 'information' as a feedback signal, to guide 11 people to move or control the camera and 'to robot positioning or image tracking. Therefore, the secret is better than B. ^ Now, but the ugly is that the image signal produced must be accurate and strong, and the system of ㈣㈣ 城 _only pinhole projection_ is more or less the same if two (or more) pinhole cameras are arranged in the physical space. This is a well-known technique for a three-dimensional video measurement system. If there are two pinhole cameras, due to the difference in camera position, the -workpiece ^ ^ points are displayed on the two cameras. Displacement of parallax (such as descending ity). Arithmetic i ^, ^ allaX π QPtiCal Η⑽. If the relative positions of the two entities in the robot device can be determined, the trigonometry equation can be used to #, the spatial position of the human beings. That is to say, given-the corresponding rendering position of the object point on the two-phase 10 200528945 machine, refer to the projection center parameters of the two cameras). Triangular geometry can be used to calculate the three-dimensional space position. $ Brain Vision textbooks have discussions.乂 Technology In many electronics <Computer model of pinhole camera> "Pinhole projection" is an ideal model for mechanical vision sensors. Difficulties The more accurate the lens, the higher the accuracy of the system. ^^ The two characteristics ‘Large aperture and small viewing angle lenses are more capable of approaching-Li Erren =. Therefore, in the reduction of Gao Qilu system 1 as a visual field ^, ... ,,,, Haraguchi ... It is very helpful in the field of special fields. It is very heavy. It has no electrical or mechanical signal connection with the two devices and is processed by a computer system like a poor signal. This is simple to set up, but the productivity and the reliability of mouthshells are relatively improved. The reduction of the camera 3 is used to correct the image to match the pinhole image and it can reach the ancient dream machine. The two important themes are vision. So the high-priced camera is still the camera; the month, the wide resolution. Cai's methodology is more The parameter ητ which can be used to deal with the cheap phase-乂 ΓΡ rVi Arr ^, — ------- f — —— -----, &quot;,, ^ v I | ^ T &gt;, H ' I S3S \ ^ ΦΎ 4+ H AA &gt; -f—

K 效焦距; *&quot;&quot;&quot;&quot; — ---—K effect focal length; * &quot; &quot; &quot; &quot; — ---—

UO, VO 星二向失真係數: AnV . &quot; ~^^- ---- ,, ,, 鏡頭失真中心的座標位置(指在電腦螢幕的數位系 統); sx fi尺寸關(因「相機」及「賴」 之「比值」使—「正方實體」顯示成為「長方形影像」) ------- 200528945 Τχ,Ιζι—上」外界空 * -— __ 空間原點的位蜂關係 內邱夹叙___ 外界Ξ S ί 述像機的投射幾何,而外部參數把相 參數,械視覺座標系統的的基準;根據這些 在空間的投射光路。蔡氏方法學 氏的使用的參數並非-成不變,有些模型跟蔡 描述二4:以=艮多隹文獻用更多或較娜^ 足以有=ίΓ 士步探討、蔡氏模型的參數並不 =;7參數:二=:來套驗,魚_ 之鏡^月的目的是針對裝置非線性透視投射機制 一角種型之相機參數解析方法, 萨石左择沾視角二、、隹視衫度1糸統,以在系統規格不失 精$度=滅下,可以擴大三維視覺系統的工作 2ίΐ述本發明之目㈣供一種魚眼相機三維視景Γ度量 〇 手克及其术设方法’其技術内容係以中華民國專利 JL?1^1!790、92109160 與 92109159 為基礎進-;研 九杳展而成,·其中有關三維視影度量系統的部分内容,苗叙7 日揭露於張創然先生的台灣大學f機工程 、本發明絲據-般認同的光學假說來轉魚 械^覺模型並架·量純加以驗引導 了中=對稱㈣酬以賴-ate)「多準細=;去導 (nrnlhcommator; -、二次大戰時代,—種以多 械結構產生多組確定立體光路,以來鑑定超大型空中 的凸,鏡所擁有的投射幾何機制)的精密機械組合,進而 圖革巴貫物和其呈現彡像兩造_尺寸 &amp; 的視覺模型。其演繹過程概述如下: 木/、声…眠相祛 12 2〇〇528945 】.依據魚眼影像具有從失真中心往 調漸衰減的特性,由一組具有丄^對,且冷增益單 方向變數及二离;叙、;又方立變數(即是三 相機和圖絲以整台主動控制 即為像素長寬比(Aspect rati。);而比 的方位可以定出相機光 可二::圖, 了外部參數的一邱八―丄加 J以間接地確定 方向與位置各決定$三之二。之的1 2個’即是相機的 2. H,針求取焦距常數與投影中心提出兩種解題演算UO, VO Star Bidirectional Distortion Coefficient: AnV. &Quot; ~ ^^-----,,,, Coordinate position of lens distortion center (refer to digital system on computer screen); sx fi size off (due to "camera" And the "ratio" of "Lai"-"the square entity" is displayed as a "rectangular image") ------- 200528945 Τχ, Ιζι—on "outside space *--__ bit-bee relationship of the origin of space Neiqiu Interpreting ___ Outside Ξ S ί The projection geometry of the camera, and the external parameters refer to the phase parameters, the reference of the mechanical vision coordinate system; according to these projection light paths in space. The parameters used by Chua's methodology are not constant, and some models follow Cai's description 2: 4: It is more or more appropriate to use the = Genduo literature. It is sufficient to discuss = Shibu ’s step. The parameters of the Chua's model are not = ; 7 parameters: two =: to test, the purpose of the lens of the fish _ month is to analyze the camera parameter angle method of the non-linear perspective projection mechanism of the device. In order to expand the work of the three-dimensional vision system without losing the accuracy of the system specifications, the purpose of the present invention is to provide a three-dimensional view of a fish-eye camera, and measure the gram and its method. The technical content is based on the patents of the Republic of China JL? 1 ^ 1! 790, 92109160, and 92109159, and is based on research and development. · Part of the content of the three-dimensional video measurement system, Miao Xu was revealed to Mr. Zhang Chuangran on the 7th The University of Taiwan ’s mechanical engineering and the optical hypothesis of the present invention are based on the generally accepted optical hypothesis to transfer the fish machine model to the frame. Quantitative testing has been conducted to guide the medium = symmetrical compensation based on -ate) `` multiple precision =; go Guide (nrnlhcommator;-, the era of World War II, a kind of multi-machine structure to generate more The group determined the three-dimensional light path, and identified the precise mechanical combination of the projection geometry mechanism of the super-large aerial projection and mirror), and then figured out the visual model of the two-dimensional _ size &amp; It is summarized as follows: Mu /, sound ... Sleep phase 12 〇528528945]. According to the fish-eye image has the characteristic of gradually fading from the center of distortion, from a group of 丄 ^ pairs, and the cold gain unidirectional variable and two separate ; ,,, and cubic variables (that is, the three cameras and the silk are actively controlled by the entire unit is the pixel aspect ratio (Aspect rati.); And the direction of the ratio can be determined by the camera light can be two ::, the outside The parameters of Qiu Ba—Jiangjia plus J to determine the direction and position indirectly determine the two-thirds of each. One of the two 'is the camera's 2. H. Need to find the focal length constant and the projection center.

叙 m 在先予軸上的位置、鑑定投射模型、隹re A 數、及有效的視野角度範圍。 又对她焦距吊 由此(ϋ:種『方法的假說是相機存在單一投射中心,且 '=』。因此可參考任何—呈影 ,實體位置觸^The position of Syria on the prior axis, the identification projection model, the Are A number, and the effective field of view angle range. And focus on her focal length from this (ϋ: a "method hypothesis is that the camera has a single projection center, and '=". Therefore, you can refer to any-rendering, the physical position touch ^

Si it映攝機制可以通用於各類鏡頭,包含‘ 人類視===投雜奴倾㈣龜,例如模擬 特,ιΓΐίΙ學具有操作敏感度和桶狀變形度成正比的 利的解'失真Α的鏡頭—一侧艮有 13 1 .的參數,足夠描述魚眼相機的視覺 2 與it夠成功地重摺魚眼影像為各種不同袼式的 〜,虽^包含重摺(remapping)成針孔影像。 200528945 維視景I㈤ 裝延伸置架設—三 相機模型“ 知道單一魚眼 眼」標度量輸「左眼」和「'右 平台的方向後,_度方位 :f相機的方位來整流兩相機:‘= 統。此糸在達到⑽。的工作 相關應用大幅提f 非*低’且能大視角工作,因而 懂,和其他目的、特徵、和優點能更明顯易 ^。下文特舉—祕貫施例,並配合所關式,作詳細說明如 【實施方式】 本發明中所揭露的技術係以中主 mi i8中日。揭露於張創然先生的台灣大^^&quot;程= 前面大::&quot;一部份綱 I 機^卜部光學參數的演算法),投雜說一:『在 “二1犯圍」内,魚眼相機的投射模型符合一典型的「圓 :數發,出兩種演算法(命名為:1.「ε-演算法」 Ί绎相機的光學參數;投射假說二:『一視 ϊίΐ ϊί應「呈影點」——映射』的特性,·據此本發 Ϊ it更通用性的演算法、可以應用在非特定「圓形函數」 、又射权3L。第一部分是揭露三維視影度量系統的架設方法及 14 200528945 ί裝置元件’這部分必須以相機光學參數已經正確獲得為前 (sight-ray) (Pinhole model)的線性透視投射==—;;·般热知之針孔模式 相較於其他參考線性投影設計的鏡其投射行為。 度較大的優點,但影像有嚴重見直野f度触 iio^tiQni; r魚目嫩象參考針孔模式的變形量Ur平 面上王中心對稱(此中心點稱為失直 义❿里〜像十 且變形量沿著徑射方向增大。〜中Wpnncipal P〇int) 相機的成像投射機制可描述為:於 -T«^F0V) 動發光及反射光)會邏輯上(注意:不 ,,實體投影中心點)匯聚於空間中:唯二 (或稱為投影中心,或viewp〇in1:,簡之饴= it二路則環形對稱於相機的光學軸。此幾何光學 視覺技藝人士所熟知的基礎理論;但於電腦 ϊΐϊΐίΐ 1綱縣肋錢當的解龍術,而僅止 投射模型(即是針孔投射模型)為基礎發展相 ,=視覺錢。這種關是來自於魚眼影像相對巨量的負徑 向失Ϊ(即桶狀失真)尚未有可用的解析方式。 、 彳文杈射函數數學模式的觀點,參考光學軸空間對稱的幾何 「㈣圖形可以在相機内映射出中心對稱的影像。例如安 ~第3圖」所示的具有中心對稱圖案220 (physical central-sy麵etlT pattern,以下簡稱pcp)的平面圖靶^ f相機視野中’然後調整圖靶22與相機間的相對方位,使在 衫像平面23上得到一中心對稱影像23〇 ( imaged 200528945 central-symmetry pattern,以下簡稱 icp),如「第 ΐϊ (=為習知的魚眼投射模型)。或且為或當得到對‘影 像化,也表示此時光學軸21同時正交通過影像中心235圄 靶中心225,並且前基點(front cardinal p〇int,簡稱為f θ 242 與後基點(back cardinal p〇int,簡稱為術) 兩機點也都在光學軸21上。由於圖靶22上的為心 ^放在已知的絕對方位,故可做為參考座標位置來 中光學轴21的方位。因此如何操控pcp 22〇 立 以得到- ICP 230是本發明的—個核心的程=钱6〇的方位 苓照「第4圖」在光學軸21上的前基點242是建 Ξ ίΓ的參考點;後基點243是建構相機内部光準 點。n魚眼鏡頭的邏輯等效圖示,這兩點“ 要圖」中的PCP挪,可視為仿真習知的多準 祕排列的光學佈置。長久以來 ,數;其_多只㈣_、__源么生面= 位±蚊轉巾的絲。^此闕空間絕對 使達到影躲清晰陳況,此啦位 裝設的點光源模擬來自無限遠/已直^在辦的 可被精確地量測,故可以經由直tmt之影像點的位置 軸角對像高的投射剖面(promf)里/、'j數據仔到一個透鏡的偏 就操作原理,多準直^ 空間對稱投射光路的影像元件、,則任何具有光學轴 投射模式。因此,有能力鐵置的 的圓形函數投射。(補充說明:「2图所口、义:個封閉 16 200528945 的投射機制。作县,夕 實驗室中實現7^二ί ΐ11的精密弧形機械結構报難在-般 幾何排列^容^提出以平關形來間接仿真多準直“ 體實施^ 礎^計之的-具 參照「第^ ft發明方法的理論基礎。請再次 22,並洛亍t 、、、且成儀③糸統的二度空間的平面圖乾 圖;圖中ΐί用在機严減空間產生的投射光路= 機。如果相機的^$4入^^平面3效表示魚眼相 係(補充說明.· 二一了 λ圓形投射函數關The Si it imaging mechanism can be universally used for various types of lenses, including 'human vision === throwing turtles, such as analog special, ιΓΐίΙ learns that the operation sensitivity is directly proportional to the barrel deformation' distortion Α Lens-13 1. Parameters on one side, enough to describe the vision of the fisheye camera 2 and it successfully refold the fisheye image into a variety of different styles, although ^ includes remapping into pinholes image. 200528945 Dimensional View I㈤ Extending and Mounting—Three-camera model "Knows a single fisheye eye" after measuring "Left Eye" and "'Right platform's direction, _degree orientation: f camera orientation to rectify two cameras:' = 。. This is reaching ⑽. Work-related applications have greatly improved f non * low 'and can work from a wide perspective, so understand, and other purposes, characteristics, and advantages can be more obvious ^. Special exemplified below-secret execution For example, and in accordance with the relevant formula, the detailed description is as follows: [Embodiment] The technology disclosed in the present invention is based on the Chinese host mi i8 in China and Japan. It was disclosed in Mr. Zhang Chuangran ’s Taiwan University ^^ &quot; 程 =: Large :: & quot Part of the algorithm of the optical parameters of the outline I machine ^ Bu Department), the miscellaneous say one: "In the" two 1 crime ", the projection model of the fisheye camera conforms to a typical" circle: number of shots, two kinds of calculations Method (named: 1. "ε-Algorithm" to interpret the optical parameters of the camera; projection hypothesis two: "one view ϊ ΐ ϊ 应 should be a" shadow point "-mapping" characteristics, according to this issue Ϊ it more Versatile algorithms can be applied to non-specific `` round functions '' 3L. The first part is to reveal the method of setting up a three-dimensional video measurement system and 14 200528945 ί device components' This part must be based on the linear optical projection of the camera's optical parameters have been correctly obtained (sight-ray) (Pinhole model) == —; ; The pinhole mode of general heat knows its projection behavior compared to other mirrors designed with reference to linear projection. It has the advantage of a larger degree, but the image has a severe sight of Naono f degree iio ^ tiQni; r fisheye tender elephant reference pinhole The deformation amount of the pattern is symmetric on the center of the king plane (this center point is called unstraighteline ❿ ~ like ten and the deformation amount increases along the radial direction. ~ 中 Wpnncipal P〇int) The imaging projection mechanism of the camera can be described as : At -T «^ F0V) dynamic light and reflected light) will logically (note: no, the center point of the physical projection) converge in the space: Wei Er (also called the projection center, or viewp〇in1 :, Jian Zhi 饴 = It two-way is circularly symmetrical to the optical axis of the camera. This basic theory is well known to those skilled in the art of geometric optics and vision; however, the computer is only a projection method (ie pinhole projection) Model) based Fundamental development phase, = visual money. This kind of relationship comes from the relatively large amount of negative radial loss (that is, barrel distortion) of fisheye images. There is no analytical method available yet. Refer to the geometrical symmetry of the optical axis. The “㈣ graphic can map a centrally symmetric image in the camera. For example, Ann ~ Figure 3” has a centrally symmetric pattern 220 (physical central-sy plane etlT pattern, hereinafter referred to as pcp). Plan view target ^ f camera field of vision 'and then adjust the relative orientation between the map target 22 and the camera to obtain a centrally symmetric image 23 on the shirt image plane 23 (imaged 200528945 central-symmetry pattern, hereinafter referred to as icp), such as "第ΐϊ (= for the conventional fisheye projection model). Or if or when the image is obtained, it also means that the optical axis 21 passes through the image center 235 and the target center 225 at the same time at the same time, and the front cardinal point (referred to as f θ 242 and the back cardinal point (back cardinal point, abbreviated as surgery) Both machine points are also on the optical axis 21. Since the target on the target 22 is placed at a known absolute orientation, it can be used as a reference coordinate position to center the optical axis 21. Orientation. So how to control the PCP 22 ° to get-ICP 230 is the invention-a core process = Qian 60. Orientation according to the "Picture 4" on the optical axis 21 is the base 242 The reference point; the rear base point 243 is the construction of the camera's internal light point. The logical equivalent of the fisheye lens. The PCP in these two points “required pictures” can be regarded as a simulated optical arrangement with multiple semi-secret arrangements. For a long time, the number; its _ more than ㈣ _, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ __ _ _ _ ___ ___ ___ ___ ______________________________? ^ This 阙 space absolutely makes it possible to achieve clear shadows. The point light source installed here simulates infinite. Far / already can be accurately measured, so it can be measured by straight tmt image points. Set the angle of the projected profile (promf) of the axial angle to the height of the lens, and the operation principle of the data from a lens to a lens. Multi-collimated ^ spatially symmetric projection optical path of the image element, any optical axis projection mode. Therefore , Capable of cast by a circular function of iron. (Supplementary note: "2. The mouth, meaning: a closed 16 200528945 projection mechanism. Zuo County, Xi laboratory to achieve 7 ^ 二 ί ΐ11 precision arc machinery The structure is difficult to report in a general geometric arrangement ^ Rong ^ put forward the indirect simulation of multi-collimation with the level of the "implementation of the system ^ basis ^ calculated-with reference to" the theoretical basis of the ^ ft invention method. Please 22 again, and Luo亍 t ,,, and Chengyi ② The two-dimensional space plan of the system; the projection light path used in the machine to reduce the space = machine. If the camera's ^ $ 4 into ^^ plane 3 effect means fisheye Phase (supplementary explanation. · The λ circular projection function

It , (collimating mechanism) &gt; 所有入射線在進入相機内部前會先匯聚於前基點⑽ jfr^t canhnal point ’簡稱為Fcp),然後再由一後基點 243 (back canhnal point,簡稱為BCp)根據投射函數發散 折射出並成像在影像平面23上。FCP 242與BCP 243是描述 魚眼鏡頭24之投射行為的二健準點,时界定魚眼相機 内、外的二^投射空間(補充說明:就發明人的認知除本發明 外,直到目前沒有文獻探討在電腦影像系統如何引入處兩基 點)。於解析魚眼相機的投射機制時,Fcp 242供視野線參考, BCP 243供影像平面23參考,此二節點間的距離並非相機的 參數,可以設定為任意值,因此本發明將FCP 242與BCP 243 合併為一單一的VP 241,如「第5 A圖」所示,以一致化表 示成像的圓函數邏輯。「第5 A圖」可以視為「第4圖」立體 模型中包含光學軸21的子午線平面(meridi〇nal plane)的光 徑投射圖。圖中顯示α’係由像高p與焦距常數/反向推導而 得。α及α’間的邏輯關係受被測試鏡頭的原生投射模式來決定 17 200528945 (補充說明:這很像一個折射現象)。 〈座標系統〉 的座ΙίΓ更描述本發明系統的實施方法’首先定義所參考到 基軸紐制平台的相互正交的三鋼體 的可ίΐ 絲52與Z’基軸53 —所定義,此為測試 季V T 7 ollable)及觀察(。⑹贿刪 糸、、先w(x,Y,Z)三分量的單位為米。 [2)土圖革巴座標系統τ(χ, γ,ζ)以圖無22布置中心點, 以經為Ζ轴的參考方向。圖乾22、是可 的位移^ Γ三 絕對座標細Χ,Υ,Ζ) 可以借用測地Ϊ。這個座標系統 來說明、標示地_緯度、、f緯座標, f,其為體表面離開赤道平面。請參考厂^ 5疋表=一個高 平面3!為基準來計算以外,是以赤道 因此’以VP 241 (或是赤道平面31)為^ β,备、λ,h相同。 ^目機内外投射空_分為南北球;7著'、,'=碩24將 ΐ正值’南半球為負值。〇的單iit度 (4) 光路座標系統δ (α β)定羞 g向量,其一端點怪為(是f射端點的 相於E(cx,P,h)的a,p,h為任意值^視參點(M) (5) 影像平面座標系統c,(x,y) …235為原點,將影像平面23以直角:標:極C座= 18 200528945 不。x,y及P的單位是長度;β則是空間 是無法直接觀察到的。 又基本上此糸統 腦^6ii^=tI(u,v)錢可喊接觀铜呈現在電 月时统顯=面的影像的座標系統,u,v以像 真中心,成像在電腦系統顯示螢幕的位:定以 I(uc,,vc)。基本上,相機映射到影像平面23的尺 ^ j Ρ’(Ρ’,β’)類比表現在I(u,v)座標系統。而任 丄 特徵點在像素座標系統也表示成以i(n, 標C(u, V),紙^票心於㈣,vc)為原點的直角座 在「第5 B圖」中同時標示校正系統建構完 兩座標系、統Ε(α,β,h)與T(X,Y,Z)方向與位置的關^性發 明所發展的實驗系統的操作目標是讓_ 22 么 f標系統的ζ軸能夠與相機狀義Ε(α,㈣的光學軸(21 令0 〈二階段投射魚眼影像視覺模型〉 名人發現在地作學(eart咖咖)部分_ 地圖的映攝邏輯和常見的魚眼鏡頭成像映攝邏輯相同。以^ 已發展很成熟的學科術語,來辅助描述本 叙明的魚眼視覺杈型的影像轉換方法。 有一種繪製地圖的映攝邏輯已經嵌入於「 由的實際尺寸職球顧是不能比,的」 ϊί把地5表面以地球中心映為基準攝到-個半^二ί 球3〇smal 1 sphere)為地 _作學的術語; 二St「二:此、能將小球30球面映社到有限面積的圖面 地表儀上。弟5 B圖」小球的一點302是為地表映攝來的一 ? 302 „ f到地圖23的一點231。這樣地、可以用半徑為π 半徑^-維平__面積來展示示小球的表面地理資气/ …眼影像成像機制可以解釋為「具有_面影像的針孔投 19 200528945 影」串連某一種「地圖誇制 一 Sp (α,β)入射光(m/攝。再舉EDP魚眼鏡頭來說明, 點體1時不再21,241線段定義)在經過視參 (以線段241,231定義f數產生非線性的折射現象光路 表面上的-點3〇2 =針孔鏡頭般以直線投影到小球 231。 也圖繪製的映攝邏輯,成影於點 理。像_制树便電腦處 邏 圖另外^的立體_及正交圖形魚眼鏡頭。。兄弟一 圖靶產生準直入射光的機制〉 成影幾何以:維rn二1 示意魚眼相機 ⑵被適當置放的PCP 220仿真多準直 °°的點先源結構;及(3)參考小球30及PCP 22〇上圓^ =所=的大球。現在以pcp 22〇最外圈的同心圓軌跡= p 7圓^跡所定義的大球半徑*「特徵點221至〇P的線 4〇 ;此大球4〇被平面圖乾22最外圈 22ΰ &quot;^的任—圓形軌跡都可以單獨以此此機制來描述 〈圖靶產生圓錐光路〉 投射自PCP 220上任一物體點221的視野線(sight 〕 純3會,質地在「人射點3Q1」正交地穿削、球3◦表面並往 球中心(也就是VP241)集中,如此一來,PCP22〇i每—同 20 200528945 ΐ ΓίηΛ於相機外㈣間恰域構—個圓錐光 ίί; ί =7220的交集,且有—共通頂點在 圓的軌跡^輯上H中畜的立體光路是發源於最外圈同心 圓〜竿几則域铒上,視野線通過νρ 24 函數折射到影像平面23。如「第 4 射 VP、231可以描述-條光路。* 口」上特放點⑵、30卜 〈得到ICP的方法〉 準圖⑽稱性’只有在光學軸21正交對 稱呈現出同心且對 中心235。 Icp 230的幾何對稱中心即為失真 因此,適當地調整圖無22與測試相機 :成之影像的對稱性符合設定精確度,此 射影像點的特徵座標(featnr^H 才口茶225所映 235 (Princi^Z:^^^ ^ C&gt;(0,0) st Ρ(0&gt;β) . 2通過失真中心235且與影像平面 通過圖案中心225的直線為光學g/;二2f,因此視此f交 了追蹤光學軸21方位的功能,这s ^日 以上鞋序貫現 部分。 町刀此&amp;疋疋相機外部參數過程的一It, (collimating mechanism) &gt; Before entering the camera, all incident rays will first converge on the front base point ⑽ jfr ^ t canhnal point 'referred to as Fcp), and then from a back canhnal point (referred to as BCp) It is divergently refracted according to a projection function and is imaged on the image plane 23. FCP 242 and BCP 243 are two-points of accuracy that describe the projection behavior of the fisheye lens 24, and at the same time define the two projection spaces inside and outside the fisheye camera (Supplementary note: as far as the inventor's cognition is concerned, except for the present invention, there is no document until now Explore how to introduce two basic points in computer imaging systems). When analyzing the projection mechanism of a fisheye camera, Fcp 242 is used as a reference for the line of sight, and BCP 243 is used as a reference for the image plane 23. The distance between these two nodes is not a parameter of the camera and can be set to any value. Therefore, the present invention uses FCP 242 and BCP 243 is merged into a single VP 241. As shown in "Figure 5A", the circular function logic of the imaging is uniformly represented. "Figure 5A" can be regarded as a projection view of the optical path of the meridian plane (optical plane) of the optical axis 21 in the "Figure 4" stereo model. The figure shows that α 'is derived from the image height p and the focal length constant / backward derivation. The logical relationship between α and α ’is determined by the native projection mode of the lens under test 17 200528945 (supplementary note: this is much like a refraction phenomenon). The coordinate of the <Coordinate System> further describes the implementation method of the system of the present invention. 'First, define the referenced to the base axis button platform of the three steel bodies orthogonal to each other 52 and Z' base axis 53-this is a test Season VT 7 ollable) and observation (. ⑹ bribe delete, first, w (x, Y, Z) three-component unit is meter. [2) Tutu Geba coordinate system τ (χ, γ, ζ) No 22 arranges the center point, and the warp is the reference direction of the Z axis. Fig. 22. It is possible to displace ^ Γ. Three absolute coordinates (X, Υ, Z) can be borrowed from geodesic Ϊ. This coordinate system is used to explain and label the geo-latitude, f latitude coordinates, f, which is the body surface leaving the equatorial plane. Please refer to the factory ^ 5 疋 table = a high plane 3! As the basis for calculation, except for the equator. Therefore, VP 241 (or the equatorial plane 31) is ^ β, and the same as λ, h. ^ The projection space inside and outside the eyepiece is divided into north and south spheres; 7 ',,' = Shuo 24 will be positive ’southern hemisphere is negative. A single iit degree (4) of the optical path coordinate system δ (α β) fixes the g vector, and one of its endpoints is strange (a, p, h relative to E (cx, P, h) at f-endpoints are Arbitrary value ^ View parameter (M) (5) Image plane coordinate system c, (x, y) ... 235 as origin, set image plane 23 at right angles: standard: pole C = 18 200528945 No. x, y and The unit of P is length; β is not directly observable in space. Basically, this system of brain ^ 6ii ^ = tI (u, v) Qian can call and watch the copper appears in the electric month. The coordinate system of the image, u, v are centered on the photorealistic center, and the position of the image on the display screen of the computer system is determined by I (uc ,, vc). Basically, the camera is mapped to the image plane 23's size ^ j Ρ '(Ρ' , Β ') analogy is represented in the I (u, v) coordinate system. And any feature point in the pixel coordinate system is also expressed as i (n, standard C (u, V), paper ^ is centered on ㈣, vc) The right-angle seat for the origin is shown in the "Figure 5B" at the same time. The correction system has completed the construction of two coordinate systems, the system E (α, β, h) and the direction and position of T (X, Y, Z). The operating goal of the developed experimental system is to allow the z-axis of the f-standard system to Machine-shaped meaning Ε (α, ㈣'s optical axis (21 order 0 <two-stage projection fisheye image visual model>) celebrities found in the earthwork (eart coffee) part _ Mapping logic of maps and common fisheye lens imaging The mapping logic is the same. Using mature terminology that has been developed to assist in describing the image-transition method of the fisheye vision of this description. There is a mapping logic that has been embedded in the actual size of the professional ball Gu is incomparable. ”Ϊί took the surface of the earth 5 as a reference to the center of the earth as a reference to a half ^ 2 ί 30smal 1 sphere) as the term of study; two St" two: this, can The small ball 30 sphere maps to the surface map instrument of limited area. Brother 5 B picture "point 302 of the small ball is a photo taken for the surface mapping? 302„ f to point 231 on map 23. This way, you can use The radius is π, the radius is ^-Weiping __area to show the surface geography of the sphere /… the imaging mechanism of the eye image can be interpreted as "pinhole with 19-dimensional image 19 200528945 shadow" in series with a "map exaggeration" Make a Sp (α, β) incident light (m / photo. EDP fisheye lens is used to illustrate Point body 1 is no longer defined by the 21,241 line segment. On the surface of the optical path after passing through the viewing parameter (the line number 241, 231 defines the f-number to produce a non-linear refraction phenomenon) -point 302 = a pinhole lens projected in a straight line to the small Sphere 231. The mapping logic of the drawing is also based on the point of view. Like the three-dimensional image of the tree and the computer, and the orthogonal graphics fisheye lens. The mechanism of the brother one-picture target generating collimated incident light 〉 Shadowing geometry: Dimension rn 2 1 indicates that the fisheye camera 适当 properly placed PCP 220 simulates a point source structure with multiple collimation °°; and (3) Reference ball 30 and PCP 22 ○ circle ^ = = The big ball. Now take the concentric circle trajectory of the outermost circle of pcp 22〇 = the radius of the large ball defined by the p 7 circle ^ trace * "the line of characteristic points 221 to 0P 40; this large ball 40 is dried by the plan view 22 and the outermost circle 22ΰ &quot; ^ 's Any—Circular trajectory can be described by this mechanism alone. [The target produces a cone light path] The line of sight (sight) projected from any object point 221 on the PCP 220 is pure 3 sessions, the texture is at the "shot point" "3Q1" is cut orthogonally, the ball is 3◦ surface and concentrated toward the center of the ball (that is, VP241). In this way, PCP22〇i each—same as 20 200528945 ΐ ΓίηΛ is located outside the camera, just a cone light ίί Ί = 7220 intersection, and there are-common vertices on the track of the circle ^ The three-dimensional light path of the animal in H is derived from the concentric circles of the outermost circle ~ several fields 竿, and the line of sight is refracted to the image through the νρ 24 function Plane 23. For example, "the fourth shot VP, 231 can be described-a light path. * Mouth" special points ⑵, 30b <Method of obtaining ICP> quasi-planar symmetry 'only appears on the optical axis 21 orthogonal symmetry Concentric and center 235. The geometric symmetrical center of Icp 230 is the distortion, so adjust the figure appropriately 22 and test camera: the symmetry of Cheng's image meets the set accuracy, and the characteristic coordinates of this image point (featnr ^ H Caikoucha 225 235 (Princi ^ Z: ^^^ ^ C &gt; (0,0) st Ρ (0 &gt; β). 2 The line passing through the distortion center 235 and the image plane through the pattern center 225 is optical g /; 2f, so f is considered to have the function of tracking the position of the optical axis 21, which is more than s ^ 2 The shoe sequence appears partly. One of the external parameters of the machine tool

心且_的圓r::同r角組t:i數目的同 220貫施例。當然’對稱且環繞 :自疋可厅的PCP 亦有&lt;相實同,但其並彻到^易的處^^校正物件 21 200528945 以下以·-具體實施例,可以實現以PCP 220為參考位置定 位代表相機的小球30的尺寸及位置、光學軸21方向與失真中 心235。本發明於實際實驗時,設計PCP 220如「第6圖」所 示,其可以雷射印表機將之印製在A3尺寸的紙上作為圖靶22 的一具體實施例。考慮到魚眼鏡頭的失真程度會輻射狀地向外 急劇增加,因此設計PCP 220之同心圓間的半徑差由内往外逐 漸擴大,以反映魚眼鏡頭的此一光學現象;PCP 220中同心圓 半徑尺寸的決定,可以用如「第3圖」般的圖靶影像先取得在 適當量測基準位置的物體與影像的對應關係,來調整PCP 220The circle r :: the same r angle group t: i number of the same 220 consecutive examples. Of course, 'Symmetry and surround: The PCP of the Hall of Destiny also has the same reality, but it goes to the easy place ^^ Correction object 21 200528945 The following specific examples can be achieved with PCP 220 as a reference The position positioning represents the size and position of the small ball 30 of the camera, the direction of the optical axis 21, and the distortion center 235. In the actual experiment of the present invention, the PCP 220 is designed as shown in "Figure 6", which can be printed on A3 size paper by a laser printer as a specific embodiment of the target 22 of the figure. Considering that the degree of distortion of the fisheye lens will increase radially outward, the radius difference between the concentric circles of the PCP 220 is designed to gradually expand from the inside to the outside to reflect this optical phenomenon of the fisheye lens; the concentric circles in the PCP 220 For the determination of the radius, PCP 220 can be adjusted by using the target image as shown in Figure 3 to obtain the corresponding relationship between the object and the image at an appropriate measurement reference position.

的實體圓執跡寬度,使系統能夠同時清楚地顯示中間區域及邊 緣影像範圍的影像。依此原則設計的PCP外參照「第6圖」。 另外,黑白相間可以明顯辨別同心圓邊緣,有利於後續影像處 理作業。 ' 弟「圖」,將該製成的圖革巴22π、一調堂干台 5^上,且讓圖靶22與相機60盡量靠近,使得pcp 22〇能夠 杈亙整個魚眼鏡頭24的FOV,如此映射出來的影像合橫跨大 f。如此安排可取樣較大視角的影像資訊:因為 模式:二f=照最能分辨魚眼鏡頭24所遵循的特定投射 =的=越=」所示,在細大_,不同投射模 〈相機與鏡頭〉 司出白體,而安裝鏡頭是韓國胁on 的相機;由^造芮=2魚眼鏡頭。這是-個非常簡單 座標在像素 22 200528945 '為了簡化^田述,以一平台三基軸的座擗抑本给 對座標系統默γ,z),並設定測試相6丁 同體位置的絕 向為絕對座標系統的正z方向。經由&amp;離圖革巴22的方 靶座標系統T(X/Y,Z)的方位。, 、工 可M移動整個圖 〈相機座標系統與圖乾座標系統的對 、 於組叙系統之初’要把相機光軸正交 「 凋整三個位移變數與三個旋轉變數來 cp中心必須 斷移動相機支架70於適當的位置,並 h先,M肉眼的判 機60的方向以目視盡量正交對準圖乾°24種^^使得相 ,示的影像與其對稱指標,藉由電腦程式調整^考,幕上 基軸52而微調圖靶22的絕對座桿位置 二51與 部的萬向雲台71微_4 底 性為取佳。根據這樣的硬體設置,理想上,若的對稱 的方向應盥Ζ,ί轴53 5的特徵座標’則光學軸21 由軟體程式^控制將更理想Γ致。如果萬向雲台71能夠延伸 〈判斷影像對稱性的方法〉 之準首= 疋Ε(α,β,_τ(χ,γ,z)兩座標系絲 的精神者為:;任何沿襲影像對稱判斷方法 圍之外。α林&amp;⑽延似不應排除在本發明的保護範 定ICP的第一種對稱指標〉 像示ίί考3 ίϋ由「第6圖」得到之icp 230影 作上日^影像為芩考平面、失真中心235 (註:實際和 像的點)為j算的原點,選擇在該ΐ 北、東以射向上(南、北、東、西、東北、西南、西 心往和同心圓影像軌跡的邊緣,如圖所示,由影像中 杬向延伸,以加註記號表示由黑而白之邊緣的二 23 200528945 線,而以“+”表示由白而黑的取樣點; (該邊緣與圓中心間的距離長度)相加為if=取到的邊緣值 個“距離和,,,分別為:ss、_、EE、ww 1巨^和,因此有八 是ICP 230達到理想的對雛,則以 、、、=龍、SE。若 為原點,相對二輻射方向之方向距離的特徵座標 差值-做《-SS、縱2姻:減應4零二即4個 diff—4=NW-SE—應趨近於零;或是相對二H~3-NE-SW、 加應達到一最大值,亦即4個和^°之距離和相 sum_2=EE+WW ^ sum_3^NE+SW &gt; sum_4=NW+SE^^]=T! ^ 這4個差值或4個和值或二者同時參考可^大。故參考 方位是否恰當,並據此微調_2 m目機 的最錄雛。社為本㈣縣 〈魚眼影像的一些處理技術〉 土 ^發明根據魚眼影像信號的特殊性發展掏取同 軌跡邊緣的方法。邊緣切出是影像處理領域f要的 = 義的成做觸触的·能量賴分佈 4 = ,不同的處理對策。根據實驗分析,魚眼影像 3影^很大的變化。因此,發明-辨認影像邊緣』Ϊ 异法,以進行擷取影像執跡邊緣。 ‘】八 制第8 Β圖」,顯示「第8 Α圖」取樣線區域的兩 條傾斜線部分的視影信號及處理結果的示意圖。以中心 f,共有四組視f彡信號顯群組由上到下示於錄方向。^ 示由中心點往影像邊緣的徑向延伸。每一組信號群組可辨識兩 個不規則且漸次衰減的(接近)方波信號曲線。實線部 始^頻能量響應(radiometricrespGnse)呈現嚴重地捏向漸 進衰減,在影像外圍區域很難辨認特徵訊號點位置。本發明發 展一非銳化遮罩(unsharpmask)處理程序,首先藉由^條^ 化處理(histogram equalizing pr〇cess)提昇外圍影像的訊 24 200528945 號強度,其結果如圖中以虛線表示的訊號變化曲 f等化後的剖面曲線應用-非因果低通濾、波器(nQn—casu二 pass filter)產生動態門檻值(如圖中所示之接近 曲線的交叉點,即為影像轨跡邊緣特徵位 ^示^到的結果標示於每—組信號的底部以方形_的波形來 〈鑑定ICP的第二種對稱指標〉 本發明提出的第二種對稱指標用來判斷魚眼影 統^2^的對稱雖。方法是將影像P,(p,p)以直角座標系 以 ο A 亩』lan C〇〇rdlnate SyStem)的 C(P,P)表示,也就是 P ^直轴、以β為橫軸,轉換「第8 A圖」的影像如「第9 白線條的紐性作為第f種對稱評估指 、不貝”後舍現,此方法的破感度相當高,只要圖爺% 的=6=^Ϊΐ微變動,圖中的直線馬上變成曲率明顯 料ΐ 疋直接以肉眼觀察或是糊電腦以數學演算 一種對稱指標都非常適合應用來判斷icp 23〇的勒 冉性'而其他圓形對稱圖靶亦適用這種演繹方式。 、-The solid circle track width enables the system to clearly display the image of the middle area and the edge image range at the same time. The PCP designed based on this principle refers to "Figure 6". In addition, the concentric edges can be clearly distinguished between black and white, which is conducive to subsequent image processing operations. "Brother" Picture ", put the manufactured Tugueba 22π, a tone on the stage 5 ^, and make the target 22 and the camera 60 as close as possible, so that the CPC 22 can control the entire FOV of the fisheye lens 24 The combined image thus spans the large f. This arrangement can sample the image information of a larger angle of view: because the mode: two f = the most specific projection followed by the fisheye lens 24 can be distinguished === Yue = "", in small and large, different projection modes <camera and Lens> White body, and the lens is a camera from South Korea; from ^ 造 瑞 = 2 fisheye lens. This is a very simple coordinate at pixel 22 200528945 'In order to simplify the description of the field, the coordinate system of the three basic axes of a platform is used to give the coordinate system the default γ, z), and the absolute direction of the test phase 6 body position is set. Is the positive z direction of the absolute coordinate system. Via &amp; the orientation of the target coordinate system T (X / Y, Z) from Tugaba 22. , 可可 M M to move the entire picture 〈The camera coordinate system and the graph coordinate system are at the beginning of the group description system ‖ To make the camera's optical axis orthogonal, the three displacement variables and three rotation variables must be centered in the cp center. Move the camera bracket 70 at an appropriate position, and firstly, the direction of the M 60 with the naked eye is as orthogonally aligned as possible with the eyes. 24 kinds of ^^ make phase, the image shown and its symmetry index, through a computer program Adjust the test, base axis 52 on the screen and fine-tune the absolute seatpost position 2 51 of the target 22 and the universal pan / tilt 71 micro_4 of the part. The basicity is better. According to such hardware settings, ideally, if the symmetry The direction of the axis should be Z, the characteristic coordinates of axis 53 5 ', then the optical axis 21 is more ideally controlled by a software program ^. If the universal pan / tilt head 71 can extend the standard of <method of judging image symmetry> = 疋The spirits of the two coordinate lines of Ε (α, β, _τ (χ, γ, z) are:; Any lineage judgment method for image symmetry is outside the scope. Alpha forest &amp; extension should not be excluded from the protection scope of the present invention The First Symmetric Index of Defining ICP> 像 示 ίί 考 3 ίϋ Obtained from "Figure 6" icp 230 film the previous day ^ The image is the test plane, the distortion center 235 (Note: the actual point of the image) is the origin of j calculation, choose to shoot upward (south, north, east, west) , Northeast, southwest, westward, and concentric circles of the image track, as shown in the image, extending from the normal direction in the image, with a mark to indicate the 23 2328928945 line with black and white edges, and "+" Represents the sampling points from white to black; (the distance between the edge and the center of the circle) is added as if = the edge value taken by "distance sum,", respectively: ss, _, EE, ww 1 giant ^ And therefore, there are eight that ICP 230 has reached the ideal pair, then ,,, == dragon, SE. If it is the origin, the characteristic coordinate difference value of the distance from the direction of the two radiation directions-do "-SS 、 vertical 2 marriage : Minus 4 zero 2 or 4 diff—4 = NW-SE—should approach zero; or relative to H ~ 3-NE-SW, the addition should reach a maximum value, that is, 4 and ^ ° Distance and phase sum_2 = EE + WW ^ sum_3 ^ NE + SW &gt; sum_4 = NW + SE ^^] = T! ^ These 4 differences or 4 sums or both can be referenced at the same time. Therefore, the reference position Is it appropriate and fine-tuned accordingly_2 The most recorded chick of the m-eye machine. The company is based on "Some processing techniques for fisheye images" in Qiongxian County. Invented the method of extracting the edges of the same trajectory according to the special characteristics of fisheye image signals. Edge cutting is the field of image processing. The necessary = the right to make the touch, the energy distribution 4 =, different processing countermeasures. According to the experimental analysis, the fisheye image 3 shadows ^ greatly changed. Therefore, the invention-identify the edge of the image "Ϊ different methods to Capturing the edge of the image capture. '] Eighth image 8B image "shows the video signal and processing result of the two oblique lines in the" 8th image A "sampling line area. With the center f, there are four groups of visual f 彡 signal display groups shown in the recording direction from top to bottom. ^ Shows the radial extension from the center point to the edge of the image. Each signal group can identify two irregular and gradually decaying (near) square wave signal curves. The solid line portion of the initial frequency energy response (radiometricrespGnse) shows a serious progressive progressive attenuation, it is difficult to identify the position of the characteristic signal point in the peripheral area of the image. The present invention develops a non-sharp mask processing program. First, the intensity of the peripheral image is increased by 24 ^ 28928945 by histogram equalizing pr0cess. The result is shown by the signal indicated by the dotted line in the figure. The profile curve equalized by the change curve f is applied-a non-causal low-pass filter and an nQn-casu two-pass filter to generate a dynamic threshold value (as shown in the figure, the intersection point close to the curve is the edge of the image track The results shown by the characteristic bits ^ are indicated at the bottom of each group of signals with a square wave shape <the second symmetry index for identifying ICP> The second symmetry index proposed by the present invention is used to determine the fish eye shadow system ^ 2 ^ Although the method is to represent the image P, (p, p) in the right-angle coordinate system as C (P, P) of ο A acre lan 〇〇Ιlnate SyStem), that is, P ^ straight axis, with β as On the horizontal axis, the image of the "Figure 8A" is transformed into "figure 9 of the white line as the f-th symmetry evaluation index", and then it is displayed. The degree of damage of this method is quite high, as long as the figure% = 6 = ^ Ϊΐ slight change, the straight line in the figure immediately becomes obvious curvature 明显 Straight Computer visually observed or paste in a symmetrical or mathematical indicators are well suited for applications to determine the properties Ran icp 23〇 Le 'of FIG target while the other circular symmetric manner also applies this interpretation. ,-

/^定出相機的方位(eamera PQsing)問題〉 响疋错由弟一種或第二種對稱指標的測試,當ICP 心為光學軸21已正交地對準圖案中 pCP 220的所為罢曰丄t轴2 (補充明·因為同心圓圖案 一個已纟d我們餘絲奴的綠,·也就是在 座標㈡是本發明的-大貢獻:光學軸2!的絕對 1 ^ PCP 220的絕對方位而得。更進-步延伸、可以掇 VP 241的空間絕對座標位置,以決 ’方定(camera pQsing)問題。 _ 25 200528945 &lt;以特定投賴式顧财最佳¥ 置 本發明以PCP 220及Jrp㈧n ^吊數乃 pi)為數值限制,利用辦法f 5心圓的半控長度(ri, 軸21 一 —測試假設f定==rial-and—err⑻沿著光學 長度。請再次參照「第5 A圖 吊數/的 21之後,魚眼鏡頭24的vp如24的光學轴 點,減-來㈣了 上的某一 其洋細步驟如下: ^真中心在影像平面的座標得 的各同心®时料銜此為預歧220 (此為度量值)對應關係為共輥座標對(1 ^的4 3 ,心圓的次序。假設光學㈣上的vp 241 ‘中= 中心225座標點的距離為D。依此便可決定心22〇== 心圓所建構的偏軸角為ai=tan_1(ri/D);又由影像平面烈^ ΪΪΪΐοΡPi(Principal distance), … 杈射㈡數么式(P=fa)為測試函數:以ai除〇i 巧對應的fi值。如果測試相機完全符合EDP模式, 則根據母-同心_算而得的Λ應皆等於—常數。故要上 可以經由改變d值、或是改變參“ 式為;二ίϊα)),直到配適出滿意的結果。接 g EDP核式為—仔;同樣的方法可實施在不同的投射 且-ίί二Τι3(ηα’Λ?座標系統的位置原點E(M,0) ’ 且口又疋先子軸21 (E(0,p,h); β, h為任意 定義的實體座標的z轴T(M,z)重合,其中為43二 ^l^PCP 220的距離已知為DQ,設定Pcp 2訂各同心圓的 二1對應π每—影像高度為pi ’由於pi與ai都相依 於DO的大小,因此EDp為以下的數學型式:^⑽二 26 200528945 /^ai(DO),其中i = l〜N,而N為ICP 230上可以得到的警像 執跡總數。取最外圍的圓形和其他之一來連結,經過運算得以 下等式: ^ pi (DO)/ pNvDO) &quot;&quot; oii(DO)/ otN(DO) —0 _______(]) 事實上,DO是一未知值’但確定VP 241落在光學I由 上;若取Z軸上的一動態點(〇, 〇, z) ’則可得一誤差數學式如 下: … eUz) = pi(D0)/ pN(D0) - aUz)/ aNU)-----------〜〜〜、 〜〜C2)/ ^ Determine the camera's orientation (eamera PQsing)> The test is performed by one or the second symmetry index. When the ICP center is that the optical axis 21 is orthogonally aligned with the pCP 220 in the pattern, let's say 丄t axis 2 (supplementary: because a concentric circle pattern has 纟 d our Yusin green, that is, the coordinates ㈡ is the invention-great contribution: the absolute axis of the optical axis 2! ^ absolute position of PCP 220 and It can be further extended to determine the absolute coordinate position of VP 241 in order to determine the camera pQsing problem. _ 25 200528945 &lt; The best investment with a specific investment style ¥ Set the invention to PCP 220 And Jrp㈧n ^ hanging number is pi) as the numerical limit, using the method f 5 center circle half length (ri, axis 21 a-test hypothesis f fixed == rial-and-err⑻ along the optical length. Please refer to the " After the number A in Fig. 5A / 21, the vp of the fisheye lens 24 is the optical axis point of 24, and the subtle steps are as follows: ^ Concentricity obtained from the coordinates of the true center on the image plane ® Time material title This is the pre-distortion 220 (this is a measurement value) The corresponding relationship is the common roller coordinate pair (1 ^ 4 3, the center of the circle Order. Suppose that the distance between vp 241 'center = center 225 coordinate point on the optical axis is D. Based on this, you can determine the off-axis angle constructed by the heart 22〇 == circle of the heart as ai = tan_1 (ri / D); From the image plane, ^ ΡοPiPi (Principal distance),… The radiance number formula (P = fa) is the test function: divide ai by 0i and the corresponding fi value. If the test camera fully complies with the EDP mode, according to the parent- The Λ calculated by concentric calculations should all be equal to a constant. Therefore, you can change the value of d or change the parameter "formula; 2 ίϊα)) until you get a satisfactory result. Then the EDP kernel formula is-Aberdeen; the same method can be implemented in different projections and the position origin E (M, 0) of the two T3 (ηα'Λ? Coordinate system, and the first axis 21 ( E (0, p, h); β, h are coincident z-axis T (M, z) of any defined physical coordinates, where 43 is the distance of PCP 220, known as DQ. Set Pcp 2 to order each The concentric circle of 2 corresponds to π each—the image height is pi '. Since pi and ai are dependent on the size of DO, EDp is the following mathematical formula: ^ ⑽ 二 26 200528945 / ^ ai (DO), where i = l ~ N, and N is the total number of police image tracks that can be obtained on ICP 230. Take the outermost circle and connect the other one, and calculate the following equation: ^ pi (DO) / pNvDO) &quot; &quot; oii (DO) / otN (DO) —0 _______ (]) In fact, DO is an unknown value 'but it is determined that VP 241 falls on the optical I source; if a dynamic point on the Z axis is taken (〇, 〇, z) 'You can get an error mathematical formula as follows:… eUz) = pi (D0) / pN (D0)-aUz) / aNU) ----------- ~~~, ~~ C2)

式中ai相依於測試的z,即是(ai(z&gt;tan-i(ri/z) &gt; 而pi的值在做實驗時已經確定了尺寸(即其恆相依於加了,’ 並不隨著假設z值而變化)。因此、只要量測得到至少二組' 共耗座標對(ri,pi) (conjugated coordinates,代表 互對應之物體點221與影像點231的資訊),即可得到ei、相 ΐϊ'ΓΓΪΪ法搜尋光學軸21上的每—點,根據式⑵1,= (z)為最小值之處,此時vp 241的位置即可 仁被畺測相機的投射函數並不清楚,是若 口 3 S3的^厄座標對(ri,pl)所計算得到的 式的判斷貢獻^角=圍:y顧慮母一影像執跡對相機投射槿 對待各執跡的貢獻,其為:现 夕、P1 ⑽)1卜1(DG))/pN(DG)——__________ /、中P〇(DG) = ◦,可視為是失直中 〜(3) 在光學軸21上戽妁^f7r心5的+控。因此, 用的誤#函數為找^中心241的配適過程中,實作例子應 ε(ζ) = δ_(β/(2)χ,。)) — 躺—點使得ε⑴最小、或是趨^G—時'(4) 此^魚眼相機的VP 24i。式⑷的數學形式^立^ 27Where ai depends on the test z, that is (ai (z &tan; tan-i (ri / z) &gt;) and the value of pi has been determined when the experiment is performed (that is, its constant phase depends on the addition, 'does not (It varies with the assumption of z value). Therefore, as long as at least two sets of 'commonly consumed coordinate pairs (ri, pi) (conjugated coordinates, information representing the corresponding object points 221 and image points 231) are obtained, we can obtain ei, phase 'ΓΓ' method searches every point on the optical axis 21, according to formula (1), where = (z) is the minimum value, at this time the position of vp 241 can be measured by the camera's projection function. It is not clear. , Is the judgment contribution of the ^ E coordinate of Ruokou 3 S3 to the formula calculated by (ri, pl) ^ angle = circle: y Concerns the contribution of the mother-one image track to the camera's projection of each track, which is: Now, P1 ⑽) 1 bu 1 (DG)) / pN (DG) ——__________ /, P〇 (DG) = ◦, can be regarded as misalignment ~ (3) On the optical axis 21 戽 妁 ^ f7r heart 5 + control. Therefore, in the process of adapting the erroneous # function to find the center 241, the implementation example should be ε (ζ) = δ_ (β / (2) χ ,.)) — lying — the point to make ε⑴ minimum, or ^ G— 时 '(4) This ^ fisheye camera's VP 24i. Mathematical form of formula 立 立 ^ 27

、事實上,由得到的fi(D)數據演算的統計標準偏差,更可 以利用來估异假没之投射模式的準確性,也就是說,可以用下 列式子做為與設定之投射模式配適程度的指標,稱之為「σ-演箅法」:Ν 200528945 、饭。又上的推導結果,若是假設前提改為其他可能的投射模 式,、例如:SGP (p = 2fxtari(a/2))或 OGP (p = fxsin(a)),、 ^式(1)主(4)必須根據SGP或〇(;p的投射函數再推導一次。無 淪如何,以上述觀念所做的推斷稱為「ε-演算法j。 、至於焦距常數f,根據量測到的pi(D)及其相對的ai(D) 為基礎,^彳用下式計算之: _W(w。)--------------------------------- 、其中,fi(D〇) = pi(DO)/ ai(DO)。同理,若是假設前提改 ,SGP,則 fi(D〇)等於 1/2*pi(D〇)/tan(ai(D〇)/2),·或是設 定投^函數為 〇Gp,則 fi(D〇)等於 pi(D〇)/ sin(ai(D〇))。若 鏡頭全符合設定的投射模式、量測無誤差,則卯值將很準 石,’那麼f(D〇)應等於任一 /i⑽),這也就是鏡頭的焦距常數 vu-υ____________________⑹ 為進一步驗證實驗結果的可靠性(包括:光學轴21方位 與=適之相機投射模式),請再次參照「第7圖」,本發明更以 初:t準直光學軸21後_ 22的絕触標位置為基準,將圖革巴 22沿著正z方向移動兩次,各增加5歷的位移;在這兩次位 移中,相機60的方位與圖靶22在χ,基軸51與γ,基軸52座 標位置皆保持不變。包含第—次實驗棚,這三次實驗分別以In fact, the statistical standard deviation calculated from the obtained fi (D) data can be used to estimate the accuracy of the projection mode of alienation. That is, the following formula can be used to match the set projection mode. An appropriate level of index is called "σ-encoding method": N 200528945, rice. The result of the above derivation, if it is assumed that the premise is changed to other possible projection modes, such as: SGP (p = 2fxtari (a / 2)) or OGP (p = fxsin (a)), ^ (1) the main ( 4) It must be deduced again based on the projection function of SGP or 〇 (; p. How to make the inference based on the above concept is called "ε-algorithm j." As for the focal length constant f, according to the measured pi ( D) and its relative ai (D) as the basis, ^ 彳 is calculated using the following formula: _W (w.) ------------------------ ---------, where fi (D〇) = pi (DO) / ai (DO). Similarly, if the premise is changed, SGP, then fi (D〇) equals 1/2 * pi (D〇) / tan (ai (D〇) / 2), or set the cast function to 0 Gp, then fi (D〇) is equal to pi (D〇) / sin (ai (D〇)). If The lens fully conforms to the set projection mode, and the measurement is error-free, so the value of 将 will be very accurate, 'then f (D〇) should be equal to any / i⑽), which is the focal length constant of the lens vu-υ ____________________ ⑹ For further verification experiments The reliability of the results (including: the orientation of the optical axis 21 and the proper camera projection mode), please refer to "Fig. 7" again. Straight optical axis 21 _ 22 after the position of the absolute contact mark as a reference, move Tugeba 22 in the positive z direction twice, each adding 5 calendar displacements; in these two displacements, the orientation of the camera 60 and the target At 22, the coordinate positions of the base axis 51 and γ and the base axis 52 remain unchanged. Including the first experiment shed, these three experiments are

Testl、Test2 與 Test3 表示之。 、 g實驗測試的來數與結果Testl, Test2, and Test3 represent this. , G experimental test results and results

Di ⑽P Di (〇GP^7sGP)—Di ⑽P Di (〇GP ^ 7sGP) —

28 200528945 (Testl) 9.3 5(Test2) 24. 4 10(Test3): 29. 4 F 1.82 ε值/ σ.值 0. 03 26.4 30 40· i 2. 44 0.03 15, 23. 19J_ 20 β·9 26.2 32.8 39. 16.5 ~—20.J 219 2. 99 0.054 1.85 2. 42 U 0 0.003 0.004 0. 004 ·--------[一——一 |U. U 、注·除f ε與σ沒有單位外,其他數值的單位皆是咖 表1列出以三次實驗資料套在EDP、0GP與SGP投射模 十,並/刀別利用ε—演算法與σ—演算法推導得到的值、,:直 與ε值/σ值。對照表}最左端的參考了⑺乜位移量,實 結果以Di呈現的科度為指標可以推 近 ,因為無論由£__域^算法, 化’皆可非常忠實地反映各次實驗遞增的—位移量;但 兩種演算法推算得到的D值相差約G. 5刪。套用 所推异剌的焦距常數(182mm/185mm)也較接近規格 i中Γΐΐh 78m ’其中的差距可能是因為手動組裝鏡頭的誤 …二。=與,的實驗結果與已知的絕對位移量與焦 吊=都相差甚夕。最後—列相當小的ε值及讀,顯示本發 明揭露,此兩種演算法具有相當的精確度及可行性。 _ ?严照二第10圖」,以Testl為例,橫軸繪示以假設的 圖靶22距離為主變數所得到的ε曲線及σ曲線。發現在六種 f ϊίί7 (三種投射模式Χ二種演算法),無論ε曲線或是σ 曲線都為-很明顯的最小值,該單一最小值的存在代表νρ24] 的所在位置’如此亦證明了本發明方法的可行性。缺而,同一 鏡頭在不同的參考投射函數可以得到不同νρ 241位置及不同 ,焦距,這表科·以單—倾來推斷以剌綱確實的原 函數。另外-提、實務上亦難以只用—特定圓形投射函 數元全描述一鏡頭的投射行為。 以上所揭4貫施例之「ε-演算法」與「σ—演算法」是針對 已知的投射函數為目標鏡頭的成像邏輯,來演繹所對應的相機 29 200528945 ίί出果驗證所提的方法學是可行的。這是本發 電=芩:相機内外部光學參數當成魚眼影像 &lt;視覺模型二、一種通用的相機參數演繹法〉 炎納f f明更進一步發展一通用的相機參數演繹方法學,不須 有的封卿式投射函數的假說,直接由校正點的絕 現的影像位置之映射關係演繹相機的光學投射 ,制亚相機參數,是為一種更通用的相機參數演算法。 心、法係根據習知的鏡頭投射現象所發展出來,設定的 二 右且為右在相機視野空間中同一特定視野綠卜 i所體點’恆邮在影像平面上唯’同―特定影像 猶aH所有視野線將匯聚於空間中一唯一的投影中心(或 中心’ Vlewp〇int,簡稱為vp) ’再根據投射函數來 圖身=像=影像平面上。此投射機制等效示意在「第1 1圖」。 32/ί^Ϊ同一視野線上的特性點W333[r] 333、W323[q」l ^^23 I,-31!' ^ 242 ' &quot; I313-333[P^^] 91 〇 像十面23上早一影像訊息(如圖中之 T2^Vl&quot;m 80 r你ί /、q、Γ二位置時,三校正點313、323、333的 ΐ 在:3/313[ρ]、W323[q]、W333W);反過來說,若 ΐΐ^ 3 少二相異物體點映射至同—影像位置,則由 相異物體點的實體空間絕對座標可以決定其投射之 稱為於^ 野線8G與光學軸21的交點即為FCP242,或 可以彻找出實體空間所有視野、ί / =#到相機的投射成影機制。而如何以此圖 “ί^、ΐαί以代表魚眼相機投影邏輯的方法的實施内容已 〈魚眼視影立體度量系統〉J免,▲像的技射函數。 3028 200528945 (Testl) 9.3 5 (Test2) 24. 4 10 (Test3): 29. 4 F 1.82 ε value / σ. Value 0.03 26.4 30 40 · i 2. 44 0.03 15, 23. 19J_ 20 β · 9 26.2 32.8 39. 16.5 ~ —20.J 219 2. 99 0.054 1.85 2. 42 U 0 0.003 0.004 0. 004 · -------- [一 —— 一 | U. U, Note · Division f ε There is no unit with σ, the units of other numerical values are all. Table 1 lists the three experimental data sets of EDP, 0GP, and SGP projection modes. ,,: Straight with ε value / σ value. The comparison table} refers to the amount of 量 displacement at the far left. The actual result can be approximated by the degree of Di presented as an indicator, because regardless of the £ __domain ^ algorithm, Hua 'can very faithfully reflect the increment of each experiment. —The amount of displacement; but the D values calculated by the two algorithms differ by about G. 5 deleted. Applying the inferred focal length constant (182mm / 185mm) is also closer to the specification i. The difference in Γΐΐh 78m ′ may be due to the mistake of manually assembling the lens ... 2. =, And the experimental results are quite different from the known absolute displacement and focus. Finally-the relatively small ε values and readings show that the present invention discloses that these two algorithms have considerable accuracy and feasibility. _? Yan Zhao II, Figure 10 ", taking Testl as an example, the horizontal axis shows the ε curve and σ curve obtained with the hypothetical target 22 distance as the main variable. It is found that in six kinds of f ϊίί7 (three projection modes x two algorithms), both the ε curve and the σ curve are-obvious minimum values. The existence of this single minimum value represents the location of νρ24]. This also proves that Feasibility of the method of the invention. However, the same lens can obtain different νρ 241 positions and different focal lengths at different reference projection functions. This table uses the mono-tilt to infer the exact original function of 剌 剌. In addition-mentioning, it is difficult to use only-a specific circular projection function to fully describe the projection behavior of a lens. The "ε-algorithm" and "σ-algorithm" disclosed in the above-mentioned four embodiments are based on the imaging logic of a known projection function as the target lens to interpret the corresponding camera. 29 200528945 The methodology is feasible. This is the power generation = 芩: the internal and external optical parameters of the camera are regarded as fisheye images &lt; visual model II. A general camera parameter deduction method> Yan Naffing further developed a general camera parameter deduction methodology, which is unnecessary The hypothetical projection function hypothesis directly deduces the camera's optical projection from the mapping relationship between the absolute image position of the correction point and the sub-camera parameter, which is a more general camera parameter algorithm. The mind and law system is developed according to the conventional lens projection phenomenon. The set two right and right points in the camera's field of view are the same as the specific point of view. The point “Hengyou on the image plane” is the same as the specific image. All the lines of view of aH will converge on a unique projection center (or center 'Vlewpint, abbreviated as vp)' in the space, and then the body = image = image plane according to the projection function. This projection mechanism is equivalently shown in "Figure 11". 32 / ί ^ ΪCharacteristic points on the same field of view W333 [r] 333, W323 [q ”l ^^ 23 I, -31! '^ 242' &quot; I313-333 [P ^^] 91 〇 像 十 面 23 The previous video message (as shown in the figure T2 ^ Vl &quot; m 80 r you ί /, q, Γ two positions, the three correction points 313, 323, 333 ΐ in: 3/313 [ρ], W323 [q ], W333W); Conversely, if ΐΐ ^ 3 least two dissimilar object points are mapped to the same-image location, then the absolute coordinates of the physical space of the dissimilar object points can determine its projection, which is called the ^ wild line 8G and The intersection point of the optical axis 21 is FCP242, or all the fields of vision in the physical space can be found out. And how to use this picture "ί ^, ΐαί to represent the fisheye camera projection logic method has been implemented in the" fisheye vision and shadow stereo measurement system "J free, ▲ image technique function. 30

j明在前部分公開相機參數求法時,以參考pcp的絕對 ΐΐί軸及投影中心於實體空間的位置,·也就是以— Η 、、土 :延伸疋位相機的方位。由於PCP是方向及位置 ίϋΐ動由系統來決定的。故如果在兩已知方位來求取兩具 二眼才=的視覺模型;則可以順利間接得到兩相機的相對方位 差,此為三維度量的重要參考基準。 200528945 c、、眼衫轉翻針孔影像的雜&gt; 為盆=ίί=ΐΐ國發明專利2_針對魚眼影像轉換 釣*的,、施例子之—;而專利2_2更能 另外在於义二^體,入(偏轴角環轴角)為光轴的針孔影像; 之。$ 々博士淪文中有更多的例子。詳細内容請參考 〈決定魚眼相機方位的技術〉 精確制PCPS絕對方向及位置必須有精密的六 本實做例子是騎證發财法學的可行性故以較 間早的方絲f施,但可__機布置並秘制於此。 〈具有相機整流布置的三維電腦視影度量系統〉 ,相«設成為影像平面在柄—平面及像素座標系統 的軸線之-(如水平軸線)落於同_線上的程序叫做相機整流 (re^tificat1Qn);是三維視覺度量佈置的—制。經此調整 可以簡化二維方位的代數運算;缺點是於—般針孔相機(由於 視野角度較小)將降低能夠操作的視野角度。「第丄2圖」顯 示經過整流後的針孔三維視覺度量系統;物件MN落在相機的 31 200528945 孔視^度量;另—物件M,N,因為不在針 型,單獨相機地/的視月針電腦視覺模 圖」所顯示Γ—、錢里U的貫施例示意圖請參考「第1 3 2校正圖靶延伸實體線段整流像素軸線〉 垂直ί Ϊί ίί、ϊ相機的水平軸線可以經由在PCP 220延伸一 的方法是於直線J像通=!」中:。,::=線的方向 影料合在在像素平面的^軸=後场轉相機,使直線 而因可以合併在執行相機參數測量階段一齊實施。铁 機間的相對方位,而有影像整流需求時有此‘。 k種布置舰未_其他布置方柄實施。谓此而要 〈架設魚眼相機三維視影度量系統〉 目p f之三維視影度量系統實施例,包含兩部相機(厂卢 機參數階段所使用的相機)以及一嶋, 。==_機與該左眼相機岐於相機架上維持一不 3 jft移,且可以自由調整方向。先針對右眼相機以 作日士 ft、光學袖21跟「第14圖」圖輕22維持正交;摔 作日守,相機和深度圖靶22可以依據需要調整。 釭 影度圖」,本發明提出之架設魚眼相機三維視 ^深度圖革巴22為㈣圖革巴’演繹右眼相機之圓對稱影像 hp ^i步驟931) ’ *考共㈣實難影像伟資料,渾繹 右眼相機之·視覺觀(步驟932);旋轉並軸深度圖= 32 200528945 學轴(步驟934);紀錄共輛圖= 右眼相機之電腦視覺模型,並⑽右眼相 置,以讓其平台座if方向及平台z袖的位 行,並當成參考面====== 大概位置(步驟937) · 厂置木5又左眼相機的 =:(=’:的^ 延伸線(步驟939);再次紀錄共輥實體伸 覺f型(步驟以二= 私,^寅繹得到的值分別定義視參點距離(步襲^千 〈魚眼視影度量學的三維演算法〉 ^統架設完成的左右軸機的實黯果請 第二表」。表列參考等距離投射、立體圖 = 投射為根據的模型;來執行Sigma及邱奶加測 ^ $說明各攔位的意義。第二到四列表示以不同的ΐ種;:^; 试相機的參數。z(ep)表示以Epsil〇 ^來二 ,座標離_的距離,單位是麵,Err(ep)為最 差:此值沒有單位’❿凡㈣則為演算得到的焦丄勺 =’刪。後面三攔則是抑脈測試的結果,其中 早位疋mm,其餘兩個跟Epsilon測試的單位 =幻 0.92表示在電腦螢幕顯現的是一個長‘ 機時的圖械標原點為基準。 根據相機製造商提供的規格鏡頭的焦距是丨· δ_。參考測 33 200528945 試結果,相機明顯的是比較像是屬於等距離投射。故以此組參 數為計算三維視影的基準。對於立體圖投射、正交圖投射的演 算結果就不採用了。 經由這些已經得到的左右眼的Γ魚眼影像電腦視覺模 型」、相機於外界視覺參考點(投影中心)的位置與相機的方 向’及能夠將影像轉換成邏輯上符合於針孔成像的線框性透視 影像。因而既有的電腦視覺技術可以直接引用這些經轉換成的 線框性透視(又名針孔)影像來執行三維視影度量。 Z(ep) Err(ep) FL(ep) Z(sig) Err(sig) FL(sig) 等距離投射 40 0.005 1.784 40 0.08 1.784 立體圖投射 60.5 0.008 2.577 55.2 0.12 2.425 正交圖投射 31.4 0.008 [430 32.6 0.13 1.464 0.92When J Ming disclosed the camera parameter calculation in the previous part, the absolute ΐΐί axis of the pcp and the position of the projection center in the physical space were referenced, that is, the position of the camera was extended with — 、, 土:. Because PCP is the direction and location, it is determined by the system. Therefore, if two vision models are obtained from two known positions, the relative position difference between the two cameras can be obtained indirectly, which is an important reference for three-dimensional measurement. 200528945 c. Miscellaneous pinhole images of eye-shirts &gt; Pots = ίί = ΐΐ 国 Invention Patent 2_ For fisheye image conversion fishing *, and the examples are; and Patent 2_2 can be more in Yi Er ^ Body, input (off-axis angle, ring axis angle) is the pinhole image of the optical axis; $ Dr. 々 has more examples in the text. For details, please refer to "Techniques for determining the orientation of the fisheye camera". The precise direction and position of the precise PCPS must be precise. Can be arranged and controlled here. "Three-dimensional computer vision measurement system with camera rectification arrangement", phase «set to be the image plane on the handle-plane and the axis of the pixel coordinate system-(such as the horizontal axis) on the same line is called camera rectification (re ^ tificat1Qn); is a three-dimensional visual measurement layout-system. This adjustment can simplify the algebraic calculation of two-dimensional orientation; the disadvantage is that a general pinhole camera (because the angle of view is small) will reduce the angle of view that can be operated. "Figure 2" shows the pinhole three-dimensional visual measurement system after rectification; the object MN falls on the camera's 31 200528945 hole measurement; the other is the object M, N, because it is not in the needle type, it is a separate camera / view moon Needle computer vision model "Γ—, Qianli U's example of implementation, please refer to the" 1 2 3 correction chart target extension solid line segment rectifier pixel axis "vertical Ϊ ί, ϊ camera's horizontal axis can be passed through the PCP The method of 220 extension one is in the straight J image pass =! ":. :: = line direction The shadows are combined at the ^ axis of the pixel plane = backfield to turn the camera, so that straight lines can be combined and implemented in the camera parameter measurement stage. The relative orientation between the irons, and this is needed when image rectification is required. The k types of layout ships are not implemented with other layout square handles. In this case, "Erecting a fisheye camera three-dimensional video measurement system" is an embodiment of the three-dimensional video measurement system of the project pf, which includes two cameras (cameras used in the parameter stage of the factory) and a stack of cameras. == _ The camera and the left-eye camera maintain a 3 jft shift on the camera mount, and the direction can be adjusted freely. First aim at the right-eye camera as a Japanese ft, the optical sleeve 21 and the light weight 22 of the "Figure 14" remain orthogonal; if you fall to the day guard, the camera and depth map target 22 can be adjusted as needed. "Shadow map", a three-dimensional view of a fish-eye camera set up by the present invention ^ Depth map Geba 22 is a rendering of the right-eye camera's circularly symmetric image hp ^ i step 931) ' Great data, interpreting the visual view of the right-eye camera (step 932); Rotating parallel axis depth map = 32 200528945 academic axis (step 934); Recording a total car map = computer vision model of the right-eye camera, and right-eye camera Set it so that its platform seat if direction and platform z sleeve position, and use it as a reference plane ====== Approximate position (step 937) · Factory-set wood 5 and left-eye camera =: (= ': ^ Extension line (step 939); again record the co-roller entity extensibility f-type (step 2 = private, ^ yinyi's values respectively define the distance between the viewing parameters (walking ^ 1000 <3D of fisheye vision measurement Algorithm> ^ Please refer to the second table for the actual results of the left and right axis machines that have been erected. Please refer to the equidistant projection, the stereogram = the model based on the projection; to perform the Sigma and Qiu milk test ^ $ Description of each block Meaning. The second to fourth columns indicate different types of camera ;: ^; parameters of the test camera. Z (ep) indicates that Epsil is used as the second coordinate. The distance from _, the unit is the surface, Err (ep) is the worst: this value has no unit '❿fan㈣ is the calculated Jiao spoon == delete. The next three blocks are the results of the pulse suppression test, which early The unit is 疋 mm, and the other two units tested with Epsilon = Magic 0.92 means that the origin of the icon on the computer screen is a long time. The focal point of the lens is based on the specifications provided by the camera manufacturer. .Reference test 33 200528945 test results, the camera is apparently more like equidistant projection. Therefore, this set of parameters is used as the basis for calculating 3D video. The calculation results of stereo projection and orthogonal projection are not used. These already obtained computer vision models of Γ fisheye images of left and right eyes, the position of the camera at the external visual reference point (projection center) and the direction of the camera ', and the ability to convert the image into a wireframe that is logically consistent with pinhole imaging Perspective image. Therefore, existing computer vision technology can directly reference these converted wireframe perspective (also known as pinhole) images to perform three-dimensional visual measurement. Z (ep) Err (ep) FL (ep ) Z (sig) Err (sig) FL (sig) Equidistant projection 40 0.005 1.784 40 0.08 1.784 Stereographic projection 60.5 0.008 2.577 55.2 0.12 2.425 Orthographic projection 31.4 0.008 [430 32.6 0.13 1.464 0.92

失真中Distortion

視覺模型丨表數 〈實物量測〉 據等發.第一組根 34 200528945 的各標的物,其實際尺寸與量測尺寸Visual model 丨 Number of tables 〈Physical measurement〉 According to the data, etc. The first group of the root 34 200528945 of each target, its actual size and measurement size

1162^02 1011^98 ^95?23 實際尺寸 玻璃門高 2730 玻璃窗戶 寬(左上) 1000 玻璃窗戶 寬(右下) 1000 玻璃窗戶 南度 530 書櫃高 書櫃長 980 燈具長 1 &quot; —_ 1 35 200528945 在不違1本創作的精神下各種組合皆可為實施例。 的優=卜,本發明揭露之求取相機之光學參數的方法具有以下 實現(,1 ί而吏Λ追縱光學轴21綠的功能可被具體 是參數中的絕對座標位置, 的光^由數於t發能夠確切地推導出轉換魚眼影像所需 演算邏輯_•非距常數,因轉換魚眼影像的 像保;據此轉換而呈現的影 你炎由本發明方法可以找到模式中單一的投影中心(VP) 的光學中心、,因此魚眼影像的制學變得可行。 影度她岐伸細 定太=本發明已以一較佳實施例揭露如上’然且並非用以限 内,當*可作藝者’在不_本㈣之㈣和範圍 附之申以準因此本發明之保護範圍當視後 【圖式簡單說明】 ,空間投射紐正方法的影像解析_及其對應 ^=,’^!1知^種典鼓眼_之投射函數曲線圖; ^圖^不依據本發明精神而設計之一同心圓圖^施例示 魚ίϊί形成之立11投射光路的示意圖; 弟5 Α圖’緣不利用中心對稱圖案⑽)模擬多準直入射光 36 200528945 路以及藉由一小球解釋視魚眼鏡頭之投射行為的示衰圖(以 距離投射為例); …寸 第ο B圖,|會示「第5 a圖」中小球與影像平面部分的立㉟朵 路示意圖;九 第6圖,繪示本發明於實際實驗時應用之中心對稱圖案 設計示意圖,· ^ y 第7圖’繪示具體實現本發明中調整魚眼相機與圖靶間相對方 位的裝置示意圖; 第8 A圖,繪示本發明於實際實驗時「第6圖」映射於早 面上的成像示意圖; 第8B圖,繪示「第8A圖」影像之東北、西南、西北、 四個方向的訊號強度變化曲線圖; 第9圖,繪示本發明以失真中心為原點、以極座標轉換 開「第8A圖」後的影像示意圖; 、 、 第10圖,繪示本發明實際測試時,根據不同投射函數, 投影中心之趨近曲線圖; /取 第二1 ^圖,緣示本發明相機參數的通用演算法之一種實施 理娜模式不意圖,其顯不圖鄉動於不同絕對位置時相異正 點成像於同一影像點的光路示咅圖; ”人 經過相機整流“孔影像三維視影度量系統的 第13圖’ %示魚眼二維視影度量系統的 第1 4圖,繪示以「第6圖」進延伸 機 :^ 對稱圖案(PCP)示意圖;以及 正佩万⑽此的中心 第15圖,繪示架設魚眼影像視影度看糸 【主要元件符號說明】亂”度里錢的步驟流程圖。 成影區域 1 長軸 短軸 200528945 13 本初子午線 13,、13,, 本初子午線的E 21 光學軸 22 圖革巴 220 中心對稱圖案 221 物體點 225 圖案中心 23 影像平面 230 中心對稱影像 231 影像點 235 失真中心 24 鏡頭 241 投影中心(VP) 242 前基點(FCP) 243 後基點(BCP) 30 小球 301 入射點 302 正規化影像點 31 赤道平面 38 中心校正點 313、323、333 校正點 40 大球 50 調整平台 51 X’基軸 52 Y’基軸 53 Z’基軸 60 相機 70 相機支架 71 萬向雲台 38 200528945 視野線 影像點1162 ^ 02 1011 ^ 98 ^ 95? 23 Actual size glass door height 2730 glass window width (top left) 1000 glass window width (bottom right) 1000 glass window south 530 bookcase high bookcase length 980 light length 1 &quot; —_ 1 35 200528945 Various combinations can be examples without departing from the spirit of this book. The value of the optical parameters of the camera disclosed in the present invention has the following implementation (1, and the function of the optical axis 21 green can be specifically the absolute coordinate position in the parameter, The number of t can accurately derive the calculation logic required to transform the fish-eye image. The non-constant constant, because of the image protection of the converted fish-eye image; the shadow you present based on this conversion can be found by the method of the present invention. The optical center of the projection center (VP), so the fisheye image system becomes feasible. The degree of contrast is too small = the present invention has been disclosed in a preferred embodiment as above, but it is not intended to be limited. When * 可 作 艺 者 'is not in the scope and scope of the application, the scope of protection of the present invention should be considered [Schematic explanation], the image analysis of the spatial projection method and its corresponding ^ =, '^! 1 Known ^ Kind of Drum Drum _ projection function curve; ^ Figure ^ A concentric circle diagram not designed in accordance with the spirit of the present invention ^ Example shows a schematic diagram of the projection 11 light path formed by the fish ϊ ϊ; brother 5 Α graph 'margin does not use a central symmetrical pattern ⑽) Simulate multiple collimation Incident light 36 200528945 and a diagram showing the projection behavior of a fisheye lens through a small ball (take distance projection as an example);… inch ο B, | will show the small ball and Schematic diagram of the Tachibana Road in the plane of the image; Figure 9 shows the schematic design of the central symmetrical pattern applied in the actual experiment of the present invention, and ^ y Figure 7 shows the adjustment of the fisheye camera and Figure 8A is a schematic diagram of the device with relative orientation between targets; Figure 8A shows the imaging diagram of "6th figure" mapped on the early surface during actual experiments of the present invention; Figure 8B shows the northeast of the "Figure 8A" image Curves of signal intensity change in four directions, southwest, northwest, and four directions; Figure 9 shows a schematic diagram of the image after the center of distortion of the present invention is converted to "Figure 8A" with polar coordinates; Figures 10, When the actual test of the present invention is shown, according to different projection functions, the projection curve of the projection center is approximated; / Take the second 1 ^ figure, which shows that the general algorithm of the camera parameters of the present invention is not intended to implement the Rina mode. Do not plan to move in The optical path diagram of the same point at the same absolute point at the same absolute position; Figure 13 of the "Three-dimensional video measurement system of the human eye through the camera rectification" hole image. Figure, showing the "Picture 6" into the extension machine: ^ Schematic diagram of the symmetrical pattern (PCP); and Figure 15 of the center of Pei Wanwan, showing the erection of fisheye images ] Flow chart of the steps in money. Photographing area 1 Long axis short axis 200528945 13 Prime meridian 13, 13, 13, Prime meridian E 21 Optical axis 22 Tugba 220 Center symmetrical pattern 221 Object point 225 Pattern center 23 Image plane 230 Center symmetrical image 231 Image point 235 Distortion center 24 Lens 241 Projection center (VP) 242 Front base point (FCP) 243 Back base point (BCP) 30 Bead 301 Incident point 302 Normalized image point 31 Equatorial plane 38 Center correction point 313, 323, 333 Calibration point 40 Large ball 50 Adjustment platform 51 X 'base axis 52 Y' base axis 53 Z 'base axis 60 Camera 70 Camera stand 71 Gimbal 38 200528 945 line of sight image point

3939

Claims (1)

200528945 十、申請專利範圍: 1. 一種魚眼相機三維視影度量系統,包含有: 一右眼相機’裝置有一非線性投射鏡頭,並已知該 相機之内、外部光學參數; ^ 一左眼相機,裝置有一非線性投射鏡頭,並已知該 • 相機之内、外部光學參數;及 又 • 位.一相機架,用於固定該右眼相機與該左眼相機之相對方 其中該右眼相機與該左眼相機係利用一具中心 #之®她正其内、外部絲參數,先湘該_校正= =之内:外部光學參數,之後移動圖革巴,並以右眼相機^ 座杯位置為系統參考點,校正左眼相機之内、外部光 :機相對方位與相關參數時便_ 2, 1 統,其中轉線性投射鏡頭是為—魚輯頭。’、 機三維視影度量系 輔助水平及垂直線段的同心圓圖靶。u口年及- 5. 變相機的方向。⑽似置有-柏雲台可以改 -種魚_機三維視影度量系統㈣設方法, 對‘^^_轉1_來演绎一右眼相機之圓 腦視覺^輛的貝體與衫像半㈣料’演繹該右眼相機之電 相機=㈣辑移動該_其中心正交於該右眼 40200528945 10. Scope of patent application: 1. A three-dimensional video measurement system for a fisheye camera, including: a right-eye camera 'device has a non-linear projection lens, and the internal and external optical parameters of the camera are known; ^ a left eye The camera, the device has a non-linear projection lens, and the inside and outside optical parameters of the camera are known; and a camera holder is used to fix the right-eye camera and the left-eye camera opposite to the right-eye camera. The camera and the left-eye camera use a center # 之 ® which corrects its internal and external silk parameters. First, _correction == inside: external optical parameters, then move the tugba, and use the right-eye camera ^ The position of the cup is the reference point of the system. When correcting the internal and external light of the left-eye camera: the relative orientation of the camera and related parameters, it will be _ 2, 1 system, where the linear projection lens is-fish series. ′, The three-dimensional visual measurement system is a concentric circle target that assists horizontal and vertical line segments. u year and-5. Change the direction of the camera. It seems that there is a-Baiyuntai can be changed-the method of setting the fish-machine three-dimensional video measurement system, to '^^ _ turn 1_ to perform a right-eye camera of the brain vision ^ car body and shirt like half ㈣ 料 'interpretation of the right-eye camera's electric camera = ㈣ move the _ its center is orthogonal to the right eye 40 200528945 n、、、己1彔共輛貫體與影像資料,演繹該右眼相機之電腦視覺 旲1,並以该右眼相機之平台座標為系統參考點; ^呆持該圖革巴的方向及平台2軸的位置,以讓其平台座標 上運動保持影像平面和該圖靶之平面平行,並當成參考面; 在適g位置架设一左眼相機的大概位置; 士平及垂直移,制其圓形對稱影像,紀錄共 輛的貫體與影像資料,演繹該左眼相機的電腦視覺模型; 心,次紀錄共輛實體與影像倾,演繹該左眼相機的電腦 視覺模型;以及 以紀,的該圖靶的在共軛實物及影像資料的平台座標 位置^演算該相機於圖靶平面位移,及演繹得到的值分別定 義視參點距離。 6.如專利範圍第5項所述之魚眼相機三維視影度量系統 的架设方法,其中演繹該右眼相機之電腦視覺模型之步驟 後,更包含以下步驟: 旋轉並移動該圖靶使輔助水平及垂直線段的影像剛好 吻合ICR的水平延伸及垂直延伸線。 7·如申凊專利範圍第5項所述之求取相機之光學參數的方 法,其中演繹該右眼相機之電腦視覺模型之步驟後, 以下步驟: 調整該左眼相機的方向使該圖靶的辅助水平及垂直線 段的影像剛好吻合ICL的水平延伸及垂直延伸線。 8· 種二維視影度夏的方法,係以一特定式樣布置二或複數個 不同方位的非線性相機來取具有視差的待測物影像,用以供 給演繹三維空間位置。 &quot; 9.如申請專利範圍第8項所述之三維視影度量的方法,其中該 特定式樣布置是整流系統的二或複數個不同方位的相機光 學轴為相互平行。 10·如申明專利乾圍弟8項所述之三維視影度量的方法,其中該 41 200528945 非線性相機系為裝置等距離投射機制鏡頭的魚眼相機。 ίΐ.如申請專利範圍第8項所述之三維視影度量的方法,其中該 非線性相機系為裝置立體圖投射機制鏡頭的魚眼相機。 12.如申請專利範圍第8項所述之三維視影度量的方法,其中該 非線性相機系為裝置正交圖投射機制鏡頭的魚眼相機。200528945 n, 1, and 1 have a total of body and image data, interpret the computer vision of the right-eye camera 旲 1, and use the platform coordinates of the right-eye camera as the system reference point; ^ stay in the direction of the tugba And the position of the 2 axes of the platform so that the movement of its platform coordinates keeps the image plane parallel to the target plane of the figure and serves as a reference plane; set the approximate position of a left-eye camera at a suitable position; Shiping and vertical movement, system Its circular symmetrical image records the car body and image data of the car, and interprets the computer vision model of the left-eye camera; the heart and sub-records of the car body and image tilt, and interprets the computer vision model of the left-eye camera; The coordinates of the target of the map on the platform of the conjugate object and the image data ^ Calculate the displacement of the camera in the plane of the target of the map, and the deduced values define the distances of the reference points respectively. 6. The method for setting up a fisheye camera three-dimensional video measurement system according to item 5 of the patent scope, wherein after the step of deducing the computer vision model of the right-eye camera, the method further includes the following steps: rotating and moving the target to assist The images of the horizontal and vertical line segments coincide with the horizontal and vertical lines of the ICR. 7. The method for obtaining the optical parameters of the camera as described in item 5 of the patent scope of the application, wherein after the step of deducing the computer vision model of the right-eye camera, the following steps: Adjust the direction of the left-eye camera to make the target The images of the auxiliary horizontal and vertical line segments exactly match the horizontal and vertical extension lines of the ICL. 8. A method of two-dimensional viewing angle summer, which uses a specific pattern to arrange two or more non-linear cameras with different orientations to take the image of the object under test with parallax for the interpretation of three-dimensional spatial position. &quot; 9. The method of three-dimensional visual shadow measurement as described in item 8 of the scope of the patent application, wherein the specific pattern arrangement is that the optical axes of two or more cameras with different orientations of the rectifying system are parallel to each other. 10. The method of three-dimensional visual shadow measurement according to item 8 of the patent claim, wherein the 41 200528945 non-linear camera is a fish-eye camera equipped with an equidistant projection mechanism lens. The method of three-dimensional video shadow measurement according to item 8 of the scope of patent application, wherein the non-linear camera is a fish-eye camera with a three-dimensional projection mechanism lens. 12. The method of three-dimensional visual shadow measurement according to item 8 of the scope of patent application, wherein the non-linear camera is a fish-eye camera with an orthogonal projection projection lens. 4242
TW94101592A 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same TW200528945A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW93101557 2004-01-20
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Publications (2)

Publication Number Publication Date
TW200528945A true TW200528945A (en) 2005-09-01
TWI375136B TWI375136B (en) 2012-10-21

Family

ID=48093266

Family Applications (1)

Application Number Title Priority Date Filing Date
TW94101592A TW200528945A (en) 2004-01-20 2005-01-19 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same

Country Status (1)

Country Link
TW (1) TW200528945A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9367752B2 (en) 2012-06-28 2016-06-14 Nec Corporation Camera position posture evaluating device, camera position posture evaluating method, and camera position posture evaluating program
TWI606421B (en) * 2017-03-13 2017-11-21 國立交通大學 Method and device for fisheye camera automatic calibration
CN108470360A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 The image device and its correlation technique of depth map are generated using on-plane surface projected image
TWI646506B (en) * 2017-10-24 2019-01-01 華晶科技股份有限公司 Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
US10762658B2 (en) 2017-10-24 2020-09-01 Altek Corporation Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
US11195297B2 (en) 2019-08-29 2021-12-07 China-Germany(Zhuhai)Artificial Intelligence Institute Co., Ltd Method and system for visual localization based on dual dome cameras
CN113873223A (en) * 2021-09-03 2021-12-31 大连中科创达软件有限公司 Camera definition determining method, device, equipment and storage medium
TWI766206B (en) * 2018-11-13 2022-06-01 創惟科技股份有限公司 Method for correcting distortion image and apparatus thereof

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI555378B (en) * 2015-10-28 2016-10-21 輿圖行動股份有限公司 An image calibration, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
TWI555379B (en) * 2015-11-06 2016-10-21 輿圖行動股份有限公司 An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN108632504A (en) * 2017-03-15 2018-10-09 致伸科技股份有限公司 multi-lens optical device
TWI661392B (en) * 2017-12-27 2019-06-01 聚星電子股份有限公司 Image stitching method and device

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9367752B2 (en) 2012-06-28 2016-06-14 Nec Corporation Camera position posture evaluating device, camera position posture evaluating method, and camera position posture evaluating program
CN108470360A (en) * 2017-02-23 2018-08-31 钰立微电子股份有限公司 The image device and its correlation technique of depth map are generated using on-plane surface projected image
TWI660328B (en) * 2017-02-23 2019-05-21 鈺立微電子股份有限公司 Image device utilizing non-planar projection images to generate a depth map and related method thereof
CN108470360B (en) * 2017-02-23 2022-06-17 钰立微电子股份有限公司 Image device for generating depth map by using non-plane projection image and related method thereof
TWI606421B (en) * 2017-03-13 2017-11-21 國立交通大學 Method and device for fisheye camera automatic calibration
TWI646506B (en) * 2017-10-24 2019-01-01 華晶科技股份有限公司 Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
US10762658B2 (en) 2017-10-24 2020-09-01 Altek Corporation Method and image pick-up apparatus for calculating coordinates of object being captured using fisheye images
TWI766206B (en) * 2018-11-13 2022-06-01 創惟科技股份有限公司 Method for correcting distortion image and apparatus thereof
US11741584B2 (en) 2018-11-13 2023-08-29 Genesys Logic, Inc. Method for correcting an image and device thereof
US11195297B2 (en) 2019-08-29 2021-12-07 China-Germany(Zhuhai)Artificial Intelligence Institute Co., Ltd Method and system for visual localization based on dual dome cameras
CN113873223A (en) * 2021-09-03 2021-12-31 大连中科创达软件有限公司 Camera definition determining method, device, equipment and storage medium
CN113873223B (en) * 2021-09-03 2023-07-21 大连中科创达软件有限公司 Method, device, equipment and storage medium for determining definition of camera

Also Published As

Publication number Publication date
TWI375136B (en) 2012-10-21

Similar Documents

Publication Publication Date Title
TW200528945A (en) 3D visual measurement system using fish-eye cameras as visual detectors and method for constructing same
CN105678742B (en) A kind of underwater camera scaling method
CN106595528B (en) A kind of micro- binocular stereo vision measurement method of telecentricity based on digital speckle
TWI397317B (en) Method for providing output image in either cylindrical mode or perspective mode
TWI555379B (en) An image calibrating, composing and depth rebuilding method of a panoramic fish-eye camera and a system thereof
CN110047104A (en) Object detection and tracking, head-mounted display apparatus and storage medium
ES2402229T3 (en) Image fusion method and device
CN106500596B (en) The measurement method of structure light panorama measuring system
CN109146965A (en) Information processing unit and computer program
CN206961066U (en) A kind of virtual reality interactive device
CN103871045B (en) Display system and method
CN106980368A (en) A kind of view-based access control model calculating and the virtual reality interactive device of Inertial Measurement Unit
CN106101689A (en) Utilize the method that mobile phone monocular cam carries out augmented reality to virtual reality glasses
CN104253989B (en) Full multi-view image display device
TW201101812A (en) Derivation of 3D information from single camera and movement sensors
CN108171758A (en) Polyphaser scaling method based on minimum time principle and transparent glass scaling board
CN106840112A (en) A kind of space geometry measuring method of utilization free space eye gaze point measurement
CN101408422A (en) Traffic accident on-site mapper based on binocular tridimensional all-directional vision
KR20180007349A (en) Head-mounted eye tracking device and method that provides drift-free eye tracking through a lens system
TW202145778A (en) Projection method of projection system
CN104200476B (en) The method that camera intrinsic parameter is solved using the circular motion in bimirror device
CN111811462A (en) Large-component portable visual ranging system and method in extreme environment
CN108269234A (en) A kind of lens of panoramic camera Attitude estimation method and panorama camera
TW565736B (en) Method for determining the optical parameters of a camera
CN109493378A (en) A kind of measuring for verticality method combined based on monocular vision with binocular vision