TWI379068B - - Google Patents

Download PDF

Info

Publication number
TWI379068B
TWI379068B TW097135902A TW97135902A TWI379068B TW I379068 B TWI379068 B TW I379068B TW 097135902 A TW097135902 A TW 097135902A TW 97135902 A TW97135902 A TW 97135902A TW I379068 B TWI379068 B TW I379068B
Authority
TW
Taiwan
Prior art keywords
light
edge
transparent body
edge position
line sensor
Prior art date
Application number
TW097135902A
Other languages
Chinese (zh)
Other versions
TW200921040A (en
Inventor
Yoshihiko Okayama
Original Assignee
Azbil Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Azbil Corp filed Critical Azbil Corp
Publication of TW200921040A publication Critical patent/TW200921040A/en
Application granted granted Critical
Publication of TWI379068B publication Critical patent/TWI379068B/zh

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
    • G01M11/02Testing optical properties
    • G01M11/0242Testing optical properties by measuring geometrical properties or aberrations
    • G01M11/0271Testing optical properties by measuring geometrical properties or aberrations by using interferometric methods
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1876Diffractive Fresnel lenses; Zone plates; Kinoforms
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0181Memory or computer-assisted visual determination

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Description

1379068 九、發明說明 【發明所屬之技術領域】 薄膜或玻璃等之透明體的 透明體之位置控制中所用 表之單色光的光路中,則 瑞奈(Fresnel )繞射。於 夫瑞奈繞射的光量分布, 上記物體之邊緣位置的邊 「光量」,係指在線條感 光的強度以比率方式來表 覺所決定的原本之定義。 複數像素排列而成的線條 平行光之光路的一部分是 象物3的位置,則如圖9 ,係以上記被偵測對象物 化。尤其是,線條感測器 置附近的夫瑞奈繞射之影 本發明係有關於,測出透明 邊緣位置,例如是有關於,上記 的理想之邊緣偵測裝置。 【先前技術】 若物體存在於以雷射光爲代 在該物體的邊緣位置,會產生夫 是,使用線條感測器來求出上記 並解新該光量分布,就可偵測出 緣偵測裝置,已有被開發。 此外,於本發明中,所謂的 測器的受光面所受光之前記單色 現,並不一定一致於基於人類視 亦即,如圖8所示,朝著由 感測器1而從打光部2照射單色 被遮蔽的方式,來擺放被偵測對 所示,上記線條感測器1的輸出 3的邊緣位置爲交界而有大幅變 1上的光量分布,係受在邊緣位 響,而有一定的變化傾向。 因此,若將線條感測器1的各像素的輸出予以正規化 ,則可將該光量是全入光時的2 5 %之位置,當成線條感測 1379068 器1之像素排列方向上的被偵測對象物3之邊緣位置而加 以測出。又’在如此解析夫瑞奈繞射之光量分布時,即使 上記被偵測對象物3是透明薄膜或玻璃等之透明體,其邊 緣位置也能精度良好地測出(例如參照曰本特開2 0 0 4 -1 773 3 5號公報)。 這是因爲’通過透明體內部的光與在自由空間側的繞 射光的相位是有差異,因此在受光面會發生干涉,在邊緣 部分會產生大幅度的光量跌落之故。此光量之跌落,係通 過透明體內部的光與自由空間側的繞射光的相位差爲1 80° 時,跌幅最大。 可是,當被偵測對象物3是透明體的情況下,則被偵 測對象物3不存在時(圖1 〇 ( a )所示的線條感測器1之 全入光狀態)、被偵測對象物3覆蓋光路之略一半時(圖 1 〇 ( b )所示的邊緣偵測狀態)、以及被偵測對象物3覆 蓋上記光路之全部時(圖1 〇 ( c )所示的線條感測器1的 全遮光狀態)’從這些各線條感測器丨之輸出(光量分布 )可知’全入光狀態與全遮光狀態的區別,是非常地困難 〇 亦即’在全入光狀態及全遮光狀態下,由於沒有在被 偵測對象物3的邊緣處所產生的夫瑞奈繞射,因此無法從 線條感測器1的輸出來偵測光量變成前述25 %之位置,由 圖1 1所示的邊緣偵測裝置的偵測特性可知,即使是被偵 測對象物3所造成的全遮光狀態下,其也會被偵測成爲全 入光狀態’例如,在一面偵測被偵測對象物3之邊緣位置 1379068 一面調整上記被偵測對象物3之位置的 道是處於全遮光狀態及全遮光狀態之哪 知道要將被偵測對象物3的位置往哪裡 〇 最近,著眼於光量在全遮光狀態下 爲減少,爲了解決上記不良情形,求出 像素之光量的加算値(全體光量),該 體光量是小於全入光狀態的全體光量的 線條感測器1之全體是被屬於透明體的 覆蓋的全遮光狀態,此種方式已被提出 開 2007-6473 35 號公報)。 在上記的方式下,如圖 12所示, 示左側)開始搜尋邊緣位置偵測的像素 的像素時’若被偵測對象物3是在線條 圍內’則例如即使被偵測對象物3的內 仍可正常地找出邊緣位置E。 然而’若線條感測器1全體都被被 蓋,則當被偵測對象物3的內部沒有髒 搜尋後的全體光量’可判斷爲線條感測 測對象物3所覆蓋’可是,當覆蓋線條 明體亦即被偵測對象物3的內部有髒污 如圖1 3所示’該髒污部分d、亦即閾 偵測’會將此處當成邊緣位置而加以輸 題。 J情況下,由於不知 :一者,因此導致不 [修正等的不良情形 是較全入光狀態較 線條感測器1的各 線條感測器1的全 情況下,係判斷爲 被偵測對象物3所 (例如參照日本特 從自由空間側(圖 ,當找到閾値以下 感測器1的測定範 部有髒污部分D, 偵測對象物3所覆 污部分時,係基於 器1全體是被被偵 感測器1全體的透 部分D存在時,則 値以下之像素會被 出’存在有如此問 1379068 【發明內容】 本發明的目的在於,除了能夠正確地偵測出透明體的 邊緣位置,還能確實地判定線條感測器是處於全入光狀態 或是處於全遮光狀態,再加上,覆蓋線條感測器1全體的 透明體的內部有髒污部分存在時,可避免將該髒污部分誤 探測成邊緣位置,例如,提供一種可理想適用於上記透明 體之位置控制的邊緣偵測裝置。 爲了達成上述目的,本發明所述之邊緣偵測裝置,其 特徵爲,具備:線條感測器,係將複數像素以所定間隔排 列而成:和光源,係朝著該線條感測器照射單色光;和邊 緣位置解析手段,係根據被放置在前記單色光之光路中的 透明體的邊緣上的光量分布,偵測出前記線條感測器之像 素排列方向上的前記透明體之邊緣位置;和全遮光狀態判 斷手段,係在以該邊緣位置解析手段偵測出前記透明體之 邊緣位置的時點上,求出前記線條感測器中的對應於前記 透明體之邊緣位置的自由空間側的像素的全受光量,當該 全受光量與預先記憶之該線條感測器的全入光狀態下的對 應於前記邊緣位置的像素爲止的全受光量的差,是超過所 定閎値時,則判斷成是前記透明體所造成的全遮光狀態。 又,本發明之其他樣態所述之邊緣偵測裝置,其特徵 爲,具備:線條感測器,係將複數像素以所定間隔排列而 成:和光源,係朝著該線條感測器照射單色光;和邊緣位 置解析手段,係根據被放置在前記單色光之光路中的透明 1379068 體的邊緣上的光量分布,偵測出前記線條感測器之像素排 列方向上的前記透明體之邊緣位置;和全遮光狀態判斷手 段,係在以該邊緣位置解析手段偵測出前記透明體之邊緣 位置的時點上,求出前記線條感測器中的對應於前記透明 體之邊緣位置的自由空間側的像素的全受光量,將構成前 記線條感測器之複數像素上的各受光量之總和或平均當作 全受光量而予以測出,當與預先記憶之全入光狀態下的全 受光量進行比較,而前記透明體之邊緣位置測出時的全受 光量是低於預先設定之比率時(例如爲了察覺10%的光量 變動而設定的90%之光量),判斷成是前記透明體所造成 的全遮光狀態。 若依據這些邊緣偵測裝置,則在從自由空間側起搜尋 邊緣位置的階段中,與此同時地加算線條感測器的各像素 之光量’將其受光量與全入光時的受光量進行比較,因此 可以根據上記比較結果,簡易地判定是處於光路中沒有透 明體存在的全入光狀態’還是處於線條感測器被透明體所 覆蓋之全遮光吠態。 尤其是,因爲可根據線條感測器的輸出來簡易地判定 光路中有無透明體,所以例如,在透明體的製造、檢查線 中安裝邊緣偵測裝置以進行該透明體的定位時,其工業上 的優點是非常的大。 理想係爲’可將前記全遮光狀態判斷手段設計成,將 構成前記線條感測器之複數像素當中的位於前記透明體之 邊緣位置偵測開始的自由空間側之數像素的受光等級與該 -8 - 1379068 數像素之全入光狀態下的受光等級進行比較’當位於前記 透明體之邊緣位置偵測特性之自由空間側的數像素的受光 等級是低於該數像素之全入光狀態下的受光等級時’則判 斷成是前記透明體所造成的全遮光狀態。 藉此,在從自由空間側起捜尋邊緣位置的階段中,可 簡易地判定是否處於線條感測器全體被透明體所覆蓋之全 遮光狀態。 又,理想係爲,可將前記邊緣位置解析手段設計成, 將前記線條感測器之各像素上的受光量,從已呈全入光狀 態的自由空間的像素側起依序讀入,將該受光量是從全受 光狀態起恰好降低所定比率之像素的位置,具體而言係考 慮到被偵測對象物是透明體,例如,將光量變成75%或 5 0%的光量位置加以測出,根據該像素位置與上記受光量 的降低比率,偵測出前記透明體之邊緣位置(光量25%之 位置)即可,藉此,可間接地求出透明體的邊緣位置。 如此測出邊緣位置之際,係不會受到穿透過透明體之 光的影響,將來自所謂入光側的線條感測器之端部的受光 量加以搜尋,將受光量降低的、所謂光量分布下挫之部分 ,當作有邊緣位置存在,來執行上記邊緣位置的偵測處理 ’較爲理想。如此,則即使透明體的內部有髒污,仍可僅 根據線條感測器的全受光量之衰減來判斷髒污,因此可避 免把該髒污當成邊緣位置而誤測之可能性。 再者,當透明體的透明度高時的理想樣態係爲,前記 線條感測器與光源係定位成,形成對前記透明體的表面傾 -9- 1379068 斜的光路爲較佳。 這是因爲,藉由傾斜可使表面反射變大而使穿透的光 衰減,及,藉由傾斜可使通過透明體內部的光與自由空間 側之光的相位差增大,使干涉所致之邊緣部分處的光量跌 ‘落更爲增大,藉此,可更正確地求出透明體的邊緣位置。 ^ 更理想係爲,可將前記邊緣位置解析手段構成爲,根 據被放置在前記單色光之光路中的透明體的邊緣上的夫瑞 Φ 奈繞射之光量分布,偵測出前記線條感測器之像素排列方 向上的前記透明體之邊緣位置,將該夫瑞奈繞射所產生的 前記線條感測器之各像素上的受光量之變化,使用近似曲 線函數來取近似,使用上記近似曲線函數而將前記線條感 測器上的像素排列方向中成爲所定光量之位置,解析成爲 前記透明體之邊緣位置。 藉此’即使透明體的內部有髒污,仍可避免把該髒污 當成邊緣位置而誤測。 【實施方式】 以下參照圖面,說明本發明之實施形態所述之邊緣偵 測裝置。 圖1係本實施形態所述之邊緣偵測裝置的要部槪略構 • 成;如圖1所示,具備:將複數像素以所定間隔排列而成 的線條感測器1 ;和對峙於該線條感測器1而設置以向線 條感測器1照射單色平行光的光源2 ;和微電腦4,係具 備邊緣位置解析手段4 a ’用以解析來自線條感測器1的輸 -10- 1379068 出訊號(光量),而偵測出線條感測器1之像素排列方向 上的被偵測對象物3之邊緣位置;上記光源2,係主要是 由雷射元件2a、及將從該雷射元件2a所發出之雷射光轉 成平行光而照射至線條感測器1的打光透鏡2b所構成, 將上記線條感測器1與光源2之間的上記單色平行光之光 路,當作用來偵測被偵測對象物3之邊緣用的偵測領域。 此例子中,微電腦4係具備全遮光狀態判定手段4b, 用以在以邊緣位置解析手段4a偵測出屬於透明體之被偵 測對象物3之邊緣位置的時點上,求出線條感測器1中的 對應於被偵測對象物3之邊緣位置的像素爲止的全受光量 ,以判定線條感測器1是否處於被覆蓋之狀態。 此處,微電腦4的邊緣位置解析手段4a,係如前述專 利文獻1所記載,將線條感測器1的輸出訊號就每一項素 進行正規化之正規化手段;將已被正規化之各像素所得之 受光量加以表示的輸出訊號(光量)加以解析,將光量爲 2 5 %之位置’當成線條感測器1之像素排列方向上的被偵 測對象物3之邊緣位置而加以測出。 更具體而言,上記邊緣位置解析手段4a,係調查已被 正規化之各像素11,12〜In的輸出訊號(光量),例如, 求出該光量爲25%前後的2個像素lk,lk+l(k=l〜η」 )’這些像素lk,lk+l的各光量之差異係依存於因夫瑞 奈繞射所產生的光量分布,將該光量之變化(光量分布) 使用雙曲線函數等之近似曲線函數來取近似,然後,使用 該近似曲線函數(光量分布)而在像素排列方向上將光量 -11 - 1379068 爲2 5 %之位置,當作被偵測對象物3之邊緣位置而加以求 出。 另一方面’前記全遮光狀態判定手段4b,係具備:預 先在該邊緣偵測裝置的啓動時等,在前述光路中未插介有 被偵測對象物3之狀態下,根據所測出的全入光狀態時的 線條感測器1之輸出訊號,來求出其全受光量,將其當成 初期値而加以記億之手段;上記全受光量,係藉由求出構 成線條感測器1之複數像素11,12〜In的各輸出訊號(光 量)之總和,而被求出。 在此全遮光狀態判定手段4b中,係其運用時(邊緣 偵測時)’在邊緣位置解析手段4a偵測出被偵測對象物3 的邊緣位置的時點上,求出線條感測器1上的對應於被偵 測對象物3之邊緣位置的像素11,12〜In爲止的輸出訊號 (光量)之總和,以求出全受光量,當該全受光量係小於 ’如前述所記憶之作爲初期値的全受光量爲基礎而得到之 對應於上記邊緣位置的像素11,12〜In爲止的全受光量時 ’將其判定成係處於線條感測器1是被被偵測對象物3所 覆蓋的全遮光狀態。 例如’如圖2所示,在邊緣位置解析手段4a偵測出 被偵測對象物3的邊緣位置E的時點上,線條感測器1上 的對應於被偵測對象物3之邊緣位置E的第20像素爲止 的每一個的光量是約爲1.0的情況下,由於其全受光量( 圖2中以實線包圍的領域,是與全入光時的第2〇像素爲 止的全受光量幾乎沒有差異,因此在全遮光狀態判定手段 -12- 1379068 4b中,係將該部位判定爲邊緣位置。 又,當線條感測器1是被被偵測對象物3所覆蓋的情 況下,如圖3所示,邊緣位置解析手段4a在第48像素偵 測到似乎是邊緣位置之部位(例如髒污部分D)的時點上 ,由於第48像素爲止的全受光量(圖3中以實線包圍的 領域,是小於全入光時的第48像素爲止的全受光量約 1 〇%程度,因此在全遮光狀態判定手段4b中,係不將該部 位D視爲邊緣位置,而是判斷爲處於線條感測器】全體是 被被偵測對象物3所覆蓋的全遮光狀態,而輸出〇mm。 亦即,全遮光狀態與全入光狀態可獲得不同的輸出結 果以外,即使屬於透明體之被偵測對象物3的內部有髒污 時,仍可避免將該髒污誤測成邊緣位置。 此處,在全入光狀態下邊緣位置(邊緣偵測位置)爲 最大,且該偵測特性係被定爲,隨著被偵測對象物3往光 路的進入量逐漸增大,上記邊緣位置則會減少的情況下, 則如圖4所示,當變爲全遮光狀態時,可將該邊緣位置維 持成最小。 換言之,如先前在變成全遮光狀態時,因爲無法將其 與全入光狀態做區別而造成其邊緣位置會劇烈地變化成最 大値的此種不良情形,可獲得抑制。 其結果爲,由於可正確地獲得相應於被偵測對象物3 往光路之進入量的邊緣位置,因此例如,可相應於該邊緣 位置來使被偵測對象物3在線條感測器1的像素排列方向 上改變位置以調整該邊緣位置。尤其是,即使無法測出邊 -13- 1379068 緣位置的情況下,也能夠容易判斷出,要將被偵測對象物 3往哪個方向改變位置使被偵測對象物3的邊緣位置置於 光路中,就可偵測到該邊緣位置,因此對於被偵測對象物 3的位置調整是有效且有用的。 順便一提,如本實施形態,當被偵測對象物3是透明 體的情況下,因爲是藉由使來自光源2的單色光穿透被偵 測對象物3後的光、與自由空間側之單色光的繞射光兩者 重合而產生的干涉,是利用其來偵測邊緣,因此有可能導 致來自光源2的單色光無法完全被遮光,在該被偵測對象 物3的邊緣所發生的夫瑞奈繞射之光量分布係被被偵測對 象物3的穿透光所掩蓋,導致前述2 5%之光量的位置難以 被偵測。尤其是,當被偵測對象物3的透明度較高時,根 據25 %之光量的位置來偵測邊緣位置是變得較爲困難。 因此,在此種情況下,則於前述之邊緣位置解析手段 4a中,亦可爲,例如,如圖5所示,是求出光量爲75 %的 位置。 具體而言,於邊緣位置解析手段4a中,調查已被正 規化之各像素11,12〜In的輸出訊號,例如,求出該光量 爲75%前後的2個像素18,18+1(8=1〜1!-1),這些像 素lg,lg+l的各光量之差異係也依存於前述的因夫瑞奈 繞射所產生的光量分布,將該光量之變化(光量分布)使 用雙曲線函數等之近似曲線函數來取近似,然後,使用該 近似曲線函數(光量分布)而在像素排列方向上將光量爲 2 5 %之位置,當作被偵測對象物3之邊緣位置而加以求出 -14- 1379068 即可。 換言之,上述光量爲7 5 %之位置,係如圖 光量25%之邊緣位置起偏置恰好Δχ後的位置 係由單色光的波長λ、線條感測器1與被偵測 距離ζ等所決定。因此,即使不像上述般地直 25 %之位置,也可藉由從如上述所求出的光量 置起實施上記偏置Δχ的補正,就可間接地求 象物3的邊緣位置。 此外,如上述在偵測邊緣位置之際,係不 過被偵測對象物3之光的影響,將來自所謂入 感測器之端部的受光量加以搜尋,將受光量降 光量分布下挫之部分,當作有邊緣位置存在, 邊緣位置的偵測處理,較爲理想。如此,則即 象物3的內部有髒污,仍可僅根據線條感測器 量之衰減來判斷髒污,因此可避免把該髒污當 而誤測之可能性。 亦即,即使被偵測對象物3的內部有髒污 將其當成邊緣位置而誤測,可判定其狀態係處 態還是全遮光状態,因此例如,在一面偵測被 3之邊緣位置一面調整被偵測對象物3之位置 仍可正確地判定上記被偵測對象物3的位置所 向。亦即,不會受到附著在被偵測對象物3表 之影響,可正確地偵測出其邊緣位置。 又,在前述之全遮光狀態判定手段4b中 5所示是從 ,該偏置量 對象物3之 接求出光量 爲 7 5 %之位 出被偵測對 會受到穿透 光側的線條 低的、所謂 來執行上記 使被偵測對 1的全受光 成邊緣位置 時,仍不會 於全入光狀 偵測對象物 的情況下, 應修正的方 面的髒污等 ,雖然是著 -15- 1379068 眼於線條感測器1所作的全受光量來判定全遮光狀態,但 於全遮光狀態下係如前述圖10 ( c)所示般地,在線條感 測器1的各像素11,12〜In的輸出訊號中會產生參差,因 此亦可調查該參差來判定是否處於全遮光狀態。 但是,由於線條感測器1的經年(衰老)變化而造成 各像素11,12〜In的輸出訊號會產生參差,因此定期定檢 查線條感測器1的輸出特性後,再來判定上述各像素 φ 11,12〜In的輸出訊號(光量)之參差,較爲理想。又, 因爲隨著被偵測對象物3之式樣也會導致上述參差的程度 有所改變,所以最好是考慮到這點,再來判定是否處於全 遮光狀態,較爲理想。 再者,在前述之全遮光狀態判定手段4b中,雖然是 著眼於線條感測器1所作的全受光量來判定全遮光狀態, 但亦可設計成,將位於所謂的入光側之線條感測器1的端 部側的數像素的受光等級、與該數像素之全入光狀態時的 φ 受光等級進行比較,當位於線條感測器1端部側之數像素 的受光等級是低於該數像素之全入光狀態下的受光等級時 ’則判斷成是被偵測對象物3所造成的全遮光狀態。換言 之,位於入光側之線條感測器i之端部側的數像素的輸出 • 訊號’是否在1·〇附近、例如0.9〜1.1,隨應於該判定, • 也可得知是否處於線條感測器1全體是被被偵測對象物3 所覆蓋的全遮光狀態。 可是’當被偵測對象物3的透明度較高時,如前述, 即使以全遮光狀態判定手段4 b來調查線條感測器1上的 -16- 1379068 受光量之總和(全受光量),也料想到可能不會發生1 〇% 以上的受光量之變化。爲了避免此種不良情形,例如,如 圖6所示之槪念,使得線條感測器丨與光源2之間所形成 的光路,對於被偵測對象物3的表面成傾斜狀地設置即可 。然後,光路對上記被偵測對象物3的表面傾斜的份量, 將邊緣位置解析手段4a中所測出的邊緣位置按照上記傾 斜角度0而加以補正,則藉此就可正確地偵測出被偵測對 象物3的邊緣位置。 亦即,即使被偵測對象物3的透明度較高,藉由使光 路對該被偵測對象物3之表面呈傾斜,就可增加其表面上 的反射,可減少穿透被偵測對象物3而到達線條感測器1 的光量。其結果爲,相較於光路對被偵測對象物3表面呈 直角設定的情況,上記光路傾斜設定的情況下,如圖7 ( a )、(b )中將各個線條感測器1的輸出加以對比表示, 光路傾斜設定時,其受光量較爲降低,並且各像素11,12 〜In上的受光量的參差係較大。 因此,若將光路對被偵測對象物3表面傾斜設定,則 即使被偵測對象物3的透明度較高時,仍可增大其表面反 射,因此可確實地偵測出遮掩光路之被偵測對象物3的存 在。 此外,本發明係不限定於上述實施形態,例如,於上 記實施形態中雖然說明,邊緣位置解析手段4a係根據被 置於單色光之光路中的被偵測對象物3之邊緣上的夫瑞奈 繞射之光量分布來偵測出線條感測器1之像素排列方向上 -17- 1379068 的被偵測對象物3之邊緣位置,但並非限定於此。 又,於上記實施形態中,雖然說明邊緣位置解析手段 4a係用雙曲線二級函數來解析夫瑞奈繞射之光量分布,但 當然亦可採用其他的近似曲線函數。 甚至,作爲線條感測器1的全受光量之資訊,雖然只 言及求取複數像素之各受光量之平均,但關於全遮光狀態 的判定條件,只要考慮到作爲邊緣偵測位置的透明體3的 透明度或外光等擾亂因素來加以設定即可。 1 明 說 單 簡 式 圖 圖1係本發明之實施形態所述之邊緣偵測裝置的要部 槪略構成圖。 圖2係藉由邊緣偵測裝置,偵測出屬於透明體之被偵 測對象物的邊緣位置之狀況下的線條感測器之輸出變化例 之圖示。 圖3係藉由邊緣偵測裝置,判定爲處於線條感測器是 被透明體之被偵測對象物所覆蓋之狀況下的線條感測器之 輸出變化例之圖示。 圖4係表示邊緣偵測裝置之邊緣位置偵測特性的圖形 〇 圖5係邊緣位置偵測之另一手法之圖示。 圖6係屬於透明體之被偵測對象物的透明度較高時的 邊緣位置偵測之手法的圖示。 圖7係光路對屬於透明體的被偵測對象物之表面成正 -18- 1379068 交時的受光變化量,與傾斜時的受光量變化的對比圖(a )、(b )。 圖8係先前的邊緣偵測裝置之槪略構成圖。 圖9係用來說明邊緣偵測裝置的邊緣偵測原理用的線 條感測器之輸出例圖。 圖1 〇係屬於透明體之被偵測對象物不在光路上之狀 態下的線條感測器之輸出變化例的圖示(a )、線條感測 器是被被偵測對象物覆蓋一半之狀態下的線條感測器之輸 出變化例的圖示(b )、及線條感測器全體是被被偵測對 象物所覆蓋之狀態下的線條感測器之輸出變化例的圖示( C ) ° 圖 π係先前之邊緣偵測裝置上的邊緣偵測特性之圖 示。 圖1 2係藉由先前之邊緣偵測裝置,偵測出屬於透明 體之被偵測對象物的邊緣位置之狀況下的線條感測器之輸 出變化例之圖示。 圖1 3係藉由先前之邊緣偵測裝置,將髒污部分誤判 成被偵測對象物之邊緣位置之狀況下的線條感測器之輸出 變化例之圖示。 【主要元件符號說明】 1 :線條感測器 2 :光源 3 :被偵測對象物 -19- 1379068 4 :微電腦 2 a :雷射元件 2b :打光透鏡 4a:邊緣位置解析手段 4b :全遮光狀態判定手段 E :邊緣位置1379068 IX. Description of the Invention [Technical Field] The Fresnel is diffracted in the optical path of monochromatic light used for position control of a transparent body of a transparent body such as a film or glass. The distribution of the amount of light diffracted by Freynai, the side of the edge of the object, the amount of light, refers to the definition of the intensity of the line of light in a ratiometric manner. A line in which a plurality of pixels are arranged A part of the light path of the parallel light is the position of the object 3, as shown in Fig. 9, the object to be detected is recorded. In particular, the effect of the Fresnel diffraction around the line sensor arrangement is related to the detection of the position of the transparent edge, such as the ideal edge detection device described above. [Prior Art] If an object exists in the edge position of the object in the form of laser light, it will be generated. Using the line sensor to find the above and solve the new light quantity distribution, the edge detection device can be detected. , has been developed. In addition, in the present invention, the light-receiving surface of the so-called detector is monochromatic before being received by the light, and is not necessarily identical to the human-based view, that is, as shown in FIG. The portion 2 is irradiated with a single color to cover the detected position, and the edge position of the output 3 of the line sensor 1 is marked as a boundary, and the light quantity distribution is greatly changed. And there is a certain tendency to change. Therefore, if the output of each pixel of the line sensor 1 is normalized, the amount of light can be 25% of the position when the light is completely incident, and the detected direction of the pixels in the direction of the line sensing 1379068 The edge position of the object 3 is measured and measured. Further, when the light distribution of the franie diffraction is analyzed in this way, even if the object to be detected 3 is a transparent body such as a transparent film or glass, the edge position can be accurately measured (for example, refer to 曰本特2 0 0 4 -1 773 3 Bulletin 5). This is because "the light passing through the inside of the transparent body is different from the phase of the diffracted light on the free space side. Therefore, interference occurs on the light receiving surface, and a large amount of light falls in the edge portion. This drop in the amount of light is the largest when the phase difference between the light inside the transparent body and the diffracted light on the free space side is 1 80°. However, when the object 3 to be detected is a transparent body, when the object 3 to be detected does not exist (the all-light state of the line sensor 1 shown in FIG. 1 (a)), it is detected. When the object 3 covers a half of the optical path (the edge detection state shown in FIG. 1 (b)), and the detected object 3 covers all of the optical path (the line shown in FIG. 1 〇 (c)) The full shading state of the sensor 1) 'from the output of these line sensors ( (light distribution), it can be seen that the difference between the full light state and the full light state is very difficult, that is, 'in the full light state In the full shading state, since there is no freinet diffraction generated at the edge of the object 3 to be detected, it is impossible to detect the amount of light from the output of the line sensor 1 to become the aforementioned 25% position, as shown in FIG. The detection characteristic of the edge detecting device shown in FIG. 1 can be detected that even if it is in the full shading state caused by the detected object 3, it will be detected as a full light entering state. For example, detecting on one side is detected. The edge position of the object 3 is 1379068, and the object to be detected 3 is adjusted. It is said that the position of the object to be detected 3 is the closest to the full light-shielding state and the full light-shielding state, and the light quantity is reduced in the full light-shielding state, and in order to solve the above-mentioned bad situation, the pixel is obtained. The addition of the amount of light (the total amount of light), the total amount of the light of the volume is less than the total amount of light in the all-into-light state, and the entire line sensor 1 is covered by the transparent body. This mode has been proposed in 2007. -6473 Gazette No. 35). In the above manner, as shown in FIG. 12, when the pixel of the pixel for detecting the edge position is started to be searched for on the left side, 'if the object 3 to be detected is in the line bar', for example, even if the object 3 is detected. The edge position E can still be found normally. However, if the entire line sensor 1 is covered, the total amount of light after the internal search of the object 3 is not dirty can be judged as being covered by the line sense object 3, but when the line is covered The body is also dirty inside the object to be detected 3 as shown in Fig. 13. The dirty part d, that is, the threshold detection, will be referred to as the edge position. In the case of J, it is not known: one, so it is not [correction, etc., is a more complete light state than the line sensor 1 of the line sensor 1 in all cases, it is judged as the detected object The object 3 (for example, referring to the Japanese special free space side (Fig., when the measurement portion of the sensor 1 below the threshold is found to have a dirty portion D, and the object to be contaminated by the object 3 is detected, the system 1 is based on the whole When the transparent portion D of the entire sensor 1 is present, the following pixels will be emitted. There is such a problem. 1379068. [Invention] The object of the present invention is to accurately detect the edge of the transparent body. The position can also be surely determined whether the line sensor is in the all-into-light state or in the full-light-shielding state, and when the inside of the transparent body covering the entire line sensor 1 has a dirty portion, it can be avoided. The dirty portion is erroneously detected as an edge position, for example, an edge detecting device that is ideally suited for position control of the upper transparent body is provided. To achieve the above object, the edge detecting device of the present invention is characterized by In order to have: a line sensor, the plurality of pixels are arranged at a predetermined interval: and the light source is directed to the line sensor to illuminate the monochromatic light; and the edge position analysis means is based on being placed in the pre-recorded monochrome The light quantity distribution on the edge of the transparent body in the light path detects the edge position of the front transparent body in the pixel arrangement direction of the front line sensor; and the full shading state determining means is based on the edge position analysis means When the edge position of the front transparent body is detected, the total light receiving amount of the pixel on the free space side corresponding to the edge position of the front transparent body in the front line sensor is obtained, and the total light receiving amount and the pre-memory are obtained. The difference in the total amount of received light corresponding to the pixel at the front edge position in the all-light incident state of the line sensor is determined to be the total light-shielding state caused by the front transparent body when the predetermined value is exceeded. An edge detecting device according to another aspect of the present invention is characterized by comprising: a line sensor, wherein the plurality of pixels are arranged at a predetermined interval: and a light source The line sensor illuminates the monochromatic light; and the edge position resolution means detects the pixel of the pre-recorded line sensor according to the light quantity distribution on the edge of the transparent 1379068 body placed in the optical path of the pre-recorded monochromatic light The edge position of the front transparent body in the arrangement direction; and the full light-shielding state determining means determine the corresponding position in the front line sensor when the edge position analysis means detects the edge position of the front transparent body The total amount of light received by the pixel on the free space side at the edge position of the transparent body is measured as the total or the average amount of received light on the complex pixels of the front line sensor, and is measured as a pre-memory. The total light-receiving amount in the all-light-in state is compared, and the total light-receiving amount when the edge position of the front transparent body is measured is lower than a predetermined ratio (for example, 90% of which is set to detect a 10% change in the light amount) The amount of light is judged to be the total shading state caused by the front transparent body. According to these edge detecting devices, in the stage of searching for the edge position from the free space side, the amount of light of each pixel of the line sensor is added simultaneously with the amount of light received and the amount of light received when the light is completely incident. For comparison, it is therefore possible to easily determine whether it is in the all-light state in which no transparent body exists in the optical path based on the comparison result above or in the full light-shielding state in which the line sensor is covered by the transparent body. In particular, since it is possible to easily determine whether or not there is a transparent body in the optical path according to the output of the line sensor, for example, when an edge detecting device is mounted in the manufacturing of the transparent body and the inspection line to perform positioning of the transparent body, the industrial The advantage is very large. The ideal system is designed such that the pre-recorded full shading state judging means is designed such that the light receiving level of the pixels on the free space side of the front end of the front edge transparent body among the plurality of pixels constituting the front line sensor is 8 - 1379068 Comparison of the light-receiving levels in the full-light state of a number of pixels' When the light-receiving level of a pixel on the free-space side of the edge position detection characteristic of the front-end transparent body is lower than the full-light state of the number of pixels When the light receiving level is ', it is judged to be the full shading state caused by the front transparent body. Thereby, in the stage of finding the edge position from the free space side, it is possible to easily determine whether or not the entire line sensor is covered by the transparent body. Further, ideally, the front edge position analyzing means can be designed such that the amount of light received on each pixel of the front line sensor is sequentially read from the pixel side of the free space in the all-into-light state, and The amount of received light is a position at which the pixel of the predetermined ratio is reduced from the fully light-receiving state, and specifically, the object to be detected is a transparent body, for example, a light amount position in which the amount of light is changed to 75% or 50% is measured. According to the ratio of the pixel position to the amount of light received by the above, the edge position of the front transparent body (the position of 25% of the light amount) can be detected, whereby the edge position of the transparent body can be indirectly obtained. When the edge position is measured in this way, it is not affected by the light that has passed through the transparent body, and the amount of light received from the end portion of the line sensor on the light incident side is searched, and the so-called light amount distribution is reduced. The part that falls down is regarded as having an edge position, and it is preferable to perform the detection processing of the edge position. In this way, even if the inside of the transparent body is dirty, the contamination can be judged based only on the attenuation of the total amount of received light of the line sensor, so that the possibility of erroneously detecting the contamination as an edge position can be avoided. Further, when the transparency of the transparent body is high, the ideal form is that the front line sensor and the light source are positioned to form an optical path inclined to the surface of the front transparent body by -9 - 1379068. This is because the reflected light is attenuated by tilting, and the transmitted light is attenuated, and the phase difference between the light passing through the inside of the transparent body and the light on the free space side is increased by tilting, thereby causing interference. The amount of light at the edge portion is further increased, whereby the edge position of the transparent body can be more accurately obtained. ^ It is more desirable to configure the pre-recording edge position analysis means to detect the pre-recorded line sensation according to the light quantity distribution of the Frey Φ ray diffraction on the edge of the transparent body placed in the optical path of the pre-recorded monochromatic light. The edge position of the front transparent body in the pixel arrangement direction of the detector, and the change of the amount of received light on each pixel of the pre-recorded line sensor generated by the franui diffraction is approximated by using an approximate curve function. By approximating the curve function, the position of the pixel arrangement direction on the front line sensor is the position of the predetermined amount of light, and is analyzed as the edge position of the front transparent body. By this, even if the inside of the transparent body is dirty, it is possible to avoid erroneously detecting the stain as an edge position. [Embodiment] Hereinafter, an edge detecting device according to an embodiment of the present invention will be described with reference to the drawings. 1 is a schematic diagram of a main part of an edge detecting device according to the embodiment; as shown in FIG. 1, a line sensor 1 in which a plurality of pixels are arranged at a predetermined interval; The line sensor 1 is provided with a light source 2 for illuminating the line sensor 1 with monochromatic parallel light; and the microcomputer 4 is provided with an edge position analyzing means 4a' for analyzing the input from the line sensor 1 - 10 1379068 The signal number (light quantity) detects the edge position of the detected object 3 in the pixel arrangement direction of the line sensor 1; the above-mentioned light source 2 is mainly composed of the laser element 2a and the The laser light emitted from the element 2a is converted into parallel light and is irradiated to the light-receiving lens 2b of the line sensor 1. The light path of the upper parallel light between the line sensor 1 and the light source 2 is recorded. The function is to detect the detection area used for the edge of the detected object 3. In this example, the microcomputer 4 is provided with a full shading state determining means 4b for determining the line sensor at the time when the edge position detecting means 4a detects the edge position of the object 3 to be detected belonging to the transparent body. The total received light amount corresponding to the pixel at the edge position of the object 3 to be detected in 1 is determined to determine whether or not the line sensor 1 is in a covered state. Here, the edge position analyzing means 4a of the microcomputer 4 is a normalizing means for normalizing the output signal of the line sensor 1 for each element as described in the above-mentioned Patent Document 1, and each of which has been normalized The output signal (light amount) represented by the amount of light received by the pixel is analyzed, and the position where the amount of light is 25 % is measured as the edge position of the detected object 3 in the pixel arrangement direction of the line sensor 1 . More specifically, the edge position analyzing means 4a is for investigating the output signals (light amounts) of the pixels 11, 12 to In which have been normalized, for example, obtaining two pixels lk before and after the light amount is 25%, lk +l(k=l~η" )' The difference in the amount of light of these pixels lk, lk+l depends on the distribution of the light amount generated by the diffraction of Inverine, and the change in the amount of light (the amount of light) is hyperbolic. Approximate curve function of the function, etc. to approximate, and then use the approximate curve function (light quantity distribution) to position the light quantity -11 - 1379068 to 25% in the pixel arrangement direction as the edge of the detected object 3 Find the position. On the other hand, the pre-recorded full-shading state determining means 4b is provided in a state in which the object to be detected 3 is not inserted in the optical path before the edge detecting means is activated, etc., based on the measured The output signal of the line sensor 1 in the all-into-light state is used to determine the total amount of received light, and it is used as an initial means to record the billions; the total amount of received light is determined by the line sensor. The sum of the output signals (light quantities) of the complex pixels 11, 12 to In of 1 is obtained. In the full shading state determining means 4b, when the edge position analyzing means 4a detects the edge position of the detected object 3, the line sensor 1 is obtained during the operation (edge detection). The sum of the output signals (light quantities) corresponding to the pixels 11, 12 to In of the edge position of the object to be detected 3, to obtain the total light receiving amount, when the total light receiving amount is less than 'remembered as described above When the total light-receiving amount of the pixels 11 and 12 to In corresponding to the edge position of the upper edge obtained based on the total amount of received light of the initial flaw is determined, it is determined that the line sensor 1 is the object to be detected 3 The full shading state covered. For example, as shown in FIG. 2, when the edge position analyzing means 4a detects the edge position E of the detected object 3, the edge position E of the line sensor 1 corresponding to the detected object 3 When the amount of light of each of the 20th pixels is about 1.0, the total amount of light received (the area surrounded by the solid line in Fig. 2 is the total amount of light received from the second pixel when the light is completely incident) There is almost no difference, so in the full shading state determining means -12-1379068 4b, the portion is determined as the edge position. Further, when the line sensor 1 is covered by the object 3 to be detected, As shown in FIG. 3, the edge position analyzing means 4a receives the total amount of light received by the 48th pixel at the time when the 48th pixel detects a portion which seems to be the edge position (for example, the dirty portion D) (solid line in FIG. 3) In the area surrounded by the 48th pixel, the total light-receiving amount is about 1%. Therefore, in the total light-shielding state determining means 4b, the portion D is not regarded as the edge position, but is determined as In the line sensor] the whole object is the object to be detected 3, the full shading state covered, and the output 〇mm. That is, the full shading state and the full light incident state can obtain different output results, even if the inside of the object to be detected 3 belonging to the transparent body is dirty, It is still possible to avoid falsely detecting the dirtyness as an edge position. Here, the edge position (edge detection position) is maximum in the all-light-in state, and the detection characteristic is determined as the object to be detected 3 When the amount of entry to the optical path gradually increases, and the position of the edge is reduced, as shown in Fig. 4, when the state becomes full shading, the edge position can be kept to a minimum. In the case of full shading, suppression can be obtained because it cannot be distinguished from the all-into-light state, and the edge position thereof is drastically changed to the maximum defect. As a result, it is possible to correctly obtain the corresponding Detecting the edge position of the object 3 toward the entrance of the optical path. Therefore, for example, the detected object 3 can be changed in position in the pixel arrangement direction of the line sensor 1 to adjust the edge corresponding to the edge position. In particular, even if the edge position of the edge -13 - 1379068 cannot be detected, it can be easily determined which direction the object to be detected 3 is to be changed to position the edge of the object 3 to be detected. In the light path, the edge position can be detected, so that the position adjustment of the object 3 to be detected is effective and useful. By the way, as in the embodiment, when the object 3 to be detected is a transparent body In this case, the interference caused by the superimposed light of the monochromatic light from the light source 2 passing through the object 3 to be detected and the diffracted light of the monochromatic light on the free space side is utilized. Since the edge is detected, there is a possibility that the monochromatic light from the light source 2 cannot be completely blocked, and the amount of the diffracted light generated at the edge of the detected object 3 is detected by the object 3 to be detected. Covering the light, the position of the aforementioned 25% of the light is difficult to detect. In particular, when the transparency of the object 3 to be detected is high, it is difficult to detect the edge position based on the position of 25% of the light amount. Therefore, in this case, in the edge position analyzing means 4a, for example, as shown in Fig. 5, it is possible to obtain a position where the amount of light is 75%. Specifically, in the edge position analyzing means 4a, the output signals of the pixels 11, 12 to In which have been normalized are investigated, for example, two pixels 18, 18+1 (8) before and after the light amount is 75% are obtained. =1~1!-1), the difference in the amount of light of these pixels lg, lg+l is also dependent on the light quantity distribution generated by the aforementioned Fresnel diffraction, and the change in the amount of light (light amount distribution) is used in double An approximation curve function such as a curve function is used to approximate, and then, using the approximation curve function (light quantity distribution), the position where the amount of light is 25 % in the pixel arrangement direction is taken as the edge position of the object 3 to be detected. Find -14- 1379068. In other words, the above-mentioned light amount is at a position of 75 %, and the position at which the edge position of the light amount of 25% is offset exactly Δχ is the wavelength λ of the monochromatic light, the line sensor 1 and the detected distance ζ Decide. Therefore, even if the position is not 25 mm as described above, the edge position of the object 3 can be indirectly indirectly obtained by performing the correction of the offset Δχ from the amount of light obtained as described above. In addition, as described above, when the edge position is detected, the amount of light received from the end portion of the sensor is searched for, and the portion of the received light amount is reduced. As the edge position exists, the edge position detection processing is ideal. Thus, even if the inside of the object 3 is dirty, the contamination can be judged only based on the attenuation of the line sensor amount, so that the possibility of misdiagnosis of the contamination can be avoided. That is, even if the inside of the object to be detected 3 is dirty and is mistakenly detected as an edge position, it can be judged that the state is in a state of full shading or the like, and thus, for example, one side is adjusted while being detected by the edge position of the edge. The position of the object 3 to be detected can still correctly determine the position of the position of the object 3 to be detected. That is, it is not affected by the attachment of the object to be detected 3, and the edge position can be correctly detected. Further, in the above-described total light-shielding state determining means 4b, as shown in FIG. 5, the amount of light obtained by the offset amount object 3 is 75%, and the line detected by the offset pair is low on the side of the transmitted light. In the case of performing the above-mentioned full-light-receiving edge position of the detected pair 1, it is still not in the case of the all-into-light detection object, and the dirt to be corrected, etc., although it is -15 - 1379068 The full light-shielding state is determined by the total light-receiving amount by the line sensor 1, but in the full light-shielding state, as shown in the above-mentioned FIG. 10(c), each pixel 11 of the line sensor 1 A stagger is generated in the output signal of 12~In, so the parameter can also be investigated to determine whether it is in the full shading state. However, since the output signals of the pixels 11, 12 to In are caused to vary due to the change of the aging of the line sensor 1, the output characteristics of the line sensor 1 are periodically checked, and then the above-mentioned respective determinations are made. It is preferable that the output signal (light quantity) of the pixel φ 11, 12 to In is a difference. Further, since the degree of the above-described variation is also changed depending on the type of the object 3 to be detected, it is preferable to take this into consideration and determine whether or not the light is in a fully shielded state. Further, in the above-described total light-shielding state determining means 4b, although the total light-shielding state is determined by focusing on the total light-receiving amount by the line sensor 1, it is also possible to design a line sense on the so-called light-incident side. The light receiving level of the pixel on the end side of the detector 1 is compared with the φ light receiving level at the time of the full light entering state of the plurality of pixels, and the light receiving level of the pixels located on the end side of the line sensor 1 is lower than When the light receiving level in the all-light-in state of the plurality of pixels is '', it is judged to be the full light-shielding state caused by the object 3 to be detected. In other words, whether the output signal of the pixel on the end side of the line sensor i on the light incident side is near 1·〇, for example, 0.9 to 1.1, depending on the determination, • can also know whether it is in the line The entire sensor 1 is in a fully shielded state covered by the object 3 to be detected. However, when the transparency of the object 3 to be detected is high, as described above, even if the total light-shielding state determining means 4b is used to investigate the sum of the light-receiving amounts of -16 - 1379068 on the line sensor 1 (the total light-receiving amount), It is also expected that a change in the amount of received light of more than 1% may not occur. In order to avoid such a problem, for example, as shown in FIG. 6, the optical path formed between the line sensor and the light source 2 is set to be inclined to the surface of the object 3 to be detected. . Then, the optical path is corrected by the tilt amount of the surface of the object to be detected 3, and the edge position measured by the edge position analyzing means 4a is corrected by the tilt angle 0 of the upper side, whereby the detected position can be correctly detected. The edge position of the object 3 is detected. That is, even if the transparency of the object 3 to be detected is high, by making the optical path tilt the surface of the object 3 to be detected, the reflection on the surface can be increased, and the object to be detected can be reduced. 3 and the amount of light reaching the line sensor 1. As a result, when the surface of the object to be detected 3 is set at a right angle with respect to the optical path, when the optical path inclination is set, the output of each line sensor 1 is as shown in FIGS. 7(a) and (b). In contrast, when the optical path is tilted, the amount of received light is reduced, and the amount of received light on each of the pixels 11, 12 to In is large. Therefore, if the optical path is tilted to the surface of the object to be detected 3, even if the transparency of the object 3 to be detected is high, the surface reflection can be increased, so that the detected light path can be surely detected. The presence of the object 3 is measured. Further, the present invention is not limited to the above-described embodiment. For example, as described in the above embodiment, the edge position analyzing means 4a is based on the edge of the object 3 to be detected placed in the optical path of the monochromatic light. The light distribution of the Raytheon diffraction detects the edge position of the detected object 3 of -17-1379068 in the pixel arrangement direction of the line sensor 1, but is not limited thereto. Further, in the above embodiment, it is explained that the edge position analyzing means 4a analyzes the light amount distribution of the Fresnel diffraction by the hyperbolic secondary function, but of course, other approximate curve functions may be employed. Even as the information on the total amount of received light of the line sensor 1, although only the average of the received light amounts of the plurality of pixels is obtained, the determination condition of the full light-shielding state is considered as the transparent body 3 as the edge detection position. The transparency or external light and other disturbance factors can be set. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is a schematic structural view of an essential part of an edge detecting device according to an embodiment of the present invention. Fig. 2 is a view showing an example of changes in the output of the line sensor in the case where the edge position of the object to be detected belonging to the transparent body is detected by the edge detecting means. Fig. 3 is a view showing an example of a change in the output of the line sensor in the case where the line sensor is covered by the object to be detected by the transparent body by the edge detecting means. Figure 4 is a diagram showing the edge position detection characteristics of the edge detecting device. Figure 5 is an illustration of another method of edge position detection. Fig. 6 is a view showing a method of edge position detection when the transparency of the object to be detected of the transparent body is high. Fig. 7 is a comparison diagram (a) and (b) showing changes in the amount of received light when the surface of the object to be detected belonging to the transparent body is at -18 - 1379068, and the change in the amount of received light when tilted. Figure 8 is a schematic block diagram of a prior edge detecting device. Fig. 9 is a diagram showing an example of output of a line sensor for explaining the edge detecting principle of the edge detecting device. Fig. 1 is a diagram showing an example of a change in output of a line sensor in a state in which the object to be detected of the transparent body is not on the optical path (a), and the line sensor is half covered by the object to be detected. Diagram (b) of the output change example of the lower line sensor, and an example of the change of the output of the line sensor in the state in which the entire line sensor is covered by the object to be detected (C) ° Figure π is an illustration of the edge detection characteristics of previous edge detection devices. Fig. 1 is a diagram showing an example of changes in the output of the line sensor in the case where the edge position of the object to be detected belonging to the transparent body is detected by the previous edge detecting means. Fig. 1 is a diagram showing an example of the change of the output of the line sensor in the case where the dirty portion is mistakenly judged to be the edge position of the object to be detected by the previous edge detecting means. [Main component symbol description] 1 : Line sensor 2 : Light source 3 : Object to be detected -19 - 1379068 4 : Microcomputer 2 a : Laser element 2b : Light-emitting lens 4a: Edge position analysis means 4b : Full shading State determination means E: edge position

-20--20-

Claims (1)

1379068 第097135902號專利申請案中文申請專利範圍修正本 民國101年9月4 日修正 十、申請專利範圍 1· 一種邊緣偵測裝置’其特徵爲,具備: 線條感測器’係將複數像素以所定間隔排列而成;和 光源,係朝著該線條感測器照射單色光;和 邊緣位置解析手段’係根據被放置在前記單色光之光 路中的透明體的邊緣上的光量分布,偵測出前記線條感測 - 器之像素排列方向上的前記透明體之邊緣位置;和 全遮光狀態判斷手段,係在以該邊緣位置解析手段偵 測出前記透明體之邊緣位置的時點上,求出前記線條感測 器中的對應於前記透明體之邊緣位置的自由空間側的像素 的全受光量,當該全受光量與預先記憶之該線條感測器的 全入光狀態下的對應於前記邊緣位置之像素爲止的全受光 量的差,是超過所定閾値時,則判斷成是前記透明體所造 # 成的全遮光狀態。 2. —種邊緣偵測裝置,其特徵爲,具備: 線條感測器,係將複數像素以所定間隔排列而成;和 光源,係朝著該線條感測器照射單色光;和 邊緣位置解析手段,係根據被放置在前記單色光之光 ; 路中的透明體的邊緣上的光量分布,偵測出前記線條感測 器之像素排列方向上的前記透明體之邊緣位置;和 全遮光狀態判斷手段,係在以該邊緣位置解析手段偵 測出前記透明體之邊緣位置的時點上,求出前記線條感測 1379068 f6年7月v曰修(¾正替換買 器中的對應於前記透明體之邊緣位置的自由空間側的像素 的全受光量,將構成前記線條感測器之複數像素上的各受 光量之總和或平均當作全受光量而予以測出,當與預先言己 憶之全入光狀態下的全受光量進行比較,而前記透明體之 邊緣位置測出時的全受光量是低於預先設定之比率時,判 斷成是前記透明體所造成的全遮光狀態。 3. 如申請專利範圍第1項或第2項所記載之邊緣偵 測裝置,其中, 前記全遮光狀態判斷手段,係將構成前記線條感測器 之複數像素當中的位於前記透明體之邊緣位置偵測開始側 的自由空間側之數像素的受光等級與該數像素之全入光狀 態下的受光等級進行比較,當位於前記透明體之邊緣位置 偵測開始側的數像素的受光等級是低於該數像素之全入光 狀態下的受光等級時,則判斷成是前記透明體所造成的全 遮光狀態。 4. 如申請專利範圍第1項或第2項所記載之邊緣偵 測裝置,其中, 前記邊緣位置解析手段,係將前記線條感測器之各像 素上的受光量,從已呈全入光狀態的自由空間側的像素側 起依序讀入,將該受光量是從全受光狀態起恰好降低所定 比率之像素的位置加以測出,根據該像素位置與上記受光 量的降低比率,偵測出前記透明體之邊緣位置。 5. 如申請專利範圍第1項或第2項所記載之邊緣偵 測裝置’其中, -2- 1379068 g修(更)正替換頁 則記線條感測器與光源,係形成對前記透明體的表面 傾斜的光路。 6.如申請專利範圍第1項或第2項所記載之邊緣偵 測裝置,其中,1379068 Patent Application No. 097135902 Patent Revision of Chinese Patent Application Revision of the Republic of China on September 4, 101. Patent Application Area 1. An edge detection device is characterized in that: a line sensor is a plurality of pixels And the light source is configured to illuminate the line sensor with the monochromatic light; and the edge position resolution means is based on the light quantity distribution on the edge of the transparent body placed in the optical path of the pre-recorded monochromatic light, Detecting the edge position of the front transparent body in the direction in which the pixel is arranged in the front row; and the means for judging the full shading state, when the edge position analysis means detects the edge position of the front transparent body, Calculating the total received light amount of the pixel on the free space side corresponding to the edge position of the front transparent body in the front line sensor, and correspondingly corresponding to the all-light state of the line sensor previously memorized When the difference in the total amount of received light from the pixel at the edge position is greater than the predetermined threshold 则, it is determined that it is the full light-shielded shape created by the front transparent body. state. 2. An edge detecting device, comprising: a line sensor for arranging a plurality of pixels at a predetermined interval; and a light source for illuminating the line sensor with monochromatic light; and an edge position The analytic means detects the edge position of the pre-recorded transparent body in the pixel arrangement direction of the pre-recorded line sensor according to the light quantity placed on the edge of the transparent body in the road; The shading state judging means determines the front line sensing 1379068 f6 July v曰 repair when the edge position analyzing means detects the edge position of the front transparent body (3⁄4 is replacing the corresponding in the buyer) The total amount of light received by the pixel on the free space side at the edge position of the transparent body is measured as the total or average amount of light received on the complex pixels of the front line sensor as the total received light amount, and It is recalled that the total amount of received light in the all-light-in state is compared, and when the total light-receiving amount when the edge position of the transparent body is measured is lower than a predetermined ratio, it is judged to be a pre-recorded transparent body. 3. The full shading state caused by the method 3. The edge detecting device described in claim 1 or 2, wherein the pre-recording full shading state judging means is located among the plurality of pixels constituting the pre-recording line sensor The light receiving level of the pixel on the free space side on the side of the edge position detection start side of the front surface of the transparent body is compared with the light receiving level in the full light entering state of the number of pixels, and the number at the edge of the edge position detection of the front transparent body is detected. When the light receiving level of the pixel is lower than the light receiving level in the all-into-light state of the plurality of pixels, it is determined to be the full light-shielding state caused by the front transparent body. 4. As described in item 1 or item 2 of the patent application. The edge detecting device, wherein the front edge position analyzing means reads the amount of light received on each pixel of the front line sensor from the pixel side of the free space side in the all-into-light state, and The amount of received light is measured from the position of the pixel which is reduced by a predetermined ratio from the fully light-receiving state, and is detected based on the ratio of the pixel position to the amount of received light. Record the edge position of the transparent body. 5. If the edge detection device described in the first or second paragraph of the patent application's -2- 1379068 g repair (more) is replacing the page, then the line sensor and the light source are recorded. , an optical path that is inclined to the surface of the front transparent body. 6. The edge detecting device according to claim 1 or 2, wherein 則gB邊緣位置解析手段’係根據被放置在前記單色光 之光路中的透明體的邊緣上的夫瑞奈繞射之光量分布,偵 測出前記線條感測器之像素排列方向上的前記透明體之邊 緣位置,將該夫瑞奈繞射所產生的前記線條感測器之各像 素上的受光量之變化,使用近似曲線函數來取近似’使用 上記近似曲線函數而將前記線條感測器上的像素排列方向 中成爲所定光量之位置,解析成爲前記透明體之邊緣位置Then, the gB edge position resolution means detects the pre-recording of the pixel arrangement direction of the pre-recorded line sensor based on the light quantity distribution of the Freyna diffraction on the edge of the transparent body placed in the optical path of the pre-recorded monochromatic light. The position of the edge of the transparent body, the change of the amount of received light on each pixel of the pre-recorded line sensor generated by the diffraction of the Fresnel, using the approximate curve function to approximate the 'approximate curve function using the above-mentioned approximation curve function The position of the pixel in the direction of the pixel arrangement becomes the position of the predetermined amount of light, and the resolution becomes the edge position of the front transparent body -3--3-
TW097135902A 2007-09-28 2008-09-18 Edge detection device TW200921040A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007253661A JP4868597B2 (en) 2007-09-28 2007-09-28 Edge detection device

Publications (2)

Publication Number Publication Date
TW200921040A TW200921040A (en) 2009-05-16
TWI379068B true TWI379068B (en) 2012-12-11

Family

ID=40517008

Family Applications (1)

Application Number Title Priority Date Filing Date
TW097135902A TW200921040A (en) 2007-09-28 2008-09-18 Edge detection device

Country Status (4)

Country Link
JP (1) JP4868597B2 (en)
KR (1) KR101009598B1 (en)
CN (1) CN101398292B (en)
TW (1) TW200921040A (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI464385B (en) * 2012-03-22 2014-12-11 Hiti Digital Inc Detecting device and method for detecting a transparent grating structure
TWI481852B (en) * 2012-03-22 2015-04-21 Hiti Digital Inc Detecting device and method for detecting an edge of transparent material
CN103884277A (en) * 2014-03-10 2014-06-25 杭州电子科技大学 Edge detection device for non-transparent media
JP6329091B2 (en) * 2015-02-19 2018-05-23 アズビル株式会社 Edge detection device
CN108548501A (en) * 2018-05-31 2018-09-18 广州贝晓德传动配套有限公司 Edge of materials position detecting device
CN111768422A (en) * 2020-01-16 2020-10-13 北京沃东天骏信息技术有限公司 Edge detection processing method, device, equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3858994B2 (en) * 2002-11-28 2006-12-20 株式会社山武 Position detection method and apparatus
JP2004226372A (en) * 2003-01-27 2004-08-12 Yamatake Corp Position detection method and apparatus
JP4775946B2 (en) * 2005-08-30 2011-09-21 株式会社山武 Edge detection device

Also Published As

Publication number Publication date
CN101398292B (en) 2010-12-15
TW200921040A (en) 2009-05-16
KR101009598B1 (en) 2011-01-20
KR20090033097A (en) 2009-04-01
CN101398292A (en) 2009-04-01
JP2009085679A (en) 2009-04-23
JP4868597B2 (en) 2012-02-01

Similar Documents

Publication Publication Date Title
TWI379068B (en)
US6781705B2 (en) Distance determination
RU2395171C1 (en) Document size detector
US7684053B2 (en) Optical displacement sensor and distance measuring apparatus
KR0153792B1 (en) Alignment method
JP4775946B2 (en) Edge detection device
JP4445989B2 (en) Edge detection device and edge detection method
JP2012189390A (en) Hair detector
JP2003004743A (en) Chromatographic quantitative measurement apparatus
US9958319B2 (en) Method and device for determining a critical angle of an excitation light beam
JP4864734B2 (en) Optical displacement sensor and displacement measuring apparatus using the same
JP4901246B2 (en) Spectral luminance distribution estimation system and method
US8988746B2 (en) Image scanning device with improved dew condensation detection and correction
TWI739337B (en) Paper discriminating device, white reference data adjustment method, program, and calibration method
JP3495212B2 (en) Seam detection device for ERW pipe
CN117242331A (en) Biological sample measuring device
JP2006275783A (en) Angle measuring device
JP2834618B2 (en) Document size detection sensor of document reading device
JP4218880B2 (en) Edge sensor diagnostic method and diagnostic apparatus
JP5274031B2 (en) Analysis method and analyzer
JP6329091B2 (en) Edge detection device
JP3631069B2 (en) Disk surface defect inspection system
JP2006268482A (en) Coin discriminating device
JPH0694630A (en) Inspecting apparatus for extraneous substance
JP5532792B2 (en) Surface inspection apparatus and surface inspection method