TW200911195A - Eye recognition/tracing method applied in personnel supervision, and apparatus - Google Patents

Eye recognition/tracing method applied in personnel supervision, and apparatus Download PDF

Info

Publication number
TW200911195A
TW200911195A TW096133980A TW96133980A TW200911195A TW 200911195 A TW200911195 A TW 200911195A TW 096133980 A TW096133980 A TW 096133980A TW 96133980 A TW96133980 A TW 96133980A TW 200911195 A TW200911195 A TW 200911195A
Authority
TW
Taiwan
Prior art keywords
eye
tracking
image
item
area
Prior art date
Application number
TW096133980A
Other languages
Chinese (zh)
Other versions
TWI411425B (en
Inventor
wang-xuan Li
ming-cong Weng
Original Assignee
Automotive Res & Amp Testing Ct
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Automotive Res & Amp Testing Ct filed Critical Automotive Res & Amp Testing Ct
Priority to TW96133980A priority Critical patent/TWI411425B/en
Publication of TW200911195A publication Critical patent/TW200911195A/en
Application granted granted Critical
Publication of TWI411425B publication Critical patent/TWI411425B/en

Links

Landscapes

  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present invention discloses an eye recognition/tracing method applied in personnel supervision, and an apparatus. This apparatus includes the IR illuminator, camera, filter and processing unit. The method mainly includes the pre-treatment process, the monitoring mode process and the tracing mode process. The pre-treatment process comprises the image-catching step and the image-sequencing step so as to select one picture from the vision information obtained by using IR source. The monitoring mode process consists of the step of searching candidate region, the step of analyzing candidate region and the step of confirming eye region so as to find out the eye region from the picture and analyze it. The tracing mode process is composed of the eye region tracing step and the tracing/judgment step so as to evaluate whether the eye region need to be re-positioned. By said processes and steps, this method can achieve the purpose of positioning eyes simply and certainly.

Description

200911195 九、發明說明: 【發明所屬之技術領域】 本發明係有關於-種細於人M監控之眼睛辨識追踨方法與裝置,詳 而言之係”測與追蹤喊之影健理技術方法與裝置。 【先前技術】 利用處理單元對於-連續之影像各補徵與細節做分㈣對,藉由預 設的流程與分析方式,進而由該影像上獲取所錢料,㈣做自動控制等 流程的-環,其帽於人體臉部影像的處期斷方式,因具有較廣的應用 粑嘴’遂成-熱門且重要的技術;人體臉部影像的處理觸方式,主要係 可應用在駕駛人員的眼睛追蹤與㈣,防止疲勞開車所造成之危害,當狹 並不僅止於此,其它各式人㈣驗與修,亦可配合此—技術而收到自 動化成效,由此可知影像追蹤之重要性與實用性。 s用技術中華民國專利公開號4 3 6 4 3 6號,持續監控車輛駕駿者 之^法及裝置,其技術在於使用視簡f二連續視訊之移動分析,偵 ^駕驗者κ平移動,令駕敬者面部置中於視訊持續對應視框内,由同質視 訊二連續視《移_素分析,侧驗者輕置中之面㈣直位移,以找 出眼睛位置; 習用技術中華民國專利公開號5 3 1 4 0 2號,眼部追縱方法,其技 術為絲嫩虹職崎朗錄之祕積,細域位置; 習用技術中華民國專利公證書號〗2 2 5 2 2 2號,即時影像串之穩 臉异法,其技術在顧接收面部影像之Y c b c R色彩空間之影 200911195 像資料,合併運動進行分析判別眼睛可能位置; 習用技術中華民國專利公開號2 〇 Q 7 i 9 2 4 7號,控制顯示裝置 之眼睛細系統’其技術在於擷取後之臉部影像與數據庫令預設臉部數據 比對,藉以辨識眼睛位置; 習用技術中華民國專利公開號2 0◦7 i 9 8 7工號,適用於複雜背 影的即時人臉偵測方法,其技術在於接收面部影像之丫c b c&色彩空間 之影像資料’合併支援向量演算進行分析判魏睛可能位置; 由上述各制技射可知,對於人體臉部而言,眼睛似成為—最主要 之辨識目標,然而習频射之辨識方法仍流於㈣,或有需要掏取臉部 影像、虹《鱗,或有需要配合丫 c b c r技辨,各先前技術皆未能 提供出-辭快速魏睛魏方式,如料健鍊高昂之硬體設備方可 配合習用飾,献S絲擷取制者自紅柯烟,相對而言, 其便利性與適用性皆不佳; 再加上習用各種技術其追蹤眼睛之方式,容易受限於天候、環境亮度 以及使用者身上配帶物(如眼鏡、墨鏡等),難以有效達到追蹤之成效與準 確度’耻雌_言’討說是相t的不具有實祕,仍有加以改良之 必要。 【發明内容】 本發明之主要目的係在於提供―種顧於人貞監控之眼睛辨識追從方 與裝置,、可改。各用之追縱辨識方式過於繁瑣、硬體限制較高、使用 不便、易受祕天候與使用者配帶物等缺失。 200911195 為達成前述之發明目的,其裝置係包括紅外線投光器、攝影機、慮片、 處理單7L ’而其方法係包括前置處理流程、侧模式流程,與追縱模式流 程,其中: 前置處理流程包括擷取影像步驟:配合紅外線光源而操取紅外線影像 做為視訊來源;影像序列步驟:將所擷取之視訊拆解成數張即時畫面,以 做後續處理; 偵測模式流程包括搜尋候選區域步驟:將上述所擷取之畫面與樣版比 較挑選出可能魏睛之候舰域:分油域步驟4候親域中利用 群組性與幾何_確認眼睛區域;確認眼_域步驟:真實眼睛樣版 做比對,以提高準破性; 追縱模式流程包括眼睛區域追蹤步驟:_物件追蹤方式追縱眼睛區 域;追縱判定步驟:若所選擇之眼睛區域無法追蹤,難触影像步驟重 新開始;若所選擇之眼睛區域可追蹤,則由影像步驟重新開關,但略 過搜尋候選區域步驟以及分析候選區域步驟; 由上所述之方法,以紅外線影像做為視訊來源,而可全天候使用,並 防止由使用者所配帶之眼鏡、墨鏡造成影響,且偵測模式流程係針對單一 畫面做處理,故較節省硬體資源,亦可加快處_度,追蹤模式流程則係 在追蹤成功時可略韻顺式流程,如此亦有繼處理單元做處理,因此 本創作實是-種相當具有實雜及進步性之創作’轉產#界來推廣並 公諸於社會大眾。 200911195 « r ' 【實施方式] 本創作係有關於一種應用於人員監控之眼睛辨識追踨方法與裝置,其 中該裝置係包括紅外線投光器、攝影機、濾片、處理單元,該紅外線投光 器係可發出紅外線光源,而該攝影機上則設有濾片並與處理單元連接,其 係可鱗紅外線以外之光源,擷取紅外線影像做為視訊以供處理單元做處 理,該處理單元則可分為影像處理單元與影像辨識單元,以便將視訊拆解 為單一畫面,進而做後續之辨識、追縱流程; 續請參閱第-圖麻,這種於人㈣控之畴觸追财法其係 包括A ·前置處贿程、B .偵職式流程,與c ·追蹤模式流程,而各 流程之意義與詳細步驟則如後述。 A .前置處贿程:前置處理流程主要係配合紅外線投絲以及設有 渡片之攝影機來達成,利用紅外線投光器提供紅外線光源,並由攝影機捕 捉反射而來的紅外線光源產生紅外線影像,其係包括a •影像步驟與 b.影像序列步驟; a .擷取影像步驟:配合紅外線投光器之紅外線光源,而利用設有 /慮片之攝影機來擷取紅外線影像,以紅外線投光器而克服環境明暗度之 變化,以及眼鏡、墨鏡之遮擋所造成之準確性問題,而再藉由濾片將紅 外線以外之光源濾掉,進而提高影像之真實性;其中該紅外線投光器與 攝影機係可設置於任意處,以車上使用為例,則可設置於冷氣孔、儀表 板、後照鏡、遮陽板及定位柱等位置附近,但並不以此為限。 b.影像序列步驟:本步驟主要係為由紅外線影像而來之視訊拆解 200911195 成數張即時畫面,任選出其中一張即時畫面,再以該張做為後續步驟之 J x旦面做為處理之標的,以節省系統需求與資源;其中該影 像序列步驟係'以-處理單元之影像處理單元做整理。 貞’振式4.侧模式錄主要係透過處理單元之影像辨識單 元對影像序列步驟中所得之畫面做處理,利用C .搜尋候選區域步驟、d . 分析候選_驟、e •確認_域步驟,_面中確認出正確之眼 睛區域位置; C ·搜尋候選區域步驟:利用處理單元進行樣版韻(Pattern200911195 IX. Description of the invention: [Technical field of invention] The present invention relates to an eye recognition method and apparatus for fine-graining human M monitoring, and in detail, the method of measuring and tracking the shadow [Previous technology] The processing unit uses the processing unit to perform (4) pairs of the replenishment and details of the continuous image, and obtains the money by the preset process and analysis method, and (4) performs automatic control, etc. The process-ring, which has a cap on the face image of the human body, has a wide application of the mouth--a popular and important technology; the face-to-face image processing method is mainly applicable to The driver's eye tracking and (4), to prevent the danger caused by fatigue driving, when narrow and not only this, other types of people (four) inspection and repair, can also cooperate with this technology to receive automation results, which can be seen image tracking The importance and practicability of the technology. The technology of the Republic of China Patent No. 4 3 6 4 3 6 , continuous monitoring of the vehicle driver's method and device, the technology is to use the visual analysis of the two continuous video Detecting the driver's κ ping movement, so that the driver's face is centered in the video and continues to correspond to the view frame. From the homogenous video two consecutive views, the shifting _ prime analysis, the side finder is lightly placed in the face (four) straight displacement, to find Eye position; Conventional technology Republic of China Patent Publication No. 5 3 1 4 0 2, eye tracking method, its technology is the secret product of the silk tender rainbow Osaki Lang recorded, fine domain location; the use of technology of the Republic of China patent certificate number 〗 2 2 5 2 2 2, the image of the instant image is stable, the technology is in the image of the Y cbc R color space of the face image 200911195 image, combined motion to analyze the possible position of the eye; the conventional technology of the Republic of China Patent Publication No. 2 〇Q 7 i 9 2 4 7 , which controls the eye system of the display device. The technique is to compare the facial image with the database to make the preset face data, thereby identifying the eye position; The Republic of China Patent Publication No. 2 0◦7 i 9 8 7 is a real-time face detection method for complex back-ups. The technique is to receive the facial image 丫cb c& color space image data 'merge support vector Calculate and judge the possible position of the Wei eye; It can be seen from the above-mentioned various techniques that the eye seems to be the most important target for the human face, but the identification method of the frequency is still in (4), or it is necessary. Take facial images, rainbow "scales, or need to cooperate with 丫cbcr technology, each of the prior art has failed to provide - the rapid Wei Jing Wei method, such as the high-hard hardware equipment can be used with the custom decoration, S silk pickers from Red Ke smoke, relatively speaking, their convenience and applicability are not good; coupled with the use of various techniques to track the eyes, it is easy to be limited by weather, ambient brightness and user body Belts (such as glasses, sunglasses, etc.), it is difficult to effectively achieve the effectiveness and accuracy of tracking. 'Shame _ _ words' is not the real secret, there is still the need to improve. SUMMARY OF THE INVENTION The main object of the present invention is to provide an eye recognition follower and device that can be monitored by a human being, and can be modified. Each type of tracking method is too cumbersome, has high hardware limitations, is inconvenient to use, and is vulnerable to secrets such as secret weather and user accessories. 200911195 In order to achieve the aforementioned object, the device includes an infrared light projector, a camera, a tablet, a processing unit 7L', and the method includes a pre-processing flow, a side mode flow, and a tracking mode flow, wherein: the pre-processing flow The method includes the steps of capturing an image: using an infrared light source to operate an infrared image as a video source; and the image sequence step: disassembling the captured video into a plurality of instant images for subsequent processing; the detection mode process includes the step of searching for a candidate region : Compare the above-mentioned captured pictures with the sample to select the possible Wei Jing's waiting area: the oil field step 4 in the neighborhood, use the group and geometry _ confirm the eye area; confirm the eye _ domain steps: real eyes The pattern is compared to improve the quasi-breaking; the tracking mode process includes the eye area tracking step: _ object tracking method to track the eye area; tracking determination step: if the selected eye area cannot be tracked, it is difficult to touch the image step again Start; if the selected eye area is traceable, the image step is re-switched, but skipping the search candidate area step The step of analyzing the candidate region; by the method described above, the infrared image is used as the video source, and can be used all the time, and is prevented from being affected by the glasses and sunglasses provided by the user, and the detection mode process is performed for a single screen. Processing, so it saves hardware resources, and can also speed up the _ degree. The tracking mode process can be slightly rhythmic in the process of tracking success, so there is a processing unit to deal with, so this creation is quite The complex and progressive creation of the 'transformation' sector is promoted and made public. 200911195 « r ' [Embodiment] The present invention relates to an eye recognition tracking method and device for personnel monitoring, wherein the device includes an infrared light projector, a camera, a filter, and a processing unit, and the infrared light projector emits infrared rays. a light source, and the camera is provided with a filter and connected to the processing unit, which is a light source other than the scale infrared light, and takes the infrared image as a video for processing by the processing unit, and the processing unit can be divided into an image processing unit. And the image recognition unit, in order to disassemble the video into a single picture, and then carry out the subsequent identification and tracking process; continuation, please refer to the - map, this kind of person (four) control domain touches the financial method including A · before The Bribery, B. Detective-style process, and c. Tracking mode process, and the meaning and detailed steps of each process are described later. A. Front bribery: The pre-processing process is mainly achieved by combining infrared projection and a camera with a ferrite. The infrared light source is used to provide an infrared light source, and the infrared light source reflected by the camera captures the infrared image. The system includes a • image step and b. image sequence step; a. capture image step: cooperate with infrared light source of infrared light projector, and use infrared camera with/with film to capture infrared image, and use infrared light projector to overcome environmental brightness The change, as well as the accuracy caused by the occlusion of the glasses and the sunglasses, and the filter is used to filter out the light source other than the infrared light, thereby improving the authenticity of the image; wherein the infrared light projector and the camera system can be placed at any place. For example, in the case of the vehicle, it can be placed near the cold air hole, the instrument panel, the rear mirror, the sun visor and the positioning column, but it is not limited thereto. b. Image sequence step: This step is mainly for the video image disassembled by infrared image 200911195 into several instant pictures, and one of the instant pictures is selected, and then the J x face of the next step is used as the next step. The target is processed to save system requirements and resources; wherein the image sequence steps are organized by the image processing unit of the processing unit.贞 'Vibration type 4. Side mode recording mainly through the image recognition unit of the processing unit to process the image obtained in the image sequence step, using C. Search candidate region step, d. Analysis candidate_j, e • Confirmation_domain step , _ face to confirm the correct eye area position; C · search candidate area step: use the processing unit for pattern rhyme (Pattern

ReCOgnitlQn) ’其主要係紅騎選之畫面賴擬樣驗崎初步挑 二出較可㈣眼睛之㈣區域;其中該模擬樣版之綱請參閱第二圖及 第三圖所示,賴擬樣版之建置,主要係湘人魏睛1 0之瞳孔i ! 與虹膜12特徵而來,如第二圖所示係為一正常之眼睛丄〇,其中在瞳 孔114分會呈現深色,而在虹膜工2部分則呈像相較瞳孔^為淺之 顏色,藉由此—特徵建置出樣版則如第三圖所示,該樣版主要係以深色 處為中〜’建立—中心區13,而該中心區周圍則為較淺色之周圍區1 4該中、區13與周圍區14排列則概呈十字形;因此,處理單元在 進仃樣板辦識時’係財心H 3與顺區丨4之態樣,尋找該畫面上 各個符σ巾心殊色且觸淺色條件之區域,並將所選細各個區域皆列 為候選區域,射該巾心區1 3與顺區1 4之排顺合方式並不十字 形或五個為限。 d ·分析贿區域麵:處理單元先經過上—步胸初步列出數個ReCOgnitlQn) 'The main image of the red riding is based on the preliminary sample of the test. The first (4) eye (4) area; the outline of the simulation sample can be seen in the second and third figures. The establishment of the version is mainly from the characteristics of the pupils of the Hunanese Weijing 1 and the iris 12, as shown in the second figure, which is a normal eye 丄〇, in which the pupils in the pupils will appear dark, while The iris part 2 is in a lighter color than the pupil ^, and the pattern is built as shown in the third figure. The pattern is mainly in the dark area. The area 13 is surrounded by a lighter surrounding area. The middle area 13 and the surrounding area 14 are arranged in a cross shape. Therefore, the processing unit is in the process of entering the slab. 3 and the direction of the area 丨 4, looking for the area on the picture of each of the σ towel color and touch the light color conditions, and the selected areas are listed as candidate areas, shoot the towel area 1 3 and The arrangement of the 1st row in the area is not a cross or five. d · Analysis of the bribery area: the processing unit first lists a few through the upper-step chest

200911195 • ' I 可能是眼睛的候選區域,本步驟則係要進_步分析各候選區域為眼睛的 可能性,並挑選出最可能為眼睛的區域;其主要侧用連通元件演算法 (Connected Component Labeling),跡轉雖麟㈣係分析來 進行’該連通元件演算法之原理,主要係將上述所選出之各個候選區域, 利用”群城與騎嶋、,諸如大小、形狀、相對位置等條件,筛選出 可能性最高的候舰域,以第_至第六圖為例,其中第四圖係一正常 人臉部2 G示意,概部2 Q上包括魏睛2 1、鼻部2 2,而透過上 一步驟則可初步選出中心深色周圍淺色,較可能為眼睛之候選區域,如 第五圖所示,其分別為第-候選區域211以及第二候選區域2 2!, 然而,透過連通元件演算法以群組性與幾何關係做分析,該第一候選區 域21 1之大小較符合眼睛之大小故可能性較高、第二候選區域2 2丄 則較小而可能性較低,該第一、第二候選區2工i、2 2丄之形狀上皆 略符合眼睛之雜’該第-候賴2 !丨之相對位置财故可能性較 高、第二候選區域2 21之相對位置過窄而可能性較低,透過上述之條 件刀析如此貝j可判疋出該第一候選區域211為眼睛區域21,並 將該眼睛區域定位如第六圖所示。 e .確認眼睛區域步驟:本步驟主要係做一進—步之確認動作,亦 可將本步驟省略以加速進行;確認眼睛區域步驟其係彻真實眼睛樣版 做比對’以提高準雜,她纖婦板,該真實晴觀係指一般正 常人之眼睛所製成之樣版’但並不需限定使用者之眼睛,故可在提高準 確性的同時,兼顧其適用性。 200911195 •' c追蹤模式流程:當偵測模式流程利用處理單元將眼睛區域,由所 選取之畫面巾分析與確認完錢,便會進人追賴式流程,其係由處理單 兀之影像辨識單元做處理,以持續追蹤該眼睛區域,且直到追縱模式流程 無法成功追魏睛區域為止,無需再進人侧模式雜,意即在成功的確 認眼睛區域的情況下,前置處理流程選取完畫面之後,將直接進入追縱模 式抓私’以節省系統資源之利用,故本流程包括有f.眼睛區域追縱步驟 與g·追蹤判定步驟; f·眼睛區域追蹤步驟:當±一流程由所擷取晝s中確認出眼睛區 域後,本步驟係利用處理單元以物件追蹤(〇bject Tracking)方式進行 追蹤,該物件追蹤方式眾多,本實施例以追蹤演算法(CamShi⑴為主, 但並不以此為限,該追蹤演算法過程如下: I.計算整張影像中像素分布資訊。 π ·選定一初始區域(本實施例中即眼睛區域),以稍大於此區域 的區塊當作搜尋視窗(Search Window),並計算其中的像素 分布。 皿.執打移動平均法(MeanShift)計算,找出搜尋視窗的中心, 並重新疋義中心點與更新零階動差(Zero-Moment)的資訊。 IV .根據瓜的結果重新定義搜尋視窗的中心,並根據零階動差重 新疋義視窗大小,之後重新進行Π。 藉由追縱演算法對眼睛區域追蹤,每次親完成後則進行下一步驟。 g ·追蹤歡步驟:娜上__步糊定是否追賴功,若追縱失敗, 200911195 I 1 ,丨 職示眼睛區域已有相當程度之位移,亦有可能是人員已離開或已閉眼 之情況,故需再回到A ·前置處理流程,重新建立眼睛區域,若持續益 法建立眼睛區域,則亦穆己習知之警示機構’做人員已離開或已閉眼 之警示;若追蹤成功,則回到A.前置處理流程,並跳過b .偵測模式 流程,持續進行追蹤。 由上所述者僅為用以解釋本創作之較佳實施例,並非企圖據以對本創 作做任何形式上之限制’是以’凡有在相同之創作精神下所做有關本創作 之任何修飾錢更者,冑鑛包括在糊作意圖倾之範嗜内。 综上所述’本創作應用於人貝驗之眼_識賴方法與裝置在結構 汉计、使时雜及成核益上,確實是完全符合產業上發展所需,且所 揭露之結構發明亦是具有前所未有的創新構造,所以其具有「新酿」應 無疑慮’又本發明可較之習知結構更具功效之增進,因此亦具有「進步性」, 其完全符合我國專利法有關發明專利之申請要件的規定,乃依法提起專利 申5月,並敬a奢鈞局早日審查,並給予肯定。 【圖式簡單說明】 第一圖係本發明流程步驟圖。 第二、三圖係本發明模擬樣版說明示意圖。 第四到六®係本發_選區域触義域制示意圖。 【主要元件符號說明】 10---眼睛 13 —中心格 2 1---眼睛 2 21第二候選區域 11 瞳孔 12 —— 虹膜 14---周圍格 20——〜臉部 2 11 ——第一候選區域2 2—--鼻部 12200911195 • ' I may be a candidate area for the eye. This step is to analyze the possibility of each candidate area as an eye and select the area most likely to be the eye; the main side uses the connected component algorithm (Connected Component) Labeling), although the lining (4) is analyzed to carry out the principle of the connected component algorithm, mainly using the selected candidate regions, the group city and the riding raft, such as size, shape, relative position, etc. The most likely waiting area is selected, taking the _ to sixth figures as an example. The fourth picture is a normal human face 2 G, and the general part 2 Q includes Wei Jing 2 1 and the nose 2 2 Through the previous step, the light color around the center dark color may be initially selected, which is more likely to be a candidate region of the eye. As shown in the fifth figure, it is the first candidate region 211 and the second candidate region 2 2!, respectively. Through the connected component algorithm, the group and the geometric relationship are analyzed. The size of the first candidate region 21 1 is higher than the size of the eye, and the second candidate region is smaller and the possibility is smaller. Low, the first First, the second candidate area 2 work i, 2 2 丄 shape is slightly in line with the eye's miscellaneous 'the first-waiting 2! 丨 relative position is more likely to be financial, the second candidate area 2 21 relative position If the width is too narrow and the possibility is low, the first candidate region 211 is determined to be the eye region 21 by the above-described condition, and the eye region is positioned as shown in the sixth figure. Step: This step is mainly to make a step-by-step confirmation action, and this step can also be omitted to speed up the process; confirm the eye area step and it is necessary to compare the actual eye pattern to improve the quasi-hybrid, her slimming board, The true appearance refers to the pattern made by the eyes of normal people's eyes, but it does not need to limit the eyes of the user, so it can improve the accuracy and balance the applicability. 200911195 • 'c tracking mode process: When the detection mode process uses the processing unit to analyze and confirm the money in the eye area, the selected image frame is entered into a follow-up process, which is processed by the image recognition unit of the processing unit to continuously track the eye area. Eye area, and until The vertical mode process can't successfully track the eye area, no need to enter the side mode, meaning that in the case of successfully confirming the eye area, after the pre-processing process selects the screen, it will directly enter the tracking mode. To save the use of system resources, the process includes f. eye area tracking step and g· tracking determination step; f· eye area tracking step: when the ± one process confirms the eye area from the captured s, The steps are tracked by the processing unit in the object tracking mode. The object tracking method is numerous. In this embodiment, the tracking algorithm (CamShi(1) is mainly used, but not limited thereto. The tracking algorithm process is as follows: I. Calculate the pixel distribution information in the entire image. π · Select an initial region (in this embodiment, the eye region), use a block slightly larger than this region as a search window, and calculate the pixel distribution therein. Dish. Perform MeanShift calculation to find the center of the search window, and re-define the center point and update the information of Zero-Moment. IV. Redefine the center of the search window based on the result of the melon, and re-size the window according to the zero-order motion difference, and then re-do it. The eye area is tracked by the tracking algorithm, and each time the parent completes, the next step is performed. g · Tracking joy step: Na _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ In this case, it is necessary to return to the A. pre-processing process to re-establish the eye area. If the eye area is established through continuous benefit, then the warning mechanism of the self-deprecating staff will be reminded that the person has left or has closed his eyes; if the tracking is successful, Then go back to A. Pre-processing flow, and skip b. Detection mode flow, continue tracking. The above description is only for the purpose of explaining the preferred embodiment of the present invention, and is not intended to impose any form of restriction on the creation of the creation of the creation of any modification of the creation in the same spirit of creation. More money, the mine is included in the ambition of the intention of the paste. In summary, the application of this creation to the eyes of people and people in the field of _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ It is also an unprecedented innovative structure, so it has the "new brewing" should be undoubtedly considered. And the invention can be more effective than the conventional structure, so it is also "progressive", which is fully in line with the invention of the Chinese patent law. The provisions of the patent application requirements are legally filed for May in accordance with the law, and will be reviewed at an early date and affirmed. BRIEF DESCRIPTION OF THE DRAWINGS The first figure is a flow chart of the present invention. The second and third figures are schematic diagrams illustrating the simulation of the present invention. The fourth to the sixth is the schematic diagram of the regional tactile domain. [Description of main component symbols] 10---eye 13 - center grid 2 1---eye 2 21 second candidate area 11 pupil 12 - iris 14---around grid 20 - face 2 11 - a candidate area 2 2 --- nose 12

Claims (1)

200911195 »» ,« 十、申請專利範圍: 1. 一種應用於人員監控之眼睛辨識追踨方法,其係包括下列步驟·· 擷取影像步驟:配合紅外線光源而擷取視訊; 影像序列步驟:將所擷取之視訊拆解成數張即時畫面以做後續處理; 搜尋候選區域步驟:將上述所擷取之晝面與樣版比較挑選出可能為眼 睛之候選區域; 分析候選區域步驟:由候選區域中利用群組性與幾何關係確認眼睛區 域; 眼睛區域追蹤步驟:利用物件追蹤方式追蹤眼睛區域; 追蹤判定步驟:若所選擇之眼睛區域無法被追蹤,則由擷取影像步驟 重新開始;若所選擇之眼睛區域可被追蹤,則略過搜尋候選區域步驟以及 分析候選區域步驟,並持續對所讀入之畫面進行追蹤處理;藉由上述步驟 而可有效達到眼睛追縱以及提高準確性之效。 2. 根據申請專利範圍第1項所述之應用於人員監控之眼睛辨識追踨 方法,其中各步驟依其屬性係分為前置處理流程、偵測模式流程與追蹤模 式流程; 該前置處理流程係包括擷取影像步驟與影像序列步驟; 該偵測模式流程係包括搜尋候選區域步驟與分析候選區域步驟; 該追蹤模式流程係包括眼睛區域追蹤步驟與追蹤判定步驟。 3. 根據申請專利範圍第2項所述之應用於人員監控之眼睛辨識追踨 方法,其中該偵測模式流程在分析候選區域步驟後更包括有一確認眼睛區 域步驟,其係利用真實眼睛樣版做比對,以提高準確性。 13 200911195 »· .· 4. 根射請專利範圍第1項或第2項麵3項中所述之顧於人員 監控之眼睛辨識追緃方法,其中該分析候選區域步驟係以連通元件演算法 利用群組性幾何關係做處理,鱗組性與幾何關係係指包括大小、形狀、 相對位置之至少一種。 5. 根射請專利範圍第1項或第2項或第3項中所述之應用於人員 監控之眼睛辨識追踨方法,其中該擷取影像步驟係以紅外線投光器配合設 有濾片之攝影機所達成。 6. 根據申請專利範圍帛1項或第2項或第3項中所述之應用於人員 監控之眼睛辨識追踨方法’其中該影像序列步驟係以影像處理單元來達成。 7 .根據申請專利範圍第1項或第2項或第3項中所述之應用於人員 監控之眼睛辨識追踨方法,其中該搜尋候選區域步驟、分析候選區域步驟、 眼睛區域追蹤步驟、追蹤判定步驟係以影像辨識單元來達成。 8.根據申請專利範圍第3項所述之應用於人員監控之眼睛辨識追踨 方法’其中該確認眼睛區域步驟係以影像辨識單元來達成。 9 .一種應用於人員監控之眼睛辨識追踨裝置,其係包括一紅外線投 光器、一攝影機、一處理單元,該紅外線投光器係用以提供紅外線光源, 而該攝影機上則設有濾片並與處理單元連接,用以擷取紅外線影像供處理 單元做辨識與追蹤之用’據以達到眼睛追蹤以及提高準確之效。 10 *根據申請專利範圍第9項所述之應用於人員監控之眼睛辨識追 踨裝置,其中該處理單元係包括影像處理單元與影彳象辨識單元。 14200911195 »» , « X. Patent application scope: 1. An eye recognition tracking method applied to personnel monitoring, which includes the following steps: · Capture image steps: capture video with infrared light source; Image sequence steps: The captured video is disassembled into several instant pictures for subsequent processing. Steps of searching for candidate regions: comparing the above-mentioned captured images with the template to select candidate regions that may be eyes; analyzing candidate regions: by candidate regions The group of eyes and the geometric relationship are used to confirm the eye area; the eye area tracking step: tracking the eye area by the object tracking method; the tracking determination step: if the selected eye area cannot be tracked, the step of capturing the image is restarted; The selected eye area can be tracked, the steps of searching for the candidate area and the step of analyzing the candidate area are skipped, and the read picture is continuously tracked; the above steps can effectively achieve eye tracking and improve accuracy. . 2. The eye recognition tracking method applied to personnel monitoring according to item 1 of the patent application scope, wherein each step is divided into a pre-processing flow, a detection mode flow and a tracking mode flow according to the attribute system; the pre-processing The process includes a step of capturing an image and a sequence of image sequences; the process of detecting a mode includes a step of searching for a candidate region and a step of analyzing a candidate region; the tracking mode process includes an eye region tracking step and a tracking determination step. 3. The eye recognition tracking method for personnel monitoring according to item 2 of the patent application scope, wherein the detection mode process further comprises a step of confirming an eye region after the step of analyzing the candidate region, which utilizes a real eye pattern Do the comparison to improve accuracy. 13 200911195 »· .. 4. The method of eye identification based on personnel monitoring, as described in Item 1 or Item 2 of the patent scope, wherein the analysis candidate region step is a connected component algorithm. The processing is performed by group geometric relations, and the scalar and geometric relationship refers to at least one of size, shape, and relative position. 5. The method of eye identification applied to personnel monitoring as described in item 1 or item 2 or item 3 of the patent scope, wherein the image capturing step is an infrared light projector and a camera with a filter. Achieved. 6. The eye recognition tracking method applied to personnel monitoring as described in the scope of patent application 帛1 or item 2 or item 3 wherein the image sequence step is achieved by an image processing unit. 7. The eye recognition tracking method for personnel monitoring according to the scope of claim 1 or 2 or 3, wherein the search candidate region step, the analysis candidate region step, the eye region tracking step, and the tracking The decision step is achieved by an image recognition unit. 8. The eye recognition tracking method applied to personnel monitoring according to item 3 of the patent application scope, wherein the step of confirming the eye area is achieved by an image recognition unit. 9. An eye recognition tracking device for personnel monitoring, comprising an infrared light projector, a camera, and a processing unit, wherein the infrared light projector is configured to provide an infrared light source, and the camera is provided with a filter and processing The unit is connected to capture infrared images for the processing unit to identify and track, in order to achieve eye tracking and improve accuracy. 10* The eye recognition tracking device for personnel monitoring according to claim 9, wherein the processing unit comprises an image processing unit and a shadow recognition unit. 14
TW96133980A 2007-09-12 2007-09-12 Used in the monitoring of the eye to identify the method TWI411425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW96133980A TWI411425B (en) 2007-09-12 2007-09-12 Used in the monitoring of the eye to identify the method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW96133980A TWI411425B (en) 2007-09-12 2007-09-12 Used in the monitoring of the eye to identify the method

Publications (2)

Publication Number Publication Date
TW200911195A true TW200911195A (en) 2009-03-16
TWI411425B TWI411425B (en) 2013-10-11

Family

ID=44724618

Family Applications (1)

Application Number Title Priority Date Filing Date
TW96133980A TWI411425B (en) 2007-09-12 2007-09-12 Used in the monitoring of the eye to identify the method

Country Status (1)

Country Link
TW (1) TWI411425B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI469061B (en) * 2012-12-19 2015-01-11 Nat Univ Chung Hsing Applies to eye recognition methods and identification systems

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US111111A (en) * 1871-01-24 Improvement in type-casting machines
TWI288628B (en) * 2005-01-07 2007-10-21 Univ Nat Central Method and apparatus of communication through eye movements and analysis method of communication through eye movements

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI469061B (en) * 2012-12-19 2015-01-11 Nat Univ Chung Hsing Applies to eye recognition methods and identification systems

Also Published As

Publication number Publication date
TWI411425B (en) 2013-10-11

Similar Documents

Publication Publication Date Title
CN102802502B (en) For the system and method for the point of fixation of tracing observation person
US11650659B2 (en) User input processing with eye tracking
JP6322986B2 (en) Image processing apparatus, image processing method, and image processing program
JP3579218B2 (en) Information display device and information collection device
Mehrubeoglu et al. Real-time eye tracking using a smart camera
US10254831B2 (en) System and method for detecting a gaze of a viewer
CN115412743A (en) Apparatus, system, and method for automatically delaying a video presentation
WO2006023647A1 (en) Systeme and method for monitoring training environment
JP5225870B2 (en) Emotion analyzer
JP2010123019A (en) Device and method for recognizing motion
CN108537103B (en) Living body face detection method and device based on pupil axis measurement
CN105765608A (en) Method and apparatus for eye detection from glints
Stanescu et al. Model-free authoring by demonstration of assembly instructions in augmented reality
CN104604219B (en) Image processing apparatus and image processing method
Colombo et al. Robust tracking and remapping of eye appearance with passive computer vision
TW200911195A (en) Eye recognition/tracing method applied in personnel supervision, and apparatus
JP2021077333A (en) Line-of-sight detection method, line-of-sight detection device, and control program
CN101393602B (en) Be applied to the eye identifying method for tracing of personnel control
Das et al. I Cannot See Students Focusing on My Presentation; Are They Following Me? Continuous Monitoring of Student Engagement through “Stungage”
Uhm et al. Improving the robustness of gaze tracking under unconstrained illumination conditions
CN112435347A (en) E-book reading system and method for enhancing reality
US20130282344A1 (en) Systems and methods for simulating accessory display on a subject
JP2021082114A (en) Information processing method, information processing device and control program
Sahay et al. An Efficient Point of Gaze Estimator for Low-Resolution Imaging Systems Using Extracted Ocular Features Based Neural Architecture
Ferhat et al. Eye-tracking with webcam-based setups: Implementation of a real-time system and an analysis of factors affecting performance