TW594591B - Image-based protruded part recognizing and tracking method - Google Patents

Image-based protruded part recognizing and tracking method Download PDF

Info

Publication number
TW594591B
TW594591B TW91132497A TW91132497A TW594591B TW 594591 B TW594591 B TW 594591B TW 91132497 A TW91132497 A TW 91132497A TW 91132497 A TW91132497 A TW 91132497A TW 594591 B TW594591 B TW 594591B
Authority
TW
Taiwan
Prior art keywords
image
pixel
distance
center
gravity
Prior art date
Application number
TW91132497A
Other languages
Chinese (zh)
Other versions
TW200407794A (en
Inventor
Der-Yun Yang
Shyue-Wu Wang
Original Assignee
Inst Information Industry
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inst Information Industry filed Critical Inst Information Industry
Priority to TW91132497A priority Critical patent/TW594591B/en
Publication of TW200407794A publication Critical patent/TW200407794A/en
Application granted granted Critical
Publication of TW594591B publication Critical patent/TW594591B/en

Links

Landscapes

  • Image Analysis (AREA)

Abstract

There is provided an image-based protruded part recognizing and tracking method, which comprises: converting an image for being expressed by HSV image format; recognizing the characteristic region in the image based on a hue range and capturing the regional profiles corresponding to the characteristic region; for every pixel of the regional profile, determining whether the distance between the pixel and the gravity center of the characteristic region is larger than the distance from the previous one of the pixel and the next one of the pixel to the gravity center. If the distance between the pixel and the gravity center of the characteristic region is larger than the distance from the previous one of the pixel and the next one of the pixel to the gravity center, it is determined that this pixel is a protruded part in the characteristic region.

Description

594591 五、發明說明(i) 本發明係有關於一種影像辨識與追蹤方法, 關於一種以影像色調為基礎且依據突出部位之牿且·特別有 像中之突出部位進行有效辨識與追縱 h徵將兩影 出部位辨識與追蹤方法。 之以〜像為基礎之突 對於應用程式的開發者而言,除了應用程式 考量之外,程式與使用者之間的互動介 、力月b性 量因素。一般程式與使用者之間的互動介面^個關鍵考 與滑鼠,使用者透過鍵盤與滑鼠可以下達二、、先的鍵盤 式,以進行相關操作與運算。 曰7 應用程 除此之外,隨著不斷的創新與研發;以人 么 礎的新一代人機介面也被試圖地開發, ^ 未成熟,此以"手"為基礎的新一代:機=:由於技術尚 殊的使用環境’例如’必須在反射平面^持=制於特 少環境的干擾,惟’此種限制將降低使用上由产以減 在使用以手•,為基礎的介面時,必須事先建立 ς而 以供後續分割使用,進而增加其不便性。 、 〜此外,若以骨幹(skeleton)作為手勢辨識的特徵, ^易焚到雜訊〇0^6>的影響;而若以輪廓為辨識 車又 較難將手與身體部分的輪廓進行完整的分割。 有鑑於此,本發明之主要目的為提供一種以影像 (Hue)為基礎且依據特徵區域中突 將兩象色调 中之突出部彳线行有㈣識與追 部位辨識與追蹤方法。 % <大出 為了達成本發明之上述目的’可藉由本發明所提供之594591 V. Description of the invention (i) The present invention relates to an image recognition and tracking method, and an effective identification and tracking of the prominent parts in a particular image based on the hue of the image and based on the highlights of the protruding parts. Identify and track the two shadows. It is based on the ~ image. For application developers, in addition to application considerations, the interaction between the program and the user, the strength of the factors. The key interface between the general program and the user is the key test and mouse. The user can use keyboard and mouse to give two or more keyboards for related operations and calculations. In addition to the 7 application process, with continuous innovation and research and development; a new generation of human-machine interface based on human beings has also been tried to develop. ^ Immature, this new generation based on " hand ": Machine =: Due to the special technical environment, such as 'must be maintained on the reflection plane ^ = limited to interference in a very small environment, but' such restrictions will reduce the use of production to reduce the use of hands •, based on The interface must be created in advance for subsequent segmentation, which increases its inconvenience. In addition, if skeleton is used as the feature of gesture recognition, it is easy to burn to the influence of noise 0 ^ 6 > and if contour is used to identify the car, it is more difficult to complete the contours of hands and body parts. segmentation. In view of this, the main object of the present invention is to provide a method for identifying and tracking parts based on image (Hue) based on the protrusions in the two-tone tone in the characteristic area. % ≪ 大 出 In order to achieve the above-mentioned object of the present invention ', the present invention can provide

594591 五、發明說明(2) 以影像為基礎之突出部位辨識與追蹤方 依據本發明實施例之以影像為基'二,成。-. 追蹤方法,首先,將一影像轉換為以穴▲ °卩位辨識與 (HSV)影像格式進行表示,依據一色巴调、飽和度、亮度 特徵區域,並擷取相應特徵區域之區已圍辨識影像中之 區域輪廓上之每一像素點,判斷像辛點^之後,對於 的距離疋否大於像素點之前一像素點與 1 的距離。若像素點至重心的距離大於像心 與後一像素點至重心的距離,則判禮本則像素點 中之突出部位。 j疋此像素點為特徵區域 此外,本發明更接收另一影像,以 v々办 位辨識,從而辨識出此影像中之突 述之大出部 像中之突出部位與重心連線方向f外,若兩影 小於-既定角度臨限值且兩影;;失角之差 之差大於一既定距離臨限值時,則 /、重〜的距離 行特定應用程式或功能指令之觸發。、一觸發訊息,以進 實施例 x 、第1圖為一流程圖,係顯示依據本發明實施你|夕以馬禮 為基礎之突出部位辨識與追從 : 是,本實施例係以手部區域之上作流程。注意的 突出部位之辨識與追縱,但不“於犬此部位)為例來進行 首先,如涉驟S10,接收一 ^ 像轉換為以色調、飽和度Λ /;二;:步額1,將影 示。注意的是,若影像度(HSV)影像格式進行表 “象接收時即為以色調、飽和度、亮度 第5頁 0213-8797TW(nl) ;STLC~01-D^9118;YIANH0U.ptd 594591594591 V. Description of the invention (2) Image-based identification and tracking of prominent parts According to the embodiment of the present invention, the image-based method is used. -. Tracking method. First, an image is converted to a point ▲ ° 卩 position identification and (HSV) image format for representation, according to a color tone, saturation, and brightness feature area, and the corresponding feature area is captured. Identify each pixel point on the outline of the area in the image, and determine whether the distance after the symplectic point ^ is greater than the distance between one pixel point before the pixel point and 1. If the distance from the pixel to the center of gravity is greater than the distance from the image center to the center of the next pixel to the center of gravity, then the rule is the protruding part of the pixel. j 疋 This pixel point is a characteristic area. In addition, the present invention receives another image, and recognizes it by v 从而, so as to identify the protruding part and the center of gravity of the protruding large image in this image. If the two shadows are less than a predetermined angle threshold and two shadows; when the difference between the missing angles is greater than a predetermined distance threshold, /, a distance of ~ is triggered by a specific application or function instruction. A trigger message, taking the embodiment x, and the first figure as a flow chart, showing the implementation of you in accordance with the present invention. Even the identification and follow of salient parts based on the horse ceremony: Yes, this embodiment is based on the hand Process on top of the area. Note the identification and follow-up of the protruding part, but not "in this part of the dog" as an example. First, if step S10 is involved, the first image is received and converted into hue and saturation Λ /; 2; step size 1, It will be shown. Note that if the image format (HSV) image format is "image received, it is hue, saturation, and brightness. Page 5 0213-8797TW (nl); STLC ~ 01-D ^ 9118; YIANH0U .ptd 594591

影像格式表示時,則步驟S1 1可以省去。 々^著’如步驟s 1 2,依據一色調範圍來辨識影像中之 特徵區域’並如步驟S1 3,擷取相應特徵區域之區域輪 廊1由於本實施例係用以辨識手部區域之指尖,因此,特 徵區域係指手部區域,色調範圍為膚色色調範圍。假設 PixeU i,j)代表影像中在座標(i,]·)位置上像素的色調資 訊’再使用預先定義的色調值的最大值Hue-Max與最小值 Hue一Min(色調範圍)來切割特徵區域,其做法如下:When the image format is indicated, step S11 can be omitted.着 ^ 'Identify the characteristic area in the image according to a tone range according to step s 12' and extract the corresponding feature area's area contour 1 according to step S1 3 because this embodiment is used to identify the hand area. Fingertips, so the feature area is the hand area, and the tone range is the skin tone range. Suppose PixeU i, j) represents the hue information of the pixel in the image at the coordinates (i,] ·). Then use the predefined maximum value Hue-Max and minimum value Hue-Min (tone range) to cut features. Region, as follows:

Pixel{ij)^ iFo^Sr〇und if Pixel{ij)< Hue ^Max &Pixel{ij)^Mm l Background otherwise 當像素的色調資訊落於色調範圍時,則此像素 徵區域的像素;反之,即不屬於。 '寺 之後,如步驟S14,對於區域輪廓上之像素點,判 像素點至特徵區域之重心的距離是否大於像素點之前— 素點與後一像素點至重心的距離。而若像素點至重心的距 離並未大於像素點之;^一像素點或後一像素點至重心的 離(步驟S14的否),則進行步驟S16的判斷(之後說明)。 2像素點至重心的距離大於像素點之前一像素點與後— 素點至重心的距離(步驟S14的是),則如步驟Sl5,判定2 像素點為特徵區域中之突出部位,即指尖。 第2圖為一示意圖係用以說明突出部位之偵測。在此 例中,PiXel_CenterC(P_CC)代表特徵區域的重心所在;Pixel {ij) ^ iFo ^ Sr〇und if Pixel {ij) < Hue ^ Max & Pixel {ij) ^ Mm l Background otherwise When the hue information of a pixel falls within the hue range, the pixel in this pixel sign region; Otherwise, it does not belong. 'After the temple, as in step S14, for the pixels on the outline of the area, determine whether the distance from the pixel to the center of gravity of the feature area is greater than before the pixel—the distance from the prime point to the center of gravity of the next pixel. If the distance from the pixel to the center of gravity is not greater than the distance between the pixels; ^ one pixel or the distance from the next pixel to the center of gravity (No in step S14), the determination in step S16 is performed (explained later). The distance from the 2 pixel point to the center of gravity is greater than the distance from the previous pixel point to the center of gravity—the distance from the prime point to the center of gravity (YES in step S14), then in step S15, it is determined that the 2 pixel point is a protruding part in the feature area, that is, the fingertip. . Fig. 2 is a schematic diagram for explaining detection of a protruding part. In this example, PiXel_CenterC (P_CC) represents the center of gravity of the feature area;

594591 五、發明說明(4) ?1乂61一(:0111:0111-(〇(?一(:(〇)表示區域輪廓上之第1個輪廊 點(像素點);Distance(i) = ( ?1又6 1一(:0111:0111'(1)-?1乂6 1一〇61116 1^)2,1)(1)代表第1個幹 廓點至重心的距離。判斷輪廓點是否為相對極值(突出邛 位)之公式如下: 口594591 V. Description of the invention (4)-1 乂 61-(: 0111: 0111- (〇 (? 一 (: (〇) represents the first contour point (pixel point) on the area outline; Distance (i) = (? 1 and 6 1 1 (: 0111: 0111 '(1)-? 1 乂 6 1 一 〇61116 1 ^) 2,1) (1) represents the distance from the first contour point to the center of gravity. Determine the contour point The formula for whether it is a relative extreme value (highlighted niches) is as follows:

Pixel 一 Coniour、i)= ExiremaPomt ; if Distmce{i^\)<Dis tance{i)8l Dis tm ce[i ^\) <Dis ce{i) 其中,ExtremaPoint表示特徵點(突出部位)。洛輪 點至重心的距離大於鄰近輪廓點至重心的距離時1 廓點判定為特徵點。 < 之後,如步驟S1 6,判斷是否所有區域輪廓上之 進行突出部位的判斷。若並未所有區域輪廓象上之 ΐϋ 進行突出部位的判斷(步驟S16的否),則如 ςι;,跳至區域輪廓上之下一像素點,並回到步驟 ,繼續判斷像素點至特徵區域之重心 像素點之前-像素點與後的距離疋否大於 有區域輪廓上之傻音赴1= 〇 <象素點至重心的距離。而若所 像,;出部位追蹤時,則接收另-影 :的特徵區域時,可以以 二二哉此影像 調粑圍(膚色色調範圍)辨璣為2既疋£域乾圍以色 辦戒此衫像中之特徵區域,其中既 〇213-8797TW(ni)Pixel One Coniour, i) = ExiremaPomt; if Distmce {i ^ \) < Distance {i) 8l Dis tm ce [i ^ \) < Dis ce {i) where ExtremePoint represents the feature point (protruding part). When the distance from the Luo point to the center of gravity is greater than the distance from the neighboring contour point to the center of gravity, the 1 contour point is determined as a feature point. < Then, as in step S16, it is judged whether or not the judging portion is judged on the outline of all the regions. If not all the contours of the area outline are used to judge the protruding part (No in step S16), then go to next pixel point above and below the area outline, and go back to the step to continue judging the pixels to the feature area. Before the center of gravity of the pixel point-Is the distance between the pixel point and the rear point greater than the silly sound on the outline of the area? Go to 1 = 0 < the distance from the pixel point to the center of gravity. And if the image is taken, when tracking the out part, when receiving another feature area: you can use the image tone range (skin tone range) to identify 2 as the range. The characteristic area in this shirt image, which is 〇213-8797TW (ni)

Ptd 第7頁 五、發明說明(5) 疋區域範圍係相應包A此 位之區域與此區域。::,中特徵區域之所有-突出部 以既定區域範圍辨識影像二::::置。ί主意:2二 整張影像而耗費大多運篡次、特敛Q域的目的係避免搜哥 整張影像之外,當找到特:J 了:始的影像必須掃描 既定區域範圍進行辨識特徵=後,其他的影像便可依據 第3a圖顯示一第一旦彡 與特徵點a、b、c、d ^中所偵測出之特徵區域範圍30 包含所有特徵區域31之特;;圖中特徵區域範圍30 域,且具有高H與寬W。由於人沾主Α 、e、與f之& 小,且在連續兩影像間位有=域有一定的範圍大 區塊範圍以進行搜尋,如因,:矿定義-:近 既定位移d區域之位置,二徵/域範圍30與周圍鄰近之-um)。.,位置’即知描區 域32具有高(H + 2d)與寬 值付注意的是,當進杆一 將整張影像轉換為以色货n衫像的辨識之前’可以 表示,或是僅對於影2定;:度^度(膽)影像格式 碉、飽和声 疋義之既疋區域範圍轉換為以色 調飽和度、免度景彡mu。 *來面的觸發機制,在特徵點辨識 來判斷是否產位置的變化作為特徵的移動’ 第4圖為-示意圖係用以說明觸發機制之辨識。若以Ptd page 7 V. Description of the invention (5) 疋 The area scope is the area of the corresponding package A and this area. ::, all of the mid-feature areas-protrusions Identify the image with a predetermined area range 2: ::: set. ί Idea: The purpose of the 22 whole image is to spend a lot of time to tamper with it. The purpose of specializing the Q field is to avoid searching the whole image of the brother. When you find the special: J: the original image must scan the predetermined area to identify the feature = After that, other images can display the feature area range 30 including all feature areas 31 detected once and the feature points a, b, c, d ^ according to Figure 3a; The area range is 30 domains, with high H and wide W. Because the person affiliates the main A, e, and f & small, and there is a range of a certain range between the two consecutive images, a large block range for searching, such as :: mine definition-: near the existing location shift d area Position, the range of singularity / domain range 30 is -um) from the surrounding neighborhood. ., The position 'that is, the tracing area 32 has a height (H + 2d) and a wide value. Note that when the first shot converts the entire image to a color n-shirt image, it can be indicated, or only For Shadow 2 :: Degree ^ Degree (bile) image format, saturation, the meaning of the existing area range is converted to hue saturation, degree-free scene mu. * For the trigger mechanism on the surface, identify at the feature point to determine whether the change in the production position is used as a feature '. Figure 4 is a schematic diagram to illustrate the identification of the trigger mechanism. If

0213-8797OV(nl) ;STIjC.〇l.D.9118;YIANHOU.ptd 第8頁 594591 五、發明說明(6) 的距離,且Theta(i)代表特徵點與重心連線方向盥 之間的夹角。若特徵點的位置變化滿足下列條^平綠 動觸發機制(類似按下滑鼠左鍵的動作)。 現為啟0213-8797OV (nl); STIjC.〇l.D.9118; YIANHOU.ptd page 8 594591 V. Description of the invention (6), and Theta (i) represents the angle between the feature point and the direction of the center of gravity connection. If the position change of the feature point satisfies the following conditions: ^ Green and green trigger mechanism (similar to the action of pressing the left mouse button). Is now Kai

Trigger = true, if {abs{7heta{iyneta{i ^)) < A^leThreshold)& iMMag{iyUag{i^)) > MagUreshold) 其中,AngleThreashold為既定角度臨限值,Trigger = true, if (abs {7heta {iyneta {i ^)) < A ^ leThreshold) & iMMag {iyUag {i ^)) > MagUreshold) where AngleThreashold is the predetermined angle threshold,

MagThr easho Id為既定距離臨限值。換士之从 一 中之特徵點(突出部位)與重心連線方向°與水平右第二影像 角與第-影像中之特徵點與重心連^ | ^夹 夾角之差小於-既定角度臨限值且第二影像出曰’的 重心的距離與第-影像中突出部位與重心的距與 -:定距離臨限值’即表示啟動觸發機制,則於 訊Ί特定應用程式或功能指令之觸發。觸發 注思的是,判斷啟動觸發機制的 應用而有所設計或更動,並不限定於本: 定規則。此外,由於特徵區域卢夕歹】中所述之判 此,特徵點可以再次進行筛選 ^ -夕個特徵點,因 觸發内容與觸發條件。 、'、不同特徵點所相應之 因此,藉由本發明所提供 辨識與追蹤方法,可以以影‘二為土礎之突出部位 徵區域中突出部位之特徵^ ^ ue)為基礎且依據特 辨識與追蹤。 M^M + 之突出部位進行有效MagThr easho Id is the established threshold. In other words, the direction of the connection between the feature point (protruding part) and the center of gravity in the first degree is connected to the horizontal right second image angle and the feature point in the first image is connected to the center of gravity ^ | ^ The difference between the included angle is less than the given angle threshold Value and the second image says 'the distance between the center of gravity and the distance between the protruding part and the center of gravity in the first image and-: a fixed distance threshold' means that the trigger mechanism is activated, which is triggered when a specific application or function command is triggered . Triggering It is important to note that the design or modification of the application of the trigger mechanism is not limited to this: In addition, due to the judgment described in the feature area Lu Xi 歹, the feature points can be filtered again ^-feature points due to trigger content and trigger conditions. Therefore, by using the identification and tracking method provided by the present invention, it is possible to use the characteristics of the protruding parts in the protruding part feature area based on the shadow two as the basis and to identify them based on the special identification and track. M ^ M + is effective

594591594591

一此外,本發明具有下述優點:第一,使用色骑資訊 打分^ i可以減少受光線影響的程度,另外,因為將切 直接設定在欲搜尋之特徵區域(本實施例為手部)的色調 部區域内,即使在非特定環境下也可以不經過建立背景 型的步驟直接將特徵區域分割出來;第二,在本實施例 中,以=指指尖為特徵點除了具有明確易於辨識的優點 外,在定位與追蹤階段的處理較以輪廓或骨幹為基礎的 式簡單與節省系統資源。 十雖然本發明已以較佳實施例揭露如上,然其並非用 限定本發明,任何熟悉此項技藝者,在不脫離本發明之 神和範圍内,當可做些許更動與濁飾,因此笨-發明之保 範圍當視後附之申請專利範圍所界定者為準。 進 割 分 模 方 以 精 護In addition, the present invention has the following advantages: first, the use of color riding information to score ^ i can reduce the degree of influence by light, and because the cut is directly set in the feature area to be searched (in this embodiment, the hand) In the hue region, even in a non-specific environment, the feature region can be directly segmented without going through the steps of establishing a background type. Second, in this embodiment, the feature points with = fingertips in addition to having clearly identifiable In addition to the advantages, the processing in the positioning and tracking phase is simpler and saves system resources than the contour or backbone based approach. Although the present invention has been disclosed in the preferred embodiment as above, it is not intended to limit the present invention. Anyone skilled in the art can make some changes and turbid decorations without departing from the spirit and scope of the present invention. -The scope of protection of the invention shall be determined by the scope of the attached patent application. Cut into the mold side for precise protection

0213-8797TW(nl) ;STLC-01-D-9118;YIANH〇U.ptd 第10頁 594591 圖式簡單說明 為使本發明之上述目的、特徵和優點能更明鞔易懂, 下文特舉實施例,並配合所附圖示,進行詳細說明如下·· 第1圖為一流紅圖係顯示依據本發明實施例之以影像 為基礎之突出部位辨識與追蹤方法之操作流程。 第2圖為一示意圖係用以說明突出部位之偵測。 第3a圖顯示第一影像中之特徵區域範圍。 第3b圖顯示第二影像中偵測特徵區域之影像區域範 圍。 第4圖為一示意圖係用以說明 符號說明 飞月觸發機制之辨識。 S1 0、SI 1、…、S1 7〜操作步驟; 二— a、b.......特徵點(突出部位); 3 0〜特徵區域範圍; ’ 31〜特徵區域; 3 2〜掃,描區域。0213-8797TW (nl); STLC-01-D-9118; YIANHOO.ptd Page 10 594591 The diagram is briefly explained in order to make the above-mentioned objects, features and advantages of the present invention more clear and understandable. The example and the accompanying drawings are described in detail as follows. Figure 1 is a first-class red picture showing the operation flow of a method for identifying and tracking salient parts based on an image according to an embodiment of the present invention. Fig. 2 is a schematic diagram for explaining detection of a protruding part. Figure 3a shows the characteristic region range in the first image. Figure 3b shows the image area range of the detection feature area in the second image. Figure 4 is a schematic diagram for explaining the identification of the moon trigger mechanism. S1 0, SI 1,…, S1 7 ~ operation steps; two-a, b ....... feature points (protruding parts); 3 0 ~ feature area range; '31 ~ feature area; 3 2 ~ scan , Tracing area.

IH 0213-8797TW(nl) ;STLC-〇l-D-9118;YIANH0U.ptd $ 11頁IH 0213-8797TW (nl); STLC-〇l-D-9118; YIANH0U.ptd $ 11 pages

Claims (1)

594591594591 括下1列;:以影像為基礎之突出部位辨識與追縱方—法 包 接收一第一影像; 將該第一影像轉換為以色調、 ^ 像格式進行表示; X、凴度(1^\〇影 域 依據-色調範圍辨識該第一影像中之1 一特徵區 擷取相應該第一特徵區域之一第一、 對於該第一區域輪廓上之每一 I區域輪廓;以及 判斷該像素點i該第—特& 否大於該像素點之前一像素點鱼 =之重心的距離是 離、以及 像素點至―重心的距 若該像素點至重心的距離大於 素點與後一像素點至重心的距離,則二象素點之前一像 一特徵區域中之突出部位。 、彳定該像素點為該第 2·如申請專利範圍第丨項所述之以旦彡 部位辨識與追蹤方法,更包括π ^止〜像為基礎之突出 又匕枯下列步驟· 接收一第二影像《 將該第二影像中之一既定區域範 該第二影像中之一第二特徵區域,其$ Μ该色調範圍辨識 相應包含該第一影像中該第一拉Μ =、該既定區域範圍係 與該區域周圍之一既定位移之位置犬出部位之區域 擷取相應該第二特徵區域之一第_。、 對於該第二區域輪廓上之备一 一區域輪廓;以及 〜却一像素點,Enclose the following column :: Recognition and tracking method of salient parts based on the image-the method package receives a first image; converts the first image to be expressed in the format of hue and ^ image; X, 凴 degree (1 ^ \ 〇 shadow domain identification-identifying a characteristic region in the first image from a tonal range to capture one of the first characteristic regions first, for each I region contour on the first region contour; and determine the pixel The point i should be greater than the distance from the center of gravity of the previous pixel to the pixel. The distance from the center of gravity of the pixel is from the distance from the pixel to the center of gravity. If the distance from the pixel to the center of gravity is greater than the prime and the next pixel The distance from the center of gravity is the protruding part in the feature area that is like a pixel before the two pixel points. Set the pixel point to be the second one. The method of identifying and tracking the position as described in item 丨 of the scope of patent application It also includes the following steps: image-based highlighting and the following steps: · Receive a second image, "Set a predetermined region in the second image to a second characteristic region in the second image, and its $ M The tone range identification Corresponds to the area containing the first pull M = in the first image, the predetermined area range and the position of a predetermined position around the area, and the corresponding one of the second feature areas is extracted. A one-by-one area contour on the second area contour; and ~ one pixel point, 六 申請專利範圍 判斷该像素點至該第二特徵區域 否大於該像素 %德=坺之重心的_距離是 離;以及之别-像素點與後-像素點至重心的距 若該像素點至重的距離大於該 素點與後一像素 $二: 二 <象素點之前-像 二特徵區域中之突出部位。 “象素點為该第 邙付3桃t Γ請專利範圍第2項所述之以影像為基礎之突出 rc方法’更包括將該第二影像轉換為以色 飽和度、壳度(HSV)影像格式進行表示。 4 ·如申睛專利範圍第2項所述之以影像為基礎之 :::識與追蹤方法,t包括將ί第二影像扩之該既定區 域範圍轉換為以色調、飽和度、焭度(HSV)影像格式進 表示。 位與重心的距離與第一影有 既定距離臨限值,貝i 述之以影像為基礎之突出 5 ·如申請專利範圍第2項所述之以影像為基礎之突出 部位辨識與追蹤方法,更包括若第二影像中之突出部位與 重心連線方向與水平線之間的夹角與第一影像中之突出部 位與重心連線方向與水平線之間的爽角之差小於一既定角 度臨限值且第二影像中突出部位& 土 中突出部位與重心的距離之差Λ於 送出一觸發訊息。 6 ·如申請專利範圍第1項所如π冬钩丞礎之突d 部位辨識與追蹤方法,其中該色調範圍為一膚色色調範 圍 7 ·如申請專利範圍第1項所述之以影像為基礎之突出 0213-8797TW(nl) ;STLC-01-D-9118;YIANHOU.ptd 第13貢 申請專利範圍 8辨:ί = :t,其中該第-特徵區域為手部_區域。 部位辨識與追蹤方$圍:1項所:之:乂影像為基礎之突出 9 一種w ',其中s玄犬出部位為指尖部分。 括下列步驟,像為基礎之突出部位辨識與追蹤方法,包 第-調、飽和度、亮度(HSV)影像格式表示之- :識該第一影像中之手部區域; 手部區域之手部輪廓;以及 對於該手部輪廓上之每一像素點,. 判斷該像素點至該手部區域 於該像素點之前一像素:;的=離是否大 及 攸像京點至重心的距離;以 I右該像素點至重心的距離大於該像素S 素點與後一像素點至重心的 /像素點之别一像 部區域中之突出部位。 離’則判定該像素點為該手 ίο.如申請專利範圍第9項 咖 部位辨識與追蹤方法,t包括下列步驟象為基礎之突出 接收以色調、飽和度、亮 第二影像; 7〜琢格式表不之一 將該第二影像中之1定區域 :::區域’‘其中該既定區域範圍: 移之位置; 匕坟/、°哀&域周圍之一既定位The scope of the six patent applications determines whether the distance from the pixel to the second feature area is greater than the pixel. The distance between the center of gravity and the center of gravity is distant; and the distance between the pixel-point and the back-pixel-point to the center of gravity is The heavy distance is greater than the pixel point and the next pixel $ 2: Before the two pixel points-the protruding parts in the characteristic region of the second image. "The pixel point is the image-based protruding rc method described in item 2 of the patent scope. It further includes converting the second image into a color saturation, shell degree (HSV) The image format is used to represent it. 4 · The image-based ::: identification and tracking method described in item 2 of Shenjing's patent scope, which includes converting the second image to the given area and expanding it to hue, saturation The image format is expressed in degrees and degrees (HSV). The distance between the position and the center of gravity has a predetermined distance threshold from the first shadow. The image-based highlight is described in 5. As described in item 2 of the scope of patent application The image-based identification and tracking method of protruding parts further includes if the angle between the protruding part and the center of gravity connection direction and the horizontal line in the second image and the protruding part and the center of gravity connection direction and the horizontal line in the first image The difference between the cool angles is less than a predetermined angle threshold and the difference between the distance between the protruding part in the second image and the center of gravity in the second image is a trigger message. 6 · As in the first item of the scope of patent application π winter hook Bit identification and tracking method, in which the tone range is a skin tone tone range7. The image-based highlighting 0213-8797TW (nl); STLC-01-D-9118; YIANHOU. ptd The 13th tribute application patent scope 8 discrimination: ί =: t, where the -feature area is the hand_area. Part identification and tracking Fang Wai: 1 item Institute: of: 乂 image-based highlight 9 9 w ', Where s mysterious dog's out part is the fingertip part. Including the following steps, such as the method of identifying and tracking salient parts based on the image, including the tone-saturation, saturation, and brightness (HSV) image format representation:-recognize the first The hand area in the image; the hand outline of the hand area; and for each pixel point on the hand outline, determine the pixel point to the hand area one pixel before the pixel point:; = away Is the distance from the Beijing point to the center of gravity to the right; the distance from the pixel to the center of gravity by I is greater than the protruding portion of the pixel S prime point and the next pixel to the center of the image / pixel point. Away ', the pixel is determined to be the hand ίο. Please refer to the 9th patent area identification and tracking method of patent scope. The method includes the following steps: receiving the second image based on hue, saturation and brightness; "Region area ::: area" where the range of the given area: the position to move; Dagger Grave 594591 申請專利範圍 以及 擷取相應該手部區域之手部輪廓 對於該手部輪廓上之每一像素點, 判斷該像素點至該手部區域、 於該像素點之前一像素點與後一像 :的距離是否大 及 4至重心的距離;以 若該像素點至重心的距離大於該像素駐+ a 素點與後一像素點至重心的距離,則判定誃儋本之别一像 部區域中之突出部位。 βχ象素點為該手 11. 如申請專利範圍第10項所述之以 出部位辨識與追蹤方法,更包括若第二费像為基礎之突 與重心連線方向與水平線之間的夾角與第一 ϋ突出部位 部位與重心連線方向與水平線之間的爽角=中之突出 角度臨限值且第二影像中突出部位與重心 二既定 像中突出部位與重心的距離之差大於一既 ^ :影 則送出一觸發訊息。 距離匕限值, 12. 如申請專利範圍第10項所述之以影像為某 出部位辨識與追蹤方法,其中該突出部位為指*、、、尖"部分。 1594591 Patent application scope and extracting the hand contour corresponding to the hand region. For each pixel point on the hand contour, judge the pixel point to the hand region, one pixel point before the pixel point and the latter image. : Whether the distance is large and the distance from 4 to the center of gravity; if the distance from the pixel to the center of gravity is greater than the distance from the pixel to the pixel + a prime point and the next pixel to the center of gravity, determine the other image area of the copy In the protruding part. The βχ pixel point is the hand 11. As described in item 10 of the scope of the patent application, the method of identifying and tracking out parts includes the angle between the direction of the line connecting the projection and the center of gravity and the horizontal line based on the second image. The cool angle between the direction of the line connecting the protruding part with the center of gravity and the horizontal line = the threshold value of the protruding angle in the middle; ^: The video sends a trigger message. The distance limit value, 12. As described in item 10 of the scope of patent application, the image is used to identify and track a certain part, wherein the protruding part is the part indicated by * ,,, and ". 1 〇213.879TnV(nl) ;STLC-01.D-9118;YIANHOU.ptd〇213.879TnV (nl); STLC-01.D-9118; YIANHOU.ptd 第15頁Page 15
TW91132497A 2002-11-04 2002-11-04 Image-based protruded part recognizing and tracking method TW594591B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW91132497A TW594591B (en) 2002-11-04 2002-11-04 Image-based protruded part recognizing and tracking method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW91132497A TW594591B (en) 2002-11-04 2002-11-04 Image-based protruded part recognizing and tracking method

Publications (2)

Publication Number Publication Date
TW200407794A TW200407794A (en) 2004-05-16
TW594591B true TW594591B (en) 2004-06-21

Family

ID=34075615

Family Applications (1)

Application Number Title Priority Date Filing Date
TW91132497A TW594591B (en) 2002-11-04 2002-11-04 Image-based protruded part recognizing and tracking method

Country Status (1)

Country Link
TW (1) TW594591B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8699748B2 (en) 2010-10-13 2014-04-15 Industrial Technology Research Institute Tracking system and method for regions of interest and computer program product thereof

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7508994B2 (en) * 2005-12-05 2009-03-24 Eastman Kodak Company Method for detecting streaks in digital images
TWI408610B (en) * 2009-12-30 2013-09-11 Ind Tech Res Inst Methods and systems for gesture recognition, and computer program products thereof
CN113807364A (en) * 2021-09-08 2021-12-17 国网内蒙古东部电力有限公司兴安供电公司 A method and system for defect detection of power equipment based on three-light fusion imaging
CN114943703B (en) * 2022-05-24 2023-09-05 闫雪 Multi-component P-map region analysis system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8699748B2 (en) 2010-10-13 2014-04-15 Industrial Technology Research Institute Tracking system and method for regions of interest and computer program product thereof

Also Published As

Publication number Publication date
TW200407794A (en) 2004-05-16

Similar Documents

Publication Publication Date Title
JP5755664B2 (en) Image feature detection based on the application of multiple feature detectors
CN103135883B (en) Control the method and system of window
US10108860B2 (en) Systems and methods for generating composite images of long documents using mobile video data
JP5685837B2 (en) Gesture recognition device, gesture recognition method and program
JP6007497B2 (en) Image projection apparatus, image projection control apparatus, and program
US9298365B2 (en) Storage medium, information processing apparatus and character recognition method
JP2000105829A (en) Face part detection method and apparatus
WO2009114967A1 (en) Motion scan-based image processing method and device
EP2628134A1 (en) Text-based 3d augmented reality
CN106125932A (en) A method, device, and mobile terminal for identifying target objects in augmented reality
WO2021121302A1 (en) Video collection control method, electronic device, and computer-readable storage medium
JP2015041279A (en) Image processing apparatus, image processing method, and image processing program
CN105830091A (en) Systems and methods for generating composite images of long documents using mobile video data
JP2019109624A (en) Information processing apparatus, program, and information processing method
TW594591B (en) Image-based protruded part recognizing and tracking method
WO2014079058A1 (en) Method and system for processing video image
JP6455186B2 (en) Fingertip position estimation device, fingertip position estimation method, and program
JP2016099643A (en) Image processing device, image processing method, and image processing program
JP5971108B2 (en) Image processing apparatus, image processing method, and image processing program
JP7009904B2 (en) Terminal devices, information processing systems, information processing methods and programs
WO2022111461A1 (en) Recognition method and apparatus, and electronic device
CN103870809B (en) The detection method and device of vehicle
JPWO2021192206A5 (en) Image recognition system, image recognition method and image recognition program
JP2010061409A (en) Image processing program and image processing system
JP2001331804A (en) Image region detecting apparatus and method

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees