KR20160124361A - Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature - Google Patents

Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature Download PDF

Info

Publication number
KR20160124361A
KR20160124361A KR1020150054476A KR20150054476A KR20160124361A KR 20160124361 A KR20160124361 A KR 20160124361A KR 1020150054476 A KR1020150054476 A KR 1020150054476A KR 20150054476 A KR20150054476 A KR 20150054476A KR 20160124361 A KR20160124361 A KR 20160124361A
Authority
KR
South Korea
Prior art keywords
hand
fingers
point
finger
points
Prior art date
Application number
KR1020150054476A
Other languages
Korean (ko)
Other versions
KR101761234B1 (en
Inventor
조진수
윤홍찬
Original Assignee
가천대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 가천대학교 산학협력단 filed Critical 가천대학교 산학협력단
Priority to KR1020150054476A priority Critical patent/KR101761234B1/en
Publication of KR20160124361A publication Critical patent/KR20160124361A/en
Application granted granted Critical
Publication of KR101761234B1 publication Critical patent/KR101761234B1/en

Links

Images

Classifications

    • G06K9/00355
    • G06K9/00389

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention relates to a method of extracting a hand feature based on curvature analysis for various hand gesture recognition, and more particularly, to a hand feature extraction method based on a curvature analysis based on a curvature analysis method for extracting a candidate region from an input image, A preprocessing step for locating the hand region of the hand candidate group; And (b) extracting features of the hand from the image of the hand region, extracting the hand outline, the boundary point between the fingers, the outline feature point, and the center point of the hand and extracting the feature number of the fingers A hand extraction algorithm based on a curvature analysis which can recognize not only the number of fingers but also the attachment of the fingers is used for extracting features necessary for hand gesture recognition, The feature extraction algorithm detects the hand region in the input image through the skin color range filter and labeling based on the color model, extracts the number of the expanded fingers and the attachment of the fingers using the outline and feature points and the curvature information extracted from them, Recognize hand gestures. Experimental results show that the recognition rate and frame rate are similar to those of existing algorithms, but the number of gestures that can be defined with extracted features is about four times that of existing algorithms, I could.

Description

[0001] The present invention relates to a hand feature extracting method based on curvature analysis for recognizing various hand gestures,

The present invention relates to a curvature analysis based hand feature extraction method for recognizing a hand gesture, and more particularly, to a curvature analysis method capable of recognizing not only the number of fingers but also a finger attachment, Based hand feature extraction algorithm to detect the hand region in the input image through the skin color range filter and labeling based on the color model and to extract the number of fingers extended and the attachment of fingers using outline and feature points and curvature information extracted from them The present invention relates to a curvature analysis based hand feature extraction method for recognizing various hand gestures by extracting features and recognizing various hand gestures.

Over the past decade, various interaction methods have been introduced and developed to control the computer instead of the keyboard or mouse. Gesture recognition technology that analyzes information acquired from cameras in the field of NUI (Natural User Interface) aiming at natural interaction with computer by directly interacting with human body organs and classifying them according to purpose is very convenient and intuitive Therefore, it has been continuously studied from the past to the present.

Among these gesture recognition technologies, the hand gesture recognition process is divided into two stages: a step of extracting a hand feature such as a finger, a palm, and a hand center, a gesture recognition step of classifying the extracted hand feature into a target activity . At this time, the features extracted from the step of extracting the characteristics of the hand are the classification criteria of the gesture in the subsequent gesture recognition step, and play a key role in determining the diversity of the gesture. The algorithms of existing studies extracting the characteristics of the hand using the image extract the characteristics of the hand using the color of the hand, the center of the hand, and the information of the outline of the hand. This approach can be applied to small systems such as mobile as well as general systems by processing fast operations. However, the features extracted through these existing algorithms are focused on the number of fingers in the unfolded state. It is difficult to support diverse target behaviors required for interaction with the computer in the gesture recognition step of classifying the extracted non-diversified features into the target behavior.

Therefore, there is a need for an algorithm that can detect the fingertip attachment state together with the number of existing fingers and improve the variety of hand gestures.

An object of the present invention to solve the problems of the prior art is to use a hand feature extraction algorithm based on curvature analysis to recognize not only the number of fingers but also the attachment of fingers in order to extract features necessary for various hand gesture recognition In this paper, we propose a novel method to extract the hand region from the input image by using the skin color range filter and the color model. The hand region is extracted from the hand region, A hand feature extraction method based on curvature analysis for recognizing various hand gestures by extracting feature points and center points of the hand and extracting the feature information of the states of the fingers based on the extracted number of fingers, .

In order to achieve the object of the present invention, a curvature analysis-based hand feature extraction method for recognizing various hand gestures comprises: (a) extracting a candidate region estimated as a hand region from the input image, A preprocessing step of extracting a candidate group and a hand region of a hand candidate group; And (b) extracting features of the hand from the image of the hand region, extracting the hand outline, the boundary point between the fingers, the outline feature point, and the center point of the hand and extracting the feature number of the fingers And a feature extraction step of extracting,

A hand-specific extraction algorithm based on a curvature analysis that can recognize up to the number of fingers as well as the number of fingers can be used for feature extraction for hand gesture recognition. The hand feature extraction algorithm uses a color model-based skin color range The hand region is detected through the filter and the labeling and the various hand gestures are recognized by extracting the number of the opened fingers and the characteristics of the finger fingers using the outline and the feature points and the curvature information extracted from the input image.

The curvature analysis based hand feature extraction method for various hand gesture recognition according to the present invention is based on a curvature analysis based on not only the number of fingers but also the attachment of the fingers in order to extract features necessary for various hand gesture recognition In this paper, we propose a hand feature extraction algorithm, which extracts hand region from input image through skin color range filter and labeling based on color model, extracts hand outline, boundary point between fingers, outline feature point and hand center point, The number of fingers extended and the characteristics of fingering were extracted and various hand gestures were recognized. Experimental results show that the recognition rate and frame rate are similar to those of existing algorithms, but the number of gestures that can be defined with extracted features is about four times that of existing algorithms, I could.

1 is a diagram illustrating an entire process of a hand feature extraction algorithm;
2 is a diagram showing (a) a current input image (b) an initial input image (c) a difference image;
FIG. 3 is a diagram showing (a) Concave decision condition and (b) Boundary point between the extracted fingers; FIG.
FIG. 4 is a diagram illustrating a method for approximating a straight line by reducing the number of points in a curve estimated as a continuous point in order to extract outline feature points using the outline information obtained in the previous step in order to obtain the hand feature information as simple information. Douglas-Peucker) algorithm;
5 is a diagram showing (a) finger end point extraction when there is no finger attached, and (b) finger end point extraction when a finger is attached;
Figure 6 is a diagram of attached finger's attachment recognition (a), (b);
Figure 7 is a drawing of the attachment procedure of the proposed algorithm;
Fig. 8 is a diagram showing a case where no finger is attached: (a) an original image (b) an existing algorithm (c);
Fig. 9 is a diagram showing a case where a finger is present: (a) an original image (b) an existing algorithm (c); And
FIG. 10 is a diagram showing an example of feature extraction using a proposed algorithm.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

II. Curvature Analysis Based Hand Feature Extraction Algorithm

FIG. 1 is a diagram illustrating an entire process of a hand feature extraction algorithm.

The entire process of the hand feature extraction algorithm proposed in the present invention is divided into two processes. A preprocessing step of extracting a hand image necessary for extracting features of a hand with respect to an input image; Second, the feature extraction step extracts the feature information of the number of fingers and the fingertip state using the hand image.

Hereinafter, the preprocessing step and feature extraction step using a curvature analysis based hand feature extraction algorithm will be described in detail.

1. Pre-processing step

The preprocessing step is to find the hand region of many objects in the input image. In order to find the region of the hand, it extracts the candidates that are presumed to be the hand region and extracts the hand regions of the hand candidates.

2 is a diagram showing (a) a current input image (b) an initial input image (c) a difference image.

1.1 Hand candidate extraction

The first step of extracting the candidates of the hand is to obtain an initial input image from the current input image through a difference image (FIG. 2) in order to extract an object including a moving hand. In the second step, the hand-candidate binomial image is extracted through the skin color range filter based on Cb and Cr except Y, which is the luminance component of the YCbCr color model, to the previously processed car image in order to extract the hand candidate of the skin color among the moving objects .

The binocular image of the hand candidate extracted through the YCbCr color model is subjected to a closure operation which is the result of expansion and erosion in the morphology operation in order to fill the inside part of the lost hand caused by the noise or hole due to the shadow To compensate for lost hands.

1.2 Hand region extraction

In the hand candidate group image processed by the skin color filter, there exist various noise from the hand region. Because the hand enters the filter range for skin color, it occupies a large area in the image. In this paper, we propose a method for extracting binarized images by extracting each group to which pixels are connected by using connected component labeling algorithm, which has object detection effect in image, and eliminating unnecessary noise by leaving group with maximum size. Respectively.

2. Feature Extraction Step

The feature extraction step is a step of extracting the characteristics of the hand from the image of the hand. It extracts the hand outline, the boundary point between the fingers, the feature point of the outline, and the center point of the hand and extracts the number of the fingers based on this, .

2.1 Curvature information and boundary point extraction between fingers

In the first step of the feature extraction step, it is a process to obtain the boundary point between the fingers, which plays a key role in recognizing the fingertip of the fingers by using the features that appear when the fingers are attached. The hand curvature information is extracted to obtain the boundary point between the fingers and detected based on the information.

First, in order to extract the hand circumference curvature information, a line that forms the curvature center point and curvature used in the curvature determination should be extracted. First, we obtain the outer contour of the hand by applying a Snake (Active Contour Model) algorithm to the binary image of the hand to obtain the contour representing the hand shape. Second, the portion of the shape of the hand, which generally includes the tendency of the straight line, is the boundary between the fingers, but the length of the straight line is not so long due to the finger joint. Based on the characteristics of these hands, we recognize that the outline is the wrist line region with the longest length, while there is no change in the flexion of the outline of the hand in the image. Then, the center of each end point of the line determined to be the wrist line region is determined as the center of the wrist (P wrist center ) and designated as the center of curvature used for determining the curvature. The curvature information of the hand is extracted using the center point of the extracted curvature and the outline of the hand. P nk , P n , and P n + k are successively extracted from the points constituting the outline of the hand in the interval K (threshold value), and the concave of the curvature (P n )]. The judgment condition for the concave is judged by comparing the lengths at the three points and the curvature center point and the angle condition formed by the points.

Figure pat00001

In this case, P random contours are the points of arbitrary contour of the outline of the hand, P wrist center is the center point of the wrists at both end points of the line determined to be the area of the wrist line, and concave (P n ) ).

The shape of the hand is convex at the fingertip region at the center of the wrist, and the point region between the fingers is concave. There are many points between the concave fingers in the point region between the fingers, but there are only two points in the point region between the fingers. The boundary point (P n ) between the fingers is defined as the point at which the concave curvature feature is present and the point at which the concave feature begins to appear. The boundary points between the fingers are extracted through the condition (2) (Fig. 3).

FIG. 3 is a diagram showing (a) Concave decision condition and (b) Boundary point between the extracted fingers.

Figure pat00002

2.2.2 Finger end point recognition and number of extended fingers

FIG. 4 is a diagram illustrating a method for approximating a straight line by reducing the number of points in a curve estimated as a continuous point in order to extract outline feature points using the outline information obtained in the previous step in order to obtain the hand feature information as simple information. Douglas-Peucker) algorithm.

The Douglas-Peucker algorithm sets the critical range to approximate a straight line, divides the interval by finding the nearest point that crosses the critical range based on the line connecting the points at the greatest distance from one point. Connect the points that are farthest from the division point again by straight line, and divide the section by finding the points that cross the nearest critical range again. Do this for every point and divide the interval. If there is a point passing through the critical section within the divided section, the section is divided again, and the point within the critical range is removed. Through this operation, it is repeatedly executed until there is no point in the critical range between the remaining points.

In order to obtain the hand feature information as simple information, outline feature points are extracted using the outline information obtained in the previous step. For extracting the outline feature points, the Douglas-Peucker algorithm [Algorithms for the reduction of the number of points required to represent a digitized line or it caricature] shown in Fig. 4, which is an outline approximation algorithm, is applied. The Douglas-Pecker algorithm approximates a straight line by decreasing the number of points in the curve estimated from consecutive points. Among the extracted outer feature points, the points having acute angles are points corresponding to the expanded finger end point, the non-attached point between the fingers, and the outer hand point (the wrist point having acute angle).

The Convex Hull algorithm [Efficience convex hull algorithms for pattern recognition applications] was applied to classify the fingertip and hand outlier points group and fingertip points in the hand out of the extracted points. This algorithm can extract polygons connecting the outermost points of multiple points. Since the vertex constituting the polygon is the outermost point of the hand, the point between the fingers other than the outermost point of the outline feature points can be excluded, so that only the fingertip point and the hand outline point can be detected.

In order to distinguish the fingertip point from the hand outline point, it is necessary to define the finger area where the fingertip point is distributed and the area boundary line which distinguishes the wrist outer area where the wrist point is distributed. The reference point of the boundary of the area is a reference which divides the finger area and the wrist outer area from the normal line of the direction in which the hand extends from the center of the hand to the center of the hand. The hand center point, which is the reference of the area boundary, is obtained through the distance transformations in digital images. The average slope of the straight line between the outer feature points and the boundary points between the fingers having acute angles indicating the direction of the hand extending from the center of the hand has the direction in which the fingers are stretched in the image. A straight line passing through the hand center point is a normal line of a straight line having a direction in which a finger is extended, and a boundary line of an area in which the finger is gathered and an area for distinguishing the other area. The area including the finger end point and the other hand point (wrist point having an acute angle) can be distinguished through the area boundary line. Using the feature that the end point of the expanded finger is longer than the average distance between the boundary feature points and the boundary points between the fingers at the center of the hand, the number of fingers extended to the number of points in the finger area larger than the average distance from the outer feature points and the hand center point (Fig. 5).

Fig. 5 is a diagram showing (a) finger end point extraction when there is no finger attached, and (b) finger end point extraction when a finger is attached.

2.3 Finger recognition

6 is a diagram showing (a) and (b) attached finger's attachment recognition.

Since the outline feature point extracts a straight line outline at a point where the slope of the outline changes greatly, the magnitude of the slope change of the outline when the finger is attached is reduced, and it is difficult to accurately extract the end point of the attached finger as shown in FIG. 5B . Conventional algorithms that can not recognize these attached fingers limit the number of gestures that can be defined through the hand because of the misrecognition that they recognize as the same gesture even though they are different gestures in the hand gesture classification step .

In order to overcome this problem, the present invention recognizes attachment of a finger through the presence or absence and the number of the boundary points between the fingers extracted through the curvature information when a certain area is held at the end point of the finger.

There are fingertip points or wrist points on both sides of the fingertip extending from the outline feature point. There is an unconditional boundary point between the fingers around the attached finger, and the range corresponds to both points (point between fingers or wrist point). The number of attached fingers can be estimated through the number of boundary points between the fingers included in the range from the expanded finger end point to both points.

FIG. 7 is a diagram showing an attachment procedure of the proposed algorithm.

K = finger end point input;

J = Threshold according to resolution;

Area width = distance (K finger left point, K finger right point);

Area Vertical Length = distance (K finger, center of hand) / J;

Horizontal area center point K finger end point positioning;

    for (i = 0; i <number of border points between fingers: i ++) {

      if (the boundary point between the i-th fingers is within the area)

          Number of boundary points between fingers in area ++;

           }

       K number of fingers attached to finger = number of border points between finger in area / 2;

return K number of fingers attached to the finger;

The horizontal region of the finger attachment region is a straight line having a normal slope at the fingertip end and a center point of the fingertip with the length between both points of the extended finger end point. In the vertical region, a threshold J (threshold according to resolution) is given to the image of the length to the fingertip and the center of the hand, and a region having a length of 1 / J is designated as a vertical length, Align the center point with the fingertip point.

There is a boundary between two fingers in one place. In this way, the number of points where the boundary point between fingers extracted in one finger end region is 2N (N is a natural number of 1 or more) can be calculated as N (Figs. 6 and 7).

III. Experiment

Implementation of the proposed algorithm and comparison evaluation with existing algorithms were performed in Inter (R) Core (TM) i5 @ 2.60GHz CPU, 4GBytes memory, Windows 7 64bit, Visual C ++ language and Logitech webcam C905 environment. We compared the existing algorithms [4] for extracting the number of fingers extended and the proposed algorithms for extracting the number of fingers and recognizing the fingers.

FIG. 8 is a diagram showing an algorithm in which no finger is attached: (a) original image (b) existing algorithm (c).

FIG. 9 is a diagram showing an algorithm proposed in (a) an original image (b) and a conventional algorithm (c) in a case where a finger is attached.

FIG. 10 is a diagram showing an example of feature extraction using a proposed algorithm.

In the case where there is no finger attached to the hand region extracted from the image (FIG. 8), in extracting the number of fingers extracted, the conventional algorithm and the proposed algorithm extract the same number. However, in the case where the attached finger exists (FIG. 9), the conventional algorithm recognizes the attached finger with a single finger, extracts the number of the opened finger with loss of information about the attached finger, and extracts the number of the incorrect finger . However, in the case of using the algorithm proposed in the present invention, it is possible to extract the information about the attached finger which is lost in the conventional algorithm through the curvature information at the attached finger point. Therefore, it is possible to accurately extract the attached state of the finger and the number of the opened fingers, so that the diversity problem in the gesture classification of the existing algorithm can be solved by using the algorithm proposed in the present invention.

Since the existing algorithm obtains the finger information only at the point where the slope of the outline changes greatly when recognizing the expanded finger, the expanded finger end is extracted because the slope of the outline is large, but the end point of the attached finger is decreased in the changing size of the slope Only the end point of one of the fingers is extracted. However, the proposed algorithm extracts the information about the fingers through the curvature information of the outline while extracting the end points of the fingertip. By complementing the problems of the existing algorithms, (Fig. 10).

For the comparative evaluation of the algorithm, the number of cases that can be defined in the gesture recognition step based on the existing algorithm and the features extracted from the algorithm proposed by the present invention, the frame rate, and the recognition rate (Table 1, Table 2, Table 3).

Figure pat00003

The difference between the recognition rate of existing algorithms and the proposed algorithm is small, and the analysis of curvature is included in the frame rate. Therefore, the difference between the existing algorithm and the proposed algorithm is two frames. However, since the existing algorithm judges only the number of fingers to be opened, the number of fingers spread is only the number of total cases, When the proposed algorithm is used, it is possible to recognize the number of fingers extended and the state of fingers different from the existing algorithm that extracts only the number of fingers extended. Therefore, as the number of fingers expanded increases, The number of existing algorithms increased. Therefore, it was confirmed that the total number of cases that can be used in the gesture classification can be 18 when the proposed hand feature extraction algorithm that extracts the extended finger and attachment state can be used.

Table 1 shows the number of cases that can be defined by applying the existing algorithm.

Number of attachment points \ Number of extended fingers      One     2    Three     4    5            0      One      One     One      One     One                                           total     5

Table 2 shows the number of cases that can be defined by applying the proposed hand feature extraction algorithm.

Number of attachment points \ Number of extended fingers      One     2    Three     4    5            0      One      One     One      One     One            One      -      One     One      2     2            2      -      -     One      One     2            Three      -      -     -      One     One            4      -      -     -      -     One                                           total    18

Table 3 shows the recognition rate and frame rate (in the conventional algorithm and proposed algorithm) in the existing algorithms and the proposed hand feature extraction algorithm.

   Performance \ Algorithm    Proposed algorithm       Existing algorithm    Recognition rate (%)          92           93 Frame rate
(Frames / second)
         22           24

In this invention, we designed and implemented a hand feature extraction algorithm that can classify more various hand gestures by solving the problems of existing algorithms that do not recognize the attached fingers.

The proposed hand feature extraction algorithm detects the hand region in the input image through the skin color range filter and labeling based on the color model, and uses the outline and feature points and the curvature information extracted based on the number, Information is extracted to recognize the hand gesture. The proposed hand feature extraction algorithm is similar to the existing algorithms in recognition rate and frame rate, but it can extract the number of cases about 4 times as much as the existing algorithm in classification of the gesture. In addition, it has the advantage of being applicable to a basic general camera, and can be widely applied to a small-sized system such as a mobile device. In the future research, we will carry out research on algorithms that can extract the number of various cases by extracting each hand end point extracted from the finger position information through the proposed hand feature extraction algorithm.

As described above, the method of the present invention can be implemented as a program and recorded on a recording medium (CD-ROM, RAM, ROM, memory card, hard disk, magneto-optical disk, storage device, etc.) Lt; / RTI &gt;

While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is clearly understood that the same is by way of illustration and example only and is not to be taken in conjunction with the present invention. The present invention can be variously modified or modified.

P random contours : points of arbitrary contours of the outline of the hand,
P wrist center : Center point of the wrist,
P nk , P n , P n + k : Three points of the K (threshold) interval among the points constituting the outline of the hand
concave (P n ): concave of curvature (1, 0)
boundary point (P n ): boundary point between fingers

Claims (7)

(a) a preprocessing step of extracting a candidate region from the input image to extract a candidate region that is supposed to be a hand region and locating a hand region among the candidate regions; And
(b) extracting features of the hand from the image of the hand region, extracting the hand outline, the boundary point between the fingers, the feature point of the outline, and the center point of the hand and extracting the feature information of the fingers Extracting feature extraction step,
A hand-specific extraction algorithm based on a curvature analysis that can recognize up to the number of fingers as well as the number of fingers can be used for feature extraction for hand gesture recognition. The hand feature extraction algorithm uses a color model-based skin color range The hand region is detected through the filter and the labeling, and the number of the opened fingers and the characteristics of the finger fingers extracted using the outline and the feature points and the curvature information extracted therefrom are extracted to recognize various hand gestures A Hand Feature Extraction Method Based on Curvature Analysis for Hand Gesture Recognition.
The method according to claim 1,
The preprocessing step includes a hand candidate group extraction and a hand region extraction process,
(a1) The first step of extraction of the hand candidate group is to obtain an initial input image through a difference image in a current input image to extract an object including a moving hand in the image, and in a second step, The binarized image of the hand candidate is extracted through the skin color range filter based on Cb and Cr except Y, which is the luminance component of the YCbCr color model, to the previously processed car image, and the binarized image of the hand candidate extracted through the YCbCr color model is binarized The image is a hand candidate that replaces the lost hand by performing a closure operation that is the result of expansion and erosion in the morphology operation in order to fill the inside of the lost hand caused by the noise or hole due to the shadow. Extraction step; And
(a2) The hand candidate region image processed by the skin color filter has various noise other than the hand region. Because the hand enters the filter range for skin color, it occupies a large area in the image. In this paper, we propose a method for extracting binarized images by extracting each group to which pixels are connected by using connected component labeling algorithm, which has object detection effect in image, and eliminating unnecessary noise by leaving group with maximum size. Hand region extraction step;
A method of hand feature extraction based on curvature analysis for various hand gesture recognition.
The method according to claim 1,
Characterized in that the feature extraction step comprises the steps of: i) extracting curvature information and boundary points between fingers, ii) recognizing finger endpoints and the number of extended fingers, and iii) recognizing finger attachments, based on curvature analysis for various hand gesture recognition Extraction method.
The method of claim 3,
The curvature information and boundary point extraction between fingers
In the first step of the feature extracting step, the hand curvature information is extracted to obtain the boundary points between the fingers to recognize the fingertip of the finger by using the features appearing when the finger is attached,
First, to extract the curvature information of the hand, the curvature center point and curvature line should be extracted. I) An active contour model algorithm is applied to the binarized image of the hand to obtain the outline representing the hand shape. And ii) the portion of the shape of the hand which generally includes the most linear tendency is the boundary line between the fingers, but based on the characteristic of the hand, there is no change in the outline of the hand, The center of the wrist is determined as the center point of the wrist and the center of curvature is used as the center of curvature used for determining the curvature, The curvature information of the hand is extracted using the center point of the extracted curvature and the outline of the hand, and the point constituting the outline of the hand Of K determining the (critical value), distance to the three points (P nk, P n, P n + k) with a concave curvature through extraction by the formula (1) conditions in sequence [concave (P n)] and the concave determined The condition (concave decision condition) is determined by comparing lengths at three points and curvature center points,
-
Figure pat00004
- Equation (1)
In the shape of the hand, the fingertip region is convex at the center of the wrist, the point region between the fingers is concave, and the point between the concave fingers in the point region between the fingers is large. However, The boundary point (P n ) between the fingers is defined as the point at which the concave curvature characteristic is lost and the point at which the concave characteristic begins to appear, and the boundary point between the fingers is defined by the following equation (2) Extract
-
Figure pat00005
- Equation (2)
A hand feature extraction method based on curvature analysis for various hand gesture recognition.
The method of claim 3,
The recognition of the fingertip and the recognition of the number of opened fingers
In order to obtain the feature information of the hand by simple information, the outline feature points are extracted using the outline information obtained in the previous step, and the Douglas-Peucker algorithm [Algorithms for the reduction Fig. 4] is a straight line approximating the number of points in a curve estimated by a continuous point, and the point having an acute angle among the extracted outer feature points is expanded The point between the fingertip, the point not between the fingers, and the point of the other hand (wrist point with acute angle)
The Convex Hull algorithm [Efficience convex hull algorithms for pattern recognition applications] was applied to classify fingertip and hand outlier groups and fingertip points in the hand out of the extracted points, The polygon connecting the outermost points can be extracted. Since the vertex constituting the polygon is the outermost point of the hand, the point between the fingers other than the outermost point can be excluded, Only points can be detected,
In order to distinguish the fingertip point and the hand outline point, a finger boundary area is required to distinguish the finger area from the wrist area and the wrist boundary area. Therefore, the boundary point of the boundary line is based on the center of the hand The normality of the direction in which the hand extends is used as a criterion for dividing the area of the finger and the area of the wrist. The center of the hand, which is the reference of the area boundary, is obtained through distance transformations in digital images, The average slope of the straight line between the feature points and the boundary points between the fingers having an acute angle indicating the direction has a direction in which the fingers are stretched in the image and a straight line passing through the center of the hand is a straight line having a direction in which the fingers are extended, Through the region boundaries for distinguishing the collected region and the other region Extraction of fingertip when there is no finger and when the finger is present is extracted and the area including the other hand point (wrist point having acute angle) is distinguished. The end point of the opened finger is the outer feature point And the number of fingers extended to the number of points in the finger region that is larger than the average distance from the fingertip point is extracted by using the feature that the fingers are longer than the average distance of the boundary points between the fingers. Analysis - based hand feature extraction.
The method of claim 3,
Since the attached finger's attachment recognition of the finger extracts a straight line outline at a point where the slope of the outline changes greatly at the outline feature point, the magnitude of the slope change of the outline decreases when the finger is attached (FIG. 5B) In order to accurately extract the end points of the attached fingers, it is recognized that the fingers are attached through the presence or absence of the boundary points between the fingers extracted through the curvature information when a certain area is held at the end point of the finger,
On both sides of the fingertip extended from the outline feature point, there is a point between the fingers or a point of the wrist, and a boundary point between the fingers exists unconditionally around the fingers, and the range corresponds to both points (point between fingers or wrist) And estimating the number of fingers attached through the number of boundary points between the fingers included in the range from the fingertip point to both points of the fingertip.
The method according to claim 6,
The attachment recognition of the finger is performed by an algorithm of the attachment procedure of the hand feature extraction algorithm as follows,
K = finger end point input;
J = Threshold according to resolution;
Area width = distance (K finger left point, K finger right point);
Area Vertical Length = distance (K finger, center of hand) / J;
Horizontal area center point K finger end point positioning;
for (i = 0; i <number of border points between fingers: i ++) {
if (the boundary point between the i-th fingers is within the area)
Number of boundary points between fingers in area ++;
}
K number of fingers attached to finger = number of border points between finger in area / 2;
return K number of fingers attached to the finger;
The horizontal area of the finger judgment area is a straight line having a length between both points of the opened finger end point and having a normal slope at the corresponding finger end point and a center point of the finger end point, A threshold value J according to the size is given and a length of 1 / J is designated as a vertical length to create an area, the center point of the horizontal area is set to the end point of the finger,
A boundary point between two fingers exists in one place where the fingers are attached to each other, and the number of places where the boundary point between the fingers extracted in one finger end region is 2N (N is a natural number equal to or greater than 1) A Hand Feature Extraction Method Based on Curvature Analysis for Various Hand Gesture Recognition.
KR1020150054476A 2015-04-17 2015-04-17 Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature KR101761234B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150054476A KR101761234B1 (en) 2015-04-17 2015-04-17 Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150054476A KR101761234B1 (en) 2015-04-17 2015-04-17 Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature

Publications (2)

Publication Number Publication Date
KR20160124361A true KR20160124361A (en) 2016-10-27
KR101761234B1 KR101761234B1 (en) 2017-07-26

Family

ID=57247359

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150054476A KR101761234B1 (en) 2015-04-17 2015-04-17 Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature

Country Status (1)

Country Link
KR (1) KR101761234B1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110281247A (en) * 2019-06-10 2019-09-27 旗瀚科技有限公司 A kind of man-machine interactive system and method for disabled aiding robot of supporting parents
KR102107182B1 (en) * 2018-10-23 2020-05-06 전남대학교 산학협력단 Hand Gesture Recognition System and Method
CN111435429A (en) * 2019-01-15 2020-07-21 北京伟景智能科技有限公司 Gesture recognition method and system based on binocular stereo data dynamic cognition
CN113039550A (en) * 2018-10-10 2021-06-25 深圳市道通智能航空技术股份有限公司 Gesture recognition method, VR (virtual reality) visual angle control method and VR system
KR20210155600A (en) * 2020-06-16 2021-12-23 한국기술교육대학교 산학협력단 Method and system for operating virtual training content using user-defined gesture model
CN116627260A (en) * 2023-07-24 2023-08-22 成都赛力斯科技有限公司 Method and device for idle operation, computer equipment and storage medium
CN117422721A (en) * 2023-12-19 2024-01-19 天河超级计算淮海分中心 Intelligent labeling method based on lower limb CT image
KR20240037067A (en) 2022-09-14 2024-03-21 엘디시스템 주식회사 Device for recognizing gesture based on artificial intelligence using general camera and method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101992837B1 (en) * 2017-09-13 2019-09-30 주식회사 매크론 Method and apparatus for generating mixed reality contents

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113039550A (en) * 2018-10-10 2021-06-25 深圳市道通智能航空技术股份有限公司 Gesture recognition method, VR (virtual reality) visual angle control method and VR system
KR102107182B1 (en) * 2018-10-23 2020-05-06 전남대학교 산학협력단 Hand Gesture Recognition System and Method
CN111435429A (en) * 2019-01-15 2020-07-21 北京伟景智能科技有限公司 Gesture recognition method and system based on binocular stereo data dynamic cognition
CN111435429B (en) * 2019-01-15 2024-03-01 北京伟景智能科技有限公司 Gesture recognition method and system based on binocular stereo data dynamic cognition
CN110281247A (en) * 2019-06-10 2019-09-27 旗瀚科技有限公司 A kind of man-machine interactive system and method for disabled aiding robot of supporting parents
KR20210155600A (en) * 2020-06-16 2021-12-23 한국기술교육대학교 산학협력단 Method and system for operating virtual training content using user-defined gesture model
KR20240037067A (en) 2022-09-14 2024-03-21 엘디시스템 주식회사 Device for recognizing gesture based on artificial intelligence using general camera and method thereof
CN116627260A (en) * 2023-07-24 2023-08-22 成都赛力斯科技有限公司 Method and device for idle operation, computer equipment and storage medium
CN117422721A (en) * 2023-12-19 2024-01-19 天河超级计算淮海分中心 Intelligent labeling method based on lower limb CT image
CN117422721B (en) * 2023-12-19 2024-03-08 天河超级计算淮海分中心 Intelligent labeling method based on lower limb CT image

Also Published As

Publication number Publication date
KR101761234B1 (en) 2017-07-26

Similar Documents

Publication Publication Date Title
KR101761234B1 (en) Hand Feature Extraction Algorithm using Curvature Analysis For Recognition of Various Hand Feature
Mukherjee et al. Fingertip detection and tracking for recognition of air-writing in videos
US9916012B2 (en) Image processing apparatus, image processing method, and program
Zaki et al. Sign language recognition using a combination of new vision based features
Keskin et al. Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm
US20160171293A1 (en) Gesture tracking and classification
CN111797709B (en) Real-time dynamic gesture track recognition method based on regression detection
Nalepa et al. Fast and accurate hand shape classification
Bilal et al. A hybrid method using haar-like and skin-color algorithm for hand posture detection, recognition and tracking
JP4745207B2 (en) Facial feature point detection apparatus and method
Ahmed et al. Appearance-based arabic sign language recognition using hidden markov models
Chang et al. Spatio-temporal hough forest for efficient detection–localisation–recognition of fingerwriting in egocentric camera
Ribeiro et al. Hand Image Segmentation in Video Sequence by GMM: a comparative analysis
Ding et al. Recognition of hand-gestures using improved local binary pattern
Rahim et al. Hand gesture recognition based on optimal segmentation in human-computer interaction
Raheja et al. Hand gesture pointing location detection
Aksaç et al. Real-time multi-objective hand posture/gesture recognition by using distance classifiers and finite state machine for virtual mouse operations
Hu et al. Multi-scale topological features for hand posture representation and analysis
Pun et al. Real-time hand gesture recognition using motion tracking
Czupryna et al. Real-time vision pointer interface
Wagner et al. Framework for a portable gesture interface
Bakheet A fuzzy framework for real-time gesture spotting and recognition
Chen et al. A fingertips detection method based on the combination of centroid and Harris corner algorithm
Mohan et al. Background and skin colour independent hand region extraction and static gesture recognition
Umadevi et al. Development of an Efficient Hand Gesture Recognition system for human computer interaction

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
GRNT Written decision to grant