KR101745601B1 - method for finger counting by using image processing and apparatus adopting the method - Google Patents

method for finger counting by using image processing and apparatus adopting the method Download PDF

Info

Publication number
KR101745601B1
KR101745601B1 KR1020150118872A KR20150118872A KR101745601B1 KR 101745601 B1 KR101745601 B1 KR 101745601B1 KR 1020150118872 A KR1020150118872 A KR 1020150118872A KR 20150118872 A KR20150118872 A KR 20150118872A KR 101745601 B1 KR101745601 B1 KR 101745601B1
Authority
KR
South Korea
Prior art keywords
image
hand
palm
contour
user
Prior art date
Application number
KR1020150118872A
Other languages
Korean (ko)
Other versions
KR20170023565A (en
Inventor
이의철
김한솔
Original Assignee
상명대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 상명대학교산학협력단 filed Critical 상명대학교산학협력단
Priority to KR1020150118872A priority Critical patent/KR101745601B1/en
Publication of KR20170023565A publication Critical patent/KR20170023565A/en
Application granted granted Critical
Publication of KR101745601B1 publication Critical patent/KR101745601B1/en

Links

Images

Classifications

    • G06K9/00375
    • G06K9/32
    • G06K9/6204

Landscapes

  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

A method and apparatus for recognizing a finger number by image processing will be described. A method for recognizing the number of fingers includes the steps of: photographing at least a user's hand using a camera to obtain an image; Detecting a user's hand floor area from the image; Detecting a palm contour from the palm area; Determining a finger end point of the user from the hand floor contour; And recognizing the number of the user's fingers using the fingertip.

Description

FIELD OF THE INVENTION [0001] The present invention relates to a method and apparatus for recognizing a finger,

The present invention relates to a method and apparatus for recognizing a number of fingers in a low-resolution depth image, and more particularly, to a method for recognizing a number of fingers using low-resolution depth image information and an apparatus for applying the method

In general, it requires a monitor, a keyboard, and a mouse, which are essential basic devices for computer control and information exchange. Therefore, such interface equipment is essential for the use of a computer, and otherwise it is impossible to control and exchange information.

Recently, research on the user interface method based on the hand gesture is proceeding. User interface methods based on such hand gestures include data gloves, gesture recognition through optical markers, and so on. Since these conventional methods require additional equipment to be attached to the human body, it is difficult to implement the interface method through only the human body, that is, the NUI (Natural User Interface).

KR1020120121589 KR1020130044845 KR1020110053410

The present invention provides a method for recognizing the number of fingers through a low-resolution depth image and an apparatus for applying the method.

According to another aspect of the present invention, there is provided a method for recognizing a finger number,

Capturing an image of at least a user's hand using a camera to obtain an image;

Detecting a user's hand floor area from the image;

Detecting a palm contour from the palm area;

Determining a finger end point of the user from the hand floor contour;

And recognizing the number of the user's fingers using the fingertip.

Method according to a specific type of the invention:

Detecting a skeleton of the user from the image photographed by the image photographing device;

Localizing the range of the image into the hand region using the three-dimensional position information of the hand based on the detected skeleton information;

Detecting only the palm region in the hand candidate region;

Applying an average filter or a medium filter to remove articulation of the detected palm image;

Binarizing the palm area to which the average filter or the medium filter is applied;

Detecting contours from the re-binarized palm area;

Calculating a curvature of a palm contour along the detected contour;

Determining a finger end point using the calculated curvature; And

And recognizing the number of fingers using the determined number of finger end points.

According to a specific embodiment of the present invention, in order to detect the palm area, the range of the image may be localized to the hand region, and the hand floor region may be detected from the localized image.

According to another embodiment of the present invention, the hand floor area is localized using optical three-dimensional position information of the user's hand area.

According to another embodiment of the present invention, the image includes the user's hand and an upper body part, and the optical three-dimensional position information is obtained from the skeleton information detected from the image.

According to another specific embodiment of the present invention, between the step of detecting the palm area and the step of detecting the contour,

Binarizing the palm area;

Applying an average filtering or median filter to the binarized palm area; And

Secondly binarizing an average filtered palm area; As shown in FIG.

According to another embodiment of the present invention, the step of localizing the hand region in the image may determine the hand region by the skin color in contrast with the background color in the image or the depth information in the image captured by the infrared camera .

According to another exemplary embodiment of the present invention, the step of obtaining the hand floor contour line may extract the contour line using a difference between the image obtained by erosion and the image obtained by the double-binarized image.

According to another specific embodiment of the present invention,

Determining the fingertip point comprises: determining that the contour line follows the contour line in a counterclockwise direction starting from a pixel of the contour line having the smallest y value, and determines which of adjacent pixels in the vicinity of 8 around the self (x, y) (X-1, y-1) / (x-1, y-1) x + 1, y) / (x + 1, y-1) and (x + 1, y + 1).

An apparatus for recognizing a number of fingers by image processing according to the present invention performs the above method and comprises: a visible ray or infrared camera for photographing the image; And a computer system for processing the image and recognizing the number of fingers of the user.

The method of the present invention recognizes the number of fingers of a user in a low-resolution depth image. In particular, in establishing an NUI without special equipment, the number of fingers of a user is recognized, Can be used as a mouse manipulation device of the mouse. In addition, since the calibration process by the initial user for tracking the user's hand is not required, the user's discomfort can be solved.

1 is a flowchart of a finger number recognition method according to the present invention.
2 is a flowchart of a finger count recognition method according to a specific embodiment of the present invention.
3 shows a detected hand region depth image in the finger number recognition method according to the present invention.
FIG. 4 shows an image of the hand floor region binarized and detected in the hand candidate depth image in the finger number recognition method according to the present invention.
FIG. 5 shows an image obtained by applying an average filter to the binarized image of FIG. 3 in the method of recognizing the number of fingers according to the present invention.
FIG. 6 shows an image obtained by binarizing the image of FIG. 4 to which an average filter is applied in the method of recognizing the number of fingers according to the present invention.
FIG. 7 illustrates an image in which a contour detected from a binarized image through a preprocessing process is detected in the finger number recognition method according to the present invention.
FIG. 8 shows 8 neighboring pixels used to find contiguous contours of the detected hand region in the embodiment of the finger count recognition method according to the present invention.
9 is a view showing an image in which a fingertip point is determined according to a curvature of a finger after curvature calculation is performed along a contour line of a detected hand region in the method of recognizing a finger number according to the present invention.
Fig. 10 shows an embodiment of a finger number recognition apparatus according to the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS Hereinafter, a method of recognizing a finger number according to the present invention and an apparatus for applying the same will be described with reference to the accompanying drawings.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

The method of recognizing the number of fingers according to the present invention is performed by a computer equipped with a display and a video device, and the term " computer " referred to in the present invention extends to all devices based on a computer system, But is not limited by the type of computer.

Prior to the description of the finger number recognition method according to the present invention, a finger recognition method for performing the finger recognition method will be described.

10, a finger recognition and number recognition method according to the present invention is based on a computer and includes a monitor 11 for presenting a GUI (Graphic User Interface) 11a, a keyboard 13, a mouse 14, And a moving picture camera 20 which faces the subject 1.

The user 1 stands facing the monitor 11 and the camera 20 and the camera 20 takes the upper body of the user 1 including the hand of the user 1 in real time. The user 1 extends his hand toward the camera 20 for recognizing the number of fingers. The GUI 11a connected to the computer main body is implemented and all the visible information such as the number of fingers is displayed.

Such a computer may be replaced by a special purpose electronic device based on a computer system. As shown in FIG. 10, if the user 1 takes a finger shape in front of the camera 20, for example, a shape of "4", the image processing and finger motion analysis system of the main body analyzes this, And displays it on the display 11.

The finger number recognition method according to the present invention can be defined as shown in FIG.

In the first step S11,

The user or the subject's hand is photographed as a live video. At this time, not only the bottom part of the hand but also the upper part of the subject can be taken together. At this time, it is good to drop the upper body and the hand part at a certain distance so that it can be separated by the image depth.

In the second step S12,

The hand floor area is separated from the palm image as shown in FIG. Here, the image of the hand floor area is extracted using the depth information of the image, in particular, the information of the three-dimensional position of the hand area.

In the third step S13,

The outline of the entire floor of the palm as shown in Fig. 7 is detected from the image of the hand floor area obtained in the previous step S12. This includes the outline of all the fingers, and if the user holds the fist, the finger outline will not be visible.

In the fourth step S14,

The fingertip point is extracted from the outline of the hand floor obtained in the previous step. Since the finger portion has a shape protruding in one direction from the palm portion, this feature can be used to separate the finger region and extract the end point of the finger from the finger region. The number of fingertip points obtained here is the result of the recognition of the number of fingers according to the present invention.

Hereinafter, with reference to FIG. 2, the flow of the embodiment of the finger number recognition method according to the present invention will be described in order.

In the first step S21,

First, the upper body of a subject having a specific finger shape is photographed, and the upper half skeleton model is detected therefrom. At this stage, in order to successfully acquire the skeleton model of the subject, the image captured by the subject must include an abnormality in the upper half of the subject, and this image is continuously detected in real time.

In the second step S22,

Using the three-dimensional position information of the hand based on the hand skeleton information from the skeleton information, the range of the image is localized from the entire upper body region to a narrower region including the hand floor region. In the step of localizing the hand floor area, one or both hand hand area localization can be performed based on the detected skeleton.

In the third step S23,

Only the palm region is determined so as to include the hand candidate region.

In this process, the hand candidate region can be determined by binarizing the hand candidate image based on the depth value corresponding to the palm region, thereby removing the portion of the wrist which is farther away from the camera than the palm.

At this time, the determination and detection of the hand region can use the depth of grayscale or color obtained through the infrared camera.

On the other hand, it is possible to determine and detect the hand region using the skin color judgment as compared with the background color in the image obtained through the visible ray camera.

In the fourth step S24,

An average filter or a medium filter is applied to remove the articulation of the binarized palm image. For this purpose, the binarized image is blurred by applying, for example, an average filter of 7 * 7 size

In the fifth step S25,

The palm area to which the average filter is applied is binarized again. This is to remove the articulation by re-binarizing the palm image with respect to an arbitrary threshold value, thereby obtaining a smooth outline image.

In the sixth step S26,

In the above process, the contour is detected from the binarized palm area twice. At this stage, the contour can be detected by subtracting the re-binarized palm image that has been eroded from the re-binarized palm image.

In the seventh step S27,

The curvature of the palm contour is calculated along the detected contour. The curvature between the adjacent three points can be calculated along the detected contour along the counterclockwise direction from the pixel having the smallest y value in the image coordinate to the starting point.

In the eighth step S28,

And determines whether it is a finger end point using the calculated curvature. At this stage, if it is determined that the curvature calculated along the outline is smaller than the predetermined threshold value, it can be determined as the fingertip point.

In the ninth step S29,

And recognizes the number of the determined finger end points as the number of fingers. That is, the number of fingers determined as the finger end point is counted and displayed on the display.

Hereinafter, the above-described steps will be described in more detail with reference to FIGS. 3 to 9. FIG. The description of each step is provided as an example of the source code used step by step to help understand each step.

3 shows a detected hand region depth image in the finger number recognition method according to the present invention.

Figure 112015081890714-pat00001

FIG. 4 shows an image of the hand floor area detected by binarization in the hand candidate depth image in the finger number recognition method according to the present invention. In the process of obtaining the image of FIG. 4, the bottom of the hand can be determined by binarizing the image based on the depth value corresponding to the palm portion, thereby removing the portion of the wrist which is farther away from the camera than the palm .

Figure 112015081890714-pat00002

FIG. 5 shows a result of applying an average filter to the first-order binarized image. In the step of applying the average filter, a binarized image is blurred by applying an averaging filter of, for example, 7 * 7 in order to remove articulation on the outside of the palm.

Figure 112015081890714-pat00003

6 shows an image in which the average filter again binarizes the image. In this second binarization step, the blurred palm image in the previous stage is re-binarized based on a certain threshold value, thereby removing roughly prominent articulation of the outer periphery, so that a palm image with smooth outline Can be obtained.

Figure 112015081890714-pat00004

FIG. 7 shows an image obtained by detecting the contours detected from the binarized image through the preprocessing process described above. In order to obtain such a result, in the contour detection step, contours of various finger shapes as shown in FIG. 7 can be detected by removing the re-binarized palm images that have been eroded in the re-binarized palm images.

Figure 112015081890714-pat00005

Fig. 8 shows 9 neighbor pixels used to find contiguous contours of the detected hand region. In calculation of the hand contour curvature calculation, the curvature between three adjacent points can be calculated by contour lines illustrated in FIG. 7 along a counterclockwise direction starting from a pixel having the smallest y value at a certain pixel interval.

At this time, in order to prevent the abnormal tracking along the contour line of the hand region along the clockwise direction with the pixel having the smallest y value as the start point, it is necessary to determine which one of the pixels in the vicinity of 8 around the self (x, y) (X-1, y-1) / (x-1, y-1), (x-1, y + / (x + 1, y) / (x + 1, y-1) and (x + 1, y + 1). <Formula 1> below is a formula for obtaining the curvature between adjacent three points at a certain pixel interval.

Figure 112015081890714-pat00006

Referring to the above equation (1), (

Figure 112015081890714-pat00007
, (
Figure 112015081890714-pat00008
(
Figure 112015081890714-pat00009
Which are located on a palm contour line spaced apart by a certain pixel interval around the center of the palm. therefore
Figure 112015081890714-pat00010
Means an angle between three adjacent points at regular intervals.

9 is a view showing a result of judging the fingertip point according to the curvature of the finger after calculating the curvature along the outline of the hand region by the above-described method.

Figure 112015081890714-pat00011

Figure 112015081890714-pat00012

As shown in FIG. 9, in the determination of the fingertip point, when it is determined that the curvature calculated along the outline is smaller than a predetermined threshold value, the fingertip point can be determined.

As described above, according to the present invention, it is possible to recognize the number of the user's fingers and to output the number of the fingers by a technique such as image capturing and image processing without an additional device that burdens the user. The present invention can be used as a variety of interface devices. For example, the present invention can be used as a mouse operation device for clicking a mouse or scrolling according to the number of fingers in a computer system. In addition, since the calibration process by the initial user for tracking the user's hand is not required, the user's discomfort can be solved.

These and other embodiments of the present invention have been described and shown in the accompanying drawings. However, it should be understood that these embodiments are only a part of various embodiments. Since various other modifications could occur to those of ordinary skill in the art.

1: User
11: Monitor
11a: Display area (GUI area)
13: Keyboard
14: Mouse
20: Camera

Claims (15)

Capturing an image of at least a user's hand using a camera to obtain an image;
Detecting a user's hand floor area from the image;
First-order binarizing the palm area;
Applying an average filtering or median filter to the binarized palm area;
Binarizing the palm area to which the filter is applied;
Detecting a palm contour from the palm area;
Determining a fingertip of the user from the palm contour; And,
And recognizing the number of the user's fingers using the finger end point.
The method according to claim 1,
Wherein the range of the image is localized to the hand region and the hand floor region is detected from the localized image to detect the palm region.
3. The method of claim 2,
Wherein the hand floor area is localized using optical three-dimensional position information of the user's hand part.
The method of claim 3,
Wherein the image includes the user's hand and an upper body part, and the optical three-dimensional position information is obtained from skeleton information detected from the image.
5. The method according to any one of claims 2 to 4,
Localizing the hand region in the image comprises: determining a hand region based on the skin color of the image in comparison with the background color or the depth information in the image captured by the infrared camera, .
delete The method according to claim 1,
Wherein the step of obtaining the palm contour comprises extracting the contour using a difference between the image obtained by erosion and the image obtained by performing the erosion on the twice binarized image.
5. The method according to any one of claims 1 to 4,
Determining the fingertip point;
(X, y + 1), (x, y) are used to determine which of the pixels adjacent to the eight neighboring pixels around the pixel (x, y) is an adjacent contour line, (x-1, y-1) / (x + 1, y) / 1), (x + 1, y + 1).
6. The method of claim 5,
Determining the fingertip point;
(X, y + 1) and (x, y + 1) when determining which of the pixels adjacent to the 8 neighboring pixels around the pixel (x, y) is an adjacent contour line along the clockwise direction with an arbitrary pixel of the contour line as a starting point. y-1) / (x + 1, y) / (x + 1, y-1), ), (x-1, y + 1).
delete 8. The method of claim 7,
Determining the fingertip point;
(X, y + 1) and (x, y + 2), which is a pixel having a contour of the smallest y value in the clockwise direction, 1), (x-1, y-1) / (x + 1, y) / 1, y-1), (x-1, y + 1).
5. An apparatus for performing the method of any one of claims 1 to 4,
A camera for photographing the image; And
And a computer system for processing the image and recognizing the number of fingers.
delete 13. The method of claim 12,
Wherein the contour is extracted by using a difference between the image obtained by erosion and the image obtained by binarizing twice in the step of obtaining the palm contour.
13. The method of claim 12,
In determining the fingertip point,
(X, y), which is a pixel having a contour of the smallest y value as a starting point, in the counterclockwise direction and determines which of the pixels adjacent to the eight neighboring pixels around the pixel (x, y) 1) / (x-1, y-1) / (x-1, y-1) (X + 1, y + 1, y-1) and (x + 1, y + 1).

KR1020150118872A 2015-08-24 2015-08-24 method for finger counting by using image processing and apparatus adopting the method KR101745601B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150118872A KR101745601B1 (en) 2015-08-24 2015-08-24 method for finger counting by using image processing and apparatus adopting the method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150118872A KR101745601B1 (en) 2015-08-24 2015-08-24 method for finger counting by using image processing and apparatus adopting the method

Publications (2)

Publication Number Publication Date
KR20170023565A KR20170023565A (en) 2017-03-06
KR101745601B1 true KR101745601B1 (en) 2017-06-09

Family

ID=58398913

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150118872A KR101745601B1 (en) 2015-08-24 2015-08-24 method for finger counting by using image processing and apparatus adopting the method

Country Status (1)

Country Link
KR (1) KR101745601B1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102096532B1 (en) * 2018-06-05 2020-04-02 국방과학연구소 Edge Enhancement Method and Apparatus based on Curvelet Transform for Object Recognition at Sonar Image
CN110941367A (en) * 2018-09-25 2020-03-31 福州瑞芯微电子股份有限公司 Identification method based on double photographing and terminal
KR102052449B1 (en) * 2019-01-14 2019-12-05 전남대학교산학협력단 System for virtual mouse and method therefor
CN112052747A (en) * 2020-08-11 2020-12-08 深圳市欧森隆健康科技有限公司 Palm recognition method, health report generation method, health detection system and electronic equipment
CN112507924B (en) * 2020-12-16 2024-04-09 深圳荆虹科技有限公司 3D gesture recognition method, device and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012128763A (en) * 2010-12-17 2012-07-05 Omron Corp Image processing apparatus, method, and program
KR101526426B1 (en) * 2013-12-31 2015-06-05 현대자동차 주식회사 Gesture recognize apparatus and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101090187B1 (en) 2011-04-12 2011-12-20 (주) 대영개발 Safety cover for hinge
KR101327892B1 (en) 2011-04-27 2013-11-11 한국화학연구원 Method for preparing lactide from alkyl lactate
KR101359354B1 (en) 2011-10-25 2014-02-10 전자부품연구원 Apparatus for guiding board status information

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012128763A (en) * 2010-12-17 2012-07-05 Omron Corp Image processing apparatus, method, and program
KR101526426B1 (en) * 2013-12-31 2015-06-05 현대자동차 주식회사 Gesture recognize apparatus and method

Also Published As

Publication number Publication date
KR20170023565A (en) 2017-03-06

Similar Documents

Publication Publication Date Title
KR101745601B1 (en) method for finger counting by using image processing and apparatus adopting the method
KR101514169B1 (en) Information processing device, information processing method, and recording medium
US9959481B2 (en) Image processing apparatus, image processing method, and computer-readable recording medium
US10445887B2 (en) Tracking processing device and tracking processing system provided with same, and tracking processing method
JP2013250882A5 (en)
US20190197313A1 (en) Monitoring device
KR101745651B1 (en) System and method for recognizing hand gesture
TW201514830A (en) Interactive operation method of electronic apparatus
KR101737430B1 (en) A method of detecting objects in the image with moving background
JP2021108193A (en) Image processing device, image processing method, and program
JP2010057105A (en) Three-dimensional object tracking method and system
JP2017117341A (en) Object detection method, device and program
JP4821355B2 (en) Person tracking device, person tracking method, and person tracking program
WO2018181363A1 (en) Diagnostic image processing apparatus, assessment assistance method, and program
JP2006236184A (en) Human body detection method by image processing
US9286513B2 (en) Image processing apparatus, method, and storage medium
KR20100121817A (en) Method for tracking region of eye
KR101496287B1 (en) Video synopsis system and video synopsis method using the same
JP2012003724A (en) Three-dimensional fingertip position detection method, three-dimensional fingertip position detector and program
JP6350331B2 (en) TRACKING DEVICE, TRACKING METHOD, AND TRACKING PROGRAM
KR101539944B1 (en) Object identification method
JP5559121B2 (en) Object type determination device
KR101653235B1 (en) Apparatus and method for econgnizing gesture
JP6818485B2 (en) Image processing equipment, image processing methods, and programs
JP2017207949A (en) Gesture command input device, gesture command input method, gesture command input program, and image display system

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant