CN107885327B - Fingertip detection method based on Kinect depth information - Google Patents

Fingertip detection method based on Kinect depth information Download PDF

Info

Publication number
CN107885327B
CN107885327B CN201711021681.XA CN201711021681A CN107885327B CN 107885327 B CN107885327 B CN 107885327B CN 201711021681 A CN201711021681 A CN 201711021681A CN 107885327 B CN107885327 B CN 107885327B
Authority
CN
China
Prior art keywords
hand
point
curvature
points
kinect
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711021681.XA
Other languages
Chinese (zh)
Other versions
CN107885327A (en
Inventor
权巍
张超
韩成
薛耀红
李华
胡汉平
陈纯毅
蒋振刚
杨华民
冯欣
王蒙蒙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changchun University of Science and Technology
Original Assignee
Changchun University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changchun University of Science and Technology filed Critical Changchun University of Science and Technology
Priority to CN201711021681.XA priority Critical patent/CN107885327B/en
Publication of CN107885327A publication Critical patent/CN107885327A/en
Application granted granted Critical
Publication of CN107885327B publication Critical patent/CN107885327B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a fingertip detection method based on Kinect depth information, wherein a device Kinect is connected with a computer through a cable; the method is characterized by comprising the following specific steps: step 1, extracting hands and acquiring palm coordinates: step 2, fingertip positioning: including image preprocessing and hand contouring; performing combined bilateral filtering on the extracted hand region; and (3) approximating the appointed point set by using a Douglas-Puck algorithm, finding out a polygonal fitting curve of the contour and drawing a fitting curve of the hand. And 3, searching the steps by using a covexHull () function, and analyzing to obtain convex wrap points of the hand. And 4, calculating the curvature of the obtained convex hull point, and setting a proper threshold value to remove the convex hull point at the wrist according to the difference between the curvature of the wrist and the curvature of the finger tip. The method can complete the gesture recognition task accurately in real time, improves the real-time performance and accuracy of Kinect gesture recognition, and improves natural gesture interaction experience.

Description

Fingertip detection method based on Kinect depth information
Technical Field
The invention relates to a fingertip detection method based on depth information, and belongs to the technical field of human-computer interaction.
Background
With the development of the human-computer interaction technology, a natural interaction mode becomes the development direction of the human-computer interaction technology. In recent years, natural and intuitive interaction is performed by human hands without wearing auxiliary equipment, and the method becomes a research hotspot in the field.
The key to achieving natural human hand interaction is accurate recognition of gestures. Currently, gesture recognition methods that do not rely on wearable aids include mainly color camera-based and depth sensor-based methods. The gesture recognition based on the color camera is easily affected by the complex lighting background and other situations, and particularly, the recognition accuracy is extremely low under the situation that the background is very complex. More and more researchers are beginning to utilize depth sensors to perform gesture recognition to acquire information of an object in a three-dimensional space, so as to improve accuracy of human hand detection and tracking. With the advent of Kinect, Kinect provides a new development direction for gesture recognition. Many scholars begin to use Kinect to perform gesture recognition, however, most of the existing methods are complicated in steps, complex in algorithm and poor in real-time performance.
Disclosure of Invention
The invention aims to provide a fingertip detection method based on Kinect depth information, which can accurately complete gesture recognition tasks in real time, improve the real-time performance and accuracy of Kinect gesture recognition, improve natural gesture interaction experience,
the technical scheme of the invention is realized as follows: a fingertip detection method based on Kinect depth information is characterized in that a device Kinect is connected with a computer through a cable; the method is characterized by comprising the following specific steps:
step 1, extracting hands and acquiring palm coordinates: the method comprises the steps of collecting human hand information by using a Kinect1, obtaining hand center coordinates by using an NITE function library, calculating an approximate depth range of a hand through the depth of a hand center point, setting a search area and a depth threshold value, multiplying n rows and n columns of elements by the hand center area through a matrix of n x n by using a depth binary mask, and separating a hand image from a background.
Step 2, fingertip positioning: including image preprocessing and hand contouring;
1) and carrying out joint bilateral filtering on the extracted hand region. The depth image and the color image of the same frame are collected through Kinect, and then the spatial distance weight of the depth image and the gray scale weight of the color image are calculated respectively by utilizing a Gaussian kernel function. And multiplying the spatial distance weight by the gray weight to obtain a combined filtering weight, and replacing a Gaussian kernel function with the fast Gaussian transform to generate the combined bilateral filter. And finally, performing convolution operation on the result filtered by the filter and the extracted hand region image, thereby realizing the effect of smooth edge protection of the finger tip region.
2) Selecting proper threshold value for the filtered image, carrying out binarization to generate a binary single-channel image, and then applying a findContours () function to search the outlines C of all feature points in the binarized imageiEach contour is a polygon and is composed of a pixel sequence, and the obtained hand feature point contour set C is represented as follows:
C=findContours(Ht(x,y))={Ci} (1)
and drawing the searched outline by using a drawContours () function.
3) And (3) approximating the appointed point set by using a Douglas-Puck algorithm, finding out a polygonal fitting curve of the contour and drawing a fitting curve of the hand. Virtually connecting a straight line to the first and last points of each curve, calculating the distance between all the points and the straight line, and finding out the maximum distance value dmaxBy dmaxCompared with the tolerance D: if d ismax< D, the middle points on this curve are all rounded off; if d ismaxNot less than D, reserve DmaxAnd dividing the curve into two parts by taking the corresponding coordinate point as a boundary, and repeatedly using the method for the two parts.
Step 3, utilizing the covexHull () function to search the obtained resultMaximum area value of the profile CmaxAnd analyzing to obtain the convex hull points of the hands.
And 4, calculating the curvature of the obtained convex hull point, and setting a proper threshold value to remove the convex hull point at the wrist according to the difference between the curvature of the wrist and the curvature of the finger tip.
1) Taking a point on the hand contour, taking the first M points q of the point p on the contour and the M-th point r after the point p, the curvature of the point p can be used as a vector
Figure BDA0001447582080000021
And
Figure BDA0001447582080000022
is represented by the cosine of the angle alpha. The value of the p-point curvature can be expressed as follows:
Figure BDA0001447582080000023
2) a threshold value L is adopted to judge the finger-like cusp, the pixel points with the curvature larger than or equal to L are screened out,
as a finger tip.
And 5: the curves at the two sides of the finger can be approximate to a group of parallel lines, 10 pixel points are taken forward and 10 pixel points are taken backward for the p points meeting the conditions obtained in the step 4, the included angle of the p points is obtained according to the formula (2), and a proper threshold range is adopted, so that the condition that the finger is mistakenly judged as the fingertip point when the finger is bent is eliminated.
The fingertip detection based on the Kinect depth information can be completed through the steps.
Compared with the fingertip detection at the present stage, the method has the advantages that:
1. the fingertip detection method simplifies the process of hand extraction.
2. The method can more accurately acquire the hand contour fitting curve.
3. The invention can well filter out non-fingertip points.
The main purpose is to realize the real-time and accurate positioning of the finger tip; the human-computer interaction realized by the traditional camera is improved, the accurate positioning of the fingertips ensures that the interaction accuracy of the hand and the fingertips of the virtual object is better improved, and the natural human-computer interaction experience is improved.
Drawings
FIG. 1 is a schematic view of an initial hand gesture and apparatus.
Fig. 2 is a flowchart of the steps of the fingertip detection method based on depth images according to the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings and specific examples.
Step S11, placing the Kinect on the desktop, with the palm perpendicular to the desktop and the palm facing the Kinect, about one meter away from the Kinect, as shown in fig. 1. And the palm is pushed forwards and then retracted, and the gesture tracking function of the NITE function library is triggered.
Step S12 is to acquire the coordinates of the palm using the NITE function library, and then calculate the approximate depth range of the hand from the palm point depth.
Step S13, setting a search area and a depth threshold value using the depth range obtained in step S12, and separating the hand image from the background by multiplying n rows and n columns of elements by the palm area through an n × n matrix using a depth binary mask.
Step S21, first, a depth image and a color image of the same frame are collected by the Kinect1, and then a spatial distance weight of the depth image and a gray scale weight of the color image are respectively calculated by using a gaussian kernel function. And multiplying the spatial distance weight by the gray weight to obtain a combined filtering weight, and replacing a Gaussian kernel function with the fast Gaussian transform to generate the combined bilateral filter.
In step S22, the hand region image extracted in S13 is convolved with the filtering result of the filter designed in S21. Thereby obtaining a joint bilateral filtered image.
And step S23, selecting proper threshold values for the image obtained in step S22, and carrying out binarization to generate a binary single-channel image.
Step S24, applying findContours () function to retrieve binary valuesContour C of all feature points in the normalized imageiEach contour is a polygon and is composed of a sequence of pixels, and expression (1) is as follows.
C=findContours(Ht(x,y))={Ci} (1)
And step S25, drawing the searched outline in the step S24 by using a drawContours () function.
Step S26, virtually connecting a straight line to the head and tail points of each curve drawn in the step S25, calculating the distance between all the points and the straight line, and finding out the maximum distance value dmaxBy dmaxCompared with the tolerance D: if d ismax< D, the middle points on this curve are all rounded off; if d ismaxNot less than D, reserve DmaxAnd dividing the curve into two parts by taking the corresponding coordinate point as a boundary, and repeatedly using the method for the two parts. Finding out a polygonal fitting curve of the outline and drawing a fitting curve of the hand.
And step S3, performing contour analysis on the hand fitting curve obtained in the step S26 by using a convexHull () function to obtain a convex wrapping point of the hand.
And step S4, calculating the curvature of the obtained convex hull point, and setting a proper threshold value to remove the convex hull point at the wrist according to the difference between the curvature of the wrist and the curvature of the fingertip.
Step S41, taking a point p on the contour of the hand, taking M points q of the point p on the contour and M points r subsequent to the point p, then the curvature of the point p can be used as a vector
Figure BDA0001447582080000041
And
Figure BDA0001447582080000042
is represented by the cosine of the angle alpha. The value of the p-point curvature can be expressed as follows.
Figure BDA0001447582080000043
And step S42, adopting a threshold value L to judge that the finger-like fingertip points screen out pixel points P with the curvature being more than or equal to L, and removing the convex hull of the wrist through curvature detection.
Step S5: the curves at the two sides of the finger can be approximate to a group of parallel lines, 10 pixel points are taken forward from the point P meeting the condition obtained in the step S42, 10 pixel points are taken backward, the included angle of the point P is obtained according to the formula (2), and a proper threshold range is adopted, so that the condition that the finger is mistakenly judged as the fingertip point when the finger is bent is eliminated. Although illustrative embodiments of the present invention have been described above to facilitate the understanding of the present invention by those skilled in the art, it should be understood that the present invention is not limited to the scope of the embodiments, and various changes may be made apparent to those skilled in the art as long as they are within the spirit and scope of the present invention as defined and defined by the appended claims, and all matters of the invention which utilize the inventive concepts are protected.

Claims (1)

1. A fingertip detection method based on Kinect depth information is characterized in that a device Kinect is connected with a computer through a cable; the method is characterized by comprising the following specific steps:
step 1, extracting hands and acquiring palm coordinates: acquiring human hand information by using a Kinect, acquiring a hand center coordinate by using an NITE function library, calculating an approximate depth range of a hand by the depth of a hand center point, setting a search region and a depth threshold, multiplying n rows and n columns of elements by a hand center region by using a depth binary mask through an n x n matrix, and separating a hand image from a background;
step 2, fingertip positioning: the method comprises the steps of image preprocessing and hand contour acquisition;
1) performing combined bilateral filtering on the extracted hand region; firstly, acquiring a depth image and a color image of the same frame through a Kinect, and then respectively calculating a spatial distance weight of the depth image and a gray scale weight of the color image by utilizing a Gaussian kernel function; multiplying the spatial distance weight by the gray weight to obtain a combined filtering weight, and replacing a Gaussian kernel function with rapid Gaussian transformation to generate a combined bilateral filter; finally, performing convolution operation on the result filtered by the filter and the extracted hand region image so as to realize the effect of smooth edge protection of the finger tip region;
2) selecting a proper threshold value for binarization of the filtered image to generate a binary single-channel image Ht(x, y), wherein x and y are pixel coordinates, and then a findContours () function is applied to retrieve the outlines C of all the feature points in the binarized imageiEach contour is a polygon and is composed of a pixel sequence, and the obtained hand feature point contour set C is represented as follows:
C=findContours(Ht(x,y))={Ci} (1)
drawing the searched outline by using a drawContours () function;
3) approximating the contour set C of the hand characteristic points by using a Douglas-Puck algorithm, finding out a polygonal fitting curve of the contour and drawing a fitting curve of the hand; virtually connecting a straight line to the first and last points of each curve, calculating the distance between all the points in the curve and the straight line, and finding out the maximum distance value dmaxBy dmaxCompared with the tolerance D: if d ismax< D, the middle points on this curve are all rounded off; if d ismaxNot less than D, reserve DmaxDividing the curve into two parts by taking the corresponding coordinate point as a boundary, and repeatedly using the method for the two parts;
step 3, searching the contour C of the maximum area value obtained in the step by using the covexHull () functionmaxThe convex hull of the hand is analyzed to obtain the convex hull point of the hand;
step 4, calculating the curvature of the obtained convex hull point, and setting a proper threshold value to remove the convex hull point at the wrist according to the difference between the curvature of the wrist and the curvature of the fingertip;
1) taking a point on the hand contour, taking the Mth point q before the point p on the contour and the Mth point r after the point p, the curvature of the point p can be used as a vector
Figure FDA0002661874220000011
And
Figure FDA0002661874220000012
is represented by the cosine of the angle alpha, the value of the curvature of the p pointCan be expressed as follows:
Figure FDA0002661874220000013
2) judging a finger tip point by adopting a threshold value L, and screening out pixel points with the curvature being more than or equal to L as the finger tip point;
and 5: the curves at the two sides of the finger can be approximate to a group of parallel lines, the fingertip points p screened out according to the curvature in the step 4 are subjected to forward 10 pixel points and backward 10 pixel points, the included angle of the p points is obtained according to a formula (2), and a proper threshold range is adopted, so that the condition that the finger is mistakenly judged as the fingertip point when the finger is bent is eliminated;
the fingertip detection based on the Kinect depth information can be completed through the steps.
CN201711021681.XA 2017-10-27 2017-10-27 Fingertip detection method based on Kinect depth information Active CN107885327B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711021681.XA CN107885327B (en) 2017-10-27 2017-10-27 Fingertip detection method based on Kinect depth information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711021681.XA CN107885327B (en) 2017-10-27 2017-10-27 Fingertip detection method based on Kinect depth information

Publications (2)

Publication Number Publication Date
CN107885327A CN107885327A (en) 2018-04-06
CN107885327B true CN107885327B (en) 2020-11-13

Family

ID=61782663

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711021681.XA Active CN107885327B (en) 2017-10-27 2017-10-27 Fingertip detection method based on Kinect depth information

Country Status (1)

Country Link
CN (1) CN107885327B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805141A (en) * 2018-05-29 2018-11-13 电子科技大学 A kind of human skeleton artis location estimation method
CN109308741B (en) * 2018-08-08 2023-04-07 长春理工大学 Meta 2-based natural interaction handicraft creative design system
CN109344718B (en) * 2018-09-03 2021-02-09 先临三维科技股份有限公司 Fingertip identification method, device, storage medium and processor
CN110874179B (en) * 2018-09-03 2021-09-14 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection device, and medium
CN109344793B (en) * 2018-10-19 2021-03-16 北京百度网讯科技有限公司 Method, apparatus, device and computer readable storage medium for recognizing handwriting in the air
CN109740497B (en) * 2018-12-27 2022-07-26 河海大学 Fingertip detection method based on least square curve fitting
CN111402368B (en) * 2019-01-03 2023-04-11 福建天泉教育科技有限公司 Correction method for drawing graph and terminal
CN110232321B (en) * 2019-05-10 2021-07-06 奥比中光科技集团股份有限公司 Method and device for detecting fingertip clicking position, terminal and computer storage medium
CN111354029B (en) * 2020-02-26 2023-05-05 深圳市瑞立视多媒体科技有限公司 Gesture depth determination method, device, equipment and storage medium
CN111950514B (en) * 2020-08-26 2022-05-03 重庆邮电大学 Depth camera-based aerial handwriting recognition system and method
CN113947683B (en) * 2021-10-15 2022-07-08 兰州交通大学 Fingertip point detection method and system and fingertip point motion track identification method and system
CN115908573B (en) * 2023-02-20 2023-06-02 季华实验室 Rubber glove opening positioning method, system, electronic equipment and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN104899600A (en) * 2015-05-28 2015-09-09 北京工业大学 Depth map based hand feature point detection method
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
CN106355598A (en) * 2016-09-14 2017-01-25 南通大学 Automatic wrist and finger joint motion degree measurement method
CN106503626A (en) * 2016-09-29 2017-03-15 南京信息工程大学 Being mated with finger contours based on depth image and refer to gesture identification method
WO2017134059A1 (en) * 2016-02-05 2017-08-10 Delphi Technologies, Inc. System and method for detecting hand gestures in a 3d space

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016091457A (en) * 2014-11-10 2016-05-23 富士通株式会社 Input device, fingertip-position detection method, and computer program for fingertip-position detection

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN104899600A (en) * 2015-05-28 2015-09-09 北京工业大学 Depth map based hand feature point detection method
WO2017134059A1 (en) * 2016-02-05 2017-08-10 Delphi Technologies, Inc. System and method for detecting hand gestures in a 3d space
CN106295531A (en) * 2016-08-01 2017-01-04 乐视控股(北京)有限公司 A kind of gesture identification method and device and virtual reality terminal
CN106355598A (en) * 2016-09-14 2017-01-25 南通大学 Automatic wrist and finger joint motion degree measurement method
CN106503626A (en) * 2016-09-29 2017-03-15 南京信息工程大学 Being mated with finger contours based on depth image and refer to gesture identification method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
《1 种基于 Kinect 深度图像的指尖检测与跟踪算法》;李智娴 等;《江苏农业科学》;20151231;第43卷(第5期);第416-419页 *
《一种基于Kinect深度图的人像自动抠像算法》;李华 等;《长春理工大学学报(自然科学版)》;20161231;第39卷(第6期);第81-84页 *

Also Published As

Publication number Publication date
CN107885327A (en) 2018-04-06

Similar Documents

Publication Publication Date Title
CN107885327B (en) Fingertip detection method based on Kinect depth information
US9916012B2 (en) Image processing apparatus, image processing method, and program
CN107563494B (en) First-view-angle fingertip detection method based on convolutional neural network and heat map
Shenoy et al. Real-time Indian sign language (ISL) recognition
EP3090382B1 (en) Real-time 3d gesture recognition and tracking system for mobile devices
US20130050076A1 (en) Method of recognizing a control command based on finger motion and mobile device using the same
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN106503651B (en) A kind of extracting method and system of images of gestures
CN103218605A (en) Quick eye locating method based on integral projection and edge detection
CN107357414B (en) Click action recognition method and device
Weiyao et al. Human action recognition using multilevel depth motion maps
CN106503619B (en) Gesture recognition method based on BP neural network
CN112381045A (en) Lightweight human body posture recognition method for mobile terminal equipment of Internet of things
Vivek Veeriah et al. Robust hand gesture recognition algorithm for simple mouse control
CN112307984A (en) Safety helmet detection method and device based on neural network
Pradhan et al. A hand gesture recognition using feature extraction
CN108876776B (en) Classification model generation method, fundus image classification method and device
Yashas et al. Hand gesture recognition: a survey
Sokhib et al. A combined method of skin-and depth-based hand gesture recognition.
CN103077381A (en) Monocular dynamic hand gesture recognition method on basis of fractional Fourier transformation
Choi et al. RGB-D camera-based hand shape recognition for human-robot interaction
CN109359543B (en) Portrait retrieval method and device based on skeletonization
KR101167784B1 (en) A method for recognizing pointers and a method for recognizing control commands, based on finger motions on the back of the portable information terminal
CN111651038A (en) Gesture recognition control method based on ToF and control system thereof
CN112084840A (en) Finger vein identification method based on three-dimensional NMI

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant