CN109446950B - Static gesture recognition method based on thermal imager camera - Google Patents

Static gesture recognition method based on thermal imager camera Download PDF

Info

Publication number
CN109446950B
CN109446950B CN201811200740.4A CN201811200740A CN109446950B CN 109446950 B CN109446950 B CN 109446950B CN 201811200740 A CN201811200740 A CN 201811200740A CN 109446950 B CN109446950 B CN 109446950B
Authority
CN
China
Prior art keywords
gesture
image
fingertips
sigmoid
gesture recognition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811200740.4A
Other languages
Chinese (zh)
Other versions
CN109446950A (en
Inventor
余丹
仲雪飞
张�雄
樊兆雯
金展翌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Original Assignee
Southeast University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University filed Critical Southeast University
Priority to CN201811200740.4A priority Critical patent/CN109446950B/en
Publication of CN109446950A publication Critical patent/CN109446950A/en
Application granted granted Critical
Publication of CN109446950B publication Critical patent/CN109446950B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a static gesture recognition method based on a thermal imager camera, which comprises the following steps of: converting thermal image temperature data acquired by a camera of a thermal imager into image pixel values through logarithm conversion; preprocessing an image; extracting characteristics; and (5) gesture recognition. Because the thermal image temperature data is converted into image pixel values and then processed and recognized, the static gesture recognition method based on the thermal imager camera provided by the invention can effectively and accurately segment the hand region in various environments without light, dim light, complex color background and the like, can accurately classify and recognize gestures, can output correct expected results, and has good algorithm robustness.

Description

Static gesture recognition method based on thermal imager camera
Technical Field
The invention belongs to the technical field of image processing gesture recognition, and relates to a static gesture recognition method based on a thermal imager camera.
Background
In the current society, the keyboard and mouse based man-machine interaction methods still dominate the devices, and these methods require additional devices for the interaction devices, and are not suitable for being applied to robots with mobile capability or portable intelligent devices. The human-computer interaction based on gesture recognition mainly adopts direct operation, so that the human-computer interaction technology is gradually transferred from a machine-centered mode to a human-centered mode, and the human-computer interaction habit is better met. Therefore, gesture recognition is increasingly being developed and applied in engineering.
However, the conventional common camera can only perform gesture recognition in an environment with sufficient light, and once the light is insufficient or no light is present, the error of the gesture recognition result is large or even the gesture recognition result cannot be recognized, so that the application environment of the gesture recognition is limited. Meanwhile, the common RGB camera is in a complex environment, such as: under the condition of various colors or similar colors, the gesture segmentation effect is extremely poor, which can affect that the final gesture recognition result is not accurate and has larger error.
Disclosure of Invention
In order to solve the problems, the invention discloses a static gesture recognition method based on a thermal imager camera, which can realize convenient, accurate and efficient gesture recognition under various environments. Based on the method, the gesture recognition does not need to be limited to the external environment unnecessarily, the intensity of the ambient light does not influence the gesture recognition result, other reflected light or similar colors also do not influence the gesture segmentation result, and the problem that a common RGB camera cannot be used in a common scene in life is solved.
In order to achieve the purpose, the invention provides the following technical scheme:
a static gesture recognition method based on a thermal imager camera comprises the following steps:
step 1, acquiring data
Converting thermal image temperature data acquired by a camera of a thermal imager into image pixel values through logarithmic conversion, namely, nonlinearly corresponding the temperature data to each image pixel value, converting the data into temperature images for display, and simultaneously widening the corresponding image pixel values near the palm temperature;
step 2, image preprocessing
Detecting edges by using a canny edge detection algorithm, filling the obtained edges by using a flood filling method, and finally performing gesture segmentation by using image binarization to obtain a hand shape;
step 3, feature extraction
Extracting the following features in the image processed in the step 2: moment of center of gravity, convex hull convex defect, number of fingertips and position of finger heel;
step 4, gesture recognition
And combining, matching and screening the gestures by using the three characteristics of the gravity center moment, the number of the finger tips and the finger heel positions, and classifying and identifying the gestures.
Further, in the logarithmic conversion process, the inverse function of the sigmoid function is selected logarithmically
Figure BDA0001829930910000021
The conversion steps are as follows:
initially setting an image display temperature interval, wherein the gray values higher than 37 ℃ are 255, and the gray values lower than 13 ℃ are 0;
making the maximum temperature maxtemp of the temperature interval correspond to the sigmoid function abscissa x equal to 0.7, and making the minimum temperature mintemp correspond to the x equal to 0.9, and calculating to obtain a linear formula as follows:
sigmoid_x=0.0004*tempreture+1.58 (2)
taking a section near the upper part and the lower part of the average value of the whole image thermal image data, calculating a value corresponding to sigmoid _ x according to a formula (2), and calculating the maximum value sigmoid _ y _ max and the minimum value sigmoid _ y _ min of the section according to a formula (1);
obtaining sigmoid _ x of the whole image thermal image data according to a formula (2), substituting the sigmoid _ x into a formula (1), obtaining sigmoid _ y, and finally obtaining the sigmoid _ y according to a formula (3)
Figure BDA0001829930910000022
And calculating a gray value display image.
Further, the characteristic of the moment of gravity in the step 3 is represented by the formula (4)
Figure BDA0001829930910000023
And (6) obtaining.
Further, the heel position in step 3 is obtained through the following process: and (3) taking the maximum external quadrangle of each finger of the gesture to obtain the position information of four vertexes of the quadrangle, and selecting two vertexes positioned below to obtain the midpoint of a two-point connecting line, namely the position of the corresponding finger heel.
Further, before the convex hull defect detection in the step 3, the contour of the image binary image is searched, and then the obtained contour is subjected to polygon approximation.
Further, the step 4 comprises the following steps:
step 4-1, primarily classifying the gestures according to the number of fingertips;
and 4-2, reclassifying according to the size of the included angle formed by the fingers and the distance information between the fingers and the heel.
Further, the classification in the step 4-1 includes: the gesture category with the fingertip number of 0, the gesture category with the fingertip number of 1, the gesture category with the fingertip number of 2, the gesture category with the fingertip number of 3, and the gesture category with the fingertip number of 4.
Further, the step 4-2 is classified again under the categories obtained in the step 4-1, and the classified categories include:
when the number of the fingertips is 1, performing gestures with included angles smaller than 80 degrees and gestures with included angles larger than 80 degrees;
when the number of the fingertips is 2, the two fingers and the heels of the fingers respectively form a gesture with an included angle of 0-50 degrees with the connecting line of the gravity center, the gesture with an included angle of 50-80 degrees and a distance between the two fingers less than 1.5 times of the length of the thumb, and the gesture with an included angle of 80-180 degrees and a distance between the two fingers more than 1.5 times of the length of the thumb;
when the number of the fingertips is 3, the gesture in which the distance difference between the middle finger and the other two fingers is smaller than a certain distance and the gesture in which the distance difference is larger than or equal to a certain distance.
Compared with the prior art, the invention has the following advantages and beneficial effects:
because the thermal image temperature data is converted into image pixel values and then processed and recognized, the static gesture recognition method based on the thermal imager camera provided by the invention can effectively and accurately segment the hand region in various environments without light, dim light, complex color background and the like, can accurately classify and recognize gestures, can output correct expected results, and has good algorithm robustness.
Drawings
Fig. 1 is a diagram illustrating 10 gesture classifications provided by the embodiment of the present invention.
Fig. 2 is a flowchart of a static gesture recognition method based on a thermal imager camera provided by the invention.
FIG. 3 is a graph showing the effect of the present invention on the logarithmic conversion of thermal image data into temperature image in step 1, wherein the left side is before conversion and the right side is after conversion.
FIG. 4 is a graph showing the effect of canny edge detection in step 2 of the present invention.
FIG. 5 is a diagram showing the filling effect of the flooding filling method in step 2 of the present invention.
FIG. 6 is a diagram of the positions of fingertips and moments of center of gravity extracted in step 3 of the present invention.
FIG. 7 is a diagram illustrating the effect of the result recognized by the method of the present invention, taking gesture 6 as an example.
Detailed Description
The technical solutions provided by the present invention will be described in detail below with reference to specific examples, and it should be understood that the following specific embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention.
The embodiment realizes the classification and recognition of 10 gestures, and the gesture categories are shown in fig. 1. The following description is based on the present embodiment of a static gesture recognition method based on a thermal imager camera, and the specific steps are shown in fig. 2, and the method includes the following steps:
step 1, acquiring data
The thermal image temperature data acquired by the camera of the thermal imager is converted into image pixel values through logarithmic conversion, namely the temperature data are nonlinearly corresponding to each image pixel value, so that the data are converted into temperature images for display, and meanwhile, the image pixel values corresponding to the temperature of the palm are widened, and the subsequent image processing is facilitated.
The logarithmic transformation is specifically as follows: inverse function of logarithmically selected sigmoid function
Figure BDA0001829930910000041
The method comprises the following steps:
initially setting an image display temperature interval (13-37 ℃), wherein the gray values higher than 37 ℃ are 255, and the gray values lower than 13 ℃ are 0;
let the maximum temperature maxtemp of the temperature interval correspond to the sigmoid function abscissa x be 0.7, and the minimum temperature mintemp correspond to x be 0.9 (linear correspondence), and the linear formula is calculated as:
sigmoid_x=0.0004*tempreture+1.58 (2)
taking a section near the upper part (50) and the lower part (50) of the average value (close to the palm temperature, namely about 29 ℃), calculating the value of the corresponding sigmoid _ x according to a formula (2), and calculating the maximum value sigmoid _ y _ max and the minimum value sigmoid _ y _ min of the section according to a formula (1);
obtaining sigmoid _ x of the whole image thermal image data according to a formula (2), substituting the sigmoid _ x into a formula (1), obtaining sigmoid _ y, and finally obtaining the sigmoid _ y according to a formula (3)
Figure BDA0001829930910000042
And calculating a gray value display image. The graph of the pre-conversion and the conversion results is shown in fig. 3.
Step 2, image preprocessing
And detecting edges by using a canny edge detection algorithm, filling the obtained edges by using a flooding filling method, wherein an effect graph is shown in fig. 4, filling a filling result is shown in fig. 5, and finally, performing gesture segmentation by using image binarization to obtain a hand shape.
Step 3, feature extraction
Extracting the following features in the image processed in the step 2: moment of center of gravity, convex hull convex defect, number of fingertips, and position of finger heel. Moment of center of gravity through formula (4)
Figure BDA0001829930910000043
And (6) obtaining. And searching the contour of the image binary image, performing polygon approximation on the obtained contour, and performing convex hull defect detection by using opencv functions convexHull and convexityDefects on the basis. And analyzing according to the convex hull and convex defect to obtain the position and quantity information of the fingertips. Both the fingertips and the moment of center of gravity are plotted in fig. 6. Finger heel position acquisition process: and (3) taking the maximum external quadrangle of each finger of the gesture to obtain the position information of four vertexes of the quadrangle, and selecting two vertexes positioned below to obtain the midpoint of a two-point connecting line, namely the position of the corresponding finger heel.
Step 4, gesture recognition
And combining, matching and screening the gestures by using three characteristics of the gravity center moment, the number of the finger tips and the finger heel positions so as to realize the classification and recognition of the 10 gestures.
The classification and identification of the gestures are mainly divided into two steps: the finger tip sorting method is based on the initial sorting of the number of finger tips and the reclassification of the information of included angles formed by fingers and distances between fingers and heels. The specific process is as follows: ten gestures can be classified into six types according to the number of fingertips: gesture 7 with the number of fingertips being 0, gesture 1 and gesture 10 with the number of fingertips being 1, gesture 2, gesture 6 and gesture 8 with the number of fingertips being 2, gesture 3 and gesture 9 with the number of fingertips being 3, gesture 4 with the number of fingertips being 4, and gesture 5 with the number of fingertips being 5. Gestures 1 and 10 are distinguished according to the included angle between the connecting line of the finger heel and the gravity center and the horizontal line, gesture 10 is shown when the included angle is less than 80 degrees, and gesture 1 is shown when the included angle is more than 80 degrees. When the number of the fingertips is 2, an included angle formed by the two finger heels and a connecting line of the gravity center is 0-50 degrees to form a gesture 2, an included angle formed by the two fingers is 50-80 degrees, the distance between the two finger heels is less than 1.5 times of the length of the thumb to form a gesture 8, and the included angle formed by the two fingers is 80-180 degrees, the distance between the two finger heels is more than 1.5 times of the length of the thumb to form a gesture 6. And when the number of the fingertips is 3, calculating the distance between the middle finger and the other two fingers, and if the difference between the two distances is smaller than a smaller distance (the distance value should be given in advance), determining that the gesture is 3, otherwise, determining that the gesture is 9. The recognition result of gesture 6, for example, is shown in fig. 7.
The technical means disclosed in the invention scheme are not limited to the technical means disclosed in the above embodiments, but also include the technical scheme formed by any combination of the above technical features. It should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and such improvements and modifications are also considered to be within the scope of the present invention.

Claims (7)

1. A static gesture recognition method based on a thermal imager camera is characterized by comprising the following steps:
step 1, acquiring data
Converting thermal image temperature data acquired by a camera of a thermal imager into image pixel values through logarithmic conversion, namely, nonlinearly corresponding the temperature data to each image pixel value, converting the data into temperature images for display, and simultaneously widening the corresponding image pixel values near the palm temperature;
in the logarithmic conversion process, the inverse function of sigmoid function is selected logarithmically
Figure FDA0003292587420000011
The conversion steps are as follows:
initially setting an image display temperature interval, wherein the gray values higher than 37 ℃ are 255, and the gray values lower than 13 ℃ are 0;
making the maximum temperature maxtemp of the temperature interval correspond to the sigmoid function abscissa x equal to 0.7, and making the minimum temperature mintemp correspond to the x equal to 0.9, and calculating to obtain a linear formula as follows:
sigmoid_x=0.0004*tempreture+1.58 (2)
taking a section near the upper part and the lower part of the average value of the whole image thermal image data, calculating a value corresponding to sigmoid _ x according to a formula (2), and calculating the maximum value sigmoid _ y _ max and the minimum value sigmoid _ y _ min of the section according to a formula (1);
obtaining sigmoid _ x of the whole image thermal image data according to a formula (2), substituting the sigmoid _ x into a formula (1), obtaining sigmoid _ y, and finally obtaining the sigmoid _ y according to a formula (3)
Figure FDA0003292587420000012
Calculating a gray value display image;
step 2, image preprocessing
Detecting edges by using a canny edge detection algorithm, filling the obtained edges by using a flood filling method, and finally performing gesture segmentation by using image binarization to obtain a hand shape;
step 3, feature extraction
Extracting the following features in the image processed in the step 2: the center of gravity moment, convex hull convex defects, the number of fingertips and the positions of fingerheels are analyzed to obtain the position and number information of the fingertips according to the convex hull convex defects;
step 4, gesture recognition
And combining, matching and screening the gestures by using the three characteristics of the gravity center moment, the number of the finger tips and the finger heel positions, and classifying and identifying the gestures.
2. The thermal imager camera-based static gesture recognition method of claim 1, wherein the moment of gravity feature of step 3 is determined by formula (4)
Figure FDA0003292587420000021
And (6) obtaining.
3. The thermal imager camera-based static gesture recognition method according to claim 1, wherein the heel position in step 3 is obtained by the following process: and (3) taking the maximum external quadrangle of each finger of the gesture to obtain the position information of four vertexes of the quadrangle, and selecting two vertexes positioned below to obtain the midpoint of a two-point connecting line, namely the position of the corresponding finger heel.
4. The thermal imager camera-based static gesture recognition method according to claim 1, wherein in the step 3, before the convex hull defect detection, the contour of the image binary image is searched, and then the obtained contour is subjected to polygon approximation.
5. The thermal imager camera-based static gesture recognition method according to claim 1, wherein said step 4 comprises the steps of:
step 4-1, primarily classifying the gestures according to the number of fingertips;
and 4-2, reclassifying according to the size of the included angle formed by the fingers and the distance information between the fingers and the heel.
6. The thermal imager camera-based static gesture recognition method of claim 5, wherein the classifying in step 4-1 comprises: the gesture category with the number of fingertips being 0, the gesture category with the number of fingertips being 1, the gesture category with the number of fingertips being 2, the gesture category with the number of fingertips being 3, the gesture category with the number of fingertips being 4, and the gesture category with the number of fingertips being 5.
7. The thermal imager camera-based static gesture recognition method of claim 5, wherein the step 4-2 is reclassified under the categories obtained in the step 4-1, the distinguished categories comprising:
when the number of the fingertips is 1, performing gestures with included angles smaller than 80 degrees and gestures with included angles larger than 80 degrees;
when the number of the fingertips is 2, the two fingers and the heels of the fingers respectively form a gesture with an included angle of 0-50 degrees with the connecting line of the gravity center, the gesture with an included angle of 50-80 degrees and a distance between the two fingers less than 1.5 times of the length of the thumb, and the gesture with an included angle of 80-180 degrees and a distance between the two fingers more than 1.5 times of the length of the thumb;
when the number of the fingertips is 3, the gesture in which the distance difference between the middle finger and the other two fingers is smaller than a certain distance and the gesture in which the distance difference is larger than or equal to a certain distance.
CN201811200740.4A 2018-10-16 2018-10-16 Static gesture recognition method based on thermal imager camera Active CN109446950B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811200740.4A CN109446950B (en) 2018-10-16 2018-10-16 Static gesture recognition method based on thermal imager camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811200740.4A CN109446950B (en) 2018-10-16 2018-10-16 Static gesture recognition method based on thermal imager camera

Publications (2)

Publication Number Publication Date
CN109446950A CN109446950A (en) 2019-03-08
CN109446950B true CN109446950B (en) 2022-02-15

Family

ID=65544948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811200740.4A Active CN109446950B (en) 2018-10-16 2018-10-16 Static gesture recognition method based on thermal imager camera

Country Status (1)

Country Link
CN (1) CN109446950B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111626364B (en) * 2020-05-28 2023-09-01 中国联合网络通信集团有限公司 Gesture image classification method, gesture image classification device, computer equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751141A (en) * 2015-03-30 2015-07-01 东南大学 ELM gesture recognition algorithm based on feature image full pixel gray values
CN106814859A (en) * 2017-03-20 2017-06-09 肖赫 A kind of man-machine interaction method of infrared gesture identification
CN107220584A (en) * 2017-03-25 2017-09-29 南宁市广千信息技术有限公司 Infrared gesture recognition system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751141A (en) * 2015-03-30 2015-07-01 东南大学 ELM gesture recognition algorithm based on feature image full pixel gray values
CN106814859A (en) * 2017-03-20 2017-06-09 肖赫 A kind of man-machine interaction method of infrared gesture identification
CN107220584A (en) * 2017-03-25 2017-09-29 南宁市广千信息技术有限公司 Infrared gesture recognition system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Tracking and classification of in-air hand gesture based on thermal;Kim S等;《Sensors》;20170117;全文 *

Also Published As

Publication number Publication date
CN109446950A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN109344701B (en) Kinect-based dynamic gesture recognition method
CN108562589B (en) Method for detecting surface defects of magnetic circuit material
CN109684959B (en) Video gesture recognition method and device based on skin color detection and deep learning
Nai et al. Fast hand posture classification using depth features extracted from random line segments
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN110796033B (en) Static gesture recognition method based on bounding box model
CN112634125B (en) Automatic face replacement method based on off-line face database
CN110569782A (en) Target detection method based on deep learning
CN110956099B (en) Dynamic gesture instruction identification method
Rahim et al. Hand gesture recognition based on optimal segmentation in human-computer interaction
Lai et al. Transparent object detection using regions with convolutional neural network
CN108274476B (en) Method for grabbing ball by humanoid robot
CN112906550A (en) Static gesture recognition method based on watershed transformation
CN109446950B (en) Static gesture recognition method based on thermal imager camera
Pradhan et al. A hand gesture recognition using feature extraction
Chen et al. An integrated color and hand gesture recognition approach for an autonomous mobile robot
Chowdhury et al. Scene text detection using sparse stroke information and MLP
Kaur et al. 2-D geometric shape recognition using canny edge detection technique
Simion et al. Finger detection based on hand contour and colour information
KR101357581B1 (en) A Method of Detecting Human Skin Region Utilizing Depth Information
Hagg et al. On recognizing transparent objects in domestic environments using fusion of multiple sensor modalities
CN108255298B (en) Infrared gesture recognition method and device in projection interaction system
US20150199033A1 (en) Method for simulating a graphics tablet based on pen shadow cues
Zarkasi et al. Weightless Neural Networks Face Recognition Learning Process for Binary Facial Pattern
Aminian et al. Face detection using color segmentation and RHT

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant