CN108268819A - A kind of motion gesture detection and recognition methods based on Face Detection - Google Patents

A kind of motion gesture detection and recognition methods based on Face Detection Download PDF

Info

Publication number
CN108268819A
CN108268819A CN201611262542.1A CN201611262542A CN108268819A CN 108268819 A CN108268819 A CN 108268819A CN 201611262542 A CN201611262542 A CN 201611262542A CN 108268819 A CN108268819 A CN 108268819A
Authority
CN
China
Prior art keywords
gesture
hand
detection
skin color
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611262542.1A
Other languages
Chinese (zh)
Inventor
钟鸿飞
覃争鸣
杨旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Original Assignee
Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou filed Critical Rich Intelligent Science And Technology Ltd Is Reflected In Guangzhou
Priority to CN201611262542.1A priority Critical patent/CN108268819A/en
Publication of CN108268819A publication Critical patent/CN108268819A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention proposes a kind of motion gesture detection and recognition methods based on Face Detection, includes the following steps:S1 shakes detection:The candidate region of hand is found in the position of detection positioning hand;S2 hand skin color model treatments:Skin color model is carried out to shaking the hand region found after detection, human hand and background is distinguished, establishes skin similarity model;S3 gesture trackings:Gesture tracking is carried out using mean shift algorithm, gesture is tracked and identified with window search;S4 gesture areas are divided:Further the hand region of tracking is split using hand skin color model, obtains the bianry image of gesture;S5 gesture identifications:Use the identification of implement the algorithm of support vector machine gesture motion.

Description

A kind of motion gesture detection and recognition methods based on Face Detection
Technical field
The present invention relates to field of human-computer interaction, and in particular to a kind of motion gesture detection and identification side based on Face Detection Method.
Background technology
In order to help disabled person/the elderly keep with the exchanging of the external world, link up, improve their independent living ability, subtract Light family, the burden of society, all over the world many scientists start the novel man-machine interaction mode of exploratory development.So-called interaction Technology includes people and the interaction of executing agency (such as robot) and the interaction of executing agency and environment.The former meaning is can It is gone to realize the planning and decision that executing agency is difficult in unknown or uncertain condition by people;And be can for the meaning of the latter By robot go to complete people job task in inaccessiable adverse circumstances or long distance environment.
Traditional human-computer interaction device mainly has keyboard, mouse, handwriting pad, touch screen, game console etc., these equipment The function of human-computer interaction is realized using the hand exercise of user.Gesture interaction supports more more natural interactive modes, carries Human-centred rather than facility center management interaction technique is supplied, being primarily focused on original this thereby using family does In thing and content rather than concentrate in equipment.
Common gesture interaction technology is divided into the gesture interaction technology based on data glove sensor and is regarded based on computer Two kinds of the gesture interaction technology of feel.
Gesture interaction technology based on data glove sensor needs user to wear data glove or position sensor etc. Hardware device acquires the information such as finger state and movement locus using sensor, computer is allowed to identify so as to carry out calculation process Gesture motion realizes various interactive controllings.This mode advantage is to identify that accurate robust performance is good, algorithm is relatively easy, operation Data are few and quick, can precisely obtain the solid space action of hand, change without the ambient lighting of vision system and carry on the back completely The problems such as scape is complicated interferes.Shortcoming is that equipment wearing is complicated, of high cost, user's operation is inconvenient for use and gesture motion is by certain Restrict, therefore, it is difficult to largely put into actual production to use.
Gesture interaction technology based on computer vision by machine vision to camera collected gesture image sequence Processing identification, so as to be interacted with computer, this method acquires gesture information using camera, then utilizes complexion model Human hand part is split, so as to fulfill gestures detection and identification, the tracking of motion gesture is finally realized using frame-to-frame differences method. The effect of this method depends on the accuracy rate of complexion model, however the skin color of people differs, it is difficult to obtain general, efficient skin Color model;In addition, when human hand movement speed is uneven, disruption will appear using frame-to-frame differences method tracking gesture, so as to lose It loses and is tracked gesture.
Invention content
The purpose of the present invention is to overcome the deficiency in the prior art, especially solves the existing gesture interaction based on computer vision In technology, it is difficult to general, efficient complexion model is established to the colour of skin that color differs, frame-to-frame differences method motion tracking occurs interrupting existing As the problem of.
In order to solve the above technical problems, the present invention proposes a kind of motion gesture detection based on Face Detection and identification side Method, key step include:
S1 shakes detection:The candidate region of hand is found in the position of detection positioning hand;
S2 hand skin color model treatments:To shake detect after the hand region that finds carry out skin color model, distinguish human hand and Background establishes skin similarity model;
S3 gesture trackings:Gesture tracking is carried out using mean shift algorithm, gesture is tracked and identified with window search;
S4 gesture areas are divided:Further the hand region of tracking is split using hand skin color model, is obtained in one's hands The bianry image of gesture;
S5 gesture identifications:Use the identification of implement the algorithm of support vector machine gesture motion.
The present invention has following advantageous effect compared with prior art:
The present invention program carries out elimination head jitter using acceleration transducer acquisition acceleration information and calculating inclination angle Or the mobile interference extracted to human hand coordinate, gesture profile is found using connected region, and search to put between finger tip point and the palm and realize Non- gesture area filtering, the tracking of motion gesture is finally realized using simplified mean shift process, is reduced since body is trembled The dynamic interference brought, while avoid tracking disruption.
Description of the drawings
Fig. 1 is a kind of flow of one embodiment of motion gesture detection and recognition methods based on Face Detection of the present invention Figure.
Fig. 2 is the acceleration squint angle schematic diagram of the embodiment of the present invention.
Fig. 3 is four connected regions of the embodiment of the present invention and the principle compares figure in eight connectivity region.
Fig. 4 is the finger tip point set schematic diagram of the embodiment of the present invention.
Fig. 5 is the program flow diagram that the motion gesture of the embodiment of the present invention detects.
Specific embodiment
Below in conjunction with the accompanying drawings and specific embodiment the present invention is carried out in further detail with complete explanation.It is appreciated that It is that specific embodiment described herein is only used for explaining the present invention rather than limitation of the invention.
Referring to Fig. 1, the detection of the motion gesture based on Face Detection and recognition methods of the embodiment of the present invention, key step packet It includes:
Referring to Fig. 2, when human hand shakes, will there is continuously the average brightness value of pixel in the region that human hand passes through Violent fluctuating change.This variation is in image not available for other regions.System only needs one section in input in this way It in the sequence of continuous frame, finds out those and changes bigger region, it is possible to obtain the position that hand substantially shakes.
It is in systems sub-block that several sizes are m × n first by the image separation of each frame, to each of t frames Sub-block (i, j) calculates its variation degree in continuous 10 frame image with S (i, j, t).
Wherein, I (i, j, t) illustrates the average brightness of sub-block (i, j) on t frames, and w (n), n=0 ..., 9 represent power Weight.Referring to Fig. 3, the visual representation definition of I (i, j, t):
Wherein, luminance (p, q, t) represents the brightness of pixel (p, q) on t frames.For the influence of reflecting time, When calculating each block in the accumulation of the variation degree on continuous 10 frame, an incremental weight at any time is assigned to each frame W (n), n=0 ..., 9.
By above-mentioned calculating, the block of S (t) maximums is exactly to change region the most violent within nearest 10 frame.
What is used in the above process is based entirely on the positioning of movement.Then, with relatively simple colour of skin decision rule root Candidate hand region is found according to the colour of skin, i.e., if the colour of skin for having comparable pixel to be similar to experience around this region, judges It is the region of human hand around this block.
S2 hand skin color model treatments:To shake detect after the hand region that finds carry out skin color model, distinguish human hand and Background establishes skin similarity model;
Using the rectangle colour of skin model of linear equation of determining Cb and Cr maximum values and minimum value, rectangular model can use four Straight line L1, L2, L3, L4 expressions are as follows:
L1:Cb×T1+T2< Cr (2)
L2:Cb×T2+T3< Cr (3)
L3:Cb×T5+T6> Cr (4)
L4:Cb×T7+T8> Cr (5)
Wherein T1=-1.22265625, T2=267.3330078125, T3=0.875, T4=29.375, T5=- 1.3330078125、T6=316.3330078125, T7=0.064453125, T8=170.612903225.Above-mentioned linear equation The parameter of parted pattern carries out off-line training to Finite Amplitude image and obtains, and passes through the parameter detecting different application of off-line training The colour of skin under scene.When the gray scale in image slices vegetarian refreshments is fallen in the range of matrix, human body complexion is taken as, Fig. 4 is this implementation The Face Detection result figure of example.
S3 gesture trackings:Gesture tracking is carried out using mean shift algorithm, gesture is tracked and identified with window search;
An initial search window is inputted, that is, shakes the region that detector navigates to.
(1) skin color probability map of search window is calculated.
(2) 0 rank square M of skin color probability is calculated00With 1 rank square M10、M01
(3) position (x of the high probability colour of skin barycenter in search window is calculatedc,yc):
(4) size of high probability area of skin color in search window is calculated.
(5) center and the size of search window are adjusted according to the size of high probability area of skin color.
(6) repeat the above steps 1-5, until the variation of the center of search window and size in certain adjustment is less than some threshold Until value.At this point, the position of high probability colour of skin barycenter seeks to the position of the object (human hand) of tracking.
S4 gesture areas are divided:Further the hand region of tracking is split using hand skin color model, is obtained in one's hands The bianry image of gesture;The complexion model of step S2 is used again, and the pixel in hand region obtained to tracking is sentenced It is disconnected, if pixel gray value is located in complexion model, it is judged as hand region, is otherwise judged as background area.After judgement, Due to the presence of noise, the pixel of part hand region can be caused to be mistaken for background, it is therefore desirable to use dilation erosion Morphological operation realizes the full segmentation of hand region.Process is:
(1) operator of corrosion is Θ, and set A is aggregated B corrosion and is defined as:
(2) operator of expansion isSet A is aggregated B expansions and is defined as:
Using dilation erosion type gradient operator, i.e., the image after subtracting corrosion with the image after expansion, you can obtain image In edge.Since edge at this time is not that single pixel wide connects, it is also necessary to again with region framework extraction algorithm to edge into Row refinement.
1) it is image to set B, and S (A) represents the skeleton of A, and B is structural element, and following formula represents:
Wherein, K represents to corrode A into the iterations before empty set, i.e. following formula is expressed as:
Sk(A) it is known as skeleton subset, can be written as according to the following formula:
A Θ kB represent to corrode A with B for continuous k times.Final etching is the result is that gesture binary image.
S5 gesture identifications:Use the identification of implement the algorithm of support vector machine gesture motion.
Using all gesture features as one group of n dimensional feature vectors x and its class label w.It is differentiable super by defining Plane obtains the discriminant function wx+b=0 of two classes.It is spaced to maximize, defines two parallel hyperplane wx+b=1, Wx+b=-1 by supporting vector, and does not have training mode between them.Then for all training mode xiIt must expire The inequality in foot face:
wi(w·x+b)≥1 (15)
The distance of this hyperplane is 2/ | | w | |.It is spaced to maximize, needs to minimize | | w | |, with Lagrange Principle states this minimization problem, simplifies optimization process, finally can be calculated discriminant function is:
In the problem of this method being generalized to Nonlinear separability using geo-nuclear tracin4.Linear support vector grader Dot product can be replaced with Non-linear Kernel function:
k(xi,xj)=Φ (xi)·Φ(xj) (17)
The discriminant function of generation is:
During practical operation, the present embodiment is trained using 150 groups of images of gestures as training sample using support vector machine method Sample obtains gesture classifier, and wherein Non-linear Kernel function is Sigmoid functionsAnd to 50 groups of test images of gestures into Row classification.

Claims (1)

1. a kind of motion gesture detection and recognition methods based on Face Detection, which is characterized in that include the following steps:
S1 shakes detection:The candidate region of hand is found in the position of detection positioning hand;
S2 hand skin color model treatments:Skin color model is carried out to shaking the hand region found after detection, distinguishes human hand and background, Establish skin similarity model;
S3 gesture trackings:Gesture tracking is carried out using mean shift algorithm, gesture is tracked and identified with window search;
S4 gesture areas are divided:Further the hand region of tracking is split using hand skin color model, obtains gesture Bianry image;
S5 gesture identifications:Use the identification of implement the algorithm of support vector machine gesture motion.
CN201611262542.1A 2016-12-31 2016-12-31 A kind of motion gesture detection and recognition methods based on Face Detection Pending CN108268819A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611262542.1A CN108268819A (en) 2016-12-31 2016-12-31 A kind of motion gesture detection and recognition methods based on Face Detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611262542.1A CN108268819A (en) 2016-12-31 2016-12-31 A kind of motion gesture detection and recognition methods based on Face Detection

Publications (1)

Publication Number Publication Date
CN108268819A true CN108268819A (en) 2018-07-10

Family

ID=62755226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611262542.1A Pending CN108268819A (en) 2016-12-31 2016-12-31 A kind of motion gesture detection and recognition methods based on Face Detection

Country Status (1)

Country Link
CN (1) CN108268819A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163055A (en) * 2018-08-10 2019-08-23 腾讯科技(深圳)有限公司 Gesture identification method, device and computer equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163055A (en) * 2018-08-10 2019-08-23 腾讯科技(深圳)有限公司 Gesture identification method, device and computer equipment

Similar Documents

Publication Publication Date Title
Zhou et al. A novel finger and hand pose estimation technique for real-time hand gesture recognition
Qi et al. Computer vision-based hand gesture recognition for human-robot interaction: a review
Hasan et al. RETRACTED ARTICLE: Static hand gesture recognition using neural networks
Shenoy et al. Real-time Indian sign language (ISL) recognition
CN110221699B (en) Eye movement behavior identification method of front-facing camera video source
Abiyev et al. Head mouse control system for people with disabilities
CN108595008B (en) Human-computer interaction method based on eye movement control
Sonkusare et al. A review on hand gesture recognition system
Yang et al. Hand gesture recognition: An overview
CN110688965A (en) IPT (inductive power transfer) simulation training gesture recognition method based on binocular vision
Kalsh et al. Sign language recognition system
Suresh et al. Real-time hand gesture recognition using deep learning
CN108614988A (en) A kind of motion gesture automatic recognition system under complex background
KR102052449B1 (en) System for virtual mouse and method therefor
Elakkiya et al. Intelligent system for human computer interface using hand gesture recognition
Sokhib et al. A combined method of skin-and depth-based hand gesture recognition.
Półrola et al. Real-time hand pose estimation using classifiers
CN108255285A (en) It is a kind of based on the motion gesture detection method that detection is put between the palm
Khan et al. Computer vision based mouse control using object detection and marker motion tracking
CN108268125A (en) A kind of motion gesture detection and tracking based on computer vision
CN108268819A (en) A kind of motion gesture detection and recognition methods based on Face Detection
Dutta et al. A hand gesture-operated system for rehabilitation using an end-to-end detection framework
Desai Segmentation and recognition of fingers using Microsoft Kinect
Raza et al. An integrative approach to robust hand detection using CPM-YOLOv3 and RGBD camera in real time
Ghodichor et al. Virtual mouse using hand gesture and color detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180710