CN106055106A - Leap Motion-based advantage point detection and identification method - Google Patents

Leap Motion-based advantage point detection and identification method Download PDF

Info

Publication number
CN106055106A
CN106055106A CN201610391403.2A CN201610391403A CN106055106A CN 106055106 A CN106055106 A CN 106055106A CN 201610391403 A CN201610391403 A CN 201610391403A CN 106055106 A CN106055106 A CN 106055106A
Authority
CN
China
Prior art keywords
delta
max
summit
phi
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610391403.2A
Other languages
Chinese (zh)
Other versions
CN106055106B (en
Inventor
刘宏哲
袁家政
张雪鉴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Union University
Original Assignee
Beijing Union University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Union University filed Critical Beijing Union University
Priority to CN201610391403.2A priority Critical patent/CN106055106B/en
Publication of CN106055106A publication Critical patent/CN106055106A/en
Application granted granted Critical
Publication of CN106055106B publication Critical patent/CN106055106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a Leap Motion-based advantage point detection and identification method, and belongs to the field of human-computer interaction of computer systems. The method comprises the steps of firstly obtaining a gesture vertex, and building a gesture library to validate an improved non-parameter control advantage point detection algorithm; giving all points between a first point and a tail point and connecting the first point and the tail point to form a line segment; finding the point furthest to the line segment from the given points and judging whether the point is greater than epsilon or not, if so, reserving the point, and or else, rejecting the point; and repeating the method and finally obtaining the optimized broken line. The parameter epsilon is obtained by self-adaption of the improved non-parameter control advantage point detection algorithm. The algorithm is applied to a Leap Motion motion-sensing controller and more gestures can be expanded. The Leap Motion-based advantage point detection and identification method has good self-adaptivity and accuracy on gesture identification; the gestures can be more accurately identified; and the method has extensive applications in human-computer interaction application.

Description

A kind of Dominant point detection recognition methods based on Leap Motion
Technical field
A kind of Dominant point detection recognition methods based on Leap Motion belongs to the field of human-computer interaction of computer system.
Background technology
Along with human-computer interaction technology application is constantly brought forth new ideas and develops, virtual reality technology greatly strengthens Museum Exhibit The form of expression and interaction capabilities, wherein the application with Leap motion is relatively broad, but the gesture used is only limitted to Leap Gesture in Motion kit.Therefore to solve this problem, need gesture summit is preferably optimized, use advantage The method of some detection obtains more accurate summit.
Dominant point detection algorithm is studied by many scholars in recent years, and Dominant point detection algorithm has a lot, according to making Method can be divided into dynamic programming, split-run, combined method, digital line method, breakpoint compression method and calculate curvature Method.All use a preset parameter ε to carry out the choice at control point in these algorithms, be referred to as state modulator Dominant point detection and calculate Method, determines the choice of this point by the side-play amount judging current point and 2 line segments being linked to be of head and the tail.But state modulator is excellent Gesture point detection algorithm needs to set in advance parameter, and does not has the most adaptive for the parameter that the line segment of different length is fixing Ying Xing.
Summary of the invention
Therefore according to problem above, the present invention proposes the Dominant point detection algorithm that a kind of nonparametric controls, it is therefore intended that will The present invention is used in the middle of the Virtual Antique gesture guide to visitors in museum, accurately identifies the gesture of operator, improves man-machine interaction.
To achieve these goals, this invention takes following technical scheme:
One, gesture summit is obtained.Use Leap Motion to obtain hand images, the hand the five fingers or any one finger are regarded as Summit also records data.
Two, the nonparametric control Dominant point detection algorithm that gesture library is improved is set up with checking.Set up vertical fold line, water respectively Flat-folded line and square gesture.
Three, the nonparametric using a kind of improvement controls Dominant point detection algorithm and carries out gesture summit accepting or rejecting optimization.First, The institute being given between first point and tail point a little, and connects first point and tail point, forms line segment.Then, find in the point be given Find from this line segment point furthest, it is judged that whether this point, more than ε, if setting up, retaining this point, otherwise casting out.Repeat This method, and finally give the broken line after optimization.Wherein parameter ε is obtained by the nonparametric control Dominant point detection algorithm self adaptation improved Out.
Four, the summit of optimization is coupled together mate with the gesture in gesture library, complete to identify.
Compared with existing detection method, present invention have the advantage that (1) nonparametric controls Dominant point detection algorithm pair In complex gesture, there is more preferable resolution;(2) are due to the property of there are differences between operator, nonparametric as shown in drawings Control Dominant point detection algorithm can more gesture yardstick be identified, and recognition effect is accurate;(3) nonparametric controls excellent The noise that gesture point detection algorithm produces for equipment all has good robustness under different gesture yardsticks.As shown in drawings.
Accompanying drawing explanation
Fig. 1 system flow chart.
Fig. 2 vertical fold line gesture identification schematic diagram.
Fig. 3 horizontal fold line gesture identification schematic diagram.
Fig. 4 square gesture identification schematic diagram.
The nonparametric that Fig. 5 improves controls the denoising schematic diagram of Dominant point detection algorithm.
Detailed description of the invention
3 the invention will be further described below in conjunction with the accompanying drawings:
1. obtain summit
Obtained the hand images of each frame by Leap Motion motion sensing control device, it can identify the most accurately and sell Refer to towards and the normal vector of hand so that each details of hand perfectly presents.We choose in each frame The mean place of finger tips as gesture summit.
2. summit processes
First dynamic gesture being carried out summit processing procedure, all summits recorded due to Leap Motion are whole Needed to be manually placed in the middle of the identification region of Leap Motion before doing gesture for gesture summit, such as operator, moving The parameter value ε that the summit produced during Dong calculates then needs to be cast out.The most rule of thumb we need front 2 tops Point is cast out.So, after removing front 2 gesture summits, the calculating of parameter can be more efficient.Now summit is referred to as { P1, P2,…Pk}(k∈N+)
3. parameter calculates
Connect P1PkObtain line segment l, computed range s=| P1Pk|, and slope m, and calculate ε.Concrete grammar is as follows:
Nonparametric RDP detection algorithm key issue to be solved based on Leap Motion is automatically to choose suitably Parameter value ε.We carry out the change of adaptive control parameter according to the angle between length, slope and the line segment of each line segment Change.And according to image processing
X'=round (x);Y'=round (y) (1)
X, y are respectively abscissa and the vertical coordinate on summit, round it, use P ' (x ', y ') approximate true summit P (x, y)。
x’,y’∈Z (2)
Z is shaping real number.
X'=x+ Δ x;Y'=y+ Δ y (3)
-0.5≤Δx≤0.5,-0.5≤Δy≤0.5 (4)
Δxk,ΔykIt is respectively summit PkAbscissa and the side-play amount of vertical coordinate.Definition P1P2Slope be m, P1′P2′ Slope be m '.:
The sine value calculated by original point is used to estimate with by the angular difference approximating the sine value that point calculates Error:
(6) formula is substituted in (7) and can obtain:
Assume that s and t is as follows:
Substitute into (8) to obtain:
From formula (4), the maximum of Δ x+ Δ y is 1;
From formula (9), | (x2-x1)/s | with | (y2-y1)/s | both less than equal to 1, s is always more thanSo Thus draw
N is positive integer: (n ∈ N+)
When | Δ x2-Δx1|=| Δ y2-Δy1| when=1There is maximum, therefore can obtain t by formula (5) Big value:
Thus obtain maximum
Because tmax≤ 1, formula (14) can be write as:
The upper limit representing rate of increase isDuring the iteration each time of RDP algorithm, utilize formula (14) try to achieve
4. parameter processing
In order to promote the robustness of the nonparametric RDP detection algorithm noise to being produced by Leap Motion, in non-ginseng In the iterative process each time of number RDP algorithms, we add the choice link of parameter value, rule of thumb, if later parameter ε Less than previous parameter ε three/cast out first.Be we have found that by such process and produce by Leap Motion Noise there is good robustness, as shown in red circle in Fig. 6, the part in circle be then intended to detection during need house The part gone because it and be not belonging to the part of gesture.
5. calculate the ultimate range on summit
By P1PkAfter drawing line segment l and calculating ε, we calculate P respectively1To PkBetween all summits to line segment l away from From, and find summit P farthest for distance line segment lmax, its ultimate range is dmax.If dmax≤ ε, then PmaxCast out, otherwise then protect Stay Pmax, and connect P1PmaxObtain line segment l(1,max), connect PmaxPkObtain line segment l(max,k).
Calculate again according to 3. parameters, 4. parameter processing and 5. three steps of ultimate range calculating summit are iterated Operation, until by all summit { P1,P2,…PkProcessed, it is finally completed the detection on all summits.
6. coupling gesture library
Summit after optimizing is attached according to sequencing, generates gesture graph, and enters with the gesture in gesture library Row coupling.It is shown experimentally that the gesture using nonparametric RDP detection algorithm based on Leap Motion to detect compares RDP Algorithm has more preferable robustness.
Advantage: RDP algorithm has good gesture identification effect, but under the scene that this artificial abortion in museum is big, often One operator suffers from different operating habits, and the gesture size drawn differs.But RDP algorithm be a kind of parameter fix Detection method, does not has the multiple dimensioned property of gesture identification, is not therefore suitable in the scene that this artificial abortion in museum is bigger. RDP algorithm is the detection method that a kind of parameter is fixing, does not has the multiple dimensioned property of gesture identification, proposed a kind of base Nonparametric RDP detection algorithm in Leap Motion has more preferable resolution for complex gesture;Compared to parameter RDP detection algorithm, it has multiple dimensioned property, and the gesture for different operators suffers from more preferable adaptability;Meanwhile, non- Parameter RDP detection algorithm all has good robust for the noise produced by the shake of operator's hand under different gesture yardsticks Property, the highest to the gesture motion code requirement of operator, operate more friendly.

Claims (1)

1. a Dominant point detection recognition methods based on Leap Motion, it is characterised in that comprise the following steps:
1) summit is obtained
Obtained the hand images of each frame by Leap Motion motion sensing control device, choose putting down of the finger tips in each frame All positions are as gesture summit;
2) summit processes
First dynamic gesture being carried out summit processing procedure, after removing front 2 gesture summits, remaining summit is referred to as { P1, P2,…Pk};
3) parameter calculates
Connect P1PkObtain the first line segment, computed range s=| P1Pk|, and slope m, and calculate ε;Specific as follows:
X'=round (x);Y'=round (y) (1)
X, y are respectively abscissa and the vertical coordinate on summit, in (1) formula round it, use the summit P ' (x ', y ') after rounding near Vraisemblance summit P (x, y);
x’,y’∈Z (2)
Z is shaping real number;
X'=x+ Δ x;Y'=y+ Δ y (3)
-0.5≤Δx≤0.5,-0.5≤Δy≤0.5 (4)
Δxk,ΔykIt is respectively summit PkAbscissa and the side-play amount of vertical coordinate;Definition P1P2Slope be m, P1′P2' slope For m ';
m = t a n φ = y 2 - y 1 x 2 - x 1 - - - ( 5 )
m ′ = y 2 ′ - y 1 ′ x 2 ′ - x 1 ′ = ( m + Δy 2 - Δy 1 x 2 - x 1 ) / ( 1 + Δx 2 - Δx 1 x 2 - x 1 ) - - - ( 6 )
The angular difference using the sine value that calculated by original point and the sine value calculated by approximation point carrys out estimation difference:
∂ φ = | tan - 1 m - tan - 1 m ′ | = | tan - 1 m - m ′ 1 + mm ′ | - - - ( 7 )
(6) formula is substituted in (7) and obtains:
∂ φ = | tan - 1 ( ( 1 + Δx 2 - Δx 1 x 2 - x 1 ) m - ( m + Δy 2 - Δy 1 x 2 - x 1 ) ( 1 + Δx 2 - Δx 1 x 2 - x 1 ) + m ( m + Δy 2 - Δy 1 x 2 - x 1 ) ) | | tan - 1 ( ( Δx 2 - Δx 1 x 2 - x 1 ) m - ( Δy 2 - Δy 1 x 2 - x 1 ) ( 1 + m 2 ) + ( Δx 2 - Δx 1 x 2 - x 1 ) + m ( Δy 2 - Δy 1 x 2 - x 1 ) ) | | tan - 1 ( m ( Δx 2 - Δx 1 ) - ( Δy 2 - Δy 1 ) ( 1 + m 2 ) ( x 2 - x 1 ) + ( Δx 2 - Δx 1 ) + m ( Δy 2 - Δy 1 ) ) | - - - ( 8 )
Assume that s and t is as follows:
s = ( x 2 - x 1 ) 2 + ( y 2 - y 1 ) 2 - - - ( 9 )
t = ( Δx 2 - Δx 1 ) ( x 2 - x 1 ) s 2 + ( Δy 2 - Δy 1 ) ( y 2 - y 1 ) s 2 - - - ( 10 )
Substitute into (8) to obtain:
∂ φ = tan - 1 ( ( x 2 - x 1 s 2 ) ( 1 + t ) - 1 ( m ( Δx 2 - Δx 1 ) - ( Δy 2 - Δy 1 ) ) ) - - - ( 11 )
From formula (4), the maximum of Δ x+ Δ y is 1;
From formula (9), | x2-x1)/s | with | (y2-y1)/s | both less than equal to 1, s is always more thanSoThus Go out
∂ φ = | tan - 1 ( ( x 2 - x 1 s 2 ) ( m ( Δx 2 - Δx 1 ) - ( Δy 2 - Δy 1 ) ) ( Σ n = 0 ∞ ( - t ) n ) ) | - - - ( 12 )
When | Δ x2-Δx1|=| Δ y2-Δy1| when=1There is maximum, therefore obtain the maximum of t by formula (5):
t max = ( 1 s ) ( | cos φ | + | sin φ | ) - - - ( 13 )
Thus obtain maximum
∂ φ max = max ( tan - 1 ( 1 s ( | sin φ ± cos φ | ) ( | Σ n = 0 ∞ ( - t max ) n | ) ) ) - - - ( 14 )
Because tmax≤ 1, formula (14) is write as:
∂ φ max = max ( tan - 1 ( 1 s ( | sin φ ± cos φ | ) ( 1 - t max + t max 2 ) ) ) + O ( t max 3 ) - - - ( 15 )
The upper limit representing rate of increase isDuring the iteration each time of RDP algorithm, try to achieve
4). parameter processing
Iterative process adds the choice link of parameter value each time, if later parameter ε is less than three points of previous parameter ε Cast out first;
5). calculate the ultimate range on summit
By P1PkAfter drawing the first line segment and calculating ε, calculate P respectively1To PkBetween all summits to the distance of the first line segment, And find the farthest summit P of distance the first line segmentmax, its ultimate range is dmax;If dmax≤ ε, then PmaxCast out, otherwise then protect Stay Pmax, and connect P1PmaxObtain the first line segment(1,max), connect PmaxPkObtain the first line segment(max,k)
Repeat step 3)-5) until by all summit { P1,P2,…PkProcessed, complete the detection on all summits.
CN201610391403.2A 2016-06-04 2016-06-04 A kind of Dominant point detection recognition methods based on Leap Motion Active CN106055106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610391403.2A CN106055106B (en) 2016-06-04 2016-06-04 A kind of Dominant point detection recognition methods based on Leap Motion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610391403.2A CN106055106B (en) 2016-06-04 2016-06-04 A kind of Dominant point detection recognition methods based on Leap Motion

Publications (2)

Publication Number Publication Date
CN106055106A true CN106055106A (en) 2016-10-26
CN106055106B CN106055106B (en) 2018-11-13

Family

ID=57170144

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610391403.2A Active CN106055106B (en) 2016-06-04 2016-06-04 A kind of Dominant point detection recognition methods based on Leap Motion

Country Status (1)

Country Link
CN (1) CN106055106B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110119216A1 (en) * 2009-11-16 2011-05-19 Microsoft Corporation Natural input trainer for gestural instruction
CN102592113A (en) * 2011-12-23 2012-07-18 哈尔滨工业大学深圳研究生院 Rapid identification method for static gestures based on apparent characteristics
CN103211622A (en) * 2013-05-06 2013-07-24 深圳市开立科技有限公司 Multi-imaging-mode ultrasonic image display method and multi-imaging-mode ultrasonic image display system
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110119216A1 (en) * 2009-11-16 2011-05-19 Microsoft Corporation Natural input trainer for gestural instruction
CN102592113A (en) * 2011-12-23 2012-07-18 哈尔滨工业大学深圳研究生院 Rapid identification method for static gestures based on apparent characteristics
CN103226387A (en) * 2013-04-07 2013-07-31 华南理工大学 Video fingertip positioning method based on Kinect
CN103211622A (en) * 2013-05-06 2013-07-24 深圳市开立科技有限公司 Multi-imaging-mode ultrasonic image display method and multi-imaging-mode ultrasonic image display system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张雪鉴 黄先开 刘宏哲: "基于Leap Motion的体感交互技术在博物馆中的研究", 《计算机科学》 *

Also Published As

Publication number Publication date
CN106055106B (en) 2018-11-13

Similar Documents

Publication Publication Date Title
CN109325437B (en) Image processing method, device and system
JP4625074B2 (en) Sign-based human-machine interaction
US20170024893A1 (en) Scene analysis for improved eye tracking
CN109242903A (en) Generation method, device, equipment and the storage medium of three-dimensional data
US6788809B1 (en) System and method for gesture recognition in three dimensions using stereo imaging and color vision
US9727776B2 (en) Object orientation estimation
CN103105924B (en) Man-machine interaction method and device
TW201616450A (en) System and method for selecting point clouds using a brush tool
CN111414837A (en) Gesture recognition method and device, computer equipment and storage medium
JP6141108B2 (en) Information processing apparatus and method
CN112017212B (en) Training and tracking method and system of face key point tracking model
CN106778574A (en) For the detection method and device of facial image
WO2022021156A1 (en) Method and apparatus for robot to grab three-dimensional object
TW201342258A (en) Method for face recognition
CN111429481B (en) Target tracking method, device and terminal based on adaptive expression
Kerdvibulvech Hand tracking by extending distance transform and hand model in real-time
WO2021196013A1 (en) Word recognition method and device, and storage medium
WO2022228391A1 (en) Terminal device positioning method and related device therefor
CN107346207B (en) Dynamic gesture segmentation recognition method based on hidden Markov model
WO2018131163A1 (en) Information processing device, database generation device, method, and program, and storage medium
CN108564063B (en) Palm positioning method and system based on depth information
US11238620B2 (en) Implicit structured light decoding method, computer equipment and readable storage medium
Jo et al. Tracking and interaction based on hybrid sensing for virtual environments
CN106055106A (en) Leap Motion-based advantage point detection and identification method
JP2017033556A (en) Image processing method and electronic apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant