CN104808788B - A kind of method that non-contact gesture manipulates user interface - Google Patents

A kind of method that non-contact gesture manipulates user interface Download PDF

Info

Publication number
CN104808788B
CN104808788B CN201510119652.1A CN201510119652A CN104808788B CN 104808788 B CN104808788 B CN 104808788B CN 201510119652 A CN201510119652 A CN 201510119652A CN 104808788 B CN104808788 B CN 104808788B
Authority
CN
China
Prior art keywords
mrow
msub
joint
gesture
msup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510119652.1A
Other languages
Chinese (zh)
Other versions
CN104808788A (en
Inventor
于乃功
王锦
郭明
阮晓钢
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN201510119652.1A priority Critical patent/CN104808788B/en
Publication of CN104808788A publication Critical patent/CN104808788A/en
Application granted granted Critical
Publication of CN104808788B publication Critical patent/CN104808788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to a kind of method of the user interface such as non-contact gesture manipulation computer based on Kinect, touch-screen and projection, this method is mainly comprised the following steps:Acted by Kinect acquisition operations persons both arms, gesture is identified using Gesture Recognition Algorithm, interacting for the user interfaces such as people and computer, touch-screen and projection is realized using the gesture identified.The present invention relates to clicking, right button is clicked, double-click, rolling, zoom and drag totally 6 kinds of gestures after preceding rolling, substantially covers the demand that people and computer, touch-screen and projection etc. are interacted.This paper presents a kind of new recognition methods, the precision of gesture identification can be not only improved, moreover it is possible to improve the rapidity of identification.With reference to the use of 6 kinds of gestures, man-machine interaction more facility can be made freely.

Description

A kind of method that non-contact gesture manipulates user interface
Technical field
The present invention relates to a kind of non-contact gesture exchange method, specifically a kind of gesture based on Kinect and electricity The method that the user interfaces such as brain, touch-screen and projection carry out contactless interaction.
Background technology
Into 21 century, the hardware performance and popularity of the electronic equipment such as mobile phone, computer are improved constantly, and touch-screen is opened Begin popular.Touch operation makes people departing from the constraint of keyboard and mouse, and the object directly shown on operation screen is more conformed to The cognition and habits and customs of people, the trend as recent man-machine interaction mode.However as more and more variety classeses and specification Screen appearance, touch operation inconvenience and limitation also reveal gradually:Small size touch-screen has only changed one The mouse and keyboard of the form of kind, do not allow user to break away from the constraint of hardware yet;The touch screen size of hung on a wall can reach 60 Even 100 inches, screen must be walked close to during operation so that user can not see the object on full screen, object is accounted on screen in addition According to huge area also make dragging, scaling and choose etc. basic operation become very clumsy so that can not complete;In some scenes In, user does not allow to touch operation equipment, and the doctor for for example implementing operation is not readily accessible to any thing that may be carried disease germs Body, in order to operate electronic equipment, it has to which repeatedly opponent carries out disinfection after operation terminates.In Public Service Field, touch-screen Application also encounter some insoluble bottlenecks.
With developing rapidly for human-computer interaction technology, people want to make man-machine interaction become more natural and direct, and pass System interactive mode, such as the hardware device such as keyboard, mouse, it is clear that do not reached our requirement, research more conforms to people's The human-computer interaction technology of exchange custom becomes very active in recent years, and man-machine interaction is from before centered on computer Progressively it is transferred to human-centred, this trend has exactly been complied with the man-machine interaction of view-based access control model research.
Research shows that the acquisition more than 80% of human information comes from vision, and gesture has very strong visual effect, vividly It is vivid and directly perceived.The Gesture Recognition of view-based access control model is also gradually applied to Sign Language Recognition, letter and Chinese Character Recognition, lantern slide and drilled Show, the control of the switch of light fixture, game control, TV remote, browse the multiple fields such as webpage control, finger drawing.
The content of the invention
In order to solve the above technical problems, this invention takes following technical scheme:A kind of noncontact based on Kinect Formula gestural control method, gathers user's both arms skeletal joint using Kinect and acts, gesture is identified by Gesture Recognition Algorithm, Operational control finally is carried out to user interfaces such as computer, touch-screen and projections using the gesture identified.
The gesture identification method that the present invention takes is comprised the steps of:
Step 1, pass through camera acquisition operations person's both arms skeletal joint act;
If controlled device is touches large-size screen monitors, Kinect cameras are arranged on above the center for touching large-size screen monitors;If controlled Object is projection or computer, then Kinect is placed on to the position for being easy to operation beside projection or computer.Pass through Kinect Camera acquisition operations person's both arms skeletal joint is acted.
Step 2, the image progress Effective judgement to collecting, specific determination methods A are as follows:
The three-dimensional coordinate information data of the skeletal joint point of camera acquisition operations person;As the behaviour collected in every two field picture The skeletal joint point of author is more than the middle shoulder that operator is contained in 14, and this 14 skeletal joint points, right and left shoulders, left and right Elbow, left and right wrist and left and right swivel of hand the three-dimensional coordinate data of totally 9 artis when, then be considered as effective image frame data, it is no Then, it is invalid to be considered as, and gives up;
Wherein, Kinect cameras can at most gather 20 skeletal joint points for appearing in its camera previous action person Three-dimensional coordinate information data, the three-dimensional data has three state value:Null value, actual value, estimate;
Whether be effective operation arm, be specially if then judging the left and right arm in effective image respectively:Shoulder joint with it is right The absolute value m for answering the difference of wrist joint y-axis coordinate less than d is effective arm, is otherwise invalid operation, wherein 9 bones are closed The coordinate representation of node is as follows:Middle shoulder Sc(xsc,ysc,zsc)=(0,0,0), left shoulder Sl(xsl,ysl,zsl), right shoulder Sr(xsr,ysr, zsr), left elbow El(xel,yel,zel), right elbow Er(xer,yer,zer), left wrist Wl(xwl,ywl,zwl), right wrist Wr(xwr,ywr,zwr), it is left Hand Hl(xhl,yhl,zhl), right hand Hr(xhr,yhr,zhr)。
Wherein:M=| ysi-ywi|,
,
I takes r or l;
Step 3, gesture identification is carried out to the image of the effective manipulator's arm of presence collected, specific recognition methods A is as follows:
3.1) three-dimensional coordinate data of the artis in image is converted into the three-dimensional data under human body coordinate system, wherein institute The human body coordinate system stated is that, using middle shoulder joint as origin (0,0,0), plane where human body is X-Y plane, straight up side To for Y-axis positive direction, human body is just being Z axis negative direction to direction, and X-Y-Z observes right-hand screw rule;
Wherein, using during Kinect camera collection images, it is necessary to by the three-dimensional data under Kinect space coordinates turn The three-dimensional data under human body coordinate system is changed to, as shown in Figure 4;
3.2) people of shoulder, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand totally 9 artis in extracting in the t periods In three-dimensional coordinate data under body coordinate system, calculating shoulder, left shoulder, right shoulder, left elbow, right elbow, left wrist, right wrist, left hand, the right hand this 9 In individual artis, vector U and elbow joint to the vector V of corresponding wrist joint point angle α, elbow of the elbow joint to correspondence shoulder joint node Joint is to the angle β of vector V and the X-axis positive direction of correspondence wrist joint point, wrist joint to the vector W and elbow joint for corresponding to swivel of hand Angle ω between the vector V of correspondence wrist joint point, recognizes that gesture control is instructed by α, β, ω;
The wherein corresponding calculation formula of the right hand is as follows, and the corresponding calculation formula of left hand similarly can obtain, i.e., only needing will be following Coordinate in formula, which is all corresponded to, is substituted for left hand coordinate,
Recognize that the method that gesture control is instructed is as follows by α, β, ω:
For single arm it is effective when,
When becoming small after α first becomes greatly in time t, to double-click gesture;
It is right button click gesture when α first diminishes in time t becomes big afterwards;
It is click gesture when becoming small after ω first becomes greatly in time t;
When trend constant after ω occurs to become greatly in time t, to choose gesture, until ω is in another time t The interior trend that diminishes, that is, discharge gesture, and whole process is drag gesture from choosing to discharging;
The trend diminished after β occurs in time t and first becomes big, while α does not change in time t, hand is rolled after being Gesture;When the trend for becoming big afterwards occurs first to diminish in time t for β, while α does not change in time t, rolling gesture before being;
For two arms it is effective when,
When right arm meets the trend diminished after angle β occurs in time t and first becomes big, while left arm meets angle β Occur first to diminish in time t the trend for becoming big afterwards when, for reduce gesture;Occur when right arm meets angle β in time t first Become big trend after diminishing, at the same left arm meet angle β occur first to become in time t it is big after diminish trend when, for amplification Gesture.
Step 4, the gesture control that will identify that instruction with the outside input in controlled device system interface program operate into Row mapping, realizes the control of user interface.
Described gesture identification method A can also be specific as follows using gesture identification method B:
The human body coordinate of shoulder, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand totally 9 artis in extracting in the t times The lower three-dimensional coordinate data of system, by middle shoulder in the t periods, left shoulder, left elbow, left wrist, left hand this 5 artis human body coordinate system Under three-dimensional coordinate data as M groups, by shoulder in the t periods, right shoulder, right elbow, right wrist, the right hand this 5 artis human body Three-dimensional coordinate data under coordinate system records the change of each artis x, y, z coordinate within the t times in two groups respectively as N groups Change, that is, constitute test template;Shoulder, left shoulder, left elbow, left wrist, left hand, right shoulder, right elbow, right wrist, the right side in the t times deposited in advance Three-dimensional coordinate in this 9 artis of hand under the human body coordinate system of each skeletal joint point changes with time trend to refer to mould Plate;Then dynamic time warping DTW algorithms are utilized, the similarity between test template and reference template are calculated, if calculated value More than given threshold value, then the match is successful for gesture, otherwise, and it fails to match;
Described reference template includes herein below:
For single arm it is effective when:
When the middle shoulder joint of arm, shoulder joint, elbow joint, wrist joint coordinate do not change within the t1 times with the time, And the y-axis coordinate of swivel of hand first diminishes with the time and becomes big afterwards, while the z-axis coordinate of swivel of hand first becomes big with the time or diminished, and When excursion is less than b centimetres, i.e. click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand First diminish with the time with carpal y-axis coordinate and become big afterwards, while after swivel of hand and carpal z-axis coordinate first become big with the time Diminish, and excursion be less than a centimetres when, that is, double-click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand Diminish after first becoming with the time greatly with carpal y-axis coordinate, while after swivel of hand and carpal z-axis coordinate first diminish with the time Become big, and excursion is when being less than a centimetres, i.e. right button click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand First diminish with the time with carpal x-axis coordinate and become big afterwards, while after swivel of hand and carpal z-axis coordinate first diminish with the time Become big, and excursion is when being less than a centimetres, i.e., after roll gesture;When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm are in t In do not changed with the time, and swivel of hand and carpal x-axis coordinate first become with the time it is big after diminish, while swivel of hand and Carpal z-axis coordinate first diminishes with the time becomes big afterwards, and excursion is when being less than a centimetres, i.e., before roll gesture;
When the middle shoulder joint of arm, shoulder joint, elbow joint, wrist joint coordinate do not change within the t1 times with the time, And the y-axis coordinate of swivel of hand diminishes with the time, while anaplasia is big at any time for the z-axis coordinate of swivel of hand, and excursion is less than b centimetres When, that is, choose gesture;Until the y-axis coordinate of swivel of hand, anaplasia is big at any time, while the z-axis coordinate of swivel of hand diminishes with the time, and When excursion is less than b centimetres, that is, discharge gesture;Whole process is drag gesture from choosing to discharging;
For two arms it is effective when:
When the middle shoulder joint, shoulder joint, elbow joint coordinate of two arms do not change within the t1 times with the time, and it is right Swivel of hand and right carpal x-axis coordinate first diminish with the time becomes big afterwards, and left hand joint and left carpal x-axis coordinate are with the time Diminish after first becoming greatly, while swivel of hand and carpal z-axis coordinate first diminish with the time becomes big afterwards, and excursion is less than a lis Meter Shi, that is, reduce gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of two arms do not change within the t1 times with the time, and it is right Swivel of hand and right carpal x-axis coordinate diminish after first becoming with the time greatly, and left hand joint and left carpal x-axis coordinate are with the time First diminish and become big afterwards, while swivel of hand and carpal z-axis coordinate first diminish with the time becomes big afterwards, and excursion is less than a lis Meter Shi, i.e. amplifying gesture;
Above-mentioned a, b span are 10<a<30,5<b<10.
Described gesture identification method A can also be by the way of method A and method B be used in combination, specially:First make Method A is used, is stopped if recognizing successfully;If it is unidentified go out gesture, then be identified with method B, if recognized successfully Then stop;If it is unidentified go out gesture if re-start gesture identification.
As a kind of preferred scheme, operator described in step 1 is stood or is seated at 1.8~3.0 immediately ahead of Kinect video camera At rice, optimum distance is 2 meters or so.
As a kind of preferred scheme, the coordinate system of the Kinect three dimensions is:The shooting of Z axis and Kinect video camera Head axis parallel, it is X-axis to be defined according to right-hand screw rule with the direction of horizontal direction parallel, and vertical direction is Y-axis, such as Fig. 3 It is shown;
The gesture motion of single arm does not differentiate between right-hand man, can complete and user circle such as computer, touch-screen and projection The contactless interaction in face.
The frame per second of the Kinect cameras is 30 frames/second, and the scope of execution time t of each gesture is 0.5s-1.5s.
Beneficial effect:
(1) present invention is used to be controlled the user interfaces such as computer, touch-screen and projection, can be without contact screen And operate on it, interactive mode is more directly perceived naturally, also enhancing the experience sense of user simultaneously.User is not required in the present invention Wear any label.
(2) the human hand and arm joint free degree is combined with gesture design, i.e., by 3 frees degree of shoulder joint, elbow joint 1 Restriction relation between the free degree, 3 frees degree of wrist joint and click, double-click, right button is clicked, zoom, after preceding rolling rolling and 6 kinds of gestures of dragging are combined, and truly realize human body as the target of controller.The three degree of freedom of shoulder joint includes Anterior flexion and rear stretching, outreach adduction and inward turning outward turning;Elbow joint has one degree of freedom i.e. anterior flexion and rear stretching;Carpal three degree of freedom bag Include anterior flexion and rear stretching, outreach adduction and turn in the palm and turn.7 degree of freedom is fully able to meet arm as the requirement of controller.
(3) consider that operator needs significantly to act and could be identified in conventional gesture interaction system so that interaction During user easily produce sense of fatigue.Present invention design is by considering 7 frees degree of human arm in motion process Power consumption situation, forearm is concentrated on by the mobile position of gesture motion, and using the algorithm of optimization, hand can be reduced in identification process The unnecessary movement of shoulder joint, has great importance in actual applications.With reference to human hand and arm joint limit of sports record, gesture is set Meter is more abundant.
(4) present invention is not related to the identification to finger-joint, so palm can arbitrarily be opened when doing any gesture motion Open or close up.
(5) only it need to track and handle the middle shoulder in effective image frame, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand The coordinate data of totally 9 artis, it is to avoid the occlusion issue during tracking, while processing speed and identification essence can be improved Degree.
(6) dynamic time warping (DTW) algorithm is combined to carry out gesture with the distinctive limb motion constraint of human body Identification, improves the efficiency of identification, enhances the practicality of system.
Brief description of the drawings
20 artis of human body that accompanying drawing 1 can collect for the Kinect cameras that the present invention is used;
Accompanying drawing 2 is technical solution of the present invention flow chart;
Accompanying drawing 3 is Kinect cameras three-dimensional coordinate system schematic diagram of the present invention;
Accompanying drawing 4 is each joint position relation schematic diagram under human body coordinate system of the present invention;
The human hand and arm joint limit of sports record data that accompanying drawing 5 is used for reference for the present invention.
Embodiment
In order that the purpose of the present invention, technical scheme are more clearly understood, below with reference to drawings and Examples to this hair It is bright to be described in detail.The specific embodiment of this place statement is only used for explaining the present invention, is not intended to limit the present invention.
The technical problem to be solved in the invention is to provide a kind of new non-contact gesture control user interface and replaced Traditional finger is touched, the method for mouse and keyboard, specifically a kind of non-contact gesture control electricity based on Kinect The method of the user interfaces such as brain, touch-screen and projection.
For above technical problem, following technical scheme is taken:Using Kinect collection user both arms actions, by specific Gesture Recognition Algorithm identify gesture, handed over using user interfaces such as the gesture identified and computer, touch-screen and projections Mutually.Specifically, the upper Kinect device of computer connection of Kinect drivings will be installed, Kinect cameras are by the depth of collection Degrees of data, skeleton data reach application program, and application program extracts effective arm joint coordinate data of operator by handling, 6 kinds of gestures are recognized by specific Gesture Recognition Algorithm, and corresponding gesture message order is sent to controlled device made instead Feedback, so as to reach the purpose of manipulation user interface.
Accompanying drawing 2 gives one of overall technical architecture in the method that a kind of non-contact gesture of the invention manipulates user interface Individual embodiment.Specific gesture identification method comprises the following steps:
Kinect cameras are arranged on and touched directly over large-size screen monitors at 1.8 meters of height by step 1., and operator stands on Immediately ahead of Kinect at 2~2.5 meters;
Step 2, the image progress Effective judgement to collecting, specific determination methods are as follows:
Whether a. judge the image collected is effective image:Kinect cameras, which can at most be gathered, appears in its shooting The three-dimensional coordinate information data of 20 skeletal joint points of head previous action person, the three-dimensional data has three state value:Null value, very Real value, estimate;When the true three-dimension coordinate value number of the skeletal joint point of operator in every two field picture is more than 14, and at this Middle shoulder, right and left shoulders, left and right elbow, left and right wrist and the left and right swivel of hand of operator totally 9 joints are contained in 14 skeletal joint points During the three-dimensional coordinate data of point, then effective image frame data are considered as, otherwise, give up invalid image frame data;
Whether b. judge the left and right arm in effective image is effective operation arm:Shoulder joint is sat with corresponding wrist joint y-axis The absolute value m of the difference of mark is effective arm less than d, is otherwise invalid operation, wherein the coordinate representation of 9 skeletal joint points It is as follows:Middle shoulder Sc(xsc,ysc,zsc)=(0,0,0), left shoulder Sl(xsl,ysl,zsl), right shoulder Sr(xsr,ysr,zsr), left elbow El(xel, yel,zel), right elbow Er(xer,yer,zer), left wrist Wl(xwl,ywl,zwl), right wrist Wr(xwr,ywr,zwr), left hand Hl(xhl,yhl, zhl), right hand Hr(xhr,yhr,zhr)。
Wherein:I takes r or l;M=| ysi-ywi|
Step 3, the three-dimensional data under Kinect space coordinates in effective image is converted under human body coordinate system three Dimension data, totally 9 the three-dimensional of artis are sat for shoulder, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand in extracting in the t periods Data are marked, middle shoulder, left shoulder in calculating per two field picture, right shoulder, left elbow, right elbow, left wrist, right wrist, left hand, this 9 joints of the right hand In point, elbow joint to the vector U of correspondence shoulder joint node and the angle α of elbow joint to the vector V of corresponding wrist joint point, elbow joint is arrived The vector V and the angle β of X-axis positive direction of correspondence wrist joint point, wrist joint is to corresponding to the vector W of swivel of hand and elbow joint to corresponding Angle ω between the vector V of wrist joint point.As shown in Figure 4.Right arm calculation formula is as follows:
Angle α span be 0 °~180 °, angle β spans be 0 °~180 °, angle ω spans be 0 °~ 90°。
If only one arm is effectively, i.e., left arm is effective or right arm is effective, then α, β, ω gesture according to definition are known Rule does not carry out gesture identification to single armed;
If two arms are all effective, α, β, ω gesture identification rule according to definition carry out gesture identification to both arms.
Also gesture identification can be carried out using the method for template matches, by the middle shoulder gathered in 3s, left shoulder, left elbow, left wrist, a left side The coordinate data deposit array M of this 5 artis of hand, by the middle shoulder gathered in 3s, right shoulder, right elbow, right wrist, this 5 passes of the right hand The coordinate data deposit array N of node, records in two arrays each x, y, z coordinate of the artis in 3s with the time respectively Change, i.e. test template.Shoulder, left shoulder, left elbow, left wrist, left hand, right shoulder, right elbow, right wrist, the right side in the 1s times deposited in advance Three-dimensional coordinate in this 9 artis of hand under the human body coordinate system of each skeletal joint point changes with time trend to refer to mould Plate.Then dynamic time warping (DTW) algorithm is utilized, the similarity between test template and reference template is calculated, if calculated Value is more than given threshold value, then the match is successful for gesture, otherwise, and it fails to match.The algorithm is the thought based on Dynamic Programming, can be solved Certainly time series template matches problem different in size, i.e., test template has adopted 3s in the present embodiment, and reference template is only 1s, Test template is divided into 3 1s by timing, is matched respectively with reference template.Respectively closed when calculating in continuous multiple frames effective image The similarity of a certain gesture in the variation tendency and reference template of node coordinate is more than 90%, then to can recognize that to be the gesture.
Or first application method A, stop if recognizing successfully;If it is unidentified go out gesture, then known with method B Not, stop if recognizing successfully;If it is unidentified go out gesture if re-start gesture identification.
Step 4, by after identification gesture control instruction with controlled device system interface program in finger touch, mouse and The operation such as keyboard is mapped one by one, realizes the control to user interfaces such as computer, touch-screen and projections.
The user action gesture that the present invention gathers Kinect is applied to enter the user interfaces such as computer, touch-screen and projection The contactless interaction of row, mode of operation more intuitive and convenient can replace traditional interactive mode such as finger touch, mouse, keyboard; 6 kinds of gestures that the present invention is designed simultaneously are easy to learn and use, and substantially covers the demand to operating user interface;Gesture Recognition Algorithm letter Single practical, processing response speed is very fast, disclosure satisfy that real-time man-machine interaction experience.
In summary, the inventive method causes user to be carried out with free-hand to user interfaces such as computer, touch-screen and projections Effectively control, without contacting with regard to that can interact, method is simply direct, more natural, convenient and efficient.
It is described above, only preferred embodiments of the present invention, but protection scope of the present invention is not limited thereto, and it is any ripe Those skilled in the art are known in scope disclosed in this invention, technique according to the invention scheme and its inventive concept It is subject to equivalent substitution or change, belongs to protection scope of the present invention.

Claims (4)

1. a kind of method that non-contact gesture manipulates user interface, it is characterised in that comprise the steps of:
1) acted by camera acquisition operations person's both arms skeletal joint;
2) Effective judgement is carried out to the image collected, specific determination methods are as follows:
Whether first determine whether the image collected is effective image, specifically, the skeletal joint point of camera acquisition operations person Three-dimensional coordinate information data;When the skeletal joint point of the operator collected in every two field picture is more than 14, and this 14 bones The three-dimensional of middle shoulder, right and left shoulders, left and right elbow, left and right wrist and the left and right swivel of hand of operator totally 9 artis is contained in artis During coordinate data, then effective image frame data are considered as, otherwise, it is invalid to be considered as, and gives up;
Whether be effective operation arm, be specially if then judging the left and right arm in effective image respectively:Shoulder joint and corresponding wrist The absolute value m of the difference of joint y-axis coordinate is effective arm less than d ,-it is otherwise invalid operation, wherein 9 skeletal joint points Coordinate representation it is as follows:Middle shoulder Sc(xsc,ysc,zsc)=(0,0,0), left shoulder Sl(xsl,ysl,zsl), right shoulder Sr(xsr,ysr,zsr), Left elbow El(xel,yel,zel), right elbow Er(xer,yer,zer), left wrist Wl(xwl,ywl,zwl), right wrist Wr(xwr,ywr,zwr), left hand Hl (xhl,yhl,zhl), right hand Hr(xhr,yhr,zhr);
Wherein:M=| ysi-ywi|,
<mrow> <mi>d</mi> <mo>=</mo> <mn>0.9</mn> <mo>*</mo> <mrow> <mo>(</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>s</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>s</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>+</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>i</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>i</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>)</mo> </mrow> </mrow> ,
I takes r or l;
3) gesture identification is carried out to the image of the effective manipulator's arm of the presence collected, specific recognition methods A is as follows:
3.1) three-dimensional coordinate data of the artis in image is converted into the three-dimensional data under human body coordinate system, wherein described Human body coordinate system is that, using middle shoulder joint as origin (0,0,0), plane where human body is X-Y plane, and direction is Y straight up Axle positive direction, human body is just being Z axis negative direction to direction, and X-Y-Z observes right-hand screw rule;
3.2) human body of shoulder, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand totally 9 artis is sat in extracting in the t periods Three-dimensional coordinate data under mark system, this 9 passes of shoulder, left shoulder, right shoulder, left elbow, right elbow, left wrist, right wrist, left hand, the right hand in calculating In node, vector U and elbow joint to the vector V of corresponding wrist joint point angle α, elbow joint of the elbow joint to correspondence shoulder joint node To the angle β of vector V and the X-axis positive direction of correspondence wrist joint point, wrist joint arrive the vector W and elbow joint of correspondence swivel of hand to pair The angle ω between the vector V of wrist joint point is answered, recognizes that gesture control is instructed by α, β, ω;The corresponding calculation formula of the right hand is such as Under, the corresponding calculation formula of left hand similarly can obtain, i.e., only need to all correspond to the coordinate in following formula and be substituted for left hand coordinate ;
<mrow> <mi>&amp;alpha;</mi> <mo>=</mo> <mi>arccos</mi> <mfrac> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>+</mo> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>+</mo> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>*</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>s</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mrow> </mfrac> </mrow>
<mrow> <mi>&amp;beta;</mi> <mo>=</mo> <mi>arccos</mi> <mfrac> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mfrac> </mrow>
<mrow> <mi>w</mi> <mo>=</mo> <mi>arccos</mi> <mfrac> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>+</mo> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>+</mo> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> <mo>*</mo> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mrow> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>e</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> <mo>*</mo> <msqrt> <mrow> <msup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>x</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>y</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>y</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> <mo>+</mo> <msup> <mrow> <mo>(</mo> <msub> <mi>z</mi> <mrow> <mi>h</mi> <mi>r</mi> </mrow> </msub> <mo>-</mo> <msub> <mi>z</mi> <mrow> <mi>w</mi> <mi>r</mi> </mrow> </msub> <mo>)</mo> </mrow> <mn>2</mn> </msup> </mrow> </msqrt> </mrow> </mfrac> </mrow>
4) the gesture control instruction that will identify that is mapped with the outside input operation in controlled device system interface program, real The control of existing user interface.
2. the method that a kind of non-contact gesture according to claim 1 manipulates user interface, it is characterised in that:
Step 3.2) described in by α, β, ω recognize gesture control instruct method it is as follows:
For single arm it is effective when,
When becoming small after α first becomes greatly in time t, to double-click gesture;
It is right button click gesture when α first diminishes in time t becomes big afterwards;
It is click gesture when becoming small after ω first becomes greatly in time t;
When trend constant after ω occurs to become greatly in time t, to choose gesture, until ω is sent out in another time t Change small trend, that is, discharges gesture, and whole process is drag gesture from choosing to discharging;
The trend diminished after β occurs in time t and first becomes big, while α does not change in time t, gesture is rolled after being;When β occur first to diminish in time t the trend for becoming big afterwards when, while α does not change in time t, rolling gesture before being;
For two arms it is effective when,
When right arm meets the trend diminished after angle β occurs first to become big in time t, at the same left arm meet angle β when Between occur first to diminish in t the trend for becoming big afterwards when, to reduce gesture;Occur first to diminish in time t when right arm meets angle β Become big trend afterwards, at the same left arm meet angle β occur first to become in time t it is big after diminish trend when, be amplifying gesture.
3. the method that a kind of non-contact gesture according to claim 1 manipulates user interface, it is characterised in that:Step 3) Described in gesture identification method A can also be specific as follows using gesture identification method B:
Shoulder, right and left shoulders, left and right elbow, left and right wrist and left and right swivel of hand are under the human body coordinate system of totally 9 artis in extracting in the t times Three-dimensional coordinate data, by under shoulder in the t periods, left shoulder, left elbow, left wrist, the human body coordinate system of left hand this 5 artis Three-dimensional coordinate data as M groups, by shoulder in the t periods, right shoulder, right elbow, right wrist, the right hand this 5 artis human body coordinate Three-dimensional coordinate data under system records the change of each artis x, y, z coordinate within the t times in two groups, i.e., respectively as N groups Constitute test template;In in t times for having deposited in advance shoulder, left shoulder, left elbow, left wrist, left hand, right shoulder, right elbow, right wrist, the right hand this 9 Three-dimensional coordinate in individual artis under the human body coordinate system of each skeletal joint point changes with time trend for reference template;So Dynamic time warping DTW algorithms are utilized afterwards, the similarity between test template and reference template is calculated, and are given if calculated value is more than Determine threshold value, then the match is successful for gesture, otherwise, and it fails to match;
Described reference template includes herein below:
For single arm it is effective when:
When the middle shoulder joint of arm, shoulder joint, elbow joint, wrist joint coordinate do not change within the t1 times with the time, and hand The y-axis coordinate in joint first diminishes with the time becomes big afterwards, while the z-axis coordinate of swivel of hand first becomes big with the time or diminished, and change When scope is less than b centimetres, i.e. click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand and wrist The y-axis coordinate in joint first diminishes with the time becomes big afterwards, while swivel of hand and carpal z-axis coordinate first become big rear change with the time It is small, and excursion be less than a centimetres when, that is, double-click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand and wrist The y-axis coordinate in joint diminishes after first becoming with the time greatly, becomes afterwards while swivel of hand and carpal z-axis coordinate first diminish with the time Greatly, and when excursion is less than a centimetres, i.e. right button click gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm do not change within the t1 times with the time, and swivel of hand and wrist The x-axis coordinate in joint first diminishes with the time becomes big afterwards, becomes afterwards while swivel of hand and carpal z-axis coordinate first diminish with the time Greatly, and excursion be less than a centimetres when, i.e., after roll gesture;When the middle shoulder joint, shoulder joint, elbow joint coordinate of arm are in the t times Do not changed with the time inside, and swivel of hand and carpal x-axis coordinate diminish after first becoming with the time greatly, while swivel of hand and wrist The z-axis coordinate in joint first diminishes with the time becomes big afterwards, and excursion is when being less than a centimetres, i.e., before roll gesture;
When the middle shoulder joint of arm, shoulder joint, elbow joint, wrist joint coordinate do not change within the t1 times with the time, and hand The y-axis coordinate in joint diminishes with the time, while anaplasia is big at any time for the z-axis coordinate of swivel of hand, and excursion is when being less than b centimetres, Choose gesture;Until the y-axis coordinate of swivel of hand, anaplasia is big at any time, while the z-axis coordinate of swivel of hand diminishes with the time, and change When scope is less than b centimetres, that is, discharge gesture;Whole process is drag gesture from choosing to discharging;
For two arms it is effective when:
When the middle shoulder joint, shoulder joint, elbow joint coordinate of two arms do not change within the t1 times with the time, and the right hand is closed Section and right carpal x-axis coordinate first diminish with the time becomes big afterwards, and left hand joint and left carpal x-axis coordinate first become with the time Diminish after big, while left and right swivel of hand, carpal z-axis coordinate first diminish with the time becomes big afterwards, and excursion is less than a centimetres When, that is, reduce gesture;
When the middle shoulder joint, shoulder joint, elbow joint coordinate of two arms do not change within the t1 times with the time, and the right hand is closed Section and right carpal x-axis coordinate diminish after first becoming with the time greatly, and left hand joint and left carpal x-axis coordinate first become with the time Become big after small, while left and right swivel of hand, carpal z-axis coordinate first diminish with the time becomes big afterwards, and excursion is less than a centimetres When, i.e. amplifying gesture;
Above-mentioned a, b span are 10<a<30,5<b<10.
4. the method that a kind of non-contact gesture according to claim 3 manipulates user interface, it is characterised in that:Step 3) Described in gesture identification method A can also be specially by the way of method A and method B are used in combination:First application method A, stops if recognizing successfully;If it is unidentified go out gesture, then be identified, stopped if recognizing successfully with method B; If it is unidentified go out gesture if re-start gesture identification.
CN201510119652.1A 2015-03-18 2015-03-18 A kind of method that non-contact gesture manipulates user interface Active CN104808788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510119652.1A CN104808788B (en) 2015-03-18 2015-03-18 A kind of method that non-contact gesture manipulates user interface

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510119652.1A CN104808788B (en) 2015-03-18 2015-03-18 A kind of method that non-contact gesture manipulates user interface

Publications (2)

Publication Number Publication Date
CN104808788A CN104808788A (en) 2015-07-29
CN104808788B true CN104808788B (en) 2017-09-01

Family

ID=53693692

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510119652.1A Active CN104808788B (en) 2015-03-18 2015-03-18 A kind of method that non-contact gesture manipulates user interface

Country Status (1)

Country Link
CN (1) CN104808788B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105302310B (en) * 2015-11-12 2018-08-31 姚焕根 A kind of gesture identifying device, system and method
CN107436679B (en) * 2016-05-27 2020-08-07 富泰华工业(深圳)有限公司 Gesture control system and method
CN106406518B (en) * 2016-08-26 2019-01-18 清华大学 Gesture control device and gesture identification method
CN107273869B (en) * 2017-06-29 2020-04-24 联想(北京)有限公司 Gesture recognition control method and electronic equipment
CN107678537A (en) * 2017-09-04 2018-02-09 全球能源互联网研究院有限公司 Assembly manipulation, the method and apparatus of simulation assembling are identified in augmented reality environment
CN107967061A (en) * 2017-12-21 2018-04-27 北京华捷艾米科技有限公司 Man-machine interaction method and device
CN108089708A (en) * 2017-12-22 2018-05-29 西安交通大学 A kind of hanging gesture interaction method for improving gesture fatigue
CN108153421A (en) * 2017-12-25 2018-06-12 深圳Tcl新技术有限公司 Body feeling interaction method, apparatus and computer readable storage medium
CN111045511A (en) * 2018-10-15 2020-04-21 华为技术有限公司 Gesture-based control method and terminal equipment
CN109833608B (en) * 2018-12-29 2021-06-22 南京华捷艾米软件科技有限公司 Dance action teaching and assisting method and system based on 3D motion sensing camera

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103038727A (en) * 2010-06-29 2013-04-10 微软公司 Skeletal joint recognition and tracking system
CN103180803A (en) * 2012-10-30 2013-06-26 华为技术有限公司 Interface switching method and apparatus
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103038727A (en) * 2010-06-29 2013-04-10 微软公司 Skeletal joint recognition and tracking system
CN103180803A (en) * 2012-10-30 2013-06-26 华为技术有限公司 Interface switching method and apparatus
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103472920A (en) * 2013-09-13 2013-12-25 通号通信信息集团有限公司 Action-recognition-based medical image control method and system
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
三维人体运动分析与动作识别方法;蔡美玲;《中国博士学位论文全文数据库信息科技辑》;20140228(第2期);I138-43 *

Also Published As

Publication number Publication date
CN104808788A (en) 2015-07-29

Similar Documents

Publication Publication Date Title
CN104808788B (en) A kind of method that non-contact gesture manipulates user interface
CN104123007B (en) Multidimensional weighted 3D recognition method for dynamic gestures
Argyros et al. Vision-based interpretation of hand gestures for remote control of a computer mouse
Yang et al. Gesture interaction in virtual reality
US9857868B2 (en) Method and system for ergonomic touch-free interface
CN102402289B (en) Mouse recognition method for gesture based on machine vision
CN103472916A (en) Man-machine interaction method based on human body gesture recognition
CN103150020A (en) Three-dimensional finger control operation method and system
CN102567703B (en) Hand motion identification information processing method based on classification characteristic
CN106598227A (en) Hand gesture identification method based on Leap Motion and Kinect
CN104571823B (en) A kind of contactless visual human&#39;s machine interaction method based on intelligent television
CN102622225B (en) Multipoint touch application program development method supporting user defined gestures
CN103135883A (en) Method and system for control of window
CN102306053B (en) Virtual touch screen-based man-machine interaction method and device and electronic equipment
CN108052202B (en) 3D interaction method and device, computer equipment and storage medium
CN103426000B (en) A kind of static gesture Fingertip Detection
CN101847057A (en) Method for touchpad to acquire input information
CN109800676A (en) Gesture identification method and system based on depth information
CN109145802A (en) More manpower gesture man-machine interaction methods and device based on Kinect
Hu et al. FingerTrak: Continuous 3D hand pose tracking by deep learning hand silhouettes captured by miniature thermal cameras on wrist
Yang et al. 3D character recognition using binocular camera for medical assist
Chaudhary Finger-stylus for non touch-enable systems
CN202749066U (en) Non-contact object-showing interactive system
CN107220634A (en) Based on the gesture identification method for improving D P algorithms and multi-template matching
CN103699214A (en) Three-dimensional tracking and interacting method based on three-dimensional natural gestures

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
EXSB Decision made by sipo to initiate substantive examination
GR01 Patent grant
GR01 Patent grant