CN107368820A - One kind becomes more meticulous gesture identification method, device and equipment - Google Patents

One kind becomes more meticulous gesture identification method, device and equipment Download PDF

Info

Publication number
CN107368820A
CN107368820A CN201710656434.0A CN201710656434A CN107368820A CN 107368820 A CN107368820 A CN 107368820A CN 201710656434 A CN201710656434 A CN 201710656434A CN 107368820 A CN107368820 A CN 107368820A
Authority
CN
China
Prior art keywords
feature
displacement characteristic
gesture
cluster
meticulous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710656434.0A
Other languages
Chinese (zh)
Other versions
CN107368820B (en
Inventor
姬晓鹏
程俊
潘亮亮
张丰
方琎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Tencent Technology Shenzhen Co Ltd
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd, Shenzhen Institute of Advanced Technology of CAS filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710656434.0A priority Critical patent/CN107368820B/en
Publication of CN107368820A publication Critical patent/CN107368820A/en
Application granted granted Critical
Publication of CN107368820B publication Critical patent/CN107368820B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Social Psychology (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

One kind gesture identification method that becomes more meticulous includes:The local feature of hand joint position is extracted, the local feature includes relative seat feature and/or sequential displacement characteristic;Cluster calculation is carried out according to the relative seat feature and/or sequential displacement characteristic, obtains the relative seat feature and/or the cluster feature corresponding to sequential displacement characteristic;According to the cluster feature and the other corresponding relation of gesture class, cluster feature model training is carried out, gesture identification is carried out according to the cluster feature model after training.It can realize that the dynamic gesture of Length discrepancy to the transformation of regular length feature, contributes to grader to carry out measuring similarity to gesture-type, and be advantageously implemented the judgement of finger refinement motion process and realize the detection of Large Amplitude Motion.

Description

One kind becomes more meticulous gesture identification method, device and equipment
Technical field
The invention belongs to gesture identification field, more particularly to one kind becomes more meticulous gesture identification method, device and equipment.
Background technology
In the man-machine interactive systems such as intelligent television, wearable mobile terminal, personal computer or virtual reality device, warp Interactively entering to online gesture identification can often be used.
According to the difference of data acquisition modes, the method for gesture identification can be divided into based on Wearable both at home and abroad at present With two kinds of view-based access control model.Wherein:
Gesture identification method based on Wearable mainly obtains gesture using sensors such as accelerometer, gyroscopes In the motion track information of three dimensions, based on the advantages of Wearable progress gesture identification it is that multiple sensings can be equipped with Device is to obtain the relative position information of accurate hand joint and space motion path, and the accuracy rate of identification is higher, but this method Need to wear complicated device, such as data glove, position tracker, wearing is relatively complicated, to man-machine interactive system Naturality brings certain influence.
The gesture identification method of view-based access control model can solve natural sex chromosome mosaicism during man-machine interaction well, be taken the photograph by visible ray As the view data of head acquisition hand region, the segmentation, feature extraction, assorting process of hand target area are then carried out.But The gesture identification method of existing view-based access control model can only handle static gesture (the gesture numeral knowledge of such as single image of single type Not) or dynamic gesture (palm slides up and down page turning), in the gesture identification problem of processing Length discrepancy sequence, existing gesture is known Other method carries out the similarity measurement of gesture motion track using dynamic time warping algorithm more, and the algorithm can solve significantly The difference of the lower hand movement locus of motion, but computation complexity is high, can not realize become more meticulous, the knowledge of multifarious finger motion Not.
The content of the invention
In view of this, become more meticulous gesture identification method, device and equipment the embodiments of the invention provide one kind, it is existing to solve Have the gesture identification method in technology, because computation complexity is high, can not realize become more meticulous, multifarious finger motion identification Problem.
The first aspect of the embodiment of the present invention provides one kind and become more meticulous gesture identification method, the gesture identification that becomes more meticulous Method includes:
The local feature of hand joint position is extracted, the local feature includes relative seat feature and/or sequential displacement Feature;
Cluster calculation is carried out according to the relative seat feature and/or sequential displacement characteristic, it is special to obtain the relative position Cluster feature corresponding to sign and/or sequential displacement characteristic;
According to the cluster feature and the other corresponding relation of gesture class, cluster feature model training is carried out, after training Cluster feature model carry out gesture identification.
With reference in a first aspect, in the first possible implementation of first aspect, the part of hand joint position is extracted The step of relative seat feature in feature, includes:
T frame dynamic gesture images are obtained, it is determined that the position of the node per the hand in frame dynamic gesture image;
Position according to the root node that the node of hand includes with corresponding child node, be calculated child node relative to The relative seat feature of root node.
With reference to the first possible implementation of first aspect, in second of possible implementation of first aspect, institute The position according to the root node that the node of hand includes and corresponding child node is stated, child node is calculated relative to root node Relative seat feature the step of include:
Obtain the relative seat feature of the t ' frames in T frame dynamic gesture imagesWherein:
The position of root node is represented, The position of expression child node corresponding with the root node, u ∈ i | 1≤i≤N }, h=1 expression left hands, the h=2 expression right hands, 1 ≤ t '≤T, N are the number of node.
With reference in a first aspect, in the third possible implementation of first aspect, the part of hand joint position is extracted The step of sequential displacement characteristic in feature, includes:
T frame dynamic gesture images are obtained, it is determined that per the displacement characteristic reference point in frame dynamic gesture image;
The position of displacement characteristic reference point in the adjacent image of every two frame, determine that the dynamic gesture image institute is right The sequential displacement characteristic answered.
With reference to the third possible implementation of first aspect, in the 4th kind of possible implementation of first aspect, institute State the position of the displacement characteristic reference point in the adjacent image of every two frame, determine corresponding to the dynamic gesture image when The step of sequence displacement characteristic, includes:
Obtain the sequential displacement characteristic of t " framesWherein, For the displacement characteristic reference point of t " frames,Referred to for the displacement characteristic of the frames of t " -1 Point, 1<T "≤T, v ∈ i | and 1≤i≤N }, M is displacement characteristic reference point number and 1≤v≤M, h=1 represent left hand, h=2 tables Show the right hand, N is the number of node.
It is described special according to the relative position with reference in a first aspect, in the 5th kind of possible implementation of first aspect Sign and/or sequential displacement characteristic carry out cluster calculation, obtain corresponding to the relative seat feature and/or sequential displacement characteristic The step of cluster feature, includes:
The relative seat feature and/or sequential displacement characteristic are expressed as to the set of unified local feature;
The cluster that predetermined number is chosen in the set of the unified local feature clusters set;
Set is clustered to each cluster and carries out transformation calculations, obtains the relative seat feature and/or sequential displacement characteristic pair The cluster feature answered.
The second aspect of the embodiment of the present invention provides one kind and become more meticulous gesture identifying device, the gesture identification that becomes more meticulous Device includes:
Local shape factor unit, for extracting the local feature of hand joint position, the local feature includes relative Position feature and/or sequential displacement characteristic;
Cluster feature computing unit, based on carrying out cluster according to the relative seat feature and/or sequential displacement characteristic Calculate, obtain the relative seat feature and/or the cluster feature corresponding to sequential displacement characteristic;
Recognition unit is trained, for according to the cluster feature and the other corresponding relation of gesture class, carrying out cluster feature mould Type training, gesture identification is carried out according to the cluster feature model after training.
With reference to second aspect, in the first possible implementation of second aspect, the local shape factor unit bag Include:
First image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that per the hand in frame dynamic gesture image The position of the node in portion;
First computation subunit, for the position according to the root node that the node of hand includes with corresponding child node, Relative seat feature of the child node relative to root node is calculated;
And/or
Second image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that per the position in frame dynamic gesture image Move feature reference point;
Second computation subunit, for the position of the displacement characteristic reference point in the adjacent image of every two frame, it is determined that Sequential displacement characteristic corresponding to the dynamic gesture image.
The third aspect of the embodiment of the present invention provides one kind and become more meticulous gesture identification equipment, including:Memory, processor And the computer program that can be run in the memory and on the processor is stored in, meter described in the computing device The step of gesture identification method is become more meticulous as described in any one of first aspect is realized during calculation machine program.
The fourth aspect of the embodiment of the present invention provides a kind of computer-readable recording medium, the computer-readable storage Media storage has computer program, is realized when the computer program is executed by processor fine as described in any one of first aspect The step of changing gesture identification method.
Existing beneficial effect is the embodiment of the present invention compared with prior art:By extracting the local special of hand joint Sign, the cluster feature for extracting local feature are trained, it is possible to achieve the dynamic gesture of Length discrepancy to regular length feature turns Become, contribute to grader to gesture-type carry out measuring similarity, and using relative seat feature and/or sequential displacement characteristic come Hand joint position is described, the judgement of finger refinement motion process is advantageously implemented and/or realizes the detection of Large Amplitude Motion.
Brief description of the drawings
Technical scheme in order to illustrate the embodiments of the present invention more clearly, below will be to embodiment or description of the prior art In the required accompanying drawing used be briefly described, it should be apparent that, drawings in the following description be only the present invention some Embodiment, for those of ordinary skill in the art, without having to pay creative labor, can also be according to these Accompanying drawing obtains other accompanying drawings.
Fig. 1 is the implementation process schematic diagram of the gesture identification method provided in an embodiment of the present invention that becomes more meticulous;
Fig. 2 is dynamic gesture schematic diagram provided in an embodiment of the present invention;
Fig. 3 is hand joint position view provided in an embodiment of the present invention;
Fig. 4 is root node provided in an embodiment of the present invention and the corresponding relation schematic diagram of child node;
Fig. 5 is the displacement characteristic reference mode schematic diagram that the present invention implements to provide;
Fig. 6 is that offer progress cluster calculation of the embodiment of the present invention obtains the implementation process schematic diagram of cluster feature;
Fig. 7 is the structural representation for the dynamic gesture operation device that this hair inventive embodiments provide;
Fig. 8 is the schematic diagram of the gesture identification equipment provided in an embodiment of the present invention that becomes more meticulous.
Embodiment
In describing below, in order to illustrate rather than in order to limit, it is proposed that such as tool of particular system structure, technology etc Body details, thoroughly to understand the embodiment of the present invention.However, it will be clear to one skilled in the art that there is no these specific The present invention can also be realized in the other embodiments of details.In other situations, omit to well-known system, device, electricity Road and the detailed description of method, in case unnecessary details hinders description of the invention.
In order to illustrate technical solutions according to the invention, illustrated below by specific embodiment.
As shown in figure 1, the gesture identification method that become more meticulous described in the embodiment of the present invention, including:
In step S101, the local feature of extraction hand joint position, the local feature includes relative seat feature And/or sequential displacement characteristic.
Specifically, the extraction of the local feature, the multiple image for including user's hand based on acquisition, as shown in Figure 2 For the schematic diagram of the T two field pictures of " OK " gesture, from the 1st frame to T frames, including multiple images, and the change of frame over time, The change of position can occur for the gesture in image, wherein, the change of position include the relative position of finger change and palm it is whole The change of body position, can be described respectively by relative seat feature and timing position feature finger relative position change with And the change of palm integral position.
In order to obtain relative seat feature and/or timing position feature, it is necessary to extract the node of hand in advance, pass through node Position change, to react relative seat feature and the timing position feature.As optionally a kind of embodiment party of the application Formula, as depicted in figs. 1 and 2, the node of the hand extracted include artis position, finger tips position and the palm of hand Center, including 2 artis of thumb, 1 finger tips node, other four fingers include 3 artis With 1 endpoint node, wrist portion includes an artis, and one node of palm center location, altogether including 22 features Reference point, to distinguish left hand and the right hand, the right hand and left hand are represented respectively with R1, R2 ... R22, and L1, L2 ... L22 respectively Node.
Wherein, the step of extracting the relative seat feature in the local feature of hand joint position includes:
T frame dynamic gesture images are obtained, it is determined that the position of the node per the hand in frame dynamic gesture image;
Position according to the root node that the node of hand includes with corresponding child node, be calculated child node relative to The relative seat feature of root node.
Dynamic gesture G={ G for including T (T is for natural number and more than or equal to 2) two field picturet| 1≤t≤T }, such as Fig. 2 The shown right hand " OK " gesture, each frame gesture data contain the node of hand under world coordinate system estimated by depth camera Position:
Here h=1 represents left hand, and h=2 represents right Hand;I ∈ { 1,2,3 ..., N } represent i-th of node of corresponding hand, as shown in Fig. 2 each hand includes 22 nodes, then N Value can be 22.
The position of the node of given the t ' framesHere 1≤t '≤T.As shown in figure 4, selection and Fig. 3 N-2 corresponding to the node (N is 22 in Fig. 3) group root nodeWith corresponding child nodePoint Not Ji Suan child node relative to the root node relative position vectorI.e.
The then relative seat feature of the t ' framesIt can be expressed as:
Wherein, the selection of root node and child node, as shown in figure 3, being the position in the joint according to where node, phase is chosen Every an artis two nodes respectively as root node and child node.
And include in this step, the step of sequential displacement characteristic in the local feature of extraction hand joint position:
T frame dynamic gesture images are obtained, it is determined that per the displacement characteristic reference point in frame dynamic gesture image;
The position of displacement characteristic reference point in the adjacent image of every two frame, determine that the dynamic gesture image institute is right The sequential displacement characteristic answered.
The selection of the displacement characteristic reference point, can select the node in palm part, and so choosing more to be defined The mass motion amount of true determination hand, avoid selecting the node of finger position from causing mistake to mass motion direction and size Difference.As shown in figure 5, can choose the individual nodes of wherein M (in Fig. 2 M be 7) is used as displacement characteristic reference point, for give t " frame Artis positionHere 1<T "≤T, N are the number of node, and M is the displacement characteristic reference point chosen Number, selected displacement characteristic reference point are:Calculate present frameAnd former frameSequential Motion vectorI.e.
The then sequential displacement characteristic of t " framesIt can be expressed as:
In step s 102, cluster calculation is carried out according to the relative seat feature and/or sequential displacement characteristic, obtains institute State relative seat feature and/or the cluster feature corresponding to sequential displacement characteristic.
The process clustered to the local feature of extraction, can with as shown in fig. 6, including:
In step s 601, the relative seat feature and/or sequential displacement characteristic are expressed as unified local feature Set.
For the relative seat feature of step S101 extractionsWith sequential displacement characteristicIt can be expressed as unified local Feature representation form:Here can set:1≤s≤4,1<τ≤T.For only including relative position Feature is put, or only includes sequential displacement characteristic to use corresponding unified local feature expression-form, such as Or
In step S602, the cluster that predetermined number is chosen in the set of the unified local feature clusters set.
In the present invention, can be to s category feature setIt is individual using k-means++ algorithm picks K Initialize cluster centre point { μs,k| 1≤k≤K }, and by the use of error sum of squares as clustering measure foundation, obtain K cluster cluster sets Close Cs={ Cs,k| 1≤k≤K }, i.e.,:
WhereinFor, per cluster cluster centre point, wherein K is the natural number more than 2 after renewal.
In step S603, to each cluster cluster set carry out transformation calculations, obtain the relative seat feature and/or when Cluster feature corresponding to sequence displacement characteristic.
C is clustered to every clusters,k, principal component analysis (Principal Component Analysis, PCA) operation is carried out, And retain all the components, the cluster C ' after being converteds,k
Calculate cluster C 's,kWith cluster centre point μ 's,kPoor accumulation and, obtain compact form cluster feature expression νk
The cluster feature of K cluster centre point is combined, forms the compact form expression of s category features:
Vs={ νs,k|1≤k≤K}。
Repeat step S602-S603, can be by local feature setBe expressed as and when The character representation of regular length unrelated sequence τ, i.e.,:
V={ Vs|1≤s≤4}。
In step s 103, according to the cluster feature and the other corresponding relation of gesture class, cluster feature model instruction is carried out Practice, gesture identification is carried out according to the cluster feature model after training.
After the character representation of the regular length unrelated with sequential τ is generated, SVMs or other instructions can be passed through Practice model, carry out model training and test, after the model trained, gesture can be sentenced by the model trained Disconnected and identification.
It should be understood that the size of the sequence number of each step is not meant to the priority of execution sequence, each process in above-described embodiment Execution sequence should determine that the implementation process without tackling the embodiment of the present invention forms any limit with its function and internal logic It is fixed.
Fig. 7 is a kind of structural representation for the gesture identifying device that becomes more meticulous provided in an embodiment of the present invention, as shown in fig. 7, The gesture identifying device that becomes more meticulous includes:
Local shape factor unit 701, for extracting the local feature of hand joint position, the local feature includes phase To position feature and/or sequential displacement characteristic;
Cluster feature computing unit 702, for being clustered according to the relative seat feature and/or sequential displacement characteristic Calculate, obtain the relative seat feature and/or the cluster feature corresponding to sequential displacement characteristic;
Recognition unit 703 is trained, for according to the cluster feature and the other corresponding relation of gesture class, carrying out cluster feature Model training, gesture identification is carried out according to the cluster feature model after training.
Preferably, the local shape factor unit includes:
First image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that per the hand in frame dynamic gesture image The position of the node in portion;
First computation subunit, for the position according to the root node that the node of hand includes with corresponding child node, Relative seat feature of the child node relative to root node is calculated;
And/or
Second image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that per the position in frame dynamic gesture image Move feature reference point;
Second computation subunit, for the position of the displacement characteristic reference point in the adjacent image of every two frame, it is determined that Sequential displacement characteristic corresponding to the dynamic gesture image.
Fig. 8 is the schematic diagram for the gesture identification equipment that becomes more meticulous that one embodiment of the invention provides.As shown in figure 8, the implementation The gesture identification equipment 8 that becomes more meticulous of example includes:Processor 80, memory 81 and it is stored in the memory 81 and can be in institute The computer program 82 run on processor 80 is stated, such as the gesture identification program that becomes more meticulous.The processor 80 performs the meter The step in above-mentioned each gesture identification method embodiment that becomes more meticulous, such as the step 101 shown in Fig. 1 are realized during calculation machine program 82 To 103.Or the processor 80 realizes each module in above-mentioned each device embodiment/mono- when performing the computer program 82 The function of member, such as the function of module 701 to 703 shown in Fig. 7.
Exemplary, the computer program 82 can be divided into one or more module/units, it is one or Multiple module/units are stored in the memory 81, and are performed by the processor 80, to complete the present invention.Described one Individual or multiple module/units can be the series of computation machine programmed instruction section that can complete specific function, and the instruction segment is used for Implementation procedure of the computer program 82 in the gesture identification equipment 8 that becomes more meticulous is described.For example, the computer program 82 can be divided into local shape factor unit, cluster feature computing unit and training recognition unit, each unit concrete function It is as follows:
Local shape factor unit, for extracting the local feature of hand joint position, the local feature includes relative Position feature and/or sequential displacement characteristic;
Cluster feature computing unit, based on carrying out cluster according to the relative seat feature and/or sequential displacement characteristic Calculate, obtain the relative seat feature and/or the cluster feature corresponding to sequential displacement characteristic;
Recognition unit is trained, for according to the cluster feature and the other corresponding relation of gesture class, carrying out cluster feature mould Type training, gesture identification is carried out according to the cluster feature model after training.
The gesture identification equipment 8 that becomes more meticulous can be desktop PC, notebook, palm PC and cloud server Deng computing device.The gesture identification equipment that becomes more meticulous may include, but be not limited only to, processor 80, memory 81.This area skill Art personnel are appreciated that Fig. 8 is only the example of gesture identification equipment 8 of becoming more meticulous, and do not form and the gesture identification that becomes more meticulous is set It standby 8 restriction, can include than illustrating more or less parts, either combine some parts or different parts, such as The gesture identification equipment that becomes more meticulous can also include input-output equipment, network access equipment, bus etc..
Alleged processor 80 can be CPU (Central Processing Unit, CPU), can also be Other general processors, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), ready-made programmable gate array (Field- Programmable Gate Array, FPGA) either other PLDs, discrete gate or transistor logic, Discrete hardware components etc..General processor can be microprocessor or the processor can also be any conventional processor Deng.
The memory 81 can be the internal storage unit of the gesture identification equipment 8 that becomes more meticulous, such as the hand that becomes more meticulous The hard disk or internal memory of gesture identification equipment 8.The memory 81 can also be the external storage of the gesture identification equipment 8 that becomes more meticulous The plug-in type hard disk being equipped with equipment, such as the gesture identification equipment 8 that becomes more meticulous, intelligent memory card (Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (Flash Card) etc..Further, it is described to deposit Reservoir 81 can also both include the internal storage unit of the gesture identification equipment 8 that becomes more meticulous or including External memory equipment.Institute Memory 81 is stated to be used to store the computer program and described other program sums to become more meticulous needed for gesture identification equipment According to.The memory 81 can be also used for temporarily storing the data that has exported or will export.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each work( Can unit, module division progress for example, in practical application, can be as needed and by above-mentioned function distribution by different Functional unit, module are completed, i.e., the internal structure of described device are divided into different functional units or module, more than completion The all or part of function of description.Each functional unit, module in embodiment can be integrated in a processing unit, also may be used To be that unit is individually physically present, can also two or more units it is integrated in a unit, it is above-mentioned integrated Unit can both be realized in the form of hardware, can also be realized in the form of SFU software functional unit.In addition, each function list Member, the specific name of module are not limited to the protection domain of the application also only to facilitate mutually distinguish.Said system The specific work process of middle unit, module, the corresponding process in preceding method embodiment is may be referred to, will not be repeated here.
In the above-described embodiments, the description to each embodiment all emphasizes particularly on different fields, and is not described in detail or remembers in some embodiment The part of load, it may refer to the associated description of other embodiments.
Those of ordinary skill in the art are it is to be appreciated that the list of each example described with reference to the embodiments described herein Member and algorithm steps, it can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually Performed with hardware or software mode, application-specific and design constraint depending on technical scheme.Professional and technical personnel Described function can be realized using distinct methods to each specific application, but this realization is it is not considered that exceed The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, can be with Realize by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute The division of module or unit is stated, only a kind of division of logic function, there can be other dividing mode when actually realizing, such as Multiple units or component can combine or be desirably integrated into another system, or some features can be ignored, or not perform.Separately A bit, shown or discussed mutual coupling or direct-coupling or communication connection can be by some interfaces, device Or INDIRECT COUPLING or the communication connection of unit, can be electrical, mechanical or other forms.
The unit illustrated as separating component can be or may not be physically separate, show as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing unit, can also That unit is individually physically present, can also two or more units it is integrated in a unit.Above-mentioned integrated list Member can both be realized in the form of hardware, can also be realized in the form of SFU software functional unit.
If the integrated module/unit realized in the form of SFU software functional unit and as independent production marketing or In use, it can be stored in a computer read/write memory medium.Based on such understanding, the present invention realizes above-mentioned implementation All or part of flow in example method, by computer program the hardware of correlation can also be instructed to complete, described meter Calculation machine program can be stored in a computer-readable recording medium, and the computer program can be achieved when being executed by processor The step of stating each embodiment of the method..Wherein, the computer program includes computer program code, the computer program Code can be source code form, object identification code form, executable file or some intermediate forms etc..Computer-readable Jie Matter can include:Can carry any entity or device of the computer program code, recording medium, USB flash disk, mobile hard disk, Magnetic disc, CD, computer storage, read-only storage (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..It is it should be noted that described The content that computer-readable medium includes can carry out appropriate increasing according to legislation in jurisdiction and the requirement of patent practice Subtract, such as in some jurisdictions, according to legislation and patent practice, computer-readable medium do not include be electric carrier signal and Telecommunication signal.
Embodiment described above is merely illustrative of the technical solution of the present invention, rather than its limitations;Although with reference to foregoing reality Example is applied the present invention is described in detail, it will be understood by those within the art that:It still can be to foregoing each Technical scheme described in embodiment is modified, or carries out equivalent substitution to which part technical characteristic;And these are changed Or replace, the essence of appropriate technical solution is departed from the spirit and scope of various embodiments of the present invention technical scheme, all should Within protection scope of the present invention.

Claims (10)

  1. The gesture identification method 1. one kind becomes more meticulous, it is characterised in that the gesture identification method that becomes more meticulous includes:
    The local feature of hand joint position is extracted, the local feature includes relative seat feature and/or sequential displacement characteristic;
    Cluster calculation is carried out according to the relative seat feature and/or sequential displacement characteristic, obtain the relative seat feature and/ Or the cluster feature corresponding to sequential displacement characteristic;
    According to the cluster feature and the other corresponding relation of gesture class, cluster feature model training is carried out, according to poly- after training Category feature model carries out gesture identification.
  2. 2. the gesture identification method according to claim 1 that becomes more meticulous, it is characterised in that the part of extraction hand joint position The step of relative seat feature in feature, includes:
    T frame dynamic gesture images are obtained, it is determined that the position of the node per the hand in frame dynamic gesture image;
    Position according to the root node that the node of hand includes with corresponding child node, is calculated child node relative to root section The relative seat feature of point.
  3. 3. the gesture identification method according to claim 2 that becomes more meticulous, it is characterised in that wrapped in the node according to hand The position of the root node included and corresponding child node, be calculated child node relative to root node relative seat feature the step of Including:
    Obtain the relative seat feature of the t ' frames in T frame dynamic gesture imagesWherein: The position of root node is represented,Represent the position of child node corresponding with the root node, u ∈ i | and 1≤i≤N }, h=1 represents left hand, and h=2 represents the right hand, and 1≤t '≤T, N are the number of node.
  4. 4. the gesture identification method according to claim 1 that becomes more meticulous, it is characterised in that the part of extraction hand joint position The step of sequential displacement characteristic in feature, includes:
    T frame dynamic gesture images are obtained, it is determined that per the displacement characteristic reference point in frame dynamic gesture image;
    The position of displacement characteristic reference point in the adjacent image of every two frame, is determined corresponding to the dynamic gesture image Sequential displacement characteristic.
  5. 5. the gesture identification method according to claim 4 that becomes more meticulous, it is characterised in that the figure adjacent according to every two frame The position of displacement characteristic reference point as in, the step of determining the sequential displacement characteristic corresponding to the dynamic gesture image, wrap Include:
    Obtain the sequential displacement characteristic of t " framesWherein, For the displacement characteristic reference point of t " frames,For the displacement characteristic reference point of the frames of t " -1,1<T "≤T, v ∈ i | 1≤i ≤ N }, M is displacement characteristic reference point number and 1≤v≤M, h=1 represent left hand, and h=2 represents the right hand, and N is the number of node.
  6. 6. the gesture identification method according to claim 1 that becomes more meticulous, it is characterised in that described special according to the relative position Sign and/or sequential displacement characteristic carry out cluster calculation, obtain corresponding to the relative seat feature and/or sequential displacement characteristic The step of cluster feature, includes:
    The relative seat feature and/or sequential displacement characteristic are expressed as to the set of unified local feature;
    The cluster that predetermined number is chosen in the set of the unified local feature clusters set;
    Set is clustered to each cluster and carries out transformation calculations, is obtained corresponding to the relative seat feature and/or sequential displacement characteristic Cluster feature.
  7. The gesture identifying device 7. one kind becomes more meticulous, it is characterised in that the gesture identifying device that becomes more meticulous includes:
    Local shape factor unit, for extracting the local feature of hand joint position, the local feature includes relative position Feature and/or sequential displacement characteristic;
    Cluster feature computing unit, for carrying out cluster calculation according to the relative seat feature and/or sequential displacement characteristic, obtain To the cluster feature corresponding to the relative seat feature and/or sequential displacement characteristic;
    Recognition unit is trained, for according to the cluster feature and the other corresponding relation of gesture class, carrying out cluster feature model instruction Practice, gesture identification is carried out according to the cluster feature model after training.
  8. 8. the gesture identifying device according to claim 7 that becomes more meticulous, it is characterised in that the local shape factor unit bag Include:
    First image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that per the hand in frame dynamic gesture image The position of node;
    First computation subunit, for the position according to the root node that the node of hand includes with corresponding child node, calculate Obtain relative seat feature of the child node relative to root node;
    And/or
    Second image obtains subelement, for obtaining T frame dynamic gesture images, it is determined that special per the displacement in frame dynamic gesture image Levy reference point;
    Second computation subunit, for the position of the displacement characteristic reference point in the adjacent image of every two frame, it is determined that described Sequential displacement characteristic corresponding to dynamic gesture image.
  9. The gesture identification equipment 9. one kind becomes more meticulous, including memory, processor and be stored in the memory and can be in institute State the computer program run on processor, it is characterised in that realized described in the computing device during computer program as weighed Profit require to become more meticulous described in 1 to 6 any one gesture identification method the step of.
  10. 10. a kind of computer-readable recording medium, the computer-readable recording medium storage has computer program, and its feature exists In realizing and become more meticulous as described in any one of claim 1 to 6 gesture identification method when the computer program is executed by processor The step of.
CN201710656434.0A 2017-08-03 2017-08-03 Refined gesture recognition method, device and equipment Active CN107368820B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710656434.0A CN107368820B (en) 2017-08-03 2017-08-03 Refined gesture recognition method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710656434.0A CN107368820B (en) 2017-08-03 2017-08-03 Refined gesture recognition method, device and equipment

Publications (2)

Publication Number Publication Date
CN107368820A true CN107368820A (en) 2017-11-21
CN107368820B CN107368820B (en) 2023-04-18

Family

ID=60309287

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710656434.0A Active CN107368820B (en) 2017-08-03 2017-08-03 Refined gesture recognition method, device and equipment

Country Status (1)

Country Link
CN (1) CN107368820B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108346168A (en) * 2018-02-12 2018-07-31 腾讯科技(深圳)有限公司 A kind of images of gestures generation method, device and storage medium
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction
CN109117766A (en) * 2018-07-30 2019-01-01 上海斐讯数据通信技术有限公司 A kind of dynamic gesture identification method and system
CN109117771A (en) * 2018-08-01 2019-01-01 四川电科维云信息技术有限公司 Incident of violence detection system and method in a kind of image based on anchor node
CN109992093A (en) * 2017-12-29 2019-07-09 博世汽车部件(苏州)有限公司 A kind of gesture comparative approach and gesture comparison system
CN110163130A (en) * 2019-05-08 2019-08-23 清华大学 A kind of random forest grader and classification method of the feature pre-align for gesture identification
CN111222486A (en) * 2020-01-15 2020-06-02 腾讯科技(深圳)有限公司 Training method, device and equipment for hand gesture recognition model and storage medium
US20210326657A1 (en) * 2020-04-21 2021-10-21 Pegatron Corporation Image recognition method and device thereof and ai model training method and device thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100071965A1 (en) * 2008-09-23 2010-03-25 Panasonic Corporation System and method for grab and drop gesture recognition
US20100188519A1 (en) * 2009-01-29 2010-07-29 Keisuke Yamaoka Information Processing Device and Method, Program, and Recording Medium
CN101976330A (en) * 2010-09-26 2011-02-16 中国科学院深圳先进技术研究院 Gesture recognition method and system
US20110158476A1 (en) * 2009-12-24 2011-06-30 National Taiwan University Of Science And Technology Robot and method for recognizing human faces and gestures thereof
CN103246891A (en) * 2013-05-28 2013-08-14 重庆邮电大学 Chinese sign language recognition method based on kinect
US20140201126A1 (en) * 2012-09-15 2014-07-17 Lotfi A. Zadeh Methods and Systems for Applications for Z-numbers
CN104598915A (en) * 2014-01-24 2015-05-06 深圳奥比中光科技有限公司 Gesture recognition method and gesture recognition device
CN106886751A (en) * 2017-01-09 2017-06-23 深圳数字电视国家工程实验室股份有限公司 A kind of gesture identification method and system
CN106937531A (en) * 2014-06-14 2017-07-07 奇跃公司 Method and system for producing virtual and augmented reality

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100071965A1 (en) * 2008-09-23 2010-03-25 Panasonic Corporation System and method for grab and drop gesture recognition
US20100188519A1 (en) * 2009-01-29 2010-07-29 Keisuke Yamaoka Information Processing Device and Method, Program, and Recording Medium
US20110158476A1 (en) * 2009-12-24 2011-06-30 National Taiwan University Of Science And Technology Robot and method for recognizing human faces and gestures thereof
CN101976330A (en) * 2010-09-26 2011-02-16 中国科学院深圳先进技术研究院 Gesture recognition method and system
US20140201126A1 (en) * 2012-09-15 2014-07-17 Lotfi A. Zadeh Methods and Systems for Applications for Z-numbers
CN103246891A (en) * 2013-05-28 2013-08-14 重庆邮电大学 Chinese sign language recognition method based on kinect
CN104598915A (en) * 2014-01-24 2015-05-06 深圳奥比中光科技有限公司 Gesture recognition method and gesture recognition device
CN106937531A (en) * 2014-06-14 2017-07-07 奇跃公司 Method and system for producing virtual and augmented reality
CN106886751A (en) * 2017-01-09 2017-06-23 深圳数字电视国家工程实验室股份有限公司 A kind of gesture identification method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YUNGHER DON等: "Improving fine motor function after brain injury using gesture recognition biofeedback", 《DISABILITY AND REHABILITATION: ASSISTIVE TECHNOLOGY》 *
曹洁等: "基于RGB-D信息的动态手势识别方法", 《计算机应用研究》 *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109992093A (en) * 2017-12-29 2019-07-09 博世汽车部件(苏州)有限公司 A kind of gesture comparative approach and gesture comparison system
CN109992093B (en) * 2017-12-29 2024-05-03 博世汽车部件(苏州)有限公司 Gesture comparison method and gesture comparison system
CN108346168A (en) * 2018-02-12 2018-07-31 腾讯科技(深圳)有限公司 A kind of images of gestures generation method, device and storage medium
CN108346168B (en) * 2018-02-12 2019-08-13 腾讯科技(深圳)有限公司 A kind of images of gestures generation method, device and storage medium
US11061479B2 (en) 2018-07-04 2021-07-13 Baidu Online Network Technology (Beijing) Co., Ltd. Method, device and readable storage medium for processing control instruction based on gesture recognition
CN108921101A (en) * 2018-07-04 2018-11-30 百度在线网络技术(北京)有限公司 Processing method, equipment and readable storage medium storing program for executing based on gesture identification control instruction
CN109117766A (en) * 2018-07-30 2019-01-01 上海斐讯数据通信技术有限公司 A kind of dynamic gesture identification method and system
CN109117771A (en) * 2018-08-01 2019-01-01 四川电科维云信息技术有限公司 Incident of violence detection system and method in a kind of image based on anchor node
CN109117771B (en) * 2018-08-01 2022-05-27 四川电科维云信息技术有限公司 System and method for detecting violence events in image based on anchor nodes
CN110163130B (en) * 2019-05-08 2021-05-28 清华大学 Feature pre-alignment random forest classification system and method for gesture recognition
CN110163130A (en) * 2019-05-08 2019-08-23 清华大学 A kind of random forest grader and classification method of the feature pre-align for gesture identification
CN111222486A (en) * 2020-01-15 2020-06-02 腾讯科技(深圳)有限公司 Training method, device and equipment for hand gesture recognition model and storage medium
CN111222486B (en) * 2020-01-15 2022-11-04 腾讯科技(深圳)有限公司 Training method, device and equipment for hand gesture recognition model and storage medium
US20210326657A1 (en) * 2020-04-21 2021-10-21 Pegatron Corporation Image recognition method and device thereof and ai model training method and device thereof

Also Published As

Publication number Publication date
CN107368820B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN107368820A (en) One kind becomes more meticulous gesture identification method, device and equipment
Cornea et al. Computing hierarchical curve-skeletons of 3D objects
Zhu et al. A cuboid CNN model with an attention mechanism for skeleton-based action recognition
CN107820593A (en) A kind of virtual reality exchange method, apparatus and system
WO2021120834A1 (en) Biometrics-based gesture recognition method and apparatus, computer device, and medium
Liang et al. Model-based hand pose estimation via spatial-temporal hand parsing and 3D fingertip localization
CN110363077A (en) Sign Language Recognition Method, device, computer installation and storage medium
CN103793683B (en) Gesture recognition method and electronic device
CN108229496A (en) The detection method and device of dress ornament key point, electronic equipment, storage medium and program
Liu et al. Kinect-based hand gesture recognition using trajectory information, hand motion dynamics and neural networks
CN107688824A (en) Picture match method and terminal device
Geng et al. Gated path selection network for semantic segmentation
CN107958230A (en) Facial expression recognizing method and device
CN108229559A (en) Dress ornament detection method, device, electronic equipment, program and medium
CN104685540A (en) Image semantic segmentation method and apparatus
De Smedt et al. 3d hand gesture recognition by analysing set-of-joints trajectories
CN110489424A (en) A kind of method, apparatus, storage medium and the electronic equipment of tabular information extraction
Hirzer et al. Smart hypothesis generation for efficient and robust room layout estimation
Zhang Application of intelligent virtual reality technology in college art creation and design teaching
Yoo et al. Fast and accurate 3D hand pose estimation via recurrent neural network for capturing hand articulations
CN104573737B (en) The method and device of positioning feature point
CN110046340A (en) The training method and device of textual classification model
Mousas et al. Efficient hand-over motion reconstruction
Zhang et al. Accurate 3D hand pose estimation network utilizing joints information
Li Badminton motion capture with visual image detection of picking robotics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant