CN106843472B - Gesture recognition method and device, virtual reality equipment and programmable equipment - Google Patents

Gesture recognition method and device, virtual reality equipment and programmable equipment Download PDF

Info

Publication number
CN106843472B
CN106843472B CN201611239018.2A CN201611239018A CN106843472B CN 106843472 B CN106843472 B CN 106843472B CN 201611239018 A CN201611239018 A CN 201611239018A CN 106843472 B CN106843472 B CN 106843472B
Authority
CN
China
Prior art keywords
gesture
outline
real
standard
standard gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611239018.2A
Other languages
Chinese (zh)
Other versions
CN106843472A (en
Inventor
李文凤
张超
张绍谦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201611239018.2A priority Critical patent/CN106843472B/en
Publication of CN106843472A publication Critical patent/CN106843472A/en
Application granted granted Critical
Publication of CN106843472B publication Critical patent/CN106843472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a gesture recognition method and device, virtual reality equipment and programmable equipment. The method comprises the following steps: processing the real gesture picture obtained by shooting to obtain a real gesture outline picture; extracting a feature vector from the real gesture outline image; comparing the feature vectors of the real gesture outline drawing with the feature vectors of each standard gesture outline drawing one by one, and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing; and calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, taking the standard gesture corresponding to the similar standard gesture outline as the recognized gesture, otherwise, adjusting the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline, so that the difference between the adjusted standard gesture outline and the real gesture outline is within a certain range, and taking the adjusted standard gesture as the recognized gesture.

Description

Gesture recognition method and device, virtual reality equipment and programmable equipment
Technical Field
The present invention relates to gesture recognition technologies, and in particular, to a gesture recognition method, a gesture recognition apparatus, a virtual reality device, and a programmable device.
Background
With the development of machine vision, virtual reality technology is fully utilized in various fields. In the virtual reality application, in order to enable a user to participate in the design and conversion of a virtual reality scene, a human-computer interaction channel needs to be provided. The human-computer interaction can be realized by using a gesture recognition technology, and because the gestures in the virtual reality technology are complex and various, the bending, twisting and moving conditions of the fingers of the user can be recognized, and the gestures of the user can be accurately recognized by using a plurality of photos at different angles.
The PSO algorithm (Particle Swarm Optimization) is a common algorithm for solving the Optimization problem. The PCA (Principal Component Analysis) is a commonly used data Analysis method, which is commonly used for dimensionality reduction of high-dimensional data, and can transform original data into a group of data linearly independent of each dimension through linear transformation to extract main characteristic components of the data. The "image difference extraction" means that the difference of two pictures is judged by a specific algorithm, and the difference of the two pictures can be calculated by an exclusive or operation in the prior art.
Disclosure of Invention
The invention aims to provide a novel gesture recognition technical scheme.
According to a first aspect of the present invention, there is provided a gesture recognition method, comprising the steps of:
processing the real gesture picture obtained by shooting to obtain a real gesture outline picture; extracting a characteristic vector of the real gesture outline from the real gesture outline;
comparing the feature vectors of the real gesture outline drawing with the feature vectors of each standard gesture outline drawing one by one, and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing;
and calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, taking the standard gesture corresponding to the similar standard gesture outline as the recognized gesture, otherwise, adjusting the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline, so that the difference between the adjusted standard gesture outline and the real gesture outline is within the certain range, and taking the adjusted standard gesture as the recognized gesture.
Optionally, the standard gesture outline is obtained by performing binarization processing on a standard gesture graph; the real gesture outline image is obtained by performing binarization processing on a real gesture picture.
Optionally, the standard gesture graph is obtained by: and generating each standard gesture graph by adjusting the freedom degree parameters of the hand joints.
Optionally, performing PCA (principal component analysis) dimension reduction processing on the standard gesture outline to obtain a feature vector of the standard gesture outline; and carrying out PCA (principal component analysis) dimension reduction processing on the real gesture outline map to obtain a feature vector of the real gesture outline map.
Optionally, the degree-of-freedom parameters of the standard gesture corresponding to the similar standard gesture profile are adjusted through a PSO algorithm, so that the difference between the adjusted standard gesture profile and the real gesture profile is within the certain range.
Optionally, the difference between the real gesture profile and the similar standard gesture profile is calculated by an exclusive-or operation.
According to a second aspect of the invention, there is provided a virtual reality device comprising a memory and a processor, the memory storing instructions for controlling the processor to operate to perform the gesture recognition method of any one of the preceding claims.
According to a third aspect of the invention, there is provided a programmable device comprising a memory and a processor, the memory for storing instructions for controlling the processor to operate to perform the gesture recognition method of any one of the preceding claims.
According to a fourth aspect of the present invention, there is provided a gesture recognition apparatus, comprising:
the contour extraction module is used for processing the real gesture picture obtained by shooting to obtain a real gesture contour map;
the characteristic vector extraction module is used for extracting characteristic vectors of the real gesture outline map from the real gesture outline map;
the characteristic vector comparison module is used for comparing the characteristic vectors of the real gesture outline drawing with the characteristic vectors of each standard gesture outline drawing one by one and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing;
and the recognition module is used for calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, the standard gesture corresponding to the similar standard gesture outline is used as the recognized gesture, otherwise, the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline is adjusted, so that the difference between the adjusted standard gesture outline and the real gesture outline is within the certain range, and the adjusted standard gesture is used as the recognized gesture.
Optionally, the feature vector extraction module is configured to perform PCA dimension reduction on the real gesture profile to obtain a feature vector of the real gesture profile.
Optionally, the recognition module is configured to adjust, by using a PSO algorithm, a degree of freedom parameter of a standard gesture corresponding to the similar standard gesture profile, so that a difference between the adjusted standard gesture profile and the real gesture profile is within the certain range.
According to the invention, a similar standard gesture outline map is determined by comparing a characteristic vector of a real gesture outline map with a characteristic vector of a standard gesture outline map, if the difference between the real gesture outline map and the similar standard gesture outline map is within a certain range, a standard gesture corresponding to the similar standard gesture outline map is taken as a recognized gesture, otherwise, a degree of freedom parameter of the standard gesture corresponding to the similar standard gesture outline map is adjusted, so that the difference between the adjusted standard gesture outline map and the real gesture outline map is within the certain range, and the adjusted standard gesture is taken as the recognized gesture. The invention realizes that the gesture of the user can be recognized through one photo.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments will be briefly described below. It is appreciated that the following drawings depict only certain embodiments of the invention and are therefore not to be considered limiting of its scope. For a person skilled in the art, it is possible to derive other relevant figures from these figures without inventive effort.
Fig. 1 is a schematic flow chart illustrating a gesture recognition method according to an embodiment of the present invention;
FIG. 2 is a flow chart illustrating a gesture recognition method according to another embodiment of the present invention;
FIG. 3 is a block diagram of a gesture recognition apparatus provided by an embodiment of the present invention;
fig. 4 is a block diagram illustrating a hardware configuration of a virtual reality device according to an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Referring to fig. 1, a gesture recognition method provided by an embodiment of the present invention is described, which includes the following steps:
101. processing the real gesture picture obtained by shooting to obtain a real gesture outline picture; and extracting the feature vector of the real gesture outline from the real gesture outline.
In one embodiment, the real gesture outline can be obtained by performing binarization processing on the real gesture picture. The feature vector of the real gesture outline can be obtained by conducting PCA dimension reduction processing on the real gesture outline.
102. And comparing the feature vectors of the real gesture outline drawing with the feature vectors of each standard gesture outline drawing one by one, thereby determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing.
In one embodiment, each standard gesture diagram can be generated by adjusting the degree of freedom parameters of the hand joints in advance, for example, multiple standard gesture diagrams can be generated by setting the degree of freedom parameters of multiple hand joints by using the functions of the LibHand library. And carrying out binarization processing on the standard gesture graph to obtain a standard gesture outline graph. PCA dimension reduction processing can be carried out on the standard gesture outline drawing to obtain the feature vector of the standard gesture outline drawing.
103. And calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, taking the standard gesture corresponding to the similar standard gesture outline as the recognized gesture, otherwise, adjusting the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline to enable the difference between the adjusted standard gesture outline and the real gesture outline to be within the certain range, and taking the adjusted standard gesture as the recognized gesture. And controlling related equipment objects to execute corresponding operations according to the corresponding relation between the recognized gestures and the operations.
In one embodiment, the certain range may be an empirical value, which is set by a technician according to the actual application requirement.
In one embodiment, the difference between the real gesture profile and the similar standard gesture profile may be calculated by an exclusive-or operation.
In one embodiment, the degree-of-freedom parameters of the standard gesture corresponding to the similar standard gesture profile may be adjusted by the PSO algorithm, so that the difference between the profile of the adjusted standard gesture and the real gesture profile is within the certain range. For example, the LibHand library provides a parameter interface for 18 joint points of the hand in three degrees of freedom, each degree of freedom parameter of a standard gesture corresponding to a similar standard gesture profile is used as a parameter input of a PSO algorithm, the PSO algorithm is used for adjusting the degree of freedom parameter in a value range of each degree of freedom parameter and on the basis of fitness function feedback, and therefore the difference between the profile corresponding to the adjusted standard gesture and the real gesture profile is within a certain range.
Referring to fig. 2, a gesture recognition method according to another embodiment of the present invention is described, which includes the following steps:
201. and adjusting the degree of freedom parameters of the hand joints to generate a plurality of standard gesture images, and then executing step 202.
202. Performing binarization processing on the standard image gesture image to obtain a standard gesture contour map; and (5) making a standard gesture outline to obtain a feature vector of the standard gesture outline, and then executing step 203.
For different gestures, the external contour is an intuitive embodiment, and the feature vectors of the standard gesture contour map of each standard gesture form a sample library.
203. Carrying out binarization processing on the real gesture picture obtained by shooting to obtain a real gesture outline picture; and (5) conducting PCA dimension reduction processing on the real gesture outline to obtain a feature vector of the real gesture outline, and then executing the step 204.
204. Comparing the feature vectors of the real gesture profile with the feature vectors of each standard gesture profile one by one, thereby determining the standard gesture profile closest to the real gesture profile as a similar standard gesture profile, and then executing step 205.
And step 204, matching the real gesture outline drawing with standard gesture outline drawings in a sample library one by one, finding out a standard gesture outline drawing with the minimum difference of the characteristic vectors of the real gesture outline drawing, and taking the standard gesture outline drawing as a similar standard gesture outline drawing.
205. The difference between the real gesture profile and the similar standard gesture profile is calculated by exclusive-or operation, and then step 206 is performed.
206. And judging whether the difference between the real gesture outline and the similar standard gesture outline is within a certain range, if so, executing a step 208, otherwise, executing a step 207.
207. And adjusting the degree-of-freedom parameters of the standard gesture corresponding to the similar standard gesture profile, and then executing step 205.
208. And taking the standard gesture corresponding to the similar standard gesture outline graph as the recognized gesture. And controlling related equipment objects to execute corresponding operations according to the corresponding relation between the recognized gestures and the operations.
Step 205-207 narrows down the difference between the similar standard gesture outline and the real gesture outline to a range by adjusting the degree of freedom parameter of the standard gesture corresponding to the similar standard gesture outline, and takes the adjusted standard gesture as the recognized gesture.
It is obvious to those skilled in the art that the foregoing gesture recognition method can be implemented by hardware, software, or a combination of hardware and software. Based on the same inventive concept, a gesture recognition apparatus provided by an embodiment of the present invention is described with reference to fig. 3 to perform the aforementioned gesture recognition method.
The gesture recognition device comprises a contour extraction module 1, a feature vector extraction module 2, a feature vector comparison module 3 and a recognition module 4.
And the contour extraction module 1 is used for processing the real gesture picture obtained by shooting to obtain a real gesture contour map.
And the feature vector extraction module 2 is used for extracting feature vectors of the real gesture outline from the real gesture outline.
And the characteristic vector comparison module 3 is used for comparing the characteristic vectors of the real gesture outline drawing with the characteristic vectors of each standard gesture outline drawing one by one, and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing.
And the recognition module 4 is used for calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, the standard gesture corresponding to the similar standard gesture outline is used as the recognized gesture, otherwise, the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline is adjusted, so that the difference between the adjusted standard gesture outline and the real gesture outline is within the certain range, and the adjusted standard gesture is used as the recognized gesture.
Optionally, the contour extraction module 1 is configured to perform binarization processing on the real gesture picture to obtain a real gesture contour map.
Optionally, the feature vector comparison module 2 is configured to perform PCA dimension reduction on the real gesture profile to obtain a feature vector of the real gesture profile.
Optionally, the feature vector comparison module 3 is configured to calculate a difference between the real gesture outline and a similar standard gesture outline through an exclusive or operation.
Optionally, the recognition module 4 is configured to adjust, by using a PSO algorithm, a degree of freedom parameter of a standard gesture corresponding to the similar standard gesture profile, so that a difference between the adjusted standard gesture profile and the real gesture profile is within the certain range.
Optionally, the gesture recognition device further includes a standard gesture graph generation module, configured to adjust the degree of freedom parameters of the hand joints to generate each standard gesture graph.
Optionally, the gesture recognition apparatus further includes another contour extraction module, configured to perform binarization processing on the standard gesture map to obtain a standard gesture contour map.
Optionally, the gesture recognition apparatus further includes another feature vector extraction module, configured to perform PCA dimension reduction on the standard gesture profile to obtain a feature vector of the standard gesture profile.
Fig. 4 is a block diagram showing an example of a hardware configuration of a virtual reality device that can be used to implement an embodiment of the present invention. The virtual reality apparatus 300 includes a processor 3010, a memory 3020, an interface device 3030, a communication device 3040, a display device 3050, an input device 3060, a speaker 3070, a microphone 3080, and the like.
The memory 3020 is configured to store instructions for controlling the processor 3010 to operate to perform a gesture recognition method according to any of the preceding claims.
The processor 3010 may be, for example, a central processing unit CPU, a microprocessor MCU, or the like. The memory 3020 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface device 3030 includes, for example, a USB interface, a headphone interface, and the like. The communication device 3040 can perform wired or wireless communication, for example. The display device 3050 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 3060 may include, for example, a touch screen, a keyboard, and the like. A user can input/output voice information through the speaker 3070 and the microphone 3080.
The virtual reality device 300 shown in fig. 4 is merely illustrative and is in no way intended to limit the invention, its application, or uses. It will be appreciated by those skilled in the art that although a plurality of devices are shown in fig. 4, the present invention may relate to only some of the devices therein. Those skilled in the art can design instructions according to the disclosed aspects, and how the instructions control the operation of the processor is well known in the art, and therefore, will not be described in detail herein.
The invention also provides a programmable device comprising a memory and a processor, wherein the memory is used for storing instructions used for controlling the processor to operate so as to execute the gesture recognition method of any one of the preceding claims.
According to the gesture recognition scheme provided by the invention, a similar standard gesture outline map is determined by comparing the feature vector of the real gesture outline map with the feature vector of the standard gesture outline map, if the difference between the real gesture outline map and the similar standard gesture outline map is within a certain range, the standard gesture corresponding to the similar standard gesture outline map is taken as the recognized gesture, otherwise, the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline map is adjusted, so that the difference between the adjusted standard gesture outline map and the real gesture outline map is within the certain range, and the adjusted standard gesture is taken as the recognized gesture. The invention realizes that the gesture of the user can be recognized through one photo.
Optionally, in the gesture recognition scheme provided by the embodiment of the present invention, the PCA dimension reduction processing is used to reduce the dimension of the contour map, and the feature vector of the contour map is extracted, so that the calculation amount is reduced.
Optionally, in the gesture recognition scheme provided by the embodiment of the invention, the standard gesture corresponding to the similar standard gesture profile is optimized by using a PSO optimization algorithm, so that the efficiency and the accuracy are high.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other. It will be apparent to those skilled in the art that the above embodiments may be used alone or in combination with each other as desired. In addition, for the device embodiment, since it corresponds to the method embodiment, the description is relatively simple, and for relevant points, refer to the description of the corresponding parts of the method embodiment. The above-described apparatus embodiments are merely illustrative, in that modules illustrated as separate components may or may not be physically separate.
In addition, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based devices that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product provided in the embodiment of the present invention includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiment, and specific implementation may refer to the method embodiment, which is not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Although some specific embodiments of the present invention have been described in detail by way of examples, it should be understood by those skilled in the art that the above examples are for illustrative purposes only and are not intended to limit the scope of the present invention. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope of the invention. The scope of the invention is defined by the appended claims.

Claims (10)

1. A gesture recognition method is characterized by comprising the following steps:
processing the real gesture picture obtained by shooting to obtain a real gesture outline picture; extracting a characteristic vector of the real gesture outline from the real gesture outline; the feature vector of the real gesture outline is obtained by conducting PCA dimension reduction processing on the real gesture outline;
comparing the feature vectors of the real gesture outline drawing with the feature vectors of each standard gesture outline drawing one by one, and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing;
and calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, taking the standard gesture corresponding to the similar standard gesture outline as the recognized gesture, otherwise, adjusting the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline, so that the difference between the adjusted standard gesture outline and the real gesture outline is within the certain range, and taking the adjusted standard gesture as the recognized gesture.
2. The method according to claim 1, wherein the standard gesture outline is obtained by performing binarization processing on a standard gesture map; the real gesture outline image is obtained by performing binarization processing on a real gesture picture.
3. The method of claim 2, wherein the standard gesture graph is obtained by: and generating each standard gesture graph by adjusting the freedom degree parameters of the hand joints.
4. The method of claim 1, wherein the feature vector of the standard gesture profile is obtained by performing PCA dimension reduction on the standard gesture profile.
5. The method according to claim 1, wherein the degree-of-freedom parameters of the standard gesture corresponding to the similar standard gesture profile are adjusted through a PSO algorithm, so that the difference between the adjusted standard gesture profile and the real gesture profile is within the certain range.
6. The method of claim 1, wherein the difference between the real gesture profile and the similar standard gesture profile is calculated by an exclusive or operation.
7. A virtual reality device comprising a memory and a processor, the memory storing instructions for controlling the processor to operate to perform a gesture recognition method according to any one of claims 1 to 6.
8. A programmable device comprising a memory and a processor, the memory for storing instructions for controlling the processor to operate to perform a gesture recognition method according to any one of claims 1-6.
9. A gesture recognition apparatus, comprising:
the contour extraction module is used for processing the real gesture picture obtained by shooting to obtain a real gesture contour map;
the characteristic vector extraction module is used for extracting characteristic vectors of the real gesture outline map from the real gesture outline map; the feature vector extraction module is specifically used for carrying out PCA (principal component analysis) dimension reduction processing on the real gesture outline image to obtain feature vectors of the real gesture outline image;
the characteristic vector comparison module is used for comparing the characteristic vectors of the real gesture outline drawing with the characteristic vectors of each standard gesture outline drawing one by one and determining the standard gesture outline drawing closest to the real gesture outline drawing as a similar standard gesture outline drawing;
and the recognition module is used for calculating the difference between the real gesture outline and the similar standard gesture outline, if the difference is within a certain range, the standard gesture corresponding to the similar standard gesture outline is used as the recognized gesture, otherwise, the freedom degree parameter of the standard gesture corresponding to the similar standard gesture outline is adjusted, so that the difference between the adjusted standard gesture outline and the real gesture outline is within the certain range, and the adjusted standard gesture is used as the recognized gesture.
10. The device according to claim 9, wherein the recognition module is configured to adjust the degree-of-freedom parameters of the standard gesture corresponding to the similar standard gesture profile by using a PSO algorithm, so that the difference between the adjusted standard gesture profile and the real gesture profile is within the certain range.
CN201611239018.2A 2016-12-28 2016-12-28 Gesture recognition method and device, virtual reality equipment and programmable equipment Active CN106843472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611239018.2A CN106843472B (en) 2016-12-28 2016-12-28 Gesture recognition method and device, virtual reality equipment and programmable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611239018.2A CN106843472B (en) 2016-12-28 2016-12-28 Gesture recognition method and device, virtual reality equipment and programmable equipment

Publications (2)

Publication Number Publication Date
CN106843472A CN106843472A (en) 2017-06-13
CN106843472B true CN106843472B (en) 2020-01-03

Family

ID=59114401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611239018.2A Active CN106843472B (en) 2016-12-28 2016-12-28 Gesture recognition method and device, virtual reality equipment and programmable equipment

Country Status (1)

Country Link
CN (1) CN106843472B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108985191A (en) * 2018-06-28 2018-12-11 广东技术师范学院 A kind of contour extraction method based on mobile device gesture identification

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122350A (en) * 2011-02-24 2011-07-13 浙江工业大学 Skeletonization and template matching-based traffic police gesture identification method
CN103116895A (en) * 2013-03-06 2013-05-22 清华大学 Method and device of gesture tracking calculation based on three-dimensional model
CN104102904A (en) * 2014-07-14 2014-10-15 济南大学 Static gesture identification method
JP2015001804A (en) * 2013-06-14 2015-01-05 国立大学法人埼玉大学 Hand gesture tracking system
CN106022227A (en) * 2016-05-11 2016-10-12 苏州大学 Gesture identification method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102122350A (en) * 2011-02-24 2011-07-13 浙江工业大学 Skeletonization and template matching-based traffic police gesture identification method
CN103116895A (en) * 2013-03-06 2013-05-22 清华大学 Method and device of gesture tracking calculation based on three-dimensional model
JP2015001804A (en) * 2013-06-14 2015-01-05 国立大学法人埼玉大学 Hand gesture tracking system
CN104102904A (en) * 2014-07-14 2014-10-15 济南大学 Static gesture identification method
CN106022227A (en) * 2016-05-11 2016-10-12 苏州大学 Gesture identification method and apparatus

Also Published As

Publication number Publication date
CN106843472A (en) 2017-06-13

Similar Documents

Publication Publication Date Title
US10452953B2 (en) Image processing device, image processing method, program, and information recording medium
CN104333700A (en) Image blurring method and image blurring device
CN104331168A (en) Display adjusting method and electronic equipment
CN113655889B (en) Virtual character control method, device and computer storage medium
CN111950521A (en) Augmented reality interaction method and device, electronic equipment and storage medium
WO2017032078A1 (en) Interface control method and mobile terminal
CN107636639B (en) Fast orthogonal projection
US20150199096A1 (en) Electronic device and method of displaying data
CN109522937B (en) Image processing method and device, electronic equipment and storage medium
US20170099427A1 (en) Methods and apparatuses for providing improved autofocus using curve-fitting
CN107368181B (en) Gesture recognition method and device
TWI663524B (en) Facial expression operating system and method
CN104898880A (en) Control method and electronic equipment
US11205066B2 (en) Pose recognition method and device
CN106843472B (en) Gesture recognition method and device, virtual reality equipment and programmable equipment
CN111160251A (en) Living body identification method and device
CN113626903A (en) Road curve setting method and device, electronic equipment and storage medium
WO2016018682A1 (en) Processing image to identify object for insertion into document
CN112669244A (en) Face image enhancement method and device, computer equipment and readable storage medium
JP6397508B2 (en) Method and apparatus for generating a personal input panel
US20230004528A1 (en) Systems and methods for template-based data processing and visualization
CN114723855A (en) Image generation method and apparatus, device and medium
CN112348069A (en) Data enhancement method and device, computer readable storage medium and terminal equipment
Pilarczyk et al. Tuning deep learning algorithms for face alignment and pose estimation
CN110955485A (en) Method and device for adjusting interface elements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201016

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Patentee before: GOERTEK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221216

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right