CN105204629A - 3D (3-dimensional) gesture recognition method - Google Patents

3D (3-dimensional) gesture recognition method Download PDF

Info

Publication number
CN105204629A
CN105204629A CN201510552869.1A CN201510552869A CN105204629A CN 105204629 A CN105204629 A CN 105204629A CN 201510552869 A CN201510552869 A CN 201510552869A CN 105204629 A CN105204629 A CN 105204629A
Authority
CN
China
Prior art keywords
gesture
data
coordinate
result
physical hardware
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510552869.1A
Other languages
Chinese (zh)
Other versions
CN105204629B (en
Inventor
何杰
杨天虎
杨伟茂
孙国辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Living Network Science And Technology Ltd On Chengdu
Original Assignee
Living Network Science And Technology Ltd On Chengdu
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Living Network Science And Technology Ltd On Chengdu filed Critical Living Network Science And Technology Ltd On Chengdu
Priority to CN201510552869.1A priority Critical patent/CN105204629B/en
Publication of CN105204629A publication Critical patent/CN105204629A/en
Application granted granted Critical
Publication of CN105204629B publication Critical patent/CN105204629B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a3D (3-dimensional) gesture recognition method. The method comprises the following steps S1, physical hardware acquires 3D coordinate data of a user gesture in real time; S2, the physical hardware pre-processes the acquired 3D coordinate data to form feedback data; S3, data processing software recognizes the feedback data; S4, a system outputs a data recognition processing result. According to the 3D gesture recognition method, the problems that video gesture recognition needs processing of large quantity of data, the process is complicated and the software processing efficiency is low can be effectively solved, an effective action judgment process is performed directly according to 3D space coordinates, and processing is more efficient. When a gesture action is generated, the gesture action is pre-judged in the physical hardware firstly, then a possible result of the action is submitted to the software for processing, finally, the software processing result is confirmed with a gesture result pre-judged by the hardware, and the gesture action recognition rate is further increased.

Description

A kind of 3D gesture identification method
Technical field
The invention belongs to software technology field, be specifically related to a kind of design of 3D gesture identification method.
Background technology
Along with the fast development of science and technology, the human-computer interaction technology HCI (Human-ComputerInteraction) of research natural harmony becomes Showed Very Brisk.From progressively transferring to centered by computing machine, focus be put on man for human-computer interaction technology, be the interaction technique of multimedia various modes, the mutual intention of people is naturally expressed as a multidimensional information vector, as people makes to use gesture in mutual with the external world simultaneously, language, and expression in the eyes etc.User can well and virtual environment (virtualenvironment) produce alternately, a very important research is exactly Gesture Recognition.
Computer vision technique is making great efforts the intelligent future development to equaling to the wisdom of humanity always, to understand scene better.Gesture Recognition adopts 2D vision to study for a long time always, but along with the appearance of 3D sensor technology, and its application is by increasingly extensive and diversified.Because 2D visual scene itself exists limitation, gesture recognition system must apply the better result that other various informations just can obtain comprising more useful information.When possibility packets of information is containing some motion tracking, depend merely on the result that 2D is also difficult to obtain relative gesture.
A kind of in 3D gesture identification method is gesture identification method based on video, be according to video acquisition to information gesture is analyzed, thus realize the object of gesture identification.It is simple that the method has equipment, cheap feature, but need process a large amount of data in gesture identification, and process is complicated, the shortcomings such as software treatment effeciency is low.
Summary of the invention
The object of the invention is, in order to solve the problem that in prior art, Gesture Recognition process is complicated, efficiency is lower, to propose a kind of 3D gesture identification method.
Technical scheme of the present invention is: a kind of 3D gesture identification method, comprises the following steps:
S1, the physical hardware 3D coordinate data to user's gesture carries out Real-time Collection;
S2, physical hardware carry out pre-service to the 3D coordinate data collected, and form feedback data;
S3, data processing software carry out identifying processing to feedback data;
S4, system export data identification result.
Further, step S2 is specially:
The standard coordinate track of each gesture arranged in the Grid Track collected and system is carried out comparison one by one by physical hardware, using the gesture corresponding to one or more similar standard coordinate tracks as preliminary identification result, finally the 3D coordinate data collected and preliminary identification result are packed, form feedback data.
Further, step S3 comprises step by step following:
S31, data processing software filter out in feedback data without the 3D coordinate data that gesture produces;
S32, data processing software filter out in feedback data the 3D coordinate data of the effective collection threshold value exceeding Operation system setting;
S33, data processing software carry out gesture judgement according to the judgement priority of Operation system setting to the 3D coordinate data after filtering, will all decision conditions be met and the highest gesture of priority as identifying processing result;
S34, data processing software judge whether identifying processing result conforms to the preliminary identification result in feedback data, if then enter step S4, otherwise returns step S1.
The invention has the beneficial effects as follows:
(1) the present invention effectively can make up the upper some shortcomings at Space of 2D visual gesture.
(2) the present invention effectively can solve video gesture identification and need process a large amount of data, and process is complicated, and the problem that software treatment effeciency is low, directly effectively judges course of action according to three dimensional space coordinate, makes process more efficient.
(3) the present invention is when gesture motion produces, first in physical hardware, it is judged in advance, and then give software result possible for action and process, finally the gesture result that the result of software process and hardware judge in advance is confirmed, further increase the discrimination of gesture motion.
(4) the present invention is when a gesture motion may meet the condition of multiple gesture judgement, a gesture is set by the pre-judged result of physical hardware and judges priority successively to may qualified gesture judge, thus improve the efficiency and hommization setting that judge.
Accompanying drawing explanation
Fig. 1 is a kind of 3D gesture identification method process flow diagram provided by the invention.
Fig. 2 is the process flow diagram step by step of step S3 of the present invention.
Embodiment
Below in conjunction with accompanying drawing, embodiments of the invention are further described.
The invention provides a kind of 3D gesture identification method, as shown in Figure 1, comprise the following steps:
S1, the physical hardware 3D coordinate data to user's gesture carries out Real-time Collection.
Here physical hardware refers to the sensor (as infrared ray sensor, ultrasonic sensor etc.) that the 3D coordinate data that can gather user's gesture is housed, and can carry out simply dealt hardware system to data.
S2, physical hardware carry out pre-service to the 3D coordinate data collected, and form feedback data.
The standard coordinate track of each gesture arranged in the Grid Track collected and system is carried out comparison one by one by physical hardware, using the gesture corresponding to one or more similar standard coordinate tracks as preliminary identification result, finally the 3D coordinate data collected and preliminary identification result are packed, form feedback data.
S3, data processing software carry out identifying processing to feedback data.
As shown in Figure 2, this step comprises as follows step by step:
S31, data processing software filter out in feedback data without the 3D coordinate data that gesture produces.
Here data processing software refer in computer system can filter 3D coordinate data, the software of the operation of logic checking, identifying processing.
The 3D coordinate data produced without gesture refers to be worth thick-and-thin 3D coordinate data in the determination time threshold value Δ t internal coordinate of Operation system setting.
S32, data processing software filter out in feedback data the 3D coordinate data of the effective collection threshold value exceeding Operation system setting.
Here physical hardware is set to effective acquisition range of the 3D coordinate data of user's gesture as (0-X max, 0-Y max, 0-Z max), so, system then can arrange corresponding 3D coordinate data and effectively gather threshold value and be respectively X m, Y m, Z m(X m≤ X max,, Y m≤ Y max,, Z m≤ Z max), when the 3D coordinate data in feedback data exceedes effective collection threshold value, system just can be judged to be invalid data, and then is filtered out by data processing software.
S33, data processing software carry out gesture judgement according to the judgement priority of Operation system setting to the 3D coordinate data after filtering, will all decision conditions be met and the highest gesture of priority as identifying processing result.
In the embodiment of the present invention, define three kinds of gestures, be respectively:
(1) contactless slip gesture:
Assuming that in the determination time threshold value Δ t of Operation system setting, the X-axis coordinate of gesture meets 0<X<X m, Y-axis coordinate meets 0<Y<Y mand Y-axis coordinate changes from small to large or from big to small, Z axis coordinate meets all the time in certain interval range, is specially 0<Za<Z<ZbLEssT.LT ssT.LTZ m, Zb-Za≤Zt (Zt is the effective width value of Z axis coordinate when sliding of default).
(2) contact click gesture:
Assuming that in the determination time threshold value Δ t of Operation system setting, the X-axis coordinate of gesture meets 0<X<X m, Y-axis coordinate meets 0<Y<Y m, there is the situation of Z=0 in Z axis coordinate.
(3) non-contact rotary gesture:
Assuming that in the determination time threshold value Δ t of Operation system setting, Z axis coordinate meets all the time in certain interval range, is specially 0<Za<Z<ZbLEssT.LT ssT.LTZ m, Zb-Za≤Zt (Zt is the effective width value of Z axis coordinate when sliding of default), X-axis coordinate and Y-axis coordinate meet jointly: R 1 2≤ X 2+ Y 2≤ R 2 2(R 1, R 2be respectively gesture circle and identify interior external radius, R 1<R 2).
Here system can arrange corresponding gesture judgement priority according to the preliminary identification result in feedback data.
Such as, when the preliminary identification result in feedback data is " contactless slip gesture " or " non-contact rotary gesture " and the similarity of " contactless slip gesture " is larger, system just can arrange gesture and judge that priority is as contactless slip gesture > non-contact rotary gesture > contact click gesture.
Here suppose:
Verify that contactless slip gesture state is designated:
Sd=1 represents that this action occurs, and Sd=0 represents that this action does not occur.
Checking contact click gesture status indicator is:
St=1 represents that this action occurs, and St=0 represents that this action does not occur.
Checking non-contact rotary gesture state is designated:
Sr=1 represents that this action occurs, and Sr=0 represents that this action does not occur.
So, its gesture determination flow is:
Judge Sd=1 whether set up, be then using contactless slip gesture as identifying processing result;
Otherwise judge Sr=1 whether set up, be then using non-contact rotary gesture as identifying processing result;
Otherwise judge Sr=1 whether set up, be then using contact click gesture as identifying processing result, otherwise set identifying processing result as sky.
S34, data processing software judge whether identifying processing result conforms to the preliminary identification result in feedback data, if then enter step S4, otherwise illustrate that this gesture may cause None-identified due to environmental interference or nonstandard reason itself, need to return step S1 Resurvey data.
S4, system export data identification result.
Here system refers to comprise above-mentioned physical hardware and the computer system of data processing software.
Those of ordinary skill in the art will appreciate that, embodiment described here is to help reader understanding's principle of the present invention, should be understood to that protection scope of the present invention is not limited to so special statement and embodiment.Those of ordinary skill in the art can make various other various concrete distortion and combination of not departing from essence of the present invention according to these technology enlightenment disclosed by the invention, and these distortion and combination are still in protection scope of the present invention.

Claims (3)

1. a 3D gesture identification method, is characterized in that, comprises the following steps:
S1, the physical hardware 3D coordinate data to user's gesture carries out Real-time Collection;
S2, physical hardware carry out pre-service to the 3D coordinate data collected, and form feedback data;
S3, data processing software carry out identifying processing to feedback data;
S4, system export data identification result.
2. 3D gesture identification method according to claim 1, is characterized in that, described step S2 is specially:
The standard coordinate track of each gesture arranged in the Grid Track collected and system is carried out comparison one by one by physical hardware, using the gesture corresponding to one or more similar standard coordinate tracks as preliminary identification result, finally the 3D coordinate data collected and preliminary identification result are packed, form feedback data.
3. 3D gesture identification method according to claim 2, is characterized in that, described step S3 comprises step by step following:
S31, data processing software filter out in feedback data without the 3D coordinate data that gesture produces;
S32, data processing software filter out in feedback data the 3D coordinate data of the effective collection threshold value exceeding Operation system setting;
S33, data processing software carry out gesture judgement according to the judgement priority of Operation system setting to the 3D coordinate data after filtering, will all decision conditions be met and the highest gesture of priority as identifying processing result;
S34, data processing software judge whether identifying processing result conforms to the preliminary identification result in feedback data, if then enter step S4, otherwise returns step S1.
CN201510552869.1A 2015-09-02 2015-09-02 A kind of 3D gesture identification methods Expired - Fee Related CN105204629B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510552869.1A CN105204629B (en) 2015-09-02 2015-09-02 A kind of 3D gesture identification methods

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510552869.1A CN105204629B (en) 2015-09-02 2015-09-02 A kind of 3D gesture identification methods

Publications (2)

Publication Number Publication Date
CN105204629A true CN105204629A (en) 2015-12-30
CN105204629B CN105204629B (en) 2018-11-13

Family

ID=54952363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510552869.1A Expired - Fee Related CN105204629B (en) 2015-09-02 2015-09-02 A kind of 3D gesture identification methods

Country Status (1)

Country Link
CN (1) CN105204629B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107831995A (en) * 2017-09-28 2018-03-23 努比亚技术有限公司 A kind of terminal operation control method, terminal and computer-readable recording medium
CN108874139A (en) * 2018-06-20 2018-11-23 浙江工业大学 The target exchange method and system of visual focus and hand exercise tracking coordinated drive

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945362A (en) * 2012-10-18 2013-02-27 中国科学院计算技术研究所 Isomerous data fusion based coordinated gesture recognition method and system of sensor
CN103118227A (en) * 2012-11-16 2013-05-22 佳都新太科技股份有限公司 Method, device and system of pan tilt zoom (PTZ) control of video camera based on kinect
US20150234572A1 (en) * 2012-10-16 2015-08-20 Mitsubishi Electric Corporation Information display device and display information operation method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150234572A1 (en) * 2012-10-16 2015-08-20 Mitsubishi Electric Corporation Information display device and display information operation method
CN102945362A (en) * 2012-10-18 2013-02-27 中国科学院计算技术研究所 Isomerous data fusion based coordinated gesture recognition method and system of sensor
CN103118227A (en) * 2012-11-16 2013-05-22 佳都新太科技股份有限公司 Method, device and system of pan tilt zoom (PTZ) control of video camera based on kinect

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107831995A (en) * 2017-09-28 2018-03-23 努比亚技术有限公司 A kind of terminal operation control method, terminal and computer-readable recording medium
CN108874139A (en) * 2018-06-20 2018-11-23 浙江工业大学 The target exchange method and system of visual focus and hand exercise tracking coordinated drive
CN108874139B (en) * 2018-06-20 2021-02-02 浙江工业大学 Target interaction method and system cooperatively driven by visual focus and hand motion tracking

Also Published As

Publication number Publication date
CN105204629B (en) 2018-11-13

Similar Documents

Publication Publication Date Title
Yun et al. A hand gesture recognition method based on multi-feature fusion and template matching
CN104899600B (en) A kind of hand-characteristic point detecting method based on depth map
US8896522B2 (en) User-centric three-dimensional interactive control environment
CN111694428B (en) Gesture and track remote control robot system based on Kinect
CN101976330B (en) Gesture recognition method and system
CN102402289B (en) Mouse recognition method for gesture based on machine vision
Chen et al. Real-time hand tracking on depth images
CN102867173A (en) Human face recognition method and system thereof
CN104809387A (en) Video image gesture recognition based non-contact unlocking method and device
CN109271840A (en) A kind of video gesture classification method
CN106406516A (en) Local real-time movement trajectory characteristic extraction and identification method for smartphone
CN107220634B (en) Based on the gesture identification method for improving D-P algorithm and multi-template matching
CN105204629A (en) 3D (3-dimensional) gesture recognition method
CN103903279A (en) Parallel tracking system and method based on bionic binocular vision onboard platform
Nisa et al. A critical review of object detection using convolution neural network
CN111626135A (en) Three-dimensional gesture recognition system based on depth map
Tang et al. Hand tracking and pose recognition via depth and color information
Naik et al. Development of an Automated Hand Gesture Software to Control Volume for Computer
Wang et al. Real-time visual static hand gesture recognition system and its FPGA-based hardware implementation
Kim et al. Visual multi-touch air interface for barehanded users by skeleton models of hand regions
Jiang et al. A dynamic gesture recognition method based on computer vision
CN205247404U (en) System is judged to gesture based on far infrared
CN103558948A (en) Man-machine interaction method applied to virtual optical keyboard
Chen et al. A fingertips detection method based on the combination of centroid and Harris corner algorithm
Liu et al. Research on human action recognition based on global and local mixed features

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20181113

Termination date: 20210902

CF01 Termination of patent right due to non-payment of annual fee