CN105929939A - Remote gesture control terminal - Google Patents

Remote gesture control terminal Download PDF

Info

Publication number
CN105929939A
CN105929939A CN201610209786.7A CN201610209786A CN105929939A CN 105929939 A CN105929939 A CN 105929939A CN 201610209786 A CN201610209786 A CN 201610209786A CN 105929939 A CN105929939 A CN 105929939A
Authority
CN
China
Prior art keywords
gesture
infrared
image
module
infrared image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610209786.7A
Other languages
Chinese (zh)
Inventor
周琳
陈林瑞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Dongding Lizhi Information Technology Co Ltd
Original Assignee
Sichuan Dongding Lizhi Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Dongding Lizhi Information Technology Co Ltd filed Critical Sichuan Dongding Lizhi Information Technology Co Ltd
Priority to CN201610209786.7A priority Critical patent/CN105929939A/en
Publication of CN105929939A publication Critical patent/CN105929939A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language

Abstract

The invention discloses a remote gesture control terminal. The remote gesture control terminal has the advantages and the beneficial effects that the terminal can achieve remote gesture control of an intelligent device and can accurately acquire gesture control information; and the system adopts a curvature change trace analysis method, and the recognition rate of gesture commands can be greatly improved.

Description

A kind of remote gesture control terminal
Art
The present invention relates to a kind of remote gesture control terminal.
Background technology
Along with increasing of intelligent electronic device, the development of human-computer interaction technology, Gesture Recognition is applied to In increasing intelligent electronic device.In man-machine interaction, people just attempt to break away from centered by computer Interactive mode turn to interactive mode focusing on people, deacclimatize the working method of computer to meter from user The natural shift in demand of user deacclimatized by calculation machine.
Gesture is as one during exchange way is widely used in daily life the most intuitively.Gesture as people with The important medium of people's exchange, contains lively and plentiful information.Therefore, gesture identification is as man-machine interaction A branch, with it, there is naturality, the most rich, the feature such as direct, be increasingly becoming people and grind The focus studied carefully.
In prior art, gesture identification comparative maturity is the knowledge utilizing image recognition technology to realize interactive command Not.This method based on image recognition, it is generally required to be equipped with high performance camera head, is answered than more typical With such as the Xbox Kinect product of Microsoft.This method obtains either statically or dynamically image by picture pick-up device, so After utilize computer vision algorithms make analyze image, carry out pattern match, thus understand the implication of this gesture, Realize gesture identification.But, this technology needs high performance camera head, and needs high performance place Reason device carries out the image analysis algorithm of complexity, with high costs and be difficult to miniaturization, and identified action needs Will be in the face of camera head, disadvantage mentioned above results in this technology and cannot extensively apply.
The principal element affecting dynamic hand gesture recognition rate in prior art has: hands has uniqueness, different people The difference of same gesture is the biggest;Hands is the material in three dimensions, and in two dimensional image, gesture direction is difficult to Determine.Gesture identification method mainly has geometrical measurers, neural network (ANN) and hidden Markov mould at present Type (HMM) recognition methods etc..Wherein ANN and HMM is due to algorithm complexity, and real-time is the highest, and uncomfortable In dynamic hand gesture recognition, and geometric properties identification rule faces the problem that discrimination is not high enough.
Summary of the invention
The present invention provides a kind of remote gesture control terminal, and this terminal can realize to smart machine farther out The gesture of distance controls, and can obtain gesture control information the most accurately.
To achieve these goals, the present invention provides a kind of remote gesture control terminal to include:
Video acquiring module, for obtaining the first Infrared image from the first infrared camera, infrared takes the photograph from second As head obtains the second Infrared image;
Degree of depth acquisition module, for by the figure included by described first Infrared image and described second Infrared image Object identical in Xiang uses binocular imaging mode to calculate deep as this object of the object distance to photographic head Degree;
Hands separation module, is used for according to the described degree of depth from described first Infrared image and described second Infrared image Image in detection sell figure;
Gesture acquisition module, handles separation module for acquisition and confirms as the gesture data of gesture graph;
Wireless data transceiver module one, for sending the gesture data of remote gesture control terminal to described Remote gesture controls server, and is used for receiving the relevant finger controlling server transmission from remote gesture Make data;
Instruction acquisition module, for from wireless data transceiver module one obtain described gesture corresponding perform instruction;
Perform module, be used for performing described performing instruction.
Preferably, described degree of depth acquisition module, specifically for:
Same to described first Infrared image and described second Infrared image have in the image of identical time stamp One object uses binocular imaging mode to calculate the object distance to photographic head as the degree of depth of this object;
If two video cameras are at the same characteristic point P (x of synchronization viewing space objectc,yc,zc), respectively first Obtaining the image of a P on Infrared image and the second Infrared image, their image coordinate is respectively pleft=(Xleft,Yleft), pright=(Xright,Yright);
The image of existing two video cameras is in approximately the same plane, then the image coordinate Y coordinate of characteristic point P is identical, I.e. Yleft=Yright=Y, then obtained by triangle geometrical relationship:
X l e f t = f x c z c X r i g h t = f ( x c - B ) z c Y = f y c z c - - - ( 1 - 1 )
Then parallax is: Disparity=Xleft-Xright, thus can calculate characteristic point P under camera coordinates system Three-dimensional coordinate is:
x c = B · X l e f t D i s p a r i t y y c = B · Y D i s p a r i t y z c = B · f D i s p a r i t y - - - ( 1 - 2 )
Wherein B is baseline distance, equal to the distance of the projection centre line of two video cameras;F is camera focus;
Therefore, as long as any point of the first Infrared image can find the match point of correspondence at the second Infrared image, Being assured that out that the three-dimensional coordinate of this point, this method are the most point-to-point computings, in image planes, institute is a little Simply by the presence of corresponding match point, it is possible to participate in above-mentioned computing, thus obtain the three-dimensional coordinate of its correspondence.
Preferably, described hands separation module, specifically for:
By the degree of depth described in the image from described first Infrared image and described second Infrared image at predeterminable range In the range of, and the object that motion intense degree is in the range of default severe degree is judged as gesture graph.
Preferably, described instruction acquisition module, specifically for:
Obtain the control instruction that described control instruction fit module identifies.
Preferably, described video acquiring module has four infrared sensors, is used for gathering described gesture operation The infrared signal that direction reflects, and the infrared signal warp that will collect according to the clock signal synchronized Digital-to-analogue conversion becomes four railway digital sampled signals to be sent to degree of depth acquisition module, handles separation module and is judged as gesture After figure, gesture acquisition module obtains the gesture data with four railway digital sampled signals, then through wireless number It is sent to remote gesture according to transceiver module one and controls server.
Preferably, described control instruction takes following manner to obtain:
Described four railway digital sampled signals are smoothed by described control server;
According to four railway digital sampled signals after smoothing processing, with the time as x-axis, with infrared sensor collection To the intensity of infrared signal be y-axis, the infrared light letter respectively described four infrared sensors collected The variation tendency of number intensity fits to four curves;
Four bent curvatures of a curve of described curve generation module output in calculating different time sections;
For according to described four curves curvature in different time sections obtain described four bent curvatures of a curve with Four change rate curves of time change;
The gesture of user is judged according to described four change rate curves;
Store the different characteristic curve templates corresponding to gesture;
Extract the crest value in described four change rate curves, the time of crest value appearance and crest value to occur Curvature Variation front and back;
The time occurred according to the described crest value in described four change rate curves, described crest value and institute State the curvature Variation before and after crest value occurs and identify the gesture of user, and provide corresponding control instruction.
The present invention has the following advantages and beneficial effect: it is more remote that this terminal can realize smart machine Gesture control, and can obtain the most accurately gesture control information, system use Curvature varying trace analysis Algorithm, is greatly improved the discrimination of gesture instruction.
Accompanying drawing explanation
Fig. 1 shows the based on gesture of a kind of remote gesture control terminal comprising the present invention of the present invention The block diagram of remote control system.
Fig. 2 shows the remote gestural control method of the gesture control terminal utilizing the present invention of the present invention Flow chart.
Detailed description of the invention
Fig. 1 shows a kind of based on gesture the remote control system of the present invention.This system includes long distance Server 2 is controlled from gesture control terminal 1 and remote gesture.
Wherein, a kind of remote gesture control terminal 1 of the present invention includes:
Video acquiring module 11, for obtaining the first Infrared image from the first infrared camera, infrared from second Photographic head obtains the second Infrared image;
Degree of depth acquisition module 12, for by included by described first Infrared image and described second Infrared image Object identical in image uses binocular imaging mode to calculate deep as this object of the object distance to photographic head Degree;
Hands separation module 13, is used for according to the described degree of depth from described first Infrared image and described second infrared figure The image of shape detects figure of selling;
Gesture acquisition module 14, handles separation module for acquisition and confirms as the gesture data of gesture graph;
Wireless data transceiver module 1, for sending the gesture data of remote gesture control terminal to institute State remote gesture and control server, and be used for receiving from being correlated with that remote gesture control server sends Director data;
Instruction acquisition module 16, for obtaining, from wireless data transceiver module 1, the execution that described gesture is corresponding Instruction;
Perform module 17, be used for performing described performing instruction.
Remote gesture control server 2 include wireless data transceiver module 2 21, data cache module 22, Curve generation module 23, study module 24 and control instruction matching module 25:
Wireless data transceiver module 2 21, for receiving what described terminal wireless data transmit-receive module 1 sent Remote gesture data;
Data cache module 22, is used for receiving described hand signal data, and will according to the principle of first in first out Described hand signal data send to described control instruction matching module 25 and curve generation module 23;
Curve generation module 23, for fitting to curve according to the variation tendency of described gesture data;
Study module 24, is used for learning in advance and storing the characteristic curve template corresponding to different gestures;
Control instruction matching module 25, knows for the curve fitted to described in basis and described characteristic curve template Do not go out the gesture of user, and export corresponding control instruction according to the gesture of user, and by control instruction via Wireless data transceiver module two is sent to described remote gesture control terminal 1.
Preferably, described degree of depth acquisition module 12, specifically for:
Same to described first Infrared image and described second Infrared image have in the image of identical time stamp One object uses binocular imaging mode to calculate the object distance to photographic head as the degree of depth of this object;
If two video cameras are at the same characteristic point P (x of synchronization viewing space objectc,yc,zc), respectively first Obtaining the image of a P on Infrared image and the second Infrared image, their image coordinate is respectively pleft=(Xleft,Yleft), pright=(Xright,Yright);
The image of existing two video cameras is in approximately the same plane, then the image coordinate Y coordinate of characteristic point P is identical, I.e. Yleft=Yright=Y, then obtained by triangle geometrical relationship:
X l e f t = f x c z c X r i g h t = f ( x c - B ) z c Y = f y c z c - - - ( 1 - 1 )
Then parallax is: Disparity=Xleft-Xright, thus can calculate characteristic point P under camera coordinates system Three-dimensional coordinate is:
x c = B · X l e f t D i s p a r i t y y c = B · Y D i s p a r i t y z c = B · f D i s p a r i t y - - - ( 1 - 2 )
Wherein B is baseline distance, equal to the distance of the projection centre line of two video cameras;F is camera focus;
Therefore, as long as any point of the first Infrared image can find the match point of correspondence at the second Infrared image, Being assured that out that the three-dimensional coordinate of this point, this method are the most point-to-point computings, in image planes, institute is a little Simply by the presence of corresponding match point, it is possible to participate in above-mentioned computing, thus obtain the three-dimensional coordinate of its correspondence.
Preferably, described hands separation module 13, specifically for:
By the degree of depth described in the image from described first Infrared image and described second Infrared image at predeterminable range In the range of, and the object that motion intense degree is in the range of default severe degree is judged as gesture graph.
Preferably, described instruction acquisition module 16, specifically for:
Obtain the control instruction that described control instruction fit module identifies.
Preferably, described video acquiring module 11 has four infrared sensors, is used for gathering described gesture behaviour The infrared signal reflected as direction, and the infrared signal that will collect according to the clock signal synchronized Become four railway digital sampled signals to be sent to degree of depth acquisition module 12 through digital-to-analogue conversion, handle separation module 13 and judge After gesture graph, gesture acquisition module 14 obtains the gesture data with four railway digital sampled signals, then It is sent to remote gesture through wireless data transceiver module 1 and controls server.
Preferably, described curve generation module 23 includes pretreatment unit and the Curves compilation unit being sequentially connected with, Described pretreatment unit is also connected with described data cache module 22, described Curves compilation unit also with described Practise module and described control instruction matching module connect, wherein:
Described pretreatment unit, for being smoothed described four railway digital sampled signals;
Described Curves compilation unit, for according to four railway digital sampled signals after smoothing processing, with the time as x Axle, the intensity of the infrared signal collected with infrared sensor is as y-axis, respectively by described four infrared biographies The variation tendency of the infrared signal intensity that sensor collects fits to four curves.
Preferably, described study module 24 includes that the curvature estimation unit being sequentially connected with, change rate curve generate Unit, gesture judging unit and memory element, described curvature estimation unit is with described Curves compilation unit even Connecing, described memory element is connected, wherein with described control instruction matching module:
Described curvature estimation unit, described curve generation module exports in calculating different time sections four Bent curvature of a curve;
Described change rate curve signal generating unit, for according to described four curves curvature in different time sections Obtain time dependent four change rate curves of described four bent curvatures of a curve;
Described gesture judging unit, for judging the gesture of user according to described four change rate curves;
Described memory element, stores different gesture institutes for the judged result according to described gesture judging unit Characteristic of correspondence template curve.
Preferably, described study module 24 also includes being connected to described change rate curve signal generating unit and described hands Trigger element between gesture judging unit;
Described trigger element, for judging whether the curvature value in four change rate curves described in current time has One reaches default curvature range, the most then triggers the gesture of described gesture judging unit identification user.
Preferably, described gesture judging unit includes feature extraction unit and the gesture identification unit being sequentially connected with, Described feature extraction unit is also connected with described change rate curve signal generating unit, described gesture identification unit also with Described control instruction matching module connects, wherein:
Described feature extraction unit, for extracting crest value in described four change rate curves, crest value goes out Curvature Variation before and after existing time and crest value appearance;
Described gesture identification unit, for according to the described crest value in described four change rate curves, described Curvature Variation before and after the time of crest value appearance and the appearance of described crest value identifies the hands of user Gesture.
Preferably, described wireless data transceiver module 2 21 includes receptor, distributor, stream control device and frequency control Device, the remote gesture control terminal data that receptor sends for receiving radio data transceiver module one, point Send out the remote gesture control terminal data that received by receptor of device and be distributed to data cache module, and by long distance Speed from gesture control terminal data receiver passes to adaptive controller module, and stream control device is by self-adaptive controlled The adaptation value that device module processed produces is sent to frequency and controls device, to control remote gesture control terminal data receiver Speed.
Fig. 2 shows a kind of remote gestural control method of the present invention.The method specifically includes following steps:
S1. video acquiring module obtains Infrared image, and degree of depth acquisition module obtains from different thermal cameras The degree of depth of Infrared image;
S2. isolating infrared gesture graph from described Infrared image, gesture acquisition module obtains infrared gesture figure The data of shape, are then sent to remote gesture and control server;
S3. curve generation module generates corresponding data curve according to described gesture data, before this, learns mould Block learns in advance and stores different characteristic curve template corresponding to gesture;
S4. control instruction matching module identifies user according to described data and curves and described characteristic curve template Gesture, and export corresponding control instruction according to the gesture of user;
S5. control instruction is sent to remote gesture control terminal, and performs this instruction.
Preferably, described step S1 specifically includes:
Binocular is used to become on the same object in the image with identical time stamp in different thermal cameras Image space formula calculates the object distance to photographic head as the degree of depth of this object;
If two video cameras are at the same characteristic point P (x of synchronization viewing space objectc,yc,zc), respectively first Obtaining the image of a P on Infrared image and the second Infrared image, their image coordinate is respectively pleft=(Xleft,Yleft), pright=(Xright,Yright);
The image of existing two video cameras is in approximately the same plane, then the image coordinate Y coordinate of characteristic point P is identical, I.e. Yleft=Yright=Y, then obtained by triangle geometrical relationship:
X l e f t = f x c z c X r i g h t = f ( x c - B ) z c Y = f y c z c - - - ( 1 - 1 )
Then parallax is: Disparity=Xleft-Xright, thus can calculate characteristic point P under camera coordinates system Three-dimensional coordinate is:
x c = B · X l e f t D i s p a r i t y y c = B · Y D i s p a r i t y z c = B · f D i s p a r i t y - - - ( 1 - 2 )
Wherein B is baseline distance, equal to the distance of the projection centre line of two video cameras;F is camera focus;
Therefore, as long as any point of the first Infrared image can find the match point of correspondence at the second Infrared image, Being assured that out that the three-dimensional coordinate of this point, this method are the most point-to-point computings, in image planes, institute is a little Simply by the presence of corresponding match point, it is possible to participate in above-mentioned computing, thus obtain the three-dimensional coordinate of its correspondence.
Preferably, in step s 2, by the degree of depth of described object in the range of predeterminable range, and motion intense Degree object in the range of default severe degree is judged as gesture graph.
Preferably, in step sl, video acquiring module has four infrared sensors, is used for gathering described The infrared signal that gesture operation direction reflects, and infrared by collect according to the clock signal synchronized Optical signal becomes four railway digital sampled signals to be sent to degree of depth acquisition module through digital-to-analogue conversion.
Preferably, in step s 2, handling after separation module is judged as gesture graph, gesture acquisition module obtains Take the gesture data with four railway digital sampled signals, be then sent to long distance through wireless data transceiver module one Server is controlled from gesture.
Preferably, in step s3, the four railway digital sampled signals exported according to described data cache module will It is concrete that the variation tendency of the infrared signal intensity that described four infrared sensors collect fits to four curves Including:
The four railway digital sampled signals exporting described data buffer are smoothed;
According to four railway digital sampled signals after smoothing processing, with the time as x-axis, with infrared sensor collection To the intensity of infrared signal be y-axis, the infrared light letter respectively described four infrared sensors collected The variation tendency of number intensity fits to four curves.
Preferably, described learn in advance and store the characteristic curve template corresponding to different gestures to specifically include:
The variation tendency of the infrared signal intensity collected by described four infrared sensors fits to four songs Line, and described four bent curvatures of a curve in calculating different time sections;
Described four bent curvatures of a curve are obtained in time according to described four curves curvature in different time sections Four change rate curves of change;
Judge the gesture of user according to described four change rate curves, and store different handss according to judged result Characteristic curve template corresponding to gesture.
Preferably, the described gesture judging user according to described four change rate curves, and according to judged result Also include before storing the different characteristic curve templates corresponding to gesture;
Judge whether the curvature value in four change rate curves described in current time has one to reach default curvature Scope, the most then enter the step of the gesture judging user according to described four change rate curves.
Preferably, described judge that the gesture of user specifically includes according to described four change rate curves:
Extract the crest value in described four change rate curves, the time of crest value appearance and crest value to occur Curvature Variation front and back;
The time occurred according to the described crest value in described four change rate curves, described crest value and institute State the curvature Variation before and after crest value occurs and identify the gesture of user.
As mentioned above, although the embodiment and the accompanying drawing that are limited according to embodiment are illustrated, but to this skill Art field can carry out various amendment and deformation for having the technical staff of general knowledge from above-mentioned record. Such as, carry out according to the order that the method illustrated from the technology of explanation is mutually different, and/or according to say The form that the method illustrated by element such as bright system, structure, device, circuit is mutually different is combined Or combination, or it is replaced according to other elements or equipollent or replaces and also can reach suitable effect. For general technical staff of the technical field of the invention, without departing from the inventive concept of the premise, Make some equivalents to substitute or obvious modification, and performance or purposes are identical, all should be considered as belonging to the present invention Protection domain.

Claims (6)

1. a remote gesture control terminal, including:
Video acquiring module, for obtaining the first Infrared image from the first infrared camera, infrared takes the photograph from second As head obtains the second Infrared image;
Degree of depth acquisition module, for by the figure included by described first Infrared image and described second Infrared image Object identical in Xiang uses binocular imaging mode to calculate deep as this object of the object distance to photographic head Degree;
Hands separation module, is used for according to the described degree of depth from described first Infrared image and described second Infrared image Image in detection sell figure;
Gesture acquisition module, handles separation module for acquisition and confirms as the gesture data of gesture graph;
Wireless data transceiver module one, for sending the gesture data of remote gesture control terminal to described Remote gesture controls server, and is used for receiving the relevant finger controlling server transmission from remote gesture Make data;
Instruction acquisition module, for from wireless data transceiver module one obtain described gesture corresponding perform instruction;
Perform module, be used for performing described performing instruction.
2. terminal as claimed in claim 1, it is characterised in that described degree of depth acquisition module, specifically for:
Same to described first Infrared image and described second Infrared image have in the image of identical time stamp One object uses binocular imaging mode to calculate the object distance to photographic head as the degree of depth of this object;
If two video cameras are at the same characteristic point P (x of synchronization viewing space objectc,yc,zc), respectively first Obtaining the image of a P on Infrared image and the second Infrared image, their image coordinate is respectively pleft=(Xleft,Yleft), pright=(Xright,Yright);
The image of existing two video cameras is in approximately the same plane, then the image coordinate Y coordinate of characteristic point P is identical, I.e. Yleft=Yright=Y, then obtained by triangle geometrical relationship:
X l e f t = f x c z c X r i g h t = f ( x c - B ) z c Y = f y c z c - - - ( 1 - 1 )
Then parallax is: Disparity=Xleft-Xright, thus can calculate characteristic point P under camera coordinates system Three-dimensional coordinate is:
x c = B · X l e f t D i s p a r i t y y c = B · Y D i s p a r i t y z c = B · f D i s p a r i t y - - - ( 1 - 2 )
Wherein B is baseline distance, equal to the distance of the projection centre line of two video cameras;F is camera focus;
Therefore, as long as any point of the first Infrared image can find the match point of correspondence at the second Infrared image, Being assured that out that the three-dimensional coordinate of this point, this method are the most point-to-point computings, in image planes, institute is a little Simply by the presence of corresponding match point, it is possible to participate in above-mentioned computing, thus obtain the three-dimensional coordinate of its correspondence.
3. terminal as claimed in claim 2, it is characterised in that described hands separation module, specifically for:
By the degree of depth described in the image from described first Infrared image and described second Infrared image at predeterminable range In the range of, and the object that motion intense degree is in the range of default severe degree is judged as gesture graph.
4. terminal as claimed in claim 3, it is characterised in that described instruction acquisition module, specifically for:
Obtain the control instruction that described control instruction fit module identifies.
5. terminal as claimed in claim 4, it is characterised in that described video acquiring module have four infrared Sensor, for gathering the infrared signal that described gesture operation direction reflects, and according to synchronize time The infrared signal collected is become four railway digital sampled signals to be sent to degree of depth acquisition through digital-to-analogue conversion by clock signal Module, handles after separation module is judged as gesture graph, and gesture acquisition module obtains has four railway digitals samplings The gesture data of signal, is then sent to remote gesture through wireless data transceiver module one and controls server.
6. terminal as claimed in claim 1, it is characterised in that described control instruction takes following manner to obtain:
Described four railway digital sampled signals are smoothed by described control server;
According to four railway digital sampled signals after smoothing processing, with the time as x-axis, with infrared sensor collection To the intensity of infrared signal be y-axis, the infrared light letter respectively described four infrared sensors collected The variation tendency of number intensity fits to four curves;
Four bent curvatures of a curve of described curve generation module output in calculating different time sections;
For according to described four curves curvature in different time sections obtain described four bent curvatures of a curve with Four change rate curves of time change;
The gesture of user is judged according to described four change rate curves;
Store the different characteristic curve templates corresponding to gesture;
Extract the crest value in described four change rate curves, the time of crest value appearance and crest value to occur Curvature Variation front and back;
The time occurred according to the described crest value in described four change rate curves, described crest value and institute State the curvature Variation before and after crest value occurs and identify the gesture of user, and provide corresponding control instruction.
CN201610209786.7A 2016-04-06 2016-04-06 Remote gesture control terminal Pending CN105929939A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610209786.7A CN105929939A (en) 2016-04-06 2016-04-06 Remote gesture control terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610209786.7A CN105929939A (en) 2016-04-06 2016-04-06 Remote gesture control terminal

Publications (1)

Publication Number Publication Date
CN105929939A true CN105929939A (en) 2016-09-07

Family

ID=56840433

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610209786.7A Pending CN105929939A (en) 2016-04-06 2016-04-06 Remote gesture control terminal

Country Status (1)

Country Link
CN (1) CN105929939A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106484108A (en) * 2016-09-30 2017-03-08 天津大学 Chinese characters recognition method based on double vision point gesture identification
CN106873776A (en) * 2017-01-16 2017-06-20 广东美的制冷设备有限公司 Recognition methods, identifying device and air-conditioner
CN107749070A (en) * 2017-10-13 2018-03-02 京东方科技集团股份有限公司 The acquisition methods and acquisition device of depth information, gesture identification equipment
CN108647564A (en) * 2018-03-28 2018-10-12 安徽工程大学 A kind of gesture recognition system and method based on casement window device
CN111374388A (en) * 2020-03-19 2020-07-07 云南电网有限责任公司玉溪供电局 AR safety helmet

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110095873A1 (en) * 2009-10-26 2011-04-28 At&T Intellectual Property I, L.P. Gesture-initiated remote control programming
CN102426482A (en) * 2011-11-08 2012-04-25 北京新岸线网络技术有限公司 Non-contact computer cursor control method and system for same
CN103176667A (en) * 2013-02-27 2013-06-26 广东工业大学 Projection screen touch terminal device based on Android system
CN103345301A (en) * 2013-06-18 2013-10-09 华为技术有限公司 Depth information acquisition method and device
CN105353876A (en) * 2015-11-09 2016-02-24 深圳市云腾智慧科技有限公司 Multi-point light sensation based spatial gesture identification control system and method
CN105430025A (en) * 2016-01-19 2016-03-23 成都银事达信息技术有限公司 Remote intelligent internet teaching system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110095873A1 (en) * 2009-10-26 2011-04-28 At&T Intellectual Property I, L.P. Gesture-initiated remote control programming
CN102426482A (en) * 2011-11-08 2012-04-25 北京新岸线网络技术有限公司 Non-contact computer cursor control method and system for same
CN103176667A (en) * 2013-02-27 2013-06-26 广东工业大学 Projection screen touch terminal device based on Android system
CN103345301A (en) * 2013-06-18 2013-10-09 华为技术有限公司 Depth information acquisition method and device
CN105353876A (en) * 2015-11-09 2016-02-24 深圳市云腾智慧科技有限公司 Multi-point light sensation based spatial gesture identification control system and method
CN105430025A (en) * 2016-01-19 2016-03-23 成都银事达信息技术有限公司 Remote intelligent internet teaching system

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106484108A (en) * 2016-09-30 2017-03-08 天津大学 Chinese characters recognition method based on double vision point gesture identification
CN106873776A (en) * 2017-01-16 2017-06-20 广东美的制冷设备有限公司 Recognition methods, identifying device and air-conditioner
CN106873776B (en) * 2017-01-16 2019-08-23 广东美的制冷设备有限公司 Recognition methods, identification device and air conditioner
CN107749070A (en) * 2017-10-13 2018-03-02 京东方科技集团股份有限公司 The acquisition methods and acquisition device of depth information, gesture identification equipment
US10643340B2 (en) 2017-10-13 2020-05-05 Boe Technology Group Co., Ltd. Method and device for acquiring depth information and gesture recognition apparatus
CN108647564A (en) * 2018-03-28 2018-10-12 安徽工程大学 A kind of gesture recognition system and method based on casement window device
CN111374388A (en) * 2020-03-19 2020-07-07 云南电网有限责任公司玉溪供电局 AR safety helmet

Similar Documents

Publication Publication Date Title
CN109255813B (en) Man-machine cooperation oriented hand-held object pose real-time detection method
CN105929939A (en) Remote gesture control terminal
CN106846403B (en) Method and device for positioning hand in three-dimensional space and intelligent equipment
CN103353935B (en) A kind of 3D dynamic gesture identification method for intelligent domestic system
CN107545302B (en) Eye direction calculation method for combination of left eye image and right eye image of human eye
Berman et al. Sensors for gesture recognition systems
CN104317391B (en) A kind of three-dimensional palm gesture recognition exchange method and system based on stereoscopic vision
JP6259545B2 (en) System and method for inputting a gesture in a 3D scene
CN105487673A (en) Man-machine interactive system, method and device
US20130069876A1 (en) Three-dimensional human-computer interaction system that supports mouse operations through the motion of a finger and an operation method thereof
KR20120068253A (en) Method and apparatus for providing response of user interface
CN111104960B (en) Sign language identification method based on millimeter wave radar and machine vision
CN101201695A (en) Mouse system for extracting and tracing based on ocular movement characteristic
CN103105924B (en) Man-machine interaction method and device
CN103279188A (en) Method for operating and controlling PPT in non-contact mode based on Kinect
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN110164060B (en) Gesture control method for doll machine, storage medium and doll machine
CN114690900B (en) Input identification method, device and storage medium in virtual scene
CN105867625A (en) Long-distance gesture control method
CN108932060A (en) Gesture three-dimensional interaction shadow casting technique
CN108805056A (en) A kind of monitoring camera-shooting face sample extending method based on 3D faceforms
CN111290584A (en) Embedded infrared binocular gesture control system and method
CN105872729A (en) Method and device for identification of operation event
CN104077784B (en) Extract the method and electronic equipment of destination object
CN105912113A (en) Gesture-based remote control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20160907