CN109993043A - A kind of gesture identification and transmission method - Google Patents

A kind of gesture identification and transmission method Download PDF

Info

Publication number
CN109993043A
CN109993043A CN201810278289.1A CN201810278289A CN109993043A CN 109993043 A CN109993043 A CN 109993043A CN 201810278289 A CN201810278289 A CN 201810278289A CN 109993043 A CN109993043 A CN 109993043A
Authority
CN
China
Prior art keywords
gesture
terminal
gestures
images
meaning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201810278289.1A
Other languages
Chinese (zh)
Inventor
蔡哲宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Great Zheng New Material Science And Technology Ltd
Original Assignee
Guangzhou Great Zheng New Material Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Great Zheng New Material Science And Technology Ltd filed Critical Guangzhou Great Zheng New Material Science And Technology Ltd
Publication of CN109993043A publication Critical patent/CN109993043A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/113Recognition of static hand signs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/28Constructional details of speech recognition systems
    • G10L15/30Distributed recognition, e.g. in client-server systems, for mobile phones or network applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/107Static hand or arm
    • G06V40/117Biometrics derived from hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

User needs the gesture for first passing through oneself at home reception terminal to upload the gesture operation of oneself and define meaning representated by gesture by input terminal.And central processing unit also will record a series of a series of meaning representated by images of gestures and images of gestures and prestore module to gesture.And when user goes to outside, a series of meaning of a series of customized images of gestures and images of gestures can be transmitted in the processor that gesture executes terminal by information transmitting terminal, and gesture executes terminal and understands meaning expressed by the gesture of user by the analysis of processor and execute the related command of user.A series of images of gestures and its meaning that the present invention passes through storage user in advance, then when user, which goes to public place, executes terminal with public gesture, it sends corresponding public gesture for gesture by information transmitting terminal to execute in terminal, gesture executes the meaning of the clear user gesture of terminal energy and makes corresponding operation.

Description

A kind of gesture identification and transmission method
Technical field
The invention patent belongs to terminal applies technical field, is related to a kind of gesture identification and transmission method.
Background technique
As gesture control constantly develops, people can directly control computer, mobile phone and vehicle by gesture Etc. equipment, but everyone is also not quite similar for the understanding of same gesture, and has the people of part due to the congenital or day after tomorrow The reason of, palm structure is not quite similar with ordinary people.Therefore, when people go to public place using gesture, due to different people There is different meanings for same gesture, it is unable to decide which is right.So machine is the meaning for being difficult to differentiate people's gesture, make corresponding Operation.
Patent of invention content
The invention patent is the technical problem to be solved is that, provide a kind of suitable for the gesture identification of public domain and transmission side Method.
A kind of gesture identification and transmission method, comprising the following steps:
S1, first gesture receive terminal and receive a series of images of gestures of user and a series of images of gestures are uploaded to centre Manage unit;
S2, input terminal receive meaning representated by a series of images of gestures of instruction definition of user and are uploaded to centre Manage unit;
Typing is in one's hands together by a series of images of gestures and a series of corresponding meaning of images of gestures for S3, central processing unit The pre- storing module of gesture;
S4, user execute that terminal transmission is a series of to be prestored images of gestures and a series of prestore by information transmitting terminal to gesture The corresponding meaning of images of gestures, gesture executes terminal for a series of images of gestures and a series of images of gestures are corresponding contains Justice is entered into the processor to match with gesture control terminal;
S5, the second gesture that is connected of terminal is executed with gesture receive terminal and receive a series of images of gestures of user and by one Serial images of gestures uploads to processor;
S6, processor, which call, a series of to be prestored images of gestures and a series of prestore meaning corresponding to images of gestures and analyze the Two gestures receive a series of meaning for images of gestures that terminal receives;
S7, gesture execute terminal and execute related command according to the meaning of gesture.
It can be camera, noctovisor scan instrument or laser scanner that the first gesture, which receives terminal,.
The input terminal can be the equipment such as computer, touch screen, mobile phone.
The central processing unit includes prestoring gesture module for store a series of images of gestures and its meaning.
The information transmitting terminal can transmit the wireless telecom equipment of information for wireless network card, antenna, bluetooth etc..It is described When information transmitting terminal is if it is wireless telecom equipment, the information transmitting terminal can be integrated with automatically retrieval terminal.
The central processing unit receives terminal, input terminal, information transmitting terminal signal with first gesture and connect.
The gesture executes terminal can have processing analysis gesture for intelligent appliance, intelligent robot, intelligent restaurant etc. Ability and the equipment or terminal that can be carried out relevant operation according to gesture.The gesture executes terminal and is used for equipped with information receiving terminal Information is received, the information receiving terminal can be the communication equipments such as wireless network card, antenna and bluetooth.
It can be camera, noctovisor scan instrument or laser scanner that the second gesture, which receives terminal,.
The processor receives terminal with second gesture, gesture executes terminal and information receiving terminal signal is connect.
It uploads the gesture operation of oneself firstly, user needs the gesture for first passing through oneself at home to receive terminal and passes through Input terminal defines meaning representated by gesture, and input terminal can show various default meanings, such as " going ", " I will have a meal " and " I will drink water ", user only needs selection.And central processing unit also will record a series of images of gestures and a series of Meaning representated by images of gestures prestores module to gesture.And when user goes to outside, it can be incited somebody to action by information transmitting terminal A series of meaning of a series of customized images of gestures and images of gestures is transmitted in the processor that gesture executes terminal, and Gesture executes terminal and also is provided with the gesture that second gesture reception terminal is used to receive user, and by the analysis of processor, gesture is held Row terminal just can be appreciated that the gesture of user wishes the meaning of expression, and gesture, which executes terminal, can execute the related command of user.
The present invention passes through a series of images of gestures and its meaning for storing user in advance, then when user goes to public place When executing terminal with public gesture, corresponding public gesture is sent for gesture by information transmitting terminal and executes terminal In, gesture executes the meaning of the clear user gesture of terminal energy and makes corresponding operation.
The gesture identification and transmission method further include: in step sl, the palm recording module of central processing unit is logical It crosses analysis and receives a series of images of gestures that terminal obtains from first gesture, obtain palm information, in step s 4 the information Transmission terminal can also execute terminal to gesture and send palm information, and in step s 6, hand identification terminal can analyze palm information, And it calls and associated a series of prestores images of gestures and a series of meanings for prestoring images of gestures.
The central processing unit includes palm recording module, and the effect of palm recording module is by analyzing from first Gesture receives a series of images of gestures that terminal obtains, and obtains the information of palm, and uploaded in central processing unit. Then it is uploaded in gesture execution terminal when going to public place palm information is prestored, then gesture, which executes terminal, to lead to The information judgement for crossing palm uses the user of gesture, avoids too many human hair in public from going out gesture instruction, gesture executes end End does not judge to issue the user of gesture instruction.
The palm information includes shape, palmmprint and the fingerprint of each palm.
If distinguishing user simply by the palm shape of user, although receiving the required precision of terminal to second gesture It is lower, but accuracy is insufficient, so second gesture receives terminal and also needs to obtain mobile phone palmmprint and fingerprint, makees further It examines.
The gesture identification and transmission method further include that central processing unit can automatically record gesture execution end message, In step s 4, the gesture near automatically retrieval terminal meeting automatically retrieval executes terminal, holds if it is the gesture being once transmitted across Row terminal, then central processing unit calls directly information transmitting terminal and executes that terminal transmission is a series of prestores images of gestures to gesture And a series of meanings for prestoring images of gestures, if it is not, then not executing.The automatically retrieval terminal is wireless network card, day Line or bluetooth, the automatically retrieval terminal are connect with central processing unit signal, and the automatically retrieval terminal and information transmission are eventually End can be with two-in-one.
By be arranged automatically retrieval terminal avoid each user require oneself to upload manually a series of images of gestures and its Meaning influences whether the usage experience of user, and the gesture near the automatically retrieval terminal energy automatically retrieval executes the net of terminal Network, when the logged gesture of user executes the network of terminal, and the trial of automatically retrieval terminal is connected thereto, and central processing Unit then passes through information transmitting terminal and transmits information to it.
In step s 2, user can choose the meaning representated by different occasions similarly hereinafter a gesture, in step s3, institute It states central processing unit and meaning representated by gesture and gesture is entered by different types of gesture according to the occasion that user selects Pre- storing module.The type includes food and drink, lodging, amusement and household.
In step s 4, when the gesture that automatically retrieval terminal automatically retrieval is transmitted across before executes terminal, centre Reason unit judges that gesture executes the type of terminal according to the information that gesture executes terminal, and the type of terminal is then executed according to gesture The pre- storing module of the gesture of same type is called to obtain corresponding gesture and its meaning.
Although gesture can be varied, normal for a user is all several gestures of habit, and in order to allow User can give expression to many meanings simply by several or ten several gestures, therefore this method is by dividing gesture Class, such as food and drink, lodging, amusement and household, user can work as center with meaning of the customized same gesture under different occasions When processing unit and gesture execute terminal transmission signal, central processing unit will also analyze the type that gesture executes terminal, according to A series of images of gestures that type uploads corresponding type execute terminal to gesture.
The gesture identification and transmission method further include, after step S6, before step S7, and the alarm set meeting Go out the meaning of gesture by voice or text importing.The alarm set is screen or loudspeaker.
In order to avoid processor is to the gesture identification mistake of user, this method also passes through one alarm set of setting, passes through Voice or text importing go out understanding of the processor to user gesture, allow user at the first time know and judge the understanding whether be What user was thought, if it is not, user can stop the operation that gesture executes terminal by gesture or other means.
Compared with prior art, the invention patent has the beneficial effect that
(1) present invention can prestore gesture into relevant device with upload user, allow public place equipment it is to be expressly understood that The gesture meaning of user simultaneously judges, and facilitates user to carry out gesture operation in public places, provides the user with and better use Experience.
(2) present invention is connected automatically in the equipment once connected by modes such as networks and uploads gesture, reduces The trouble for needing user oneself to upload accelerates the speed that entire gesture uploads.
(3) present invention also differentiates the gesture person of making by setting palmmprint and fingerprint recognition, avoids more in people Occasion in, gesture, which executes terminal, cannot intuitively tell the gesture person of making, and judge incorrectly to the meaning of gesture.
(4) classify to the use occasion of gesture, so that same gesture can represent the different meanings according to the difference of occasion, But also the information of terminal is executed by obtaining gesture, occasion used in gesture is automatically analyzed, artificial selection use is not needed Place.
Detailed description of the invention
Fig. 1 is flow diagram of the invention.
Specific embodiment
Present invention will be further explained below with reference to specific examples.It should be understood that these embodiments are merely to illustrate the present invention Rather than it limits the scope of the invention.In addition, it should also be understood that, after reading the content taught by the present invention, those skilled in the art Member can make various changes or modifications the present invention, and such equivalent forms equally fall within the application the appended claims and limited Range.
As shown in Figure 1, a kind of gesture identification and transmission method, comprising:
S1, first gesture receive terminal and receive a series of images of gestures of user and a series of images of gestures are uploaded to centre Manage unit;
S2, input terminal receive meaning representated by a series of images of gestures of instruction definition of user and are uploaded to centre Manage unit;
Typing is in one's hands together by a series of images of gestures and a series of corresponding meaning of images of gestures for S3, central processing unit The pre- storing module of gesture;
S4, user execute that terminal transmission is a series of to be prestored images of gestures and a series of prestore by information transmitting terminal to gesture The corresponding meaning of images of gestures, gesture executes terminal for a series of images of gestures and a series of images of gestures are corresponding contains Justice is entered into the processor to match with gesture control terminal;
S5, the second gesture that is connected of terminal is executed with gesture receive terminal and receive a series of images of gestures of user and by one Serial images of gestures uploads to processor;
S6, processor, which call, a series of to be prestored images of gestures and prestores the corresponding meaning of images of gestures and analysis the with a series of Two gestures receive the meaning for the gesture that terminal receives;
S7, gesture execute terminal and execute related command according to a series of meaning of images of gestures.
It can be camera, noctovisor scan instrument or laser scanner that the first gesture, which receives terminal,.Preferably rotate Camera.
After the first gesture receives a series of images of gestures arrival central processing unit that terminal obtains, central processing list Member can do characterization to a series of images of gestures, obtain color characteristic therein, textural characteristics, shape feature, space are closed It is feature and records a series of putting in order for images of gestures, central processing unit will record relatively color characteristic, texture Feature, shape feature, spatial relationship, a series of images of gestures put in order, fingerprint and palmmprint are simultaneously stored in gesture and prestore Module, which becomes, prestores images of gestures.
The input terminal can be the equipment such as computer, tablet computer, touch screen, mobile phone.
The central processing unit includes prestoring gesture module for store gesture and its meaning.The central processing list Member receives terminal with first gesture and input terminal signal is connect.
The equipment that the information transmitting terminal can transmit information for wireless network card, antenna, bluetooth etc..
The central processing unit receives terminal, input terminal and information transmitting terminal signal with first gesture and connect. Central processing unit includes for carrying out characterization to images of gestures and recording the gesture processing module of images of gestures feature.
The gesture executes terminal can have one system of processing analysis for intelligent appliance, intelligent robot, intelligent restaurant etc. Column images of gestures ability and the equipment or terminal that can be carried out relevant operation according to a series of images of gestures.The gesture executes terminal It can be set equipped with information receiving terminal for receiving information, the information receiving terminal for communications such as wireless network card and bluetooths It is standby.
It can be camera, noctovisor scan instrument or laser scanner that the second gesture, which receives terminal,.
The processor receives terminal with second gesture, gesture executes terminal and information receiving terminal signal is connect.
When processor receives a series of images of gestures for receiving terminal from second gesture, processor can be to gesture figure As data do characterization, color characteristic therein, textural characteristics, shape feature, spatial relation characteristics, central processing are obtained Unit will record relatively color characteristic, textural characteristics, shape feature, spatial relationship, image sequence, fingerprint and palmmprint, first It first passes through fingerprint and palmmprint judges this series of images of gestures is that who client belonged to, then call the user's to prestore gesture Image and its meaning receive color characteristic, the textural characteristics, shape of a series of images of gestures that terminal receives according to second gesture Shape feature and image sequence compare with a series of images of gestures that prestore, analyze a series of meaning of this images of gestures.
User needs the gesture for first passing through oneself at home reception terminal to upload the gesture operation of oneself and pass through first Input terminal defines meaning representated by gesture, and input terminal can show various default meanings, such as " going ", " I will have a meal " and " I will drink water ", user only needs selection.And central processing unit also will record a series of images of gestures and a series of Meaning representated by images of gestures prestores module to gesture.And when user goes to outside, it can be passed by information transmitting terminal It send a series of meaning of a series of customized images of gestures and images of gestures to be transmitted to gesture by information transmitting terminal to hold In the processor of row terminal, and gesture executes terminal and also is provided with a series of gestures that second gesture reception terminal is used to receive user Image, by the analysis of processor, gesture, which executes terminal, just can be appreciated that the gesture of user wishes that the meaning of expression, gesture execute end End can execute the related command of user.
The gesture identification and transmission method further include after step S1, before step S2, the palm recording module Gesture is recorded by central processing unit, analysis distinguishes the information of palm and records;In step s 4, user also passes through letter Breath transmission terminal executes terminal transmission to gesture and prestores palm information;In step s 6, hand identification terminal can analyze palm letter Breath, and call and associated prestore gesture and prestore gesture meaning.
The central processing unit includes palm recording module, and the effect of palm recording module is by analyzing from first Gesture receives a series of images of gestures that terminal obtains, and obtains the information of palm, and uploaded in central processing unit. Then it is uploaded in gesture execution terminal when going to public place palm information is prestored, then gesture, which executes terminal, to lead to The information judgement for crossing palm uses the user of gesture, avoids too many human hair in public from going out gesture instruction, gesture executes end The user for issuing gesture instruction can not be identified in end.
The palm information includes shape, palmmprint and the fingerprint of each palm.
The gesture identification and transmission method further include that central processing unit can automatically record gesture execution end message, In step s 4, the gesture near automatically retrieval terminal meeting automatically retrieval executes terminal, if it is the gesture being once transmitted across Execute terminal, then central processing unit call directly information transmitting terminal to gesture execute terminal transmission prestore images of gestures and The meaning of images of gestures is prestored, if it is not, then not executing.The automatically retrieval terminal is wireless network card, antenna or bluetooth, institute Automatically retrieval terminal is stated to connect with central processing unit signal.
In step s 2, user can choose the meaning representated by different occasions similarly hereinafter a gesture, in step s3, institute It states central processing unit and meaning representated by gesture and gesture is entered by different types of gesture according to the occasion that user selects Pre- storing module.The type includes food and drink, lodging, amusement and household.
In step s 4, when the gesture that automatically retrieval terminal automatically retrieval is transmitted across before executes terminal, centre Reason unit judges that gesture executes the type of terminal according to the information that gesture executes terminal, and the type of terminal is then executed according to gesture The pre- storing module of the gesture of same type is called to obtain corresponding gesture and its meaning.
User can be with meaning of the customized same gesture under different occasions, when central processing unit and gesture execute terminal When transmitting signal, central processing unit will also analyze the type that gesture executes terminal, and the hand of corresponding type is uploaded according to classification Gesture executes terminal to gesture.
The gesture identification and transmission method further include, and between step S6 and step S7, the alarm set can pass through Voice or text importing go out the meaning of gesture.The alarm set is screen or loudspeaker.
Embodiment 1
A series of images of gestures are transmitted to central processing list by corresponding gesture under user is recorded by camera first, camera In member.Following user opens the software in mobile phone or computer, selected on software meaning representated by the gesture for it is thirsty simultaneously And it is defined as the type of food and drink.Central processing unit can store a series of images of gestures and a series of images of gestures are opposite The meaning answered prestores module to gesture, when people go out, is recorded one in the pre- storing module of gesture by mobile devices such as mobile phones Series prestores images of gestures, when going to dining room, is directly uploaded by the antenna in mobile phone and relevant a series of prestores gesture figure As and its meaning into the communication equipment in dining room, communication equipment prestores images of gestures and its meaning is transmitted to processing for a series of again In device.When user is in specified region or when using the gesture in face of intelligent robot, the camera in dining room is either Camera with intelligent robot can obtain the gesture of user and be sent in total processor in dining room, according to camera institute A series of images of gestures of shooting, processor go out the source of gesture by analyzing its palmmprint and fingerprint recognition, then call related A series of images of gestures of user find the meaning representated by the gesture in food and drink type, and then processor is sent instructions to Gesture executes terminal, makes it go to pass one glass of water to user according to the meaning of gesture and serves the catalogue of beverage.
Embodiment 2
The present embodiment is roughly the same with example 1, and difference is that user is to have gone to before one the dining room gone and at this Family is transmitted through a series of relevant images of gestures on dining room, and when user goes near the dining room, wireless network card can be searched for constantly The network of surrounding, sees if there is the network that can enter and come into the past, closes when the position in user and dining room reaches It is suitable apart from when, wireless network card can be connected to the network in dining room, and upload a series of relevant images of gestures by wireless network card And its meaning, into the communication equipment in dining room, the communication equipment transmits information to processor again.
Obviously, the above embodiment of the present invention be only to clearly illustrate example of the present invention, and not be pair The restriction of embodiments of the present invention.For those of ordinary skill in the art, may be used also on the basis of the above description To make other variations or changes in different ways.There is no necessity and possibility to exhaust all the enbodiments.It is all this Made any modifications, equivalent replacements, and improvements etc., should be included in the claims in the present invention within the spirit and principle of invention Protection scope within.

Claims (8)

1. a kind of gesture identification and transmission method characterized by comprising
S1, first gesture receive terminal and receive the images of gestures of user and images of gestures is uploaded to central processing unit;
S2, input terminal receive the instruction of user and define meaning representated by images of gestures and uploaded to central processing list Member;
Images of gestures and the corresponding meaning of images of gestures are entered into the pre- storing module of gesture by S3, central processing unit together;
S4, user execute that terminal transmission prestores images of gestures and to prestore images of gestures opposite by information transmitting terminal to gesture The meaning answered, gesture execution terminal, which will prestore images of gestures and prestore the corresponding meaning of images of gestures, to be entered into and gesture control In the processor that terminal processed matches;
S5, the second gesture that is connected of terminal is executed with gesture receive terminal and receive the images of gestures of user and by images of gestures Upload to processor;
S6, processor calling prestore images of gestures meaning corresponding with images of gestures is prestored and analyze second gesture and receive eventually Terminate the meaning of the images of gestures received;
S7, the meaning according to images of gestures, processor send instructions to gesture and execute terminal, and gesture executes terminal according to gesture figure The meaning of picture executes related command.
2. a kind of gesture identification according to claim 1 and transmission method, which is characterized in that in step sl, centre The palm recording module for managing unit receives the images of gestures that terminal obtains from first gesture by analysis, obtains palm information and remembers Record is got off, and the information transmitting terminal can also execute terminal to gesture and send palm information, in step s 6, hand in step s 4 Palm identification terminal can analyze palm information, and call the associated meaning for prestoring images of gestures and prestoring images of gestures.
3. a kind of gesture identification according to claim 2 and transmission method, which is characterized in that the palm information includes hand Shape, palmmprint and the fingerprint of the palm.
4. a kind of gesture identification according to claim 1 and transmission method, which is characterized in that further include central processing unit Gesture can be automatically recorded and execute end message, in step s 4, the gesture near automatically retrieval terminal meeting automatically retrieval executes Terminal executes terminal if it is the image for prestoring gesture was once transmitted across, then central processing unit calls directly information transmission eventually It holds and prestores images of gestures to gesture execution terminal transmission and prestore the meaning of images of gestures, if it is not, then not executing.
5. a kind of gesture identification according to claim 1-4 and transmission method, which is characterized in that in step S2 In, user can choose the meaning representated by different occasions similarly hereinafter an images of gestures, in step s3, the central processing list Meaning representated by images of gestures and images of gestures is entered into different types of gesture according to the occasion that user selects and prestored by member Module.
6. a kind of gesture identification according to claim 5 and transmission method, which is characterized in that in step s 4, when automatic When the gesture that searching terminal automatically retrieval is transmitted across before executes terminal, central processing unit executes the letter of terminal according to gesture Breath judges that gesture executes the type of terminal, and the pre- storing module of gesture of same type is then called according to the type that gesture executes terminal It obtains and corresponding prestores gesture and its meaning.
7. a kind of gesture identification according to claim 5 and transmission method, which is characterized in that the type include food and drink, Lodging, amusement and household.
8. a kind of gesture identification according to claim 1 and transmission method, which is characterized in that after step S6, in step It further include the meaning that the alarm set goes out gesture by voice or text importing before rapid S7.
CN201810278289.1A 2017-12-30 2018-03-30 A kind of gesture identification and transmission method Withdrawn CN109993043A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2017219108591 2017-12-30
CN201721910859 2017-12-30

Publications (1)

Publication Number Publication Date
CN109993043A true CN109993043A (en) 2019-07-09

Family

ID=67128928

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201810278289.1A Withdrawn CN109993043A (en) 2017-12-30 2018-03-30 A kind of gesture identification and transmission method
CN201810279492.0A Withdrawn CN109992102A (en) 2017-12-30 2018-03-30 A kind of gesture identification and transmitting device and its system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810279492.0A Withdrawn CN109992102A (en) 2017-12-30 2018-03-30 A kind of gesture identification and transmitting device and its system

Country Status (1)

Country Link
CN (2) CN109993043A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072494A (en) * 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
CN102857808A (en) * 2012-09-29 2013-01-02 上海广电电子科技有限公司 Intelligent mobile Internet device (MID), intelligent television, as well as gesture control system and method
CN103136508A (en) * 2011-12-05 2013-06-05 联想(北京)有限公司 Gesture identification method and electronic equipment
CN103874031A (en) * 2012-12-17 2014-06-18 腾讯科技(深圳)有限公司 Data transmission method and equipment
CN105334958A (en) * 2015-09-11 2016-02-17 南京西西弗信息科技有限公司 Gesture recognition system and realization method
CN105353634A (en) * 2015-11-30 2016-02-24 北京地平线机器人技术研发有限公司 Household appliance and method for controlling operation by gesture recognition
CN105471924A (en) * 2016-01-29 2016-04-06 驰众信息技术(上海)有限公司 Wireless electronic Bluetooth identity recognition system
CN105700541A (en) * 2016-03-18 2016-06-22 普宙飞行器科技(深圳)有限公司 Control method of unmanned aerial vehicle, unmanned aerial vehicle system, and unmanned aerial vehicle
CN106529249A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 An information interaction method and virtual reality glasses

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105224066A (en) * 2014-06-03 2016-01-06 北京创思博德科技有限公司 A kind of gesture identification method based on high in the clouds process
CN104834907A (en) * 2015-05-06 2015-08-12 江苏惠通集团有限责任公司 Gesture recognition method, apparatus, device and operation method based on gesture recognition
CN106227341A (en) * 2016-07-20 2016-12-14 南京邮电大学 Unmanned plane gesture interaction method based on degree of depth study and system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6072494A (en) * 1997-10-15 2000-06-06 Electric Planet, Inc. Method and apparatus for real-time gesture recognition
CN103136508A (en) * 2011-12-05 2013-06-05 联想(北京)有限公司 Gesture identification method and electronic equipment
CN102857808A (en) * 2012-09-29 2013-01-02 上海广电电子科技有限公司 Intelligent mobile Internet device (MID), intelligent television, as well as gesture control system and method
CN103874031A (en) * 2012-12-17 2014-06-18 腾讯科技(深圳)有限公司 Data transmission method and equipment
CN105334958A (en) * 2015-09-11 2016-02-17 南京西西弗信息科技有限公司 Gesture recognition system and realization method
CN105353634A (en) * 2015-11-30 2016-02-24 北京地平线机器人技术研发有限公司 Household appliance and method for controlling operation by gesture recognition
CN105471924A (en) * 2016-01-29 2016-04-06 驰众信息技术(上海)有限公司 Wireless electronic Bluetooth identity recognition system
CN105700541A (en) * 2016-03-18 2016-06-22 普宙飞行器科技(深圳)有限公司 Control method of unmanned aerial vehicle, unmanned aerial vehicle system, and unmanned aerial vehicle
CN106529249A (en) * 2016-10-31 2017-03-22 宇龙计算机通信科技(深圳)有限公司 An information interaction method and virtual reality glasses

Also Published As

Publication number Publication date
CN109992102A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
KR101803081B1 (en) Robot for store management
CN105095873B (en) Photo be shared method, apparatus
US9244888B2 (en) Inferring placement of mobile electronic devices
CN105117007B (en) Show control method, device and the intelligent pad of equipment
CN109074819A (en) Preferred control method based on operation-sound multi-mode command and the electronic equipment using it
CN104104910B (en) It is a kind of to carry out two-way live shared terminal and method with intelligent monitoring
CN106209800A (en) Equipment Authority sharing method and apparatus
CN109917979A (en) A kind of searching method and mobile terminal
CN105471924B (en) Electronics bluetooth wireless identity identifying system
CN108390921A (en) The system and method for providing sensing data to electronic equipment
CN109446775A (en) A kind of acoustic-controlled method and electronic equipment
CN109189986A (en) Information recommendation method, device, electronic equipment and readable storage medium storing program for executing
CN104837049A (en) User terminal apparatus, display apparatus, and control methods thereof
EP2114041A1 (en) Context-sensitive data handling
CN109409244A (en) A kind of object puts the output method and mobile terminal of scheme
CN107861669A (en) The switching method and mobile terminal of a kind of custom system
CN106341382A (en) Multi-device screen sharing system between conference devices
CN105516944A (en) Short message canceling method and device
CN109639569A (en) A kind of social communication method and terminal
CN105897600A (en) Method and device for controlling router load balance
CN108123999A (en) A kind of information push method and mobile terminal
CN110413169A (en) A kind of information displaying method, device and medium
CN110188252A (en) A kind of searching method and terminal
CN106791563A (en) Information transferring method, local device, opposite equip. and system
CN109828668A (en) A kind of display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20190709

WW01 Invention patent application withdrawn after publication