CN107479686A - A kind of system of gesture identification, method and apparatus - Google Patents
A kind of system of gesture identification, method and apparatus Download PDFInfo
- Publication number
- CN107479686A CN107479686A CN201610404848.XA CN201610404848A CN107479686A CN 107479686 A CN107479686 A CN 107479686A CN 201610404848 A CN201610404848 A CN 201610404848A CN 107479686 A CN107479686 A CN 107479686A
- Authority
- CN
- China
- Prior art keywords
- gesture
- terminal
- wearable device
- word content
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present invention relates to wireless communication technology field, more particularly to a kind of system of gesture identification, method and apparatus, to solve the problems, such as that hand language recognition device practicality is poor in the prior art, the system includes:Wearable device and terminal, wherein, Wearable is connected with terminal;Wearable device is used for the gesture every preset duration detection user, obtains hand exercise parameter information, and send hand exercise parameter information to terminal;Terminal is used for the hand exercise parameter information for receiving wearable device transmission, and according to hand exercise parameter information, the gesture of user is determined, from the database of default gesture and word content corresponding relation, after finding word content corresponding with gesture, the word content found is exported.Implementation of this technical scheme due to simplifying hand language recognition device, so as to ensure that the normal communication of deaf-mute and normal person.
Description
Technical field
The present invention relates to wireless communication technology field, more particularly to a kind of system of gesture identification, method and apparatus.
Background technology
Sign language is the language that deaf-mute's exchange uses, and it is the proportion action that uses gesture, according to the change modeling of gesture image
Or the certain meaning or word of the syllable to form, however, due to the complexity of sign language, allow most of normal persons to grasp and apply
Sign language is exchanged clearly unpractical with deaf-mute.Obstacle be present with exchanging for normal person which results in deaf-mute.
Deaf-mute is helped to be exchanged with normal person by hand language recognition device in the prior art, but existing hand
Language identification device is bulky, complicated and be easily damaged, and practicality is poor.
The content of the invention
The present invention provides a kind of gesture recognition system, method and apparatus, to solve hand language recognition device in the prior art
The problem of practicality is poor.
The embodiments of the invention provide a kind of gesture recognition system, including:Wearable device and terminal, wherein, wearing
Formula equipment is connected with terminal;Wearable device is used for the gesture every preset duration detection user, obtains hand exercise ginseng
Number information, and send hand exercise parameter information to terminal;Terminal is used for the hand exercise ginseng for receiving wearable device transmission
Number information, and according to hand exercise parameter information, determine the gesture of user, from default gesture and word content corresponding relation
In database, after finding word content corresponding with gesture, the word content found is exported.
Optionally, wearable device to terminal after startup flag is sent, every the hand of preset duration detection user
Gesture, at the time of startup flag starts for instruction terminal gesture;And end of identification position is sent to terminal, end of identification position is used for
At the time of instruction terminal gesture terminates.
Optionally, terminal, which receives, starts flag, hand exercise parameter information and end of identification position, and terminates receiving
After flag, according to the startup flag received and the hand exercise parameter information between flag is received, determines user's
Gesture.
Optionally, terminal shows the word content found;And/or the word found is played by the form of voice
Content.
The embodiments of the invention provide a kind of method of gesture identification, including:
Gesture of the formula equipment every preset duration detection user can be worn, obtains hand exercise parameter information;And to that can wear
The terminal for wearing the connection of formula equipment sends hand exercise parameter information.
Optionally, wearable device to terminal after startup flag is sent, every the hand of preset duration detection user
Gesture, at the time of startup flag starts for instruction terminal gesture;And end of identification position is sent to terminal, end of identification position is used
At the time of instruction terminal gesture terminates.
The embodiments of the invention provide a kind of method of gesture identification, including:
Terminal is after the hand exercise information that the wearable device being connected with terminal is sent is received, according to hand exercise
Information, from the database of default gesture and word content corresponding relation, find word content corresponding with gesture;It is and defeated
Go out the word content found.
Optionally, terminal receives startup flag and the end of identification position that wearable device is sent, wherein, start mark
At the time of position starts for instruction terminal gesture, at the time of end of identification position is terminated for instruction terminal gesture;And according to reception
Hand exercise parameter information between the startup flag and reception flag that arrive, determine the gesture of user.
Optionally, terminal shows the word content found;And/or terminal is played and found by the form of voice
Word content.
The embodiments of the invention provide a kind of equipment of gesture identification, including:Processing unit, detection unit and transmitting-receiving are single
Member, wherein, processing unit is used for the gesture for detecting user every preset duration by detection unit, obtains hand exercise parameter letter
Breath;Transmit-Receive Unit is used to send hand exercise parameter information to the terminal being connected with equipment.
Optionally, processing unit is after Transmit-Receive Unit sends startup flag to terminal, by detection unit every default
Duration detects the gesture of user, at the time of startup flag starts for instruction terminal gesture;Transmit-Receive Unit is additionally operable to terminal
End of identification position is sent, at the time of end of identification position is terminated for instruction terminal gesture.
The embodiments of the invention provide a kind of terminal of gesture identification, including:Transmit-Receive Unit, processing unit and output are single
Member, wherein, Transmit-Receive Unit is used to receive the hand exercise information for the wearable device transmission being connected with terminal;Processing unit
For according to hand exercise information, from the database of default gesture and word content corresponding relation, finding and gesture pair
The word content answered;
Output unit, for exporting the word content found.
Optionally, Transmit-Receive Unit is additionally operable to receive startup flag and the end of identification position that wearable device is sent, its
In, at the time of startup flag starts for instruction terminal gesture, at the time of end of identification position is terminated for instruction terminal gesture;
Processing unit is used for according to the startup flag received and receives the hand exercise parameter information between flag,
Determine the gesture of user.
Optionally, output unit, specifically for showing the word content found;And/or played by the form of voice
The word content found.
In embodiments of the present invention, because Wearable can obtain hand exercise parameter letter after gesture is detected
Breath, and hand exercise parameter information is sent to terminal, terminal can determine the hand of user according to the hand exercise information received
Gesture, further according to gesture, from the database of default gesture and word content corresponding relation, search text corresponding with gesture
Word content, and export the word content found so that being ignorant of the people of sign language can be managed by the word content exported in terminal
The meaning of sign language expression is solved, so as to simplify the implementation of hand language recognition device, ensure that deaf-mute and normal person's is normal
Exchange.
Brief description of the drawings
Fig. 1 is the schematic diagram of the system of gesture identification of the embodiment of the present invention;
Fig. 2 is the schematic flow sheet of the method for gesture identification of the embodiment of the present invention;
Fig. 3 is the schematic flow sheet of the method for gesture identification of the embodiment of the present invention;
Fig. 4 is the structural representation of the equipment of gesture identification of the embodiment of the present invention;
Fig. 5 is the hardware architecture diagram for the equipment that the present invention implements gesture identification;
Fig. 6 is the structural representation of the terminal of gesture identification of the embodiment of the present invention;
Fig. 7 is the hardware architecture diagram for the terminal that the present invention implements gesture identification.
Embodiment
In order that the purpose, technical scheme and advantage of the application are clearer, the application is made below in conjunction with accompanying drawing into
One step it is described in detail.
As shown in figure 1, the system of gesture identification of the embodiment of the present invention, including:Wearable device 100 and terminal 110, its
In, wearable device 100 is connected with terminal 110;Specifically, wearable device 100 is used to detect every preset duration
The gesture of user, hand exercise parameter information is obtained, and hand exercise parameter information is sent to terminal 110;Terminal 110 is used to connect
The hand exercise parameter information that wearable device is sent is received, and according to hand exercise parameter information, determines the gesture of user, from
Default gesture is with the database of word content corresponding relation, after finding word content corresponding with gesture, exporting and searching
The word content arrived.
It should be understood that in embodiments of the present invention, to enable wearable device correctly to detect finger and palm
Wearable device, can be designed like the shape of gloves, be worn on both hands by motion.Specifically, wearable device bag
Include the motion that multiple sensors are used to sense hand.
The terminal of the embodiment of the present invention can be connected with mobile phone, tablet personal computer, notebook computer etc. with wearable device
Electronic equipment connect and output character content.
It should be understood that wearable device and terminal can pass through such as WiFi's (Wireless Fidelity, Wireless Fidelity)
The wireless mode such as wireless network or bluetooth, or the mode of wired connection, wearable device is connected with terminal.
It should be noted that gesture of the wearable device every preset duration detection user, wherein, preset duration can be with
It is configured according to the hardware case of wearable device, specifically, terminal after the user gesture of detection a period of time, waits in advance
If duration, continuing the gesture of detection survey user, so repeating the detection process, until terminating gesture.Wherein, the duration of detection
Also can be set according to the needs of actual conditions.
Complete in short can clearly to enable the terminals to differentiation sign language, optionally, before user starts gesture, use
Start button is pressed at family in wearable device, and wearable device is sent to terminal starts flag, the startup flag
At the time of beginning for instruction terminal gesture, and after startup flag is sent, the gesture of each preset duration detection user,
After terminating gesture, user presses conclusion button in wearable device, and wearable device sends end of identification position to terminal,
At the time of the end of identification position is terminated for instruction terminal gesture, terminal is after end of identification position is received, according to what is received
Hand exercise parameter information between origin identification position and end of identification position, determine the gesture of user.
Wherein, hand exercise parameter information includes the moving parameter information of each finger and the moving parameter information of palm,
Moving parameter information includes the information such as finger, the direction that palm moves, acceleration, angular speed, displacement.
It should be noted that conclusion button and start button can be same button, or two different to press
Button.The button can be the physical button of entity, or virtual button.
Optionally, the word content that terminal output is found can be looked into by showing the word content found to export
The word content found, the word content found can also be played by the form of voice, or with language while display
The form of sound plays the word content found.
Specifically, the language form for showing or playing can also be arranged as required to, as other side only knows English, then may be selected
Show and/or play the word of English type.
Based on same inventive concept, a kind of method of gesture identification is additionally provided in the embodiment of the present invention, due to a kind of hand
Gesture knows the system that system corresponding to method for distinguishing is gesture identification of the embodiment of the present invention, therefore gesture identification of the embodiment of the present invention
The implementation of method may refer to the implementation of the system, repeats part and repeats no more.
As shown in Fig. 2 the method for gesture identification of the embodiment of the present invention, including:
Step 200, gesture of the formula equipment every preset duration detection user can be worn, obtains hand exercise parameter information.
Step 201, formula equipment can be worn and send hand exercise parameter information to the terminal being connected with wearable device.
Optionally, wearable device to terminal after startup flag is sent, every the hand of preset duration detection user
Gesture, at the time of startup flag starts for instruction terminal gesture;And end of identification position is sent to terminal, end of identification position is used
At the time of instruction terminal gesture terminates
As shown in figure 3, the method for gesture identification of the embodiment of the present invention, including:
Step 300, terminal is after the hand exercise information that the wearable device that is connected with terminal is sent is received, according to
Hand exercise information, from the database of default gesture and word content corresponding relation, find word corresponding with gesture
Content;
Step 301, the word content that terminal output is found.
Optionally, terminal receives startup flag and the end of identification position that wearable device is sent, wherein, start mark
At the time of position starts for instruction terminal gesture, at the time of end of identification position is terminated for instruction terminal gesture;And according to reception
Hand exercise parameter information between the startup flag and reception flag that arrive, determine the gesture of user.
Optionally, terminal shows the word content found;And/or terminal is played and found by the form of voice
Word content.
Based on same inventive concept, a kind of equipment of gesture identification is additionally provided in the embodiment of the present invention, due to a kind of hand
System corresponding to the equipment of gesture identification is the system of gesture identification of the embodiment of the present invention, therefore gesture identification of the embodiment of the present invention
The implementation of equipment may refer to the implementation of the system, repeats part and repeats no more.
As shown in figure 4, the equipment of gesture identification of the embodiment of the present invention, including:Processing unit 400, detection unit 401 and receipts
Bill member 402, wherein, processing unit 400 is used for the gesture for detecting user every preset duration by detection unit 401, obtains
Hand exercise parameter information;Transmit-Receive Unit 402 is used to send hand exercise parameter information to the terminal being connected with the equipment.
Optionally, processing unit 400 passes through detection unit 401 after Transmit-Receive Unit 402 sends startup flag to terminal
The gesture of user is detected every preset duration, at the time of startup flag starts for instruction terminal gesture;Transmit-Receive Unit 401 is also
For sending end of identification position to terminal, at the time of end of identification position is terminated for instruction terminal gesture.
It should be noted that in the embodiment of the present invention, processing unit 400 can be realized by processor, and detection unit can be by sensing
Device realizes that Transmit-Receive Unit 402 can be realized by transceiver.
As shown in figure 5, the equipment 500 of gesture identification can include processor 510, sensor 520, the and of transceiver 530
Memory 540.Wherein, memory 540 can be used for the program/code pre-installed when storage device 500 is dispatched from the factory, and can also store use
Code when processor 510 performs etc..
Wherein, processor 510 can use general central processing unit (Central Processing Unit, CPU),
Microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or
One or more integrated circuits, for performing associative operation, to realize technical scheme that the embodiment of the present invention is provided.
It should be noted that although the equipment 500 shown in Fig. 5 illustrate only processor 510, sensor 520, the and of transceiver 530
Memory 540, but during specific implementation, it should be apparent to a person skilled in the art that the equipment is also normal comprising realizing
Other devices necessary to operation.Meanwhile according to specific needs, it should be apparent to a person skilled in the art that the equipment can also wrap
Containing the hardware device for realizing other additional functions.In addition, it should be apparent to a person skilled in the art that the equipment also can be included only
Device or module necessary to realizing the embodiment of the present invention, without including whole devices shown in Fig. 5.
One of ordinary skill in the art will appreciate that realize all or part of flow in above-described embodiment method, being can be with
The hardware of correlation is instructed to complete by computer program, above-mentioned program can be stored in a computer read/write memory medium
In, the program is upon execution, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, above-mentioned storage medium can be magnetic
Disk, CD, read-only memory (ROM:Read-Only Memory) or random access memory (RAM:Random
Access Memory) etc..
As shown in fig. 6, the terminal of gesture identification of the embodiment of the present invention, including:Transmit-Receive Unit 600, processing unit 601 and defeated
Go out unit 602, wherein, Transmit-Receive Unit 600 is used to receive the hand exercise letter for the wearable device transmission being connected with terminal
Breath;Processing unit 601 is used for according to hand exercise information, from the database of default gesture and word content corresponding relation,
Find word content corresponding with gesture;
Output unit 602, for exporting the word content found.
Optionally, Transmit-Receive Unit 600 is additionally operable to receive startup flag and the end of identification position that wearable device is sent,
Wherein, start flag at the time of start for instruction terminal gesture, end of identification position for instruction terminal gesture terminate when
Carve;
Processing unit 601 is used for according to the startup flag received and receives the hand exercise parameter letter between flag
Breath, determine the gesture of user.
Optionally, output unit 602, specifically for showing the word content found;And/or the form for passing through voice
Play the word content found.
It should be noted that in the embodiment of the present invention, processing unit 601 can be realized by processor, and output unit 602 can be by showing
Show that device and/or player realize that Transmit-Receive Unit 600 can be realized by transceiver.
As shown in fig. 7, the terminal 700 of gesture identification can include processor 710, display and/or player 720,
Transceiver 730 and memory 740.Wherein, memory 740 can be used for the program/code pre-installed when storage device 700 is dispatched from the factory,
Code when being performed for processor 710 etc. can also be stored.
Wherein, processor 710 can use general central processing unit (Central Processing Unit, CPU),
Microprocessor, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or
One or more integrated circuits, for performing associative operation, to realize technical scheme that the embodiment of the present invention is provided.
It should be noted that although the terminal 700 shown in Fig. 7 illustrate only processor 710, display and/or player 720,
Transceiver 730 and memory 740, but during specific implementation, it should be apparent to a person skilled in the art that the terminal is also wrapped
Containing other devices necessary to realizing normal operation.Meanwhile according to specific needs, it should be apparent to a person skilled in the art that should
Terminal can also include the hardware device for realizing other additional functions.In addition, it should be apparent to a person skilled in the art that terminal
Can be only comprising device or module necessary to realizing the embodiment of the present invention, without including whole devices shown in Fig. 7.
One of ordinary skill in the art will appreciate that realize all or part of flow in above-described embodiment method, being can be with
The hardware of correlation is instructed to complete by computer program, above-mentioned program can be stored in a computer read/write memory medium
In, the program is upon execution, it may include such as the flow of the embodiment of above-mentioned each method.Wherein, above-mentioned storage medium can be magnetic
Disk, CD, read-only memory (ROM:Read-Only Memory) or random access memory (RAM:Random
Access Memory) etc..
It can be seen from the above:The gesture recognition system of the embodiment of the present invention includes wearable device and terminal,
Wherein, Wearable is connected with terminal;Wearable device is used for the gesture every preset duration detection user, obtains in one's hands
Portion's moving parameter information, and send hand exercise parameter information to terminal;Terminal is used for the hand for receiving wearable device transmission
Portion's moving parameter information, and according to hand exercise parameter information, the gesture of user is determined, from default gesture and word content pair
In the database that should be related to, after finding word content corresponding with gesture, the word content found is exported.This technical side
Case can be by the word content that is exported in terminal due to that can be ignorant of the people of sign language by wearable device and terminal
Understand the meaning of sign language expression, so as to simplify the implementation of hand language recognition device, ensure that deaf-mute and normal person just
Often exchange.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program
Product.Therefore, the present invention can use the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware
Apply the form of example.Moreover, the present invention can use the computer for wherein including computer usable program code in one or more
The computer program production that usable storage medium is implemented on (including but is not limited to magnetic disk storage, CD-ROM, optical memory etc.)
The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product
Figure and/or block diagram describe.It should be understood that can be by every first-class in computer program instructions implementation process figure and/or block diagram
Journey and/or the flow in square frame and flow chart and/or block diagram and/or the combination of square frame.These computer programs can be provided
The processors of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that produced by the instruction of computer or the computing device of other programmable data processing devices for real
The device for the function of being specified in present one flow of flow chart or one square frame of multiple flows and/or block diagram or multiple square frames.
These computer program instructions, which may be alternatively stored in, can guide computer or other programmable data processing devices with spy
Determine in the computer-readable memory that mode works so that the instruction being stored in the computer-readable memory, which produces, to be included referring to
Make the manufacture of device, the command device realize in one flow of flow chart or multiple flows and/or one square frame of block diagram or
The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that counted
Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented processing, so as in computer or
The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in individual square frame or multiple square frames.
Although preferred embodiments of the present invention have been described, but those skilled in the art once know basic creation
Property concept, then can make other change and modification to these embodiments.So appended claims be intended to be construed to include it is excellent
Select embodiment and fall into having altered and changing for the scope of the invention.
Obviously, those skilled in the art can carry out the essence of various changes and modification without departing from the present invention to the present invention
God and scope.So, if these modifications and variations of the present invention belong to the scope of the claims in the present invention and its equivalent technologies
Within, then the present invention is also intended to comprising including these changes and modification.
Claims (14)
- A kind of 1. gesture recognition system, it is characterised in that including:Wearable device and terminal, the Wearable and institute Terminal is stated to be connected;The wearable device, for the gesture every preset duration detection user, obtain hand exercise parameter information, and to The terminal sends the hand exercise parameter information;The terminal, the hand exercise parameter information sent for receiving the wearable device, and transported according to the hand Dynamic parameter information, the gesture of user is determined, from the database of default gesture and word content corresponding relation, found and institute After stating word content corresponding to gesture, the word content that is found described in output.
- 2. the system as claimed in claim 1, it is characterised in that the wearable device is every preset duration detection user's During gesture, it is specifically used for:After startup flag is sent to the terminal, every the gesture of preset duration detection user, the startup flag is used At the time of indicating that the terminal gesture starts;The wearable device, is additionally operable to:End of identification position is sent to the terminal, at the time of the end of identification position is used to indicate that the terminal gesture terminates.
- 3. system as claimed in claim 2, it is characterised in that when the terminal determines the gesture of user, be specifically used for:Startup flag, hand exercise parameter information and the end of identification position are received, and is receiving the end mark After knowing position, according to the startup flag and the hand exercise parameter information received between flag received, it is determined that The gesture of user.
- 4. the system as described in claims 1 to 3 is any, it is characterised in that in the word found described in the terminal output Rong Shi, it is specifically used for:The word content found described in display;And/orPass through the word content found described in the form broadcasting of voice.
- A kind of 5. method of gesture identification, it is characterised in that including:The gesture that formula equipment is worn every preset duration detection user, obtains hand exercise parameter information;The wearable device sends the hand exercise parameter information to the terminal being connected with the wearable device.
- 6. method as claimed in claim 5, it is characterised in that the wearable device is every preset duration detection user's Gesture, including:The wearable device detects the gesture of user after startup flag is sent to the terminal, every preset duration, At the time of the startup flag is used to indicate that the terminal gesture starts;Methods described also includes:The wearable device sends end of identification position to the terminal, and the end of identification position is used to indicate the terminal hand At the time of gesture terminates.
- A kind of 7. method of gesture identification, it is characterised in that including:Terminal is after the hand exercise information that the wearable device being connected with the terminal is sent is received, according to the hand Movable information, from the database of default gesture and word content corresponding relation, find word corresponding with the gesture Content;The word content found described in the terminal output.
- 8. method as claimed in claim 7, it is characterised in that also include:The terminal receives startup flag and the end of identification position that the wearable device is sent, wherein, it is described to start mark Know position to be used at the time of indicate that the terminal gesture starts, when the end of identification position is used to indicating that the terminal gesture to terminate Carve;The terminal determines the gesture of user, specifically includes:The terminal according to the startup flag that receives and the hand exercise parameter information received between flag, Determine the gesture of user.
- 9. method as claimed in claim 7 or 8, it is characterised in that the word content found described in the terminal output, tool Body includes:The terminal show described in the word content that finds;And/orThe terminal by the form of voice play described in the word content that finds.
- A kind of 10. equipment of gesture identification, it is characterised in that including:Processing unit, for detecting the gesture of user every preset duration by detection unit, obtain hand exercise parameter information;Transmit-Receive Unit, for sending the hand exercise parameter information to the terminal being connected with the equipment.
- 11. equipment as claimed in claim 10, it is characterised in that the processing unit is by the detection unit every default When duration detects the gesture of user, it is specifically used for:After the Transmit-Receive Unit sends startup flag to the terminal, detected by the detection unit every preset duration The gesture of user, at the time of the startup flag is used to indicate that the terminal gesture starts;The Transmit-Receive Unit, is additionally operable to:End of identification position is sent to the terminal, at the time of the end of identification position is used to indicate that the terminal gesture terminates.
- A kind of 12. terminal of gesture identification, it is characterised in that including:Transmit-Receive Unit, for receiving the hand exercise information of the wearable device being connected with terminal transmission;Processing unit, for according to the hand exercise information, from default gesture and the database of word content corresponding relation In, find word content corresponding with the gesture;Output unit, for exporting the word content found.
- 13. terminal as claimed in claim 12, it is characterised in that the Transmit-Receive Unit, be additionally operable to:Startup flag and the end of identification position that the wearable device is sent are received, wherein, the startup flag is used for At the time of indicating that the terminal gesture starts, at the time of the end of identification position is used to indicate that the terminal gesture terminates;When the processing unit determines the gesture of user, it is specifically used for:According to the startup flag and the hand exercise parameter information received between flag received, user is determined Gesture.
- 14. the terminal as described in claim 12 or 13, it is characterised in that the output unit, be specifically used for:The word content found described in display;And/orPass through the word content found described in the form broadcasting of voice.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610404848.XA CN107479686A (en) | 2016-06-08 | 2016-06-08 | A kind of system of gesture identification, method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610404848.XA CN107479686A (en) | 2016-06-08 | 2016-06-08 | A kind of system of gesture identification, method and apparatus |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107479686A true CN107479686A (en) | 2017-12-15 |
Family
ID=60593813
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610404848.XA Pending CN107479686A (en) | 2016-06-08 | 2016-06-08 | A kind of system of gesture identification, method and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107479686A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112068700A (en) * | 2020-09-04 | 2020-12-11 | 北京服装学院 | Character information input method, device and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103257711A (en) * | 2013-05-24 | 2013-08-21 | 河南科技大学 | Space gesture input method |
CN104516483A (en) * | 2013-09-28 | 2015-04-15 | 南京专创知识产权服务有限公司 | Gesture language input identification system based on motion-sensing technology |
CN104980599A (en) * | 2015-06-17 | 2015-10-14 | 上海斐讯数据通信技术有限公司 | Sign language-voice call method and sign language-voice call system |
CN104978886A (en) * | 2015-06-29 | 2015-10-14 | 广西瀚特信息产业股份有限公司 | Sign language interpreting system based on motion sensing technology and processing method |
CN205158728U (en) * | 2015-11-19 | 2016-04-13 | 陆庆健 | Sign language translating system |
-
2016
- 2016-06-08 CN CN201610404848.XA patent/CN107479686A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103257711A (en) * | 2013-05-24 | 2013-08-21 | 河南科技大学 | Space gesture input method |
CN104516483A (en) * | 2013-09-28 | 2015-04-15 | 南京专创知识产权服务有限公司 | Gesture language input identification system based on motion-sensing technology |
CN104980599A (en) * | 2015-06-17 | 2015-10-14 | 上海斐讯数据通信技术有限公司 | Sign language-voice call method and sign language-voice call system |
CN104978886A (en) * | 2015-06-29 | 2015-10-14 | 广西瀚特信息产业股份有限公司 | Sign language interpreting system based on motion sensing technology and processing method |
CN205158728U (en) * | 2015-11-19 | 2016-04-13 | 陆庆健 | Sign language translating system |
Non-Patent Citations (1)
Title |
---|
陆虎敏: "《飞机座舱显示与控制技术》", 31 December 2015, 航空工业出版社 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112068700A (en) * | 2020-09-04 | 2020-12-11 | 北京服装学院 | Character information input method, device and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106055088B (en) | The air of interactive wearable device writes and gesture system | |
JP6602803B2 (en) | Tactilely enhanced interactivity of interactive content | |
US10446059B2 (en) | Hand motion interpretation and communication apparatus | |
KR101794493B1 (en) | Mobile devices and methods employing haptics | |
TWI457793B (en) | Real-time motion recognition method and inertia sensing and trajectory | |
CN103870199B (en) | The recognition methods of user operation mode and handheld device in handheld device | |
US20120270654A1 (en) | Method and apparatus for scaling gesture recognition to physical dimensions of a user | |
WO2018161906A1 (en) | Motion recognition method, device, system and storage medium | |
HUE027334T2 (en) | Method and apparatus for tracking orientation of a user | |
WO2012151471A2 (en) | Identifying gestures using multiple sensors | |
Liu et al. | Multi-HMM classification for hand gesture recognition using two differing modality sensors | |
CN104076920A (en) | Information processing apparatus, information processing method, and storage medium | |
Zhang et al. | FingOrbits: interaction with wearables using synchronized thumb movements | |
EP3304953B1 (en) | Transmitting athletic data using non-connected state of discovery signal | |
CN108760214A (en) | Projected angle of impact acquisition methods and Related product | |
CN108572719A (en) | The intelligent helmet control method and system identified using figure | |
KR101793607B1 (en) | System, method and program for educating sign language | |
CN106055958B (en) | A kind of unlocking method and device | |
CN107479686A (en) | A kind of system of gesture identification, method and apparatus | |
CN112001442B (en) | Feature detection method, device, computer equipment and storage medium | |
CN108646975A (en) | Information processing method and device | |
CN105931627B (en) | The musical instrument analogy method and device of action recognition based on artificial intelligence | |
CN108089710A (en) | A kind of electronic equipment control method, device and electronic equipment | |
CN114341947A (en) | System and method for exercise type recognition using wearable devices | |
CN107172302A (en) | Audio control method and device, computer installation and computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20171215 |
|
RJ01 | Rejection of invention patent application after publication |