CN103226386A - Gesture identification method and system based on mobile terminal - Google Patents
Gesture identification method and system based on mobile terminal Download PDFInfo
- Publication number
- CN103226386A CN103226386A CN2013100804501A CN201310080450A CN103226386A CN 103226386 A CN103226386 A CN 103226386A CN 2013100804501 A CN2013100804501 A CN 2013100804501A CN 201310080450 A CN201310080450 A CN 201310080450A CN 103226386 A CN103226386 A CN 103226386A
- Authority
- CN
- China
- Prior art keywords
- gesture
- mobile phone
- phone users
- reflected
- portable terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention belongs to the field of movement identification, and provides a gesture identification method and a system based on a mobile terminal. According to the method and the system, ultrasonic waves are emitted from four different positions, distances between a gesture of a mobile terminal user and the four ultrasonic wave emission positions are computed, and spatial modeling is performed according to spatial position change of the gesture of the mobile terminal user relative to the four different positions, so that a gesture movement can be identified quickly and accurately; corresponding operation can be executed; and the problems that the movement identification accuracy cannot be ensured easily, and the possibility of interference of light and other factors is high due to the fact that a movement track is captured by a camera in the prior art are solved.
Description
Technical field
The invention belongs to the action recognition field, relate in particular to a kind of gesture identification method and system based on portable terminal.
Background technology
Along with increasing terminal device has the action recognition function, make people's life be full of enjoyment, made things convenient for operation simultaneously.
And prior art, most action recognition are pounced on to catch to analyze behind the picture by camera and are obtained, not only on the action recognition precision, be difficult to the standard that reaches higher, and be subjected to the interference of light and other factors easily, the more important thing is that the spatial depth of action is the difficult identification of real 3D action.
Summary of the invention
The invention provides a kind of action identification method of portable terminal and system and be intended to solve in the prior art by camera capturing motion track, the action recognition precision is difficult to guarantee, is subjected to the problem of the interference of light and other factors easily.
In order to solve the problems of the technologies described above, the invention provides a kind of gesture identification method based on portable terminal, said method comprising the steps of:
From 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
Writing down 4 diverse locations respectively launches ultrasound wave and ultrasound wave at every turn and is reflected mistiming when returning to receive;
When calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
The gesture of mobile phone users is with respect to the locus of 4 diverse locations when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
The gesture of mobile phone users is set up the solid space model of the gesture of mobile phone users with respect to the variation of the locus of 4 diverse locations when being reflected according to 4 each ultrasonic waves transmitted of diverse location that calculate;
The gesture motion of identification mobile phone users.
Further, described 4 diverse locations lay respectively on the true origin O that is based upon the rectangular coordinate system in space on the portable terminal, X-axis, Y-axis, the Z axle, and are positioned at three positions on X-axis, Y-axis, the Z axle and the distance of true origin O is respectively Kx, Ky, Kz.
Further, described when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the formula of the distance of the gesture of mobile phone users be:
Wherein, V represents hyperacoustic speed,
Time when the expression ultrasound wave is reflected from being transmitted into.
Further, described when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected the gesture of mobile phone users with respect to the formula of the locus of 4 diverse locations be:
Wherein, So, Sa, Sb, Sc represent to be positioned at the distance of the gesture of ultrasonic emitting position on true origin O, X-axis, Y-axis, the Z axle and mobile phone users respectively.
Further, from 4 diverse locations respectively before mobile phone users emission and receiving the continuous hyperacoustic step that reflects, this method also includes:
The various gesture solid space models of default mobile phone users and with the corresponding operation of each gesture solid space model.
Further, after the step of the gesture motion of described identification mobile phone users, also include:
According to the gesture motion of the mobile phone users that recognizes, carry out corresponding operation.
The present invention also provides a kind of gesture identification system based on portable terminal, and described system comprises:
4 ultrasonic emitting/receiving elements are used for from 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
The time keeping unit is used for writing down respectively 4 diverse locations and launches ultrasound wave and ultrasound wave at every turn and be reflected mistiming when returning to receive;
Data processing unit is when being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
Data processing unit, the gesture of mobile phone users is with respect to the locus of 4 diverse locations when also being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
The gesture of mobile phone users is set up the solid space model of the gesture of mobile phone users with respect to the variation of the locus of 4 diverse locations when being reflected according to 4 each ultrasonic waves transmitted of diverse location that calculate; And
Recognition unit is used to discern the gesture motion of mobile phone users.
Further, described system also comprises: default unit, be used for default mobile phone users various gesture solid space models and with the corresponding operation of each gesture solid space model.
Further, described recognition unit specifically is used for: gesture solid space model that generates and the gesture solid space model of presetting are compared, judge the gesture motion of identification mobile phone users.
Further, described system also comprises: performance element, be used for gesture motion according to the mobile phone users that recognizes, and carry out corresponding operation.
In the present invention, by launching ultrasound waves from 4 different positions, calculate the gesture of mobile phone users and the distance of 4 ultrasonic emitting positions and the gesture of mobile phone users and carry out spatial modeling with respect to the variation of the locus of 4 diverse locations, can discern gesture motion fast and accurately and carry out corresponding operation, solved in the prior art by camera capturing motion track, the action recognition precision is difficult to guarantee, is subjected to the problem of the interference of light and other factors easily.
Description of drawings
Fig. 1 is the method flow diagram of the action recognition of the portable terminal that provides of first embodiment of the invention;
Fig. 2 is the rectangular space coordinate figure of position of the gesture of provide four hyperacoustic positions of emission of the embodiment of the invention and mobile phone users;
Fig. 3 is the method flow diagram of the action recognition of the portable terminal that provides of second embodiment of the invention;
Fig. 4 is the method flow diagram of the action recognition of the portable terminal that provides of third embodiment of the invention;
Fig. 5 is the modular structure figure of the action recognition system of the portable terminal that provides of the embodiment of the invention;
O, a, b, c represent the locus of 4 ultrasonic emitting respectively, and E represents the locus of the gesture of mobile phone users.
Embodiment
In order to make purpose of the present invention, technical scheme and advantage clearer,, the present invention is further elaborated below in conjunction with drawings and Examples.Should be appreciated that specific embodiment described herein only in order to explanation the present invention, and be not used in qualification the present invention.
Fig. 1 shows the method flow diagram of the action recognition of the portable terminal that first embodiment of the invention provides, and as one embodiment of the invention, it specifically may further comprise the steps:
Step S102, from 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
Step S103 writes down 4 diverse locations respectively and launches ultrasound wave and ultrasound wave at every turn and be reflected mistiming when returning to receive;
Step S104 is when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
Step S105, the gesture of mobile phone users is with respect to the locus of 4 diverse locations when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
Step S106, the gesture of mobile phone users is set up the solid space model of the gesture of mobile phone users with respect to the variation of the locus of 4 diverse locations when being reflected according to 4 each ultrasonic waves transmitted of diverse location that calculate;
Step S107, the gesture motion of identification mobile phone users.As shown in Figure 2, as one embodiment of the invention, 4 diverse locations lay respectively on the true origin O that is based upon the rectangular coordinate system in space on the portable terminal, X-axis, Y-axis, the Z axle, and are positioned at three positions on X-axis, Y-axis, the Z axle and the distance of true origin O is respectively Kx, Ky, Kz.
In embodiments of the present invention, the true origin O of rectangular coordinate system in space, X-axis, Y-axis, Z axle lay respectively on the mobile terminal structure, O, a, b, c represent 4 hyperacoustic locus of emission respectively in the coordinate of rectangular coordinate system in space, and E represents the locus of the gesture of mobile phone users.
As one embodiment of the invention, when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the formula of the distance of the gesture of mobile phone users be:
Wherein, V represents hyperacoustic speed,
Time when the expression ultrasound wave is reflected from being transmitted into.
In the invention process, in the time of will calculating 4 each ultrasonic waves transmitted of diverse location respectively and be reflected and the distance of the gesture of mobile phone users, need calculate 4 times when promptly the distance of the gesture of four positions on true origin O, X-axis, Y-axis, the Z axle and mobile phone users is launched ultrasound wave at every turn.
As one embodiment of the invention, the gesture of mobile phone users with respect to the formula of the locus of 4 diverse locations is when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected:
Wherein, So, Sa, Sb, Sc represent to be positioned at the distance of the gesture of ultrasonic emitting position on true origin O, X-axis, Y-axis, the Z axle and mobile phone users respectively.
In embodiments of the present invention, the gesture of mobile phone users is with respect to the locus of 4 diverse locations when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected, calculate the locus of the gesture of mobile phone users when repeatedly ultrasonic waves transmitted is reflected continuously with respect to 4 diverse locations, can obtain the variation of locus of the gesture of mobile phone users, i.e. the movement locus of the gesture of mobile phone users.
As one embodiment of the invention, as shown in Figure 3, before the step of step S102, this method also includes:
Step S101, the various gesture solid space models of default mobile phone users and with the corresponding operation of each gesture solid space model.
As one embodiment of the invention, as shown in Figure 4, after the step of step S107, also include:
Step S108 according to the gesture motion of the mobile phone users that recognizes, carries out corresponding operation.
One of ordinary skill in the art will appreciate that all or part of step that realizes in the foregoing description method is can control relevant hardware by program to finish, described program can be in being stored in a computer read/write memory medium, described storage medium is as ROM/RAM, disk, CD etc.
As shown in Figure 5, the present invention also provides a kind of gesture identification system 100 based on portable terminal, comprising:
4 ultrasonic emitting/receiving elements 102 are used for from 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
Time keeping unit 103 is used for writing down respectively 4 diverse locations and launches ultrasound wave and ultrasound wave at every turn and be reflected mistiming when returning to receive;
Data processing unit 104 is when being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
Data processing unit 104, the gesture of mobile phone users is with respect to the locus of 4 diverse locations when also being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
Modelling unit 105 is used for the variation of the gesture of mobile phone users when being reflected according to 4 each ultrasonic waves transmitted of diverse location calculating with respect to the locus of 4 diverse locations, sets up the solid space model of the gesture of mobile phone users; And
Recognition unit 106 is used to discern the gesture motion of mobile phone users.
Further, as one embodiment of the invention, system also comprises:
Default unit 101, be used for default mobile phone users various gesture solid space models and with the corresponding operation of each gesture solid space model.
Further, as one embodiment of the invention, recognition unit 106 specifically is used for:
Gesture solid space model of setting up and the gesture solid space model of presetting are compared, judge the gesture motion of identification mobile phone users.
Further, as one embodiment of the invention, system also comprises:
Performance element 107 is used for the gesture motion according to the mobile phone users that recognizes, and carries out corresponding operation.
In embodiments of the present invention, portable terminal comprises mobile phone, panel computer etc., and human-computer interaction module comprises keyboard and display screen, and ultrasonic probe, ultrasonic receiver can be the microphone on mobile phone or the panel computer, also can be the hardware module that is provided with separately.
In embodiments of the present invention, by launching ultrasound waves from 4 different positions, calculate the gesture of mobile phone users and the distance of 4 ultrasonic emitting positions and the gesture of mobile phone users and carry out spatial modeling with respect to the variation of the locus of 4 diverse locations, can discern gesture motion fast and accurately and carry out corresponding operation, solved in the prior art by camera capturing motion track, the action recognition precision is difficult to guarantee, is subjected to the problem of the interference of light and other factors easily.
The above only is preferred embodiment of the present invention, not in order to restriction the present invention, all any modifications of being done within the spirit and principles in the present invention, is equal to and replaces and improvement etc., all should be included within protection scope of the present invention.
Claims (10)
1. the gesture identification method based on portable terminal is characterized in that, said method comprising the steps of:
From 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
Writing down 4 diverse locations respectively launches ultrasound wave and ultrasound wave at every turn and is reflected mistiming when returning to receive;
When calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
The gesture of mobile phone users is with respect to the locus of 4 diverse locations when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
The gesture of mobile phone users is set up the solid space model of the gesture of mobile phone users with respect to the variation of the locus of 4 diverse locations when being reflected according to 4 each ultrasonic waves transmitted of diverse location that calculate;
The gesture motion of identification mobile phone users.
2. the action identification method of portable terminal as claimed in claim 1, it is characterized in that, described 4 diverse locations lay respectively on the true origin O that is based upon the rectangular coordinate system in space on the portable terminal, X-axis, Y-axis, the Z axle, and are positioned at three positions on X-axis, Y-axis, the Z axle and the distance of true origin O is respectively Kx, Ky, Kz.
3. the action identification method of portable terminal as claimed in claim 2 is characterized in that, described when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the formula of the distance of the gesture of mobile phone users be:
4. the action identification method of portable terminal as claimed in claim 3 is characterized in that, described when calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected the gesture of mobile phone users with respect to the formula of the locus of 4 diverse locations be:
Wherein, So, Sa, Sb, Sc represent to be positioned at the distance of the gesture of ultrasonic emitting position on true origin O, X-axis, Y-axis, the Z axle and mobile phone users respectively.
5. the gesture identification method based on portable terminal as claimed in claim 1 is characterized in that, from 4 diverse locations respectively before mobile phone users emission and receiving the continuous hyperacoustic step that reflects, this method also includes:
The various gesture solid space models of default mobile phone users and with the corresponding operation of each gesture solid space model.
6. the action identification method of portable terminal as claimed in claim 5 is characterized in that, also includes after the step of the gesture motion of described identification mobile phone users:
According to the gesture motion of the mobile phone users that recognizes, carry out corresponding operation.
7. gesture identification system based on portable terminal is characterized in that described system comprises:
4 ultrasonic emitting/receiving elements are used for from 4 diverse locations respectively to mobile phone users emission and receive the continuous ultrasound wave that reflects;
The time keeping unit is used for writing down respectively 4 diverse locations and launches ultrasound wave and ultrasound wave at every turn and be reflected mistiming when returning to receive;
Data processing unit is when being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected and the distance of the gesture of mobile phone users;
Data processing unit, the gesture of mobile phone users is with respect to the locus of 4 diverse locations when also being used for calculating 4 each ultrasonic waves transmitted of diverse location respectively and being reflected;
The gesture of mobile phone users is set up the solid space model of the gesture of mobile phone users with respect to the variation of the locus of 4 diverse locations when being reflected according to 4 each ultrasonic waves transmitted of diverse location that calculate; And
Recognition unit is used to discern the gesture motion of mobile phone users.
8. the gesture identification system based on portable terminal as claimed in claim 7 is characterized in that described system also comprises:
Default unit, be used for default mobile phone users various gesture solid space models and with the corresponding operation of each gesture solid space model.
9. the gesture identification system based on portable terminal as claimed in claim 7 is characterized in that described recognition unit specifically is used for:
Gesture solid space model of setting up and the gesture solid space model of presetting are compared, judge the gesture motion of identification mobile phone users.
Portable terminal as claimed in claim 7 based on the gesture identification system, it is characterized in that described system also comprises:
Performance element is used for the gesture motion according to the mobile phone users that recognizes, and carries out corresponding operation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2013100804501A CN103226386A (en) | 2013-03-13 | 2013-03-13 | Gesture identification method and system based on mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2013100804501A CN103226386A (en) | 2013-03-13 | 2013-03-13 | Gesture identification method and system based on mobile terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
CN103226386A true CN103226386A (en) | 2013-07-31 |
Family
ID=48836868
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2013100804501A Pending CN103226386A (en) | 2013-03-13 | 2013-03-13 | Gesture identification method and system based on mobile terminal |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN103226386A (en) |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103713536A (en) * | 2013-12-27 | 2014-04-09 | 广东省自动化研究所 | 3D gesture recognition controller and method based on ultrasonic locating |
CN104169858A (en) * | 2013-12-03 | 2014-11-26 | 华为技术有限公司 | Method and device of using terminal device to identify user gestures |
CN104375717A (en) * | 2014-07-17 | 2015-02-25 | 深圳市钛客科技有限公司 | Portable device, touch control system and touch device |
CN104699232A (en) * | 2013-12-09 | 2015-06-10 | 联想(北京)有限公司 | Three-dimensional position detector, electronic device and three-dimensional position detecting method |
WO2015196619A1 (en) * | 2014-06-26 | 2015-12-30 | 中兴通讯股份有限公司 | Player interaction method and apparatus, and storage medium |
CN105278669A (en) * | 2014-12-25 | 2016-01-27 | 维沃移动通信有限公司 | Mobile terminal control method and mobile terminal |
CN105306819A (en) * | 2015-10-15 | 2016-02-03 | 广东欧珀移动通信有限公司 | Gesture-based photographing control method and device |
CN105391854A (en) * | 2015-10-15 | 2016-03-09 | 广东欧珀移动通信有限公司 | Audio incoming call processing method and audio incoming call processing device |
CN105446475A (en) * | 2014-09-26 | 2016-03-30 | 联想(北京)有限公司 | Signal processing method and electronic equipment |
CN105474144A (en) * | 2013-08-21 | 2016-04-06 | 高通股份有限公司 | Ultrasound multi-zone hovering system |
CN105612483A (en) * | 2013-10-10 | 2016-05-25 | 高通股份有限公司 | System and method for multi-touch gesture detection using ultrasound beamforming |
CN105843402A (en) * | 2016-05-12 | 2016-08-10 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device based on wearable equipment |
CN105843404A (en) * | 2016-05-12 | 2016-08-10 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device |
CN105867639A (en) * | 2016-05-12 | 2016-08-17 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device based on sonar |
CN106662913A (en) * | 2014-03-10 | 2017-05-10 | 埃尔瓦有限公司 | Systems and methods for a dual modality sensor system |
CN106778179A (en) * | 2017-01-05 | 2017-05-31 | 南京大学 | A kind of identity identifying method based on the identification of ultrasonic wave lip reading |
CN106897018A (en) * | 2017-02-27 | 2017-06-27 | 努比亚技术有限公司 | Gesture operation method, device and mobile terminal |
CN106919261A (en) * | 2017-03-08 | 2017-07-04 | 广州致远电子股份有限公司 | A kind of infrared gesture identification method and device based on zone sequence reconstruct |
CN107169470A (en) * | 2017-06-06 | 2017-09-15 | 中控智慧科技股份有限公司 | A kind of gesture identification method, apparatus and system |
CN107483915A (en) * | 2017-08-23 | 2017-12-15 | 京东方科技集团股份有限公司 | The control method and device of 3-D view |
CN108475064A (en) * | 2017-05-16 | 2018-08-31 | 深圳市大疆创新科技有限公司 | Method, equipment and computer readable storage medium for equipment control |
CN108924417A (en) * | 2018-07-02 | 2018-11-30 | Oppo(重庆)智能科技有限公司 | Filming control method and Related product |
CN109597312A (en) * | 2018-11-26 | 2019-04-09 | 北京小米移动软件有限公司 | Speaker control method and device |
CN109660739A (en) * | 2018-11-13 | 2019-04-19 | 深圳艺达文化传媒有限公司 | The stacking method and Related product of short-sighted frequency certain effects |
CN109725704A (en) * | 2017-10-30 | 2019-05-07 | 腾讯科技(武汉)有限公司 | The method and device of control application operation |
CN109756672A (en) * | 2018-11-13 | 2019-05-14 | 深圳艺达文化传媒有限公司 | Short-sighted frequency animal model stacking method and Related product |
CN110096133A (en) * | 2018-01-30 | 2019-08-06 | 鸿富锦精密工业(武汉)有限公司 | Infrared gesture identifying device and method |
WO2023087629A1 (en) * | 2021-11-19 | 2023-05-25 | 北京小米移动软件有限公司 | Device control method and apparatus, device, and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1584628A (en) * | 2004-06-08 | 2005-02-23 | 清华大学 | Ultrasonic positioning and ranging microacoustic system based on silicon microprocessing technology |
CN101226437A (en) * | 2008-01-30 | 2008-07-23 | 丁远彤 | Three-dimensional sensing system capable of being used for space telecontrol mouse and capturing motion |
CN101344586A (en) * | 2008-08-29 | 2009-01-14 | 华南理工大学 | Method and apparatus for three-dimensional multi-movement objective positioning by using multi-frequency sound wave |
US20100095206A1 (en) * | 2008-10-13 | 2010-04-15 | Lg Electronics Inc. | Method for providing a user interface using three-dimensional gestures and an apparatus using the same |
CN101720558A (en) * | 2007-04-19 | 2010-06-02 | 埃波斯开发有限公司 | Voice and position localization |
CN201886412U (en) * | 2010-11-11 | 2011-06-29 | 郑贤豪 | Input device applied in 3D image interaction system |
CN102347804A (en) * | 2011-09-26 | 2012-02-08 | 热土(上海)网络科技有限公司 | Mobile terminal ultrasonic communication system and method |
US20120194483A1 (en) * | 2011-01-27 | 2012-08-02 | Research In Motion Limited | Portable electronic device and method therefor |
CN202383598U (en) * | 2011-09-29 | 2012-08-15 | 上海华勤通讯技术有限公司 | Mobile terminal |
CN102937832A (en) * | 2012-10-12 | 2013-02-20 | 广东欧珀移动通信有限公司 | Gesture capturing method and device for mobile terminal |
-
2013
- 2013-03-13 CN CN2013100804501A patent/CN103226386A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1584628A (en) * | 2004-06-08 | 2005-02-23 | 清华大学 | Ultrasonic positioning and ranging microacoustic system based on silicon microprocessing technology |
CN101720558A (en) * | 2007-04-19 | 2010-06-02 | 埃波斯开发有限公司 | Voice and position localization |
CN101226437A (en) * | 2008-01-30 | 2008-07-23 | 丁远彤 | Three-dimensional sensing system capable of being used for space telecontrol mouse and capturing motion |
CN101344586A (en) * | 2008-08-29 | 2009-01-14 | 华南理工大学 | Method and apparatus for three-dimensional multi-movement objective positioning by using multi-frequency sound wave |
US20100095206A1 (en) * | 2008-10-13 | 2010-04-15 | Lg Electronics Inc. | Method for providing a user interface using three-dimensional gestures and an apparatus using the same |
CN201886412U (en) * | 2010-11-11 | 2011-06-29 | 郑贤豪 | Input device applied in 3D image interaction system |
US20120194483A1 (en) * | 2011-01-27 | 2012-08-02 | Research In Motion Limited | Portable electronic device and method therefor |
CN102347804A (en) * | 2011-09-26 | 2012-02-08 | 热土(上海)网络科技有限公司 | Mobile terminal ultrasonic communication system and method |
CN202383598U (en) * | 2011-09-29 | 2012-08-15 | 上海华勤通讯技术有限公司 | Mobile terminal |
CN102937832A (en) * | 2012-10-12 | 2013-02-20 | 广东欧珀移动通信有限公司 | Gesture capturing method and device for mobile terminal |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105474144B (en) * | 2013-08-21 | 2019-01-11 | 高通股份有限公司 | Ultrasonic wave multizone hovering system |
CN105474144A (en) * | 2013-08-21 | 2016-04-06 | 高通股份有限公司 | Ultrasound multi-zone hovering system |
CN105612483A (en) * | 2013-10-10 | 2016-05-25 | 高通股份有限公司 | System and method for multi-touch gesture detection using ultrasound beamforming |
CN104169858A (en) * | 2013-12-03 | 2014-11-26 | 华为技术有限公司 | Method and device of using terminal device to identify user gestures |
WO2015081485A1 (en) * | 2013-12-03 | 2015-06-11 | 华为技术有限公司 | Method and device for terminal device to identify user gestures |
CN104169858B (en) * | 2013-12-03 | 2017-04-26 | 华为技术有限公司 | Method and device of using terminal device to identify user gestures |
CN104699232A (en) * | 2013-12-09 | 2015-06-10 | 联想(北京)有限公司 | Three-dimensional position detector, electronic device and three-dimensional position detecting method |
CN103713536A (en) * | 2013-12-27 | 2014-04-09 | 广东省自动化研究所 | 3D gesture recognition controller and method based on ultrasonic locating |
CN103713536B (en) * | 2013-12-27 | 2016-05-04 | 广东省自动化研究所 | A kind of 3D gesture identification controller and method based on localization by ultrasonic |
CN106662913A (en) * | 2014-03-10 | 2017-05-10 | 埃尔瓦有限公司 | Systems and methods for a dual modality sensor system |
WO2015196619A1 (en) * | 2014-06-26 | 2015-12-30 | 中兴通讯股份有限公司 | Player interaction method and apparatus, and storage medium |
CN104375717A (en) * | 2014-07-17 | 2015-02-25 | 深圳市钛客科技有限公司 | Portable device, touch control system and touch device |
CN105446475A (en) * | 2014-09-26 | 2016-03-30 | 联想(北京)有限公司 | Signal processing method and electronic equipment |
CN105446475B (en) * | 2014-09-26 | 2018-08-31 | 联想(北京)有限公司 | Signal processing method and electronic equipment |
CN105278669A (en) * | 2014-12-25 | 2016-01-27 | 维沃移动通信有限公司 | Mobile terminal control method and mobile terminal |
CN105278669B (en) * | 2014-12-25 | 2018-12-04 | 维沃移动通信有限公司 | The control method and mobile terminal of mobile terminal |
CN105391854A (en) * | 2015-10-15 | 2016-03-09 | 广东欧珀移动通信有限公司 | Audio incoming call processing method and audio incoming call processing device |
CN105306819A (en) * | 2015-10-15 | 2016-02-03 | 广东欧珀移动通信有限公司 | Gesture-based photographing control method and device |
CN105843404A (en) * | 2016-05-12 | 2016-08-10 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device |
CN105867639A (en) * | 2016-05-12 | 2016-08-17 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device based on sonar |
CN105843402A (en) * | 2016-05-12 | 2016-08-10 | 深圳市联谛信息无障碍有限责任公司 | Screen reading application instruction input method and device based on wearable equipment |
CN106778179A (en) * | 2017-01-05 | 2017-05-31 | 南京大学 | A kind of identity identifying method based on the identification of ultrasonic wave lip reading |
CN106778179B (en) * | 2017-01-05 | 2021-07-09 | 南京大学 | Identity authentication method based on ultrasonic lip language identification |
CN106897018A (en) * | 2017-02-27 | 2017-06-27 | 努比亚技术有限公司 | Gesture operation method, device and mobile terminal |
CN106897018B (en) * | 2017-02-27 | 2020-07-28 | 努比亚技术有限公司 | Gesture operation method and device and mobile terminal |
CN106919261A (en) * | 2017-03-08 | 2017-07-04 | 广州致远电子股份有限公司 | A kind of infrared gesture identification method and device based on zone sequence reconstruct |
CN108475064A (en) * | 2017-05-16 | 2018-08-31 | 深圳市大疆创新科技有限公司 | Method, equipment and computer readable storage medium for equipment control |
CN108475064B (en) * | 2017-05-16 | 2021-11-05 | 深圳市大疆创新科技有限公司 | Method, apparatus, and computer-readable storage medium for apparatus control |
CN107169470A (en) * | 2017-06-06 | 2017-09-15 | 中控智慧科技股份有限公司 | A kind of gesture identification method, apparatus and system |
CN107483915A (en) * | 2017-08-23 | 2017-12-15 | 京东方科技集团股份有限公司 | The control method and device of 3-D view |
CN109725704B (en) * | 2017-10-30 | 2023-05-12 | 腾讯科技(武汉)有限公司 | Method and device for controlling application running |
CN109725704A (en) * | 2017-10-30 | 2019-05-07 | 腾讯科技(武汉)有限公司 | The method and device of control application operation |
CN110096133A (en) * | 2018-01-30 | 2019-08-06 | 鸿富锦精密工业(武汉)有限公司 | Infrared gesture identifying device and method |
CN108924417A (en) * | 2018-07-02 | 2018-11-30 | Oppo(重庆)智能科技有限公司 | Filming control method and Related product |
CN109756672A (en) * | 2018-11-13 | 2019-05-14 | 深圳艺达文化传媒有限公司 | Short-sighted frequency animal model stacking method and Related product |
CN109660739A (en) * | 2018-11-13 | 2019-04-19 | 深圳艺达文化传媒有限公司 | The stacking method and Related product of short-sighted frequency certain effects |
US11614540B2 (en) | 2018-11-26 | 2023-03-28 | Beijing Xiaomi Mobile Software Co., Ltd. | Method and apparatus for controlling sound box |
CN109597312A (en) * | 2018-11-26 | 2019-04-09 | 北京小米移动软件有限公司 | Speaker control method and device |
WO2023087629A1 (en) * | 2021-11-19 | 2023-05-25 | 北京小米移动软件有限公司 | Device control method and apparatus, device, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103226386A (en) | Gesture identification method and system based on mobile terminal | |
CN103345301B (en) | A kind of depth information acquisition method and device | |
EP3109785B1 (en) | Portable apparatus and method for changing screen of the same | |
Chen et al. | Your table can be an input panel: Acoustic-based device-free interaction recognition | |
KR102042461B1 (en) | Mobile terminal and method for controlling of the same | |
US9261995B2 (en) | Apparatus, method, and computer readable recording medium for selecting object by using multi-touch with related reference point | |
KR102051418B1 (en) | User interface controlling device and method for selecting object in image and image input device | |
CN105103457A (en) | Portable terminal, hearing aid, and method of indicating positions of sound sources in the portable terminal | |
CN109905754A (en) | Virtual present collection methods, device and storage equipment | |
CN104049728A (en) | Electronic device and control method thereof | |
CN101859226A (en) | The portable terminal of the method for input command and this method of use in portable terminal | |
US9571930B2 (en) | Audio data detection with a computing device | |
CN105074615A (en) | Virtual sensor systems and methods | |
KR100940307B1 (en) | Method and apparatus for measuring position of the object using microphone | |
CN103455171A (en) | Three-dimensional interactive electronic whiteboard system and method | |
CN108924417A (en) | Filming control method and Related product | |
CN104076912A (en) | Intelligent pen type electronic device | |
KR102186815B1 (en) | Method, apparatus and recovering medium for clipping of contents | |
CN105278825A (en) | Screen capturing method and mobile terminal | |
CN103810073B (en) | The method of mobile terminal and control mobile terminal | |
CN108196675A (en) | For the exchange method, device and touch control terminal of touch control terminal | |
US20240119943A1 (en) | Apparatus for implementing speaker diarization model, method of speaker diarization, and portable terminal including the apparatus | |
CN105446550A (en) | Input device, positioning method of input device, electronic equipment and input system | |
CN104915627A (en) | Character identification method and apparatus | |
CN109358755B (en) | Gesture detection method and device for mobile terminal and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20130731 |
|
RJ01 | Rejection of invention patent application after publication |