CN103226436A - Man-machine interaction method and system of intelligent terminal - Google Patents

Man-machine interaction method and system of intelligent terminal Download PDF

Info

Publication number
CN103226436A
CN103226436A CN2013100707944A CN201310070794A CN103226436A CN 103226436 A CN103226436 A CN 103226436A CN 2013100707944 A CN2013100707944 A CN 2013100707944A CN 201310070794 A CN201310070794 A CN 201310070794A CN 103226436 A CN103226436 A CN 103226436A
Authority
CN
China
Prior art keywords
intelligent terminal
steering order
data
user
man
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2013100707944A
Other languages
Chinese (zh)
Inventor
胡慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN2013100707944A priority Critical patent/CN103226436A/en
Publication of CN103226436A publication Critical patent/CN103226436A/en
Priority to PCT/CN2014/072586 priority patent/WO2014135023A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a man-machine interaction method and system of an intelligent terminal. The intelligent terminal is in wired or wireless connection with induction headphones. The method comprises steps as follows: a digital gyroscope in the headphones collects action data in a moving process of the headphones and sends the action data to the intelligent terminal; the action data are analyzed, so that a control command matched with the action data is obtained and executed, wherein the control command comprises action of a focus identification on a user display interface; a pressure induction device in the headphones obtains an induction signal produced by tensioning action of user's ears and outputs induction data to the intelligent terminal according to the induction signal; and a control command matched with the control command is analyzed and executed after the induction data are analyzed, wherein the control command comprises triggering project pointed by the focus identification. Head action of a user and the tensioning action applied to the ears are detected by the induction headphones, so that the user can send an operation command without touching a telephone when listening to music, watching a video or even selecting a detailed menu.

Description

A kind of man-machine interaction method of intelligent terminal and system
Technical field
The present invention relates to relate to human-computer interaction technique field, relate in particular to a kind of man-machine interaction method and system of intelligent terminal.
Background technology
Along with the development of electronic product technology, intelligent terminals such as smart mobile phone, panel computer have become product very common in people's daily life.Because people are more and more higher at present to the intelligent requirements of equipment, the improvement of the operating experience of man-machine interaction is often become the key element that the developer more and more payes attention to.The touch screen technology of widespread use at present reaches day by day ripe voice control technology, face recognition technology etc. and all has been applied to the intelligent terminal field, has greatly improved the convenience of man-machine interaction.With the touch-screen mobile phone is example, and it has replaced this comparatively mode of operation of inconvenience of mechanical key, but still needs to keep could realize its manipulation with contacting of terminal.And the application of voice control technology often is subjected to the restriction of hardware condition, require to keep between operator and the controlling object expedite certain distance to guarantee the accurate identification of voice signal, face recognition technology needs the opening of device camera and aims at the accurately identification of people's face.Thereby, in actual use, when the user was reluctant or can not implements various operation with hand, human-computer interaction technology mentioned above usually can't satisfy user's operation requirement, reach the operating effect of user expectation exactly, give the user with bad easily operating experience inadequately.
Summary of the invention
In order to overcome the weak point of the prior art of above-mentioned indication, the invention provides a kind of man-machine interaction method and system of novel intelligent terminal, so that the user appreciating audio/video even carrying out detailed menu when selecting, do not need to contact mobile phone and operate and to pass on the relevant instruction of controlling.
The present invention is achieved by the following technical solutions:
A kind of man-machine interaction method of intelligent terminal, a described intelligent terminal and an induction earphone are set up wired or wireless connections, and digital gyroscope and at least one pressure sensitive device are set on this induction earphone, and the step of described method comprises:
A, digital gyroscope are gathered the exercise data in the earphone motion process and are sent to intelligent terminal;
The processor of b, intelligent terminal is analyzed steering order and the execution that exercise data is complementary with it, and it is mobile that described steering order comprises that the focusing on user's display interface identifies;
C, pressure sensitive device obtain the induced signal that is caused by the tensioning of user's ear organ action, according to this induced signal output sensed data to intelligent terminal;
The steering order that d, the described sensed data of analysis are complementary is with it also carried out, and described steering order comprises the focusing sign project pointed that triggers among the step b.
Further, described focusing sign includes but not limited to the effect that highlights of cursor, mouse pointer and literal/symbol/figure, and this focusing sign project pointed includes but not limited to the literal/symbol/figure of button, menu option, tool linking functions.
The steering order that the processor analysis exercise data of the described intelligent terminal of step b is complementary with it is meant: the relative motion data that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.
The described steering order of step b also can comprise the page on user's display interface on/down/left side/right translation and page turning.
The described sensed data of step c comprises the force value and the corresponding time value of collection.
Steps d is analyzed the steering order that described sensed data is complementary with it and is comprised: relatively because the expansion of user's ear organ or tighten up maximum pressure differential that action causes and whether reach pressure differential threshold values in the preset instructions parameter; Relatively whether this expansion or the frequency of tightening up action are complementary with default frequency reference value.
The invention also discloses a kind of man-machine interactive system of intelligent terminal, comprise induction earphone and the intelligent terminal of setting up wired connection or wireless connections, described induction earphone comprises:
One link block is used for earphone data is sent to intelligent terminal;
One gyro module is used for gathering the exercise data of earphone motion process;
One pressure detection module is used to obtain the induced signal that the tensioning action by user's ear organ causes, according to this induced signal output sensed data;
Described intelligent terminal comprises:
One link block is used to receive the data that earphone sends;
One motion analysis module is used to analyze the steering order that the exercise data of gyro module is complementary with it, and this steering order comprises moving of focusing sign on user's display interface;
One pressure analysis module is used to analyze the steering order that the sensed data of pressure detection module is complementary with it, and described steering order comprises the focusing sign project pointed that triggers on user's display interface;
One execution module is used to carry out described steering order.
Further, described focusing sign comprises the effect that highlights of cursor, mouse pointer and literal/symbol/figure, and this focusing sign project pointed comprises the literal/symbol/figure of button, menu option, tool linking functions.
The motion analysis module is used to analyze the steering order that exercise data is complementary with it and is meant: the relative motion data that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.
Described sensed data comprises the force value and the corresponding time value of collection, and described pressure analysis module is used for analyzing the steering order that sensed data is complementary with it and comprises: be used for comparison because the expansion of user's ear organ or tighten up the pressure differential threshold values whether maximum pressure differential that action causes reaches the preset instructions parameter; Whether the frequency that is used for relatively this expansion or tightens up action is complementary with default frequency reference value.
Compared with prior art, the invention provides a kind of man-machine interaction mode of new intelligent terminal, by with earphone that the intelligent terminal signal is connected in gyroscope and the tensioning action that earphone is implemented of the headwork that detects the user of pressure sensitive device and ear, thereby carry out the operating user interface instruction that adapts with detection signal.The user need not directly to operate input instruction towards intelligent terminal, can avoid also being subjected to that foreign object disturbs and puzzlement that can't transfer control signal, especially appreciating audio/video even carrying out detailed menu when selecting as the user, not needing to contact mobile phone operates and can pass on the relevant instruction of controlling, and under existing sensing technology is supported, can reach the accuracy requirement of operation identification, bring brand-new operating experience easily to the user.
Description of drawings
Accompanying drawing 1 is the realization flow synoptic diagram of man-machine interaction method of the smart mobile phone of an embodiment of the present invention;
Accompanying drawing 2 is the formation sketch of man-machine interactive system of the intelligent terminal of an embodiment of the present invention.
Embodiment
For the ease of those skilled in the art's understanding, the invention will be further described below in conjunction with accompanying drawing.
A kind of man-machine interaction method of intelligent terminal, a described intelligent terminal and an induction earphone are set up wired or wireless connections, described wireless connections comprise that bluetooth connects, and digital gyroscope and at least one pressure sensitive device are set on this induction earphone, and the step of described method comprises:
A, digital gyroscope are gathered the exercise data in the earphone motion process and are sent to intelligent terminal;
The processor of b, intelligent terminal is analyzed steering order and the execution that exercise data is complementary with it, and it is mobile that described steering order comprises that the focusing on user's display interface identifies;
C, pressure sensitive device obtain the induced signal that is caused by the tensioning of user's ear organ action, according to this induced signal output sensed data to intelligent terminal;
The steering order that d, the described sensed data of analysis are complementary is with it also carried out, and described steering order comprises the focusing sign project (coordinate position that the focusing among the described step b identifies is stored in the processor) pointed that triggers among the step b.
Further, described focusing sign includes but not limited to the effect that highlights of cursor, mouse pointer (comprising the imagery sign with directive property) and literal/symbol/figure, and this focusing sign project pointed includes but not limited to the literal/symbol/figure of button, menu option, tool linking functions.
The steering order that the processor analysis exercise data of the described intelligent terminal of step b is complementary with it is meant: the relative motion data (with respect to intelligent terminal) that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.The analysis of these relative motion data requires intelligent terminal to dispose digital gyroscope simultaneously.Above-mentioned these relative motion data or exercise data and preset instructions parameter are mated, be meant and judge that relatively relative motion data or exercise data are whether in the scope of preset instructions parameter.
Also comprise before the described step b: the corresponding relation of preset instructions parameter and steering order is set, and described preset instructions parameter comprises movement velocity and acceleration, and this movement velocity and acceleration are trivector.
In other embodiments of the invention, the described steering order of step b also can comprise the page on user's display interface on/down/left side/right translation and page turning.Thereby promptly by digital gyroscope detect the user the headwork control page on/down/left side/right translation and page turning.
The described sensed data of step c comprises the force value and the corresponding time value of collection.
Steps d is analyzed the steering order that described sensed data is complementary with it and is comprised: relatively because the expansion of user's ear organ or tighten up maximum pressure differential that action causes and whether reach pressure differential threshold values in the preset instructions parameter; Relatively whether this expansion or the frequency of tightening up action are complementary with default frequency reference value.Particularly, relatively whether this expansion or the frequency of tightening up action are complementary with default frequency reference value and comprise: more once expand or whether duration of tightening up action is complementary with default duration scope, described once expand or tighten up force value that action is meant that the pressure sensitive device gathers upwards or down fluctuation cause pressure differential to reach pressure differential threshold values in the preset instructions parameter from rising to reach this pressure differential threshold values to descending and end; Relatively whether the number of times of tensioning action is complementary with default tensioning number of times; Relatively whether the interval time in twos of tensioning action is complementary with preset interval time.
Before the described steps d, also comprise: the corresponding relation of preset instructions parameter and steering order is set, and described preset instructions parameter comprises pressure differential threshold values, duration scope, tensioning number of times, interval time.
In preferred embodiment of the present invention, described intelligent terminal has sound prompt function, be used in combination with the gyro module and the pressure detection module of earphone by phonetic function, earphone output information, such as the prompting menu option, thereby make the man-machine interaction form of the present invention under conceiving more flexible, diversified.
The application of man-machine interaction method of the present invention comprises control of the choosing of menu option, game operation etc., particularly, the example that is chosen for menu option, after digital gyroscope and the unlatching of pressure-detecting device function, gyroscope detect user's head to front/rear/on/down/left side/motion, the position that focusing such as mouse identifies on the control intelligent terminal screen.After this focusing sign had dropped on the position of user expectation, the action that the user makes expansion or shrinks ear detected this operation by pressure-detecting device, triggered focusing on sign menu option pointed, such as trigger button " next is opened ".And this method is applied to can control walking of game object by head movement in the recreation, and by the expansion of ear or the predetermined registration operation of contractive action control game object, such as " shooting shell ".
Particularly, as shown in Figure 1, be example with the smart mobile phone, the present invention also provides the realization flow of man-machine interaction method of the smart mobile phone of a kind of embodiment, and its step comprises:
001, opens the digital gyroscope and the pressure sensitive apparatus function of earphone;
002, digital gyroscope is gathered the exercise data in the earphone motion process;
003, earphone sends exercise data to intelligent terminal by bluetooth;
004, the processor of intelligent terminal is analyzed exercise data;
005, judge that whether exercise data is complementary with the preset instructions parameter of steering order, if execution in step 006, otherwise return step 002;
006, intelligent terminal is carried out steering order, and the focusing on mobile subscriber's display interface identifies the desired position to the user;
007, the pressure sensitive device obtains the induced signal that is caused by the tensioning of user's ear organ action;
008, earphone is exported sensed data to intelligent terminal according to induced signal;
009, the processor of intelligent terminal is analyzed described sensed data;
010, judge that whether sensed data is complementary with the preset instructions parameter of steering order, if execution in step 011, otherwise return step 007;
011, intelligent terminal is carried out steering order, triggers the sign project pointed that focuses on.
As shown in Figure 2, the invention also discloses a kind of man-machine interactive system of intelligent terminal, comprise induction earphone and the intelligent terminal of setting up wired connection or wireless connections, described induction earphone comprises:
One link block is used for earphone data is sent to intelligent terminal;
One gyro module is used for gathering the exercise data of earphone motion process;
One pressure detection module is used to obtain the induced signal that the tensioning action by user's ear organ causes, according to this induced signal output sensed data;
Described intelligent terminal comprises:
One link block is used to receive the data that earphone sends;
One motion analysis module is used to analyze the steering order that the exercise data of gyro module is complementary with it, and this steering order comprises moving of focusing sign on user's display interface;
One pressure analysis module is used to analyze the steering order that the sensed data of pressure detection module is complementary with it, and described steering order comprises the focusing sign project pointed that triggers on user's display interface;
One execution module is used to carry out described steering order.
Further, described focusing sign comprises the effect that highlights of cursor, mouse pointer and literal/symbol/figure, and this focusing sign project pointed comprises the literal/symbol/figure of button, menu option, tool linking functions.
The motion analysis module is used to analyze the steering order that exercise data is complementary with it and is meant: the relative motion data that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.
Described sensed data comprises the force value and the corresponding time value of collection, and described pressure analysis module is used for analyzing the steering order that sensed data is complementary with it and comprises: be used for comparison because the expansion of user's ear organ or tighten up the pressure differential threshold values whether maximum pressure differential that action causes reaches the preset instructions parameter; Whether the frequency that is used for relatively this expansion or tightens up action is complementary with default frequency reference value.
Further, man-machine interactive system of the present invention also comprises the order parameter presetting module in the intelligent terminal, is used to be provided with the corresponding relation of preset instructions parameter and each steering order.
Above content be in conjunction with concrete optimal way to further describing that the present invention did, should not assert that concrete enforcement of the present invention is confined to above explanation.For those skilled in the art, without departing from the inventive concept of the premise, can also make some simple deduction or replace, within the protection domain that the claim that all should be considered as being submitted to by the present invention is determined.

Claims (10)

1. the man-machine interaction method of an intelligent terminal, a described intelligent terminal and an induction earphone are set up wired or wireless connections, and digital gyroscope and at least one pressure sensitive device are set on this induction earphone, and the step of described method comprises:
A, digital gyroscope are gathered the exercise data in the earphone motion process and are sent to intelligent terminal;
The processor of b, intelligent terminal is analyzed steering order and the execution that exercise data is complementary with it, and it is mobile that described steering order comprises that the focusing on user's display interface identifies;
C, pressure sensitive device obtain the induced signal that is caused by the tensioning of user's ear organ action, according to this induced signal output sensed data to intelligent terminal;
The steering order that d, the described sensed data of analysis are complementary is with it also carried out, and described steering order comprises the focusing sign project pointed that triggers among the step b.
2. the man-machine interaction method of intelligent terminal according to claim 1, it is characterized in that: described focusing sign comprises the effect that highlights of cursor, mouse pointer and literal/symbol/figure, and this focusing sign project pointed comprises the literal/symbol/figure of button, menu option, tool linking functions.
3. the man-machine interaction method of intelligent terminal according to claim 2, it is characterized in that, the steering order that the processor analysis exercise data of the described intelligent terminal of step b is complementary with it is meant: the relative motion data that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.
4. the man-machine interaction method of intelligent terminal according to claim 3 is characterized in that: the described steering order of step b also comprise the page on user's display interface on/down/left side/right translation and page turning.
5. the man-machine interaction method of intelligent terminal according to claim 3, it is characterized in that: the described sensed data of step c comprises the force value and the corresponding time value of collection.
6. the man-machine interaction method of intelligent terminal according to claim 5 is characterized in that, steps d is analyzed the steering order that described sensed data is complementary with it and comprised:
Relatively because the expansion of user's ear organ or tighten up maximum pressure differential that action causes and whether reach pressure differential threshold values in the preset instructions parameter;
Relatively whether this expansion or the frequency of tightening up action are complementary with default frequency reference value.
7. the man-machine interactive system of an intelligent terminal comprises induction earphone and the intelligent terminal of setting up wired connection or wireless connections, it is characterized in that, described induction earphone comprises:
One link block is used for earphone data is sent to intelligent terminal;
One gyro module is used for gathering the exercise data of earphone motion process;
One pressure detection module is used to obtain the induced signal that the tensioning action by user's ear organ causes, according to this induced signal output sensed data;
Described intelligent terminal comprises:
One link block is used to receive the data that earphone sends;
One motion analysis module is used to analyze the steering order that the exercise data of gyro module is complementary with it, and this steering order comprises moving of focusing sign on user's display interface;
One pressure analysis module is used to analyze the steering order that the sensed data of pressure detection module is complementary with it, and described steering order comprises the focusing sign project pointed that triggers on user's display interface;
One execution module is used to carry out described steering order.
8. the man-machine interactive system of intelligent terminal according to claim 7, it is characterized in that: described focusing sign comprises the effect that highlights of cursor, mouse pointer and literal/symbol/figure, and this focusing sign project pointed comprises the literal/symbol/figure of button, menu option, tool linking functions.
9. the man-machine interactive system of intelligent terminal according to claim 8, it is characterized in that, the motion analysis module is used to analyze the steering order that exercise data is complementary with it and is meant: the relative motion data that obtain earphone with the intelligent terminal for the object of reference analysis, these relative motion data and preset instructions parameter are mated, judge steering order according to matching result; Perhaps directly the exercise data and the preset instructions parameter of digital gyroscope output are mated, judge steering order according to matching result.
10. the man-machine interactive system of intelligent terminal according to claim 9 is characterized in that, described sensed data comprises the force value and the corresponding time value of collection, and described pressure analysis module is used to analyze the steering order that sensed data is complementary with it and comprises:
Be used for comparison because the expansion of user's ear organ or tighten up the pressure differential threshold values whether maximum pressure differential that action causes reaches the preset instructions parameter;
Whether the frequency that is used for relatively this expansion or tightens up action is complementary with default frequency reference value.
CN2013100707944A 2013-03-06 2013-03-06 Man-machine interaction method and system of intelligent terminal Pending CN103226436A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN2013100707944A CN103226436A (en) 2013-03-06 2013-03-06 Man-machine interaction method and system of intelligent terminal
PCT/CN2014/072586 WO2014135023A1 (en) 2013-03-06 2014-02-26 Man-machine interaction method and system of intelligent terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2013100707944A CN103226436A (en) 2013-03-06 2013-03-06 Man-machine interaction method and system of intelligent terminal

Publications (1)

Publication Number Publication Date
CN103226436A true CN103226436A (en) 2013-07-31

Family

ID=48836909

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2013100707944A Pending CN103226436A (en) 2013-03-06 2013-03-06 Man-machine interaction method and system of intelligent terminal

Country Status (2)

Country Link
CN (1) CN103226436A (en)
WO (1) WO2014135023A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103543843A (en) * 2013-10-09 2014-01-29 中国科学院深圳先进技术研究院 Man-machine interface equipment based on acceleration sensor and man-machine interaction method
CN103763440A (en) * 2014-02-19 2014-04-30 联想(北京)有限公司 Information processing method, electronic device accessory and electronic device
WO2014135023A1 (en) * 2013-03-06 2014-09-12 广东欧珀移动通信有限公司 Man-machine interaction method and system of intelligent terminal
CN104080022A (en) * 2014-07-14 2014-10-01 深迪半导体(上海)有限公司 Drive-by-wire earphone with posture control
CN104935721A (en) * 2014-03-20 2015-09-23 宇龙计算机通信科技(深圳)有限公司 Method and device for interaction with intelligent terminal
CN105446467A (en) * 2014-08-21 2016-03-30 刘小洋 Method and apparatus for controlling use of intelligent terminal device
CN105472496A (en) * 2015-11-21 2016-04-06 惠州Tcl移动通信有限公司 Bluetooth earphone and automatic on-off method thereof
CN105572870A (en) * 2014-11-05 2016-05-11 优利科技有限公司 Head-mounted display device
CN105867600A (en) * 2015-11-06 2016-08-17 乐视移动智能信息技术(北京)有限公司 Interaction method and device
CN106908642A (en) * 2015-12-23 2017-06-30 苏州普源精电科技有限公司 A kind of probe, oscillograph, movement recognition system and method
CN105718777B (en) * 2016-01-19 2018-12-25 宇龙计算机通信科技(深圳)有限公司 A kind of terminal solution/screen locking method, earphone, terminal and system
CN112835453A (en) * 2021-03-04 2021-05-25 网易(杭州)网络有限公司 Method, device and storage medium for simulating interface effect when human eyes are focused
CN112969116A (en) * 2021-02-01 2021-06-15 深圳市美恩微电子有限公司 Interactive control system of wireless earphone and intelligent terminal

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106325481A (en) * 2015-06-30 2017-01-11 展讯通信(天津)有限公司 A non-contact type control system and method and a mobile terminal
CN109426498B (en) * 2017-08-24 2023-11-17 北京迪文科技有限公司 Background development method and device for man-machine interaction system
CN108269571B (en) * 2018-03-07 2024-01-09 佛山市云米电器科技有限公司 Voice control terminal with camera function
CN114554094B (en) * 2022-02-25 2024-04-26 深圳市豪恩声学股份有限公司 Camera shooting control method based on headset and headset

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070023519A1 (en) * 2005-07-27 2007-02-01 Beom-Su Chung System and method using movement of human body to control a function of a terminal
CN101232743A (en) * 2007-01-24 2008-07-30 鸿富锦精密工业(深圳)有限公司 Audio playing device and used earphone, automatic control method
CN101790125A (en) * 2008-11-24 2010-07-28 苹果公司 Detecting the repositioning of an earphone using a microphone and associated action
EP2293598A2 (en) * 2009-07-31 2011-03-09 Carlos De La Fe Dahlin Menusystem
US20110206215A1 (en) * 2010-02-21 2011-08-25 Sony Ericsson Mobile Communications Ab Personal listening device having input applied to the housing to provide a desired function and method
CN102473024A (en) * 2009-07-23 2012-05-23 高通股份有限公司 Method and apparatus for communicating control information by a wearable device to control mobile and consumer electronic devices

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101341727B1 (en) * 2011-08-29 2013-12-16 주식회사 팬택 Apparatus and Method for Controlling 3D GUI
CN103226436A (en) * 2013-03-06 2013-07-31 广东欧珀移动通信有限公司 Man-machine interaction method and system of intelligent terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070023519A1 (en) * 2005-07-27 2007-02-01 Beom-Su Chung System and method using movement of human body to control a function of a terminal
CN101232743A (en) * 2007-01-24 2008-07-30 鸿富锦精密工业(深圳)有限公司 Audio playing device and used earphone, automatic control method
CN101790125A (en) * 2008-11-24 2010-07-28 苹果公司 Detecting the repositioning of an earphone using a microphone and associated action
CN102473024A (en) * 2009-07-23 2012-05-23 高通股份有限公司 Method and apparatus for communicating control information by a wearable device to control mobile and consumer electronic devices
EP2293598A2 (en) * 2009-07-31 2011-03-09 Carlos De La Fe Dahlin Menusystem
US20110206215A1 (en) * 2010-02-21 2011-08-25 Sony Ericsson Mobile Communications Ab Personal listening device having input applied to the housing to provide a desired function and method

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014135023A1 (en) * 2013-03-06 2014-09-12 广东欧珀移动通信有限公司 Man-machine interaction method and system of intelligent terminal
CN103543843A (en) * 2013-10-09 2014-01-29 中国科学院深圳先进技术研究院 Man-machine interface equipment based on acceleration sensor and man-machine interaction method
CN103763440A (en) * 2014-02-19 2014-04-30 联想(北京)有限公司 Information processing method, electronic device accessory and electronic device
CN104935721B (en) * 2014-03-20 2018-02-13 宇龙计算机通信科技(深圳)有限公司 A kind of method and device interactive with intelligent terminal
CN104935721A (en) * 2014-03-20 2015-09-23 宇龙计算机通信科技(深圳)有限公司 Method and device for interaction with intelligent terminal
CN104080022A (en) * 2014-07-14 2014-10-01 深迪半导体(上海)有限公司 Drive-by-wire earphone with posture control
CN105446467A (en) * 2014-08-21 2016-03-30 刘小洋 Method and apparatus for controlling use of intelligent terminal device
CN105446467B (en) * 2014-08-21 2018-08-31 刘小洋 The method and apparatus that control intelligent terminal uses
CN105572870A (en) * 2014-11-05 2016-05-11 优利科技有限公司 Head-mounted display device
CN105867600A (en) * 2015-11-06 2016-08-17 乐视移动智能信息技术(北京)有限公司 Interaction method and device
CN105472496A (en) * 2015-11-21 2016-04-06 惠州Tcl移动通信有限公司 Bluetooth earphone and automatic on-off method thereof
CN106908642A (en) * 2015-12-23 2017-06-30 苏州普源精电科技有限公司 A kind of probe, oscillograph, movement recognition system and method
CN105718777B (en) * 2016-01-19 2018-12-25 宇龙计算机通信科技(深圳)有限公司 A kind of terminal solution/screen locking method, earphone, terminal and system
CN112969116A (en) * 2021-02-01 2021-06-15 深圳市美恩微电子有限公司 Interactive control system of wireless earphone and intelligent terminal
CN112835453A (en) * 2021-03-04 2021-05-25 网易(杭州)网络有限公司 Method, device and storage medium for simulating interface effect when human eyes are focused

Also Published As

Publication number Publication date
WO2014135023A1 (en) 2014-09-12

Similar Documents

Publication Publication Date Title
CN103226436A (en) Man-machine interaction method and system of intelligent terminal
US9696767B2 (en) Command recognition method including determining a hold gesture and electronic device using the method
CN110164440B (en) Voice interaction awakening electronic device, method and medium based on mouth covering action recognition
RU2636104C1 (en) Method and device for implementing touch-sensitive button and identifying fingerprints and terminal device
KR101742410B1 (en) Intelligent terminal with built-in screenshot function and implementation method thereof
CN102981622A (en) External control method and system of mobile terminal
JP6196398B2 (en) Apparatus, method, terminal device, program, and recording medium for realizing touch button and fingerprint authentication
US10860857B2 (en) Method for generating video thumbnail on electronic device, and electronic device
CN104657072B (en) It is a kind of to trigger the method and apparatus for performing operational order
CN111314784B (en) Video playing method and electronic equipment
CN109994111B (en) Interaction method, interaction device and mobile terminal
CN104866226A (en) Terminal device and method for controlling same
CN111158487A (en) Man-machine interaction method for interacting with intelligent terminal by using wireless earphone
CN106201108B (en) Gloves control mode touch mode control method and device and electronic equipment
CN108491282A (en) The method and apparatus that webpage and operating system are communicated
CN112306366A (en) Operation method, mobile terminal and storage medium
WO2015131590A1 (en) Method for controlling blank screen gesture processing and terminal
KR20160133305A (en) Gesture recognition method, a computing device and a control device
CN110703972A (en) File control method and electronic equipment
CN105335061A (en) Information display method and apparatus and terminal
CN107179835A (en) A kind of input method and device, a kind of device for being used to input
CN105187597B (en) A kind of management method of voice record, device and its mobile terminal
CN107544686A (en) Operation performs method and device
CN107025041A (en) Fingerprint input method and terminal
CN109213349A (en) Exchange method and device, computer readable storage medium based on touch screen

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20130731