WO2022041532A1 - Procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil - Google Patents

Procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil Download PDF

Info

Publication number
WO2022041532A1
WO2022041532A1 PCT/CN2020/132315 CN2020132315W WO2022041532A1 WO 2022041532 A1 WO2022041532 A1 WO 2022041532A1 CN 2020132315 W CN2020132315 W CN 2020132315W WO 2022041532 A1 WO2022041532 A1 WO 2022041532A1
Authority
WO
WIPO (PCT)
Prior art keywords
wireless headset
arm
wearable device
smart wearable
control command
Prior art date
Application number
PCT/CN2020/132315
Other languages
English (en)
Chinese (zh)
Inventor
王晓晨
刘若宇
董科
Original Assignee
歌尔股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 歌尔股份有限公司 filed Critical 歌尔股份有限公司
Publication of WO2022041532A1 publication Critical patent/WO2022041532A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/10Details of earpieces, attachments therefor, earphones or monophonic headphones covered by H04R1/10 but not provided for in any of its subgroups

Definitions

  • the present application relates to the field of earphone control, and in particular, to a method for controlling a wireless earphone by using a smart wearable device, a smart wearable device, and a readable storage medium.
  • wireless headset has its own pressure sensor, which can directly sense the gestures of clicking, long-pressing, and sliding, and perform corresponding actions.
  • the two existing control methods both require the user to touch the control terminal or the wireless earphone itself through their limbs to complete the control of the wireless earphone. Under certain circumstances (such as when holding something in your hand or when your hand is dirty), the control of the wireless headset cannot be successfully completed.
  • the purpose of this application is to provide a method for controlling a wireless earphone by using a smart wearable device, an intelligent wearable device and a readable storage medium, which are used to conveniently and quickly realize the control of the wireless earphone.
  • the present application provides a method for controlling a wireless headset by using a smart wearable device, the method comprising:
  • the smart wearable device detects the distance from the wireless headset
  • a corresponding first control command is sent to the wireless headset according to the movement parameter of the arm, so that the wireless headset executes the first control command.
  • send a corresponding first control command to the wireless headset according to the movement parameters of the arm including:
  • the movement parameters of the arm include at least one of a movement direction, a movement path, and a movement number of the arm.
  • the corresponding relationship between the movement parameters of the arm and the first control command in the preset storage space is edited according to the editing command.
  • the first control command includes at least one of a song switching command, a call on/off command, a volume control command, a voice assistant wake-up command, and a fast-forward/fast-rewind command.
  • the smart wearable device detects the distance from the wireless headset, including:
  • a corresponding second control command is sent to the wireless headset according to the movement parameter of the hand, so that the wireless headset executes the second control command.
  • the movement parameters of the hand include palm movement parameters and/or fingertip movement parameters.
  • the application also provides a smart wearable device, the smart wearable device includes:
  • a first detection module used for the smart wearable device to detect the distance from the wireless headset
  • a second detection module configured to activate an inertial measurement sensor when the distance is less than a threshold, and use the inertial measurement sensor to detect whether the arm is in a moving state
  • a first sending module configured to send a corresponding first control command to the wireless headset according to a movement parameter of the arm if the arm is in a moving state, so that the wireless headset executes the first control command.
  • the application also provides a smart wearable device, the smart wearable device includes:
  • the processor is configured to implement the steps of the method for using the smart wearable device to control the wireless headset according to any one of the above when executing the computer program.
  • the present application also provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the method for controlling a wireless headset by using a smart wearable device according to any one of the above-mentioned methods is implemented. step.
  • the method of using a smart wearable device to control a wireless headset includes: the smart wearable device detects the distance from the wireless headset; when the distance is less than a threshold, the inertial measurement sensor is activated, and the inertial measurement sensor is used to detect whether the arm is in a moving state; if the arm is in a moving state; In the moving state, the corresponding first control command is sent to the wireless earphone according to the movement parameter of the arm, so that the wireless earphone executes the first control command.
  • the technical solution provided by this application detects the distance from the wireless headset through a smart wearable device; when the distance is less than a threshold, the inertial measurement sensor is activated, and the inertial measurement sensor is used to detect whether the arm is in a moving state; if the arm is in a moving state, according to the arm Send the corresponding first control command to the wireless headset.
  • the whole process does not require the user to touch the control terminal or the wireless headset itself through the limbs, but the control of the wireless headset can be conveniently and quickly completed according to the movement parameters of the arm.
  • the present application also provides a smart wearable device and a readable storage medium, which have the above beneficial effects, and will not be repeated here.
  • FIG. 1 is a flowchart of a method for controlling a wireless headset by using a smart wearable device according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of a smart wearable device detecting a distance from a wireless headset according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of coordinates of an arm movement provided by an embodiment of the present application.
  • FIG. 4 is a schematic diagram of an arm motion recognition provided by an embodiment of the present application.
  • FIG. 5 is a flowchart of another method for controlling a wireless headset by using a smart wearable device according to an embodiment of the present application
  • FIG. 6 is a schematic diagram of an arm motion recognition provided by an embodiment of the present application.
  • FIG. 7 is a schematic position diagram of a millimeter-wave radar sensor provided by an embodiment of the present application.
  • FIG. 8 is a structural diagram of a smart wearable device provided by an embodiment of the application.
  • FIG. 9 is a structural diagram of another smart wearable device provided by an embodiment of the present application.
  • the core of the present application is to provide a method for controlling a wireless earphone by using a smart wearable device, an intelligent wearable device and a readable storage medium, which are used to conveniently and quickly realize the control of the wireless earphone.
  • FIG. 1 is a flowchart of a method for controlling a wireless headset by using a smart wearable device according to an embodiment of the present application.
  • the smart wearable device detects the distance from the wireless headset
  • the user needs to touch the control terminal or the wireless headset itself through the body to complete the control of the wireless headset. Or when the hand is stained), the control of the wireless headset cannot be successfully completed; and although the existing gesture control method can realize the non-contact control of the wireless headset, it is impossible to judge whether it is operated by myself. When the hand intrudes into the recognition range, it is impossible to distinguish whether the gesture is made by itself, and other people may misuse it; therefore, the present application provides a method for controlling a wireless headset by using a smart wearable device to solve the above problems;
  • FIG. 2 is a schematic diagram of a smart wearable device detecting a distance from a wireless headset according to an embodiment of the present application.
  • the present application detects the distance from the wireless earphone through the smart wearable device, and controls the wireless earphone according to the state of the arm when the distance is less than the threshold value. It does not require the user to control the terminal or the wireless earphone itself through the limbs. Touching can also avoid misoperation caused by recognizing actions of outsiders, thereby realizing convenient and quick control of the wireless earphones.
  • the distance from the wireless headset can be detected by the smart wearable device through power calculation, that is, the smart wearable device mentioned here detects the distance from the wireless headset, which can be specifically implemented by performing the following steps:
  • the distance between the detection of the smart wearable device and the wireless headset may also be calculated according to the reflection time of the signal, which is not specifically limited in this application.
  • S102 Activate the inertial measurement sensor when the distance is less than a threshold, and use the inertial measurement sensor to detect whether the arm is in a moving state;
  • step S103 If the arm is in a moving state, go to step S103;
  • step S103 is performed.
  • the movement parameter sends a corresponding first control command to the wireless headset, so that the wireless headset executes the first control command.
  • the wireless earphone can use the wireless sensor of the earphone as the reference point to establish a three-dimensional space with a radius of 20cm as the operation recognition range.
  • the 20cm radius mentioned here is based on ergonomic dimensions (head and shoulders). It can be used as the threshold value of the operating range.
  • the threshold value can also be determined according to industry standards or other requirements, which is not specifically limited in this application.
  • S103 Send a corresponding first control command to the wireless headset according to the movement parameter of the arm, so that the wireless headset executes the first control command.
  • the sending of the corresponding first control command to the wireless headset according to the movement parameters of the arm mentioned here may specifically be:
  • the movement parameters of the arm mentioned here may include at least one of the movement direction, movement path, and movement times of the arm.
  • the addition, modification and deletion of the correspondence between the movement parameters of the arm and the first control command in the preset storage space can also be implemented by performing the following steps:
  • the corresponding relationship between the movement parameters of the arm and the first control command in the preset storage space is edited according to the editing command.
  • the first control command mentioned here includes at least one of a song switching command, a call on/off command, a volume control command, a voice assistant wake-up command, and a fast-forward/fast-rewind command.
  • a coordinate system can be established according to the position of the human body to realize the recognition of the arm movement.
  • FIG. 3 and FIG. 4 is a schematic diagram of arm motion recognition according to an embodiment of the present application.
  • a three-dimensional coordinate system can be established based on the plane where the human body is located as the X-axis and the Y-axis, and the direction facing the human body is the Z-axis;
  • the sensor detects that the movement path is rotating around the X-axis, so different control commands can be designed according to different rotation directions. For example, take 1S as the The detection time period, with 10 degrees as the effective rotation degree threshold, can be set as gesture 1, clockwise rotation is detected once, and the wireless headset related application starts the function of switching songs (next song); set gesture 2 as detected Rotate clockwise 2 times, the wireless headset related application starts the fast forward function; set gesture 3 to, detect a counterclockwise rotation, the wireless headset related application starts the function of switching songs (previous song); set gesture 4 to detect reverse The hour hand rotates twice, and the wireless headset related application starts the rewind function.
  • take 1S as the The detection time period, with 10 degrees as the effective rotation degree threshold, can be set as gesture 1, clockwise rotation is detected once, and the wireless headset related application starts the function of switching songs (next song); set gesture 2 as detected Rotate clockwise 2 times, the wireless headset related application starts the fast forward function; set gesture 3 to, detect a counterclockwise rotation, the wireless headset related
  • the present application provides a method for controlling a wireless headset by using a smart wearable device.
  • the smart wearable device detects the distance from the wireless headset; when the distance is less than a threshold, the inertial measurement sensor is activated, and the inertial measurement sensor is used to detect the arm Whether it is in a moving state; if the arm is in a moving state, the corresponding first control command is sent to the wireless headset according to the movement parameters of the arm.
  • the whole process does not require the user to touch the control terminal or the wireless headset itself through the limbs.
  • the wireless headset can be controlled conveniently and quickly according to the movement parameters of the arm.
  • step S102 is performed to detect whether the arm is in a moving state by using an inertial measurement sensor, if the arm is not in a moving state, the steps shown in FIG.
  • FIG. 5 is a flowchart of another method for controlling a wireless headset by using a smart wearable device according to an embodiment of the present application.
  • S501 Activate the millimeter-wave radar sensor, and use the millimeter-wave radar sensor to detect whether the hand is in a moving state;
  • step S502 If the hand is in a moving state, go to step S502;
  • step S502 is executed at this time.
  • the movement parameter of the part sends the corresponding second control command to the wireless headset to realize the control of the wireless headset;
  • the smart wearable device detects that the distance from the wireless headset is less than the threshold, and the arm is not moving and the hand is not moving, it proves that the user is just unconsciously placing the smart wearable device on the wireless headset. In a relatively close position, you can do nothing at this time.
  • S502 Send a corresponding second control command to the wireless headset according to the movement parameter of the hand, so that the wireless headset executes the second control command.
  • the second control command mentioned here may be the same as or different from the first control command, which is not specifically limited in this application;
  • the movement parameters of the hand mentioned here may specifically include palm movement parameters and/or fingertip movement parameters.
  • a coordinate system can also be established according to the position of the human body to realize hand motion recognition.
  • FIG. 6 is a schematic diagram of arm motion recognition provided by an embodiment of the present application.
  • gesture 5 can be set as, when the millimeter wave radar sensor recognizes that the displacement direction of the four fingertip points is the negative direction of the X axis, the wireless headset related application starts the volume reduction function;
  • Set gesture 6 as, when the millimeter-wave radar sensor recognizes that the displacement direction of the four fingertip points is the positive direction of the X axis, the wireless headset related application starts the volume increase function;
  • gesture 7 as, when the millimeter-wave radar sensor recognizes When the displacement direction of only one fingertip point is the positive X-axis, the wireless headset related application starts the pause/answer function;
  • gesture 8 can be set as, when the millimeter wave radar sensor recognizes the displacement of only one fingertip point When the direction is the negative X-axis, the wireless headset-related application activates the wake-up voice assistant function.
  • the number of millimeter-wave radar sensors can be one;
  • the number of millimeter-wave radar sensors may also be multiple;
  • FIG. 7 is a schematic diagram of the location of a millimeter-wave radar sensor provided by an embodiment of the present application.
  • the smart wearable device when it is a smart watch, it can be The millimeter-wave radar sensor is placed in the dial 1 and wristband 2 of the smart watch, so that the two millimeter-wave radar sensors can be symmetrically distributed on the upper and lower sides of the wrist, which can ensure the movement of the hand as much as possible under the premise of saving costs. The measurement accuracy of the parameter.
  • FIG. 8 is a structural diagram of a smart wearable device provided by an embodiment of the present application.
  • the smart wearable device may include:
  • the first detection module 100 is used for the smart wearable device to detect the distance from the wireless headset;
  • the second detection module 200 is configured to activate the inertial measurement sensor when the distance is less than the threshold, and use the inertial measurement sensor to detect whether the arm is in a moving state;
  • the first sending module 300 is configured to send a corresponding first control command to the wireless earphone according to the movement parameter of the arm if the arm is in a moving state, so that the wireless earphone executes the first control command.
  • the first sending module 300 may include:
  • a first sending submodule used for searching the first control command corresponding to the movement parameter of the arm in the preset storage space, and sending the first control command to the wireless headset;
  • the movement parameters of the arm include at least one of the movement direction, movement path, and movement times of the arm.
  • the first sending module 300 may further include:
  • the receiving sub-module is used to receive the input editing command
  • the editing sub-module is used for editing the correspondence between the movement parameters of the arm and the first control command in the preset storage space according to the editing command.
  • the first control command includes at least one of a song switching command, a call on/off command, a volume control command, a voice assistant wake-up command, and a fast forward/rewind command. one.
  • the first detection module 100 may include:
  • a transmission sub-module used for transmitting the detection signal to the wireless earphone, and determining the power value of the reflected signal of the detection signal
  • the calculation sub-module is used to calculate the distance between the smart wearable device and the wireless headset according to the power value.
  • the smart wearable device may further include:
  • the third detection module is used to activate the millimeter-wave radar sensor when the arm is not in a moving state, and use the millimeter-wave radar sensor to detect whether the hand is in a moving state;
  • the second sending module is configured to send a corresponding second control command to the wireless earphone according to the movement parameter of the hand if the hand is in a moving state, so that the wireless earphone executes the second control command.
  • the movement parameters of the hand include palm movement parameters and/or fingertip movement parameters.
  • the embodiments of the smart wearable device part correspond to the embodiments of the method part, the embodiments of the smart wearable device part refer to the description of the embodiments of the method part, which will not be repeated here.
  • FIG. 9 is a structural diagram of another smart wearable device provided by an embodiment of the present application.
  • the smart wearable device 900 may vary greatly due to different configurations or performances, and may include one or more processors (central processing units, CPU) 922 (for example, one or more processors) and a memory 932, one or more One or more storage media 930 (eg, one or more mass storage devices) storing applications 942 or data 944.
  • the memory 932 and the storage medium 930 may be short-term storage or persistent storage.
  • the program stored in the storage medium 930 may include one or more modules (not shown in the figure), and each module may include a series of instruction operations on the smart wearable device.
  • the processor 922 may be configured to communicate with the storage medium 930, and execute a series of instruction operations in the storage medium 930 on the smart wearable device 900.
  • the smart wearable device 900 may also include one or more power supplies 929, one or more wired or wireless network interfaces 950, one or more input and output interfaces 958, and/or, one or more operating systems 941, such as Windows ServerTM, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, etc.
  • the steps in the method for controlling a wireless headset by using a smart wearable device described above in FIGS. 1 to 7 are implemented by the smart wearable device based on the structure shown in FIG. 9 .
  • the disclosed smart wearable device and method may be implemented in other ways.
  • the above-described smart wearable device embodiments are only illustrative.
  • the division of modules is only a logical function division. In actual implementation, there may be other division methods.
  • multiple modules or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or modules, and may be in electrical, mechanical or other forms.
  • Modules described as separate components may or may not be physically separated, and components shown as modules may or may not be physical modules, that is, they may be located in one place, or may be distributed to multiple network modules. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist physically alone, or two or more modules may be integrated into one module.
  • the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
  • the integrated modules if implemented in the form of software functional modules and sold or used as independent products, can be stored in a computer-readable storage medium.
  • the technical solutions of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, and the computer software products are stored in a storage medium , which includes several instructions to cause a computer device (which may be a personal computer, a function invocation device, or a network device, etc.) to execute all or part of the steps of the methods in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
  • a method for controlling a wireless headset by using a smart wearable device, the smart wearable device, and the readable storage medium provided in the present application are described above in detail. Specific examples are used herein to illustrate the principles and implementations of the present application, and the descriptions of the above embodiments are only used to help understand the methods and core ideas of the present application. It should be pointed out that for those of ordinary skill in the art, without departing from the principles of the present application, several improvements and modifications can also be made to the present application, and these improvements and modifications also fall within the protection scope of the claims of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil, comportant les étapes suivantes: un dispositif vestimentaire intelligent détecte la distance entre lui-même et un casque d'écoute sans fil; un capteur de mesure inertielle est activé lorsque la distance est inférieure à un seuil, et le capteur de mesure inertielle est utilisé pour détecter si un bras est dans un état de mouvement; et si le bras est dans un état de mouvement, une première consigne de commande correspondante est envoyée au casque d'écoute sans fil selon des paramètres de mouvement du bras de telle sorte que le casque d'écoute sans fil exécute la première consigne de commande. Le procédé utilise le capteur de mesure inertielle pour détecter si le bras est dans un état de mouvement; et si c'est le cas, la première consigne de commande correspondante est envoyée au casque d'écoute sans fil selon les paramètres de mouvement du bras. L'ensemble du processus commande commodément et rapidement le casque d'écoute sans fil selon les paramètres de mouvement du bras sans imposer à l'utilisateur de toucher un terminal de commande ou le casque d'écoute sans fil lui-même au moyen de ses membres. En même temps, l'invention concerne en outre un dispositif vestimentaire intelligent et un support de stockage lisible, qui ont les effets bénéfiques susmentionnés.
PCT/CN2020/132315 2020-08-31 2020-11-27 Procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil WO2022041532A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010895532.1 2020-08-31
CN202010895532.1A CN112035088A (zh) 2020-08-31 2020-08-31 一种利用智能穿戴设备控制无线耳机的方法

Publications (1)

Publication Number Publication Date
WO2022041532A1 true WO2022041532A1 (fr) 2022-03-03

Family

ID=73586996

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/132315 WO2022041532A1 (fr) 2020-08-31 2020-11-27 Procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil

Country Status (2)

Country Link
CN (1) CN112035088A (fr)
WO (1) WO2022041532A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112631427A (zh) * 2020-12-21 2021-04-09 深圳市爱都科技有限公司 一种处理通信信息的方法、装置、智能穿戴设备及存储介质
CN113721768A (zh) * 2021-08-30 2021-11-30 歌尔科技有限公司 一种穿戴设备的控制方法、装置、系统及可读存储介质
CN113835352B (zh) * 2021-09-29 2023-09-08 歌尔科技有限公司 一种智能设备控制方法、系统、电子设备及存储介质
CN116094861A (zh) * 2022-08-30 2023-05-09 荣耀终端有限公司 遥控电子设备的方法、设备及系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106685459A (zh) * 2016-12-27 2017-05-17 广东小天才科技有限公司 一种可穿戴设备操作的控制方法及可穿戴设备
US9801047B1 (en) * 2013-03-27 2017-10-24 Open Invention Network Llc Wireless device application interaction via external control detection
CN110505549A (zh) * 2019-08-21 2019-11-26 Oppo(重庆)智能科技有限公司 耳机的控制方法和装置
CN110536203A (zh) * 2019-08-13 2019-12-03 Oppo广东移动通信有限公司 一种蓝牙耳机、可穿戴设备、控制系统及控制方法
CN110543231A (zh) * 2018-05-28 2019-12-06 Oppo广东移动通信有限公司 电子装置控制方法及相关设备
CN110650405A (zh) * 2019-10-22 2020-01-03 Oppo(重庆)智能科技有限公司 无线耳机控制系统、方法、装置及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511603A (zh) * 2015-11-25 2016-04-20 小米科技有限责任公司 设备控制方法和装置
CN105721694A (zh) * 2016-01-29 2016-06-29 宇龙计算机通信科技(深圳)有限公司 控制方法、控制装置和可穿戴智能设备及终端
CN107817896B (zh) * 2017-10-09 2019-08-20 维沃移动通信有限公司 一种基于无线耳机的交互方法、无线耳机及移动终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9801047B1 (en) * 2013-03-27 2017-10-24 Open Invention Network Llc Wireless device application interaction via external control detection
CN106685459A (zh) * 2016-12-27 2017-05-17 广东小天才科技有限公司 一种可穿戴设备操作的控制方法及可穿戴设备
CN110543231A (zh) * 2018-05-28 2019-12-06 Oppo广东移动通信有限公司 电子装置控制方法及相关设备
CN110536203A (zh) * 2019-08-13 2019-12-03 Oppo广东移动通信有限公司 一种蓝牙耳机、可穿戴设备、控制系统及控制方法
CN110505549A (zh) * 2019-08-21 2019-11-26 Oppo(重庆)智能科技有限公司 耳机的控制方法和装置
CN110650405A (zh) * 2019-10-22 2020-01-03 Oppo(重庆)智能科技有限公司 无线耳机控制系统、方法、装置及存储介质

Also Published As

Publication number Publication date
CN112035088A (zh) 2020-12-04

Similar Documents

Publication Publication Date Title
WO2022041532A1 (fr) Procédé d'utilisation d'un dispositif vestimentaire intelligent pour commander un casque d'écoute sans fil
US9965033B2 (en) User input method and portable device
JP6545258B2 (ja) スマートリング
JP5858155B2 (ja) 携帯型端末装置のユーザインターフェースを自動的に切り替える方法、及び携帯型端末装置
US20200064932A1 (en) Devices and methods for generating input
US20150054630A1 (en) Remote Controller and Information Processing Method and System
US9310896B2 (en) Input method and electronic device using pen input device
WO2019105376A1 (fr) Procédé de reconnaissance de geste, terminal et support de stockage
TW201413538A (zh) 具有手姿勢控制之輸入裝置
WO2008132546A1 (fr) Procédé et algorithme de détection de mouvement d'un objet
KR20140114913A (ko) 사용자 기기의 센서 운용 방법 및 장치
US9811255B2 (en) Detection of gesture data segmentation in mobile devices
CN109844702B (zh) 一种对电子设备的控制方法以及输入设备
JP2006511862A (ja) 非接触型入力装置
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
TWI621037B (zh) 利用運動下達指令的觸控系統、觸控筆與其方法
WO2021197487A1 (fr) Procédé et appareil permettant de commander un écran de terminal au moyen d'une souris, souris et support de stockage
WO2018219279A1 (fr) Système tactile virtuel, procédé et dispositif
KR20220123036A (ko) 터치 키, 제어 방법 및 전자 장치
CN110851061A (zh) 一种指环式鼠标控制终端的方法
TWI724384B (zh) 用於可穿戴設備的資訊處理方法和裝置
WO2016049842A1 (fr) Procédé d'interaction hybride pour dispositif intelligent portatif ou à porter sur soi
RU2528079C2 (ru) Устройство ввода и способ масштабирования объекта с помощью устройства ввода
WO2021160000A1 (fr) Dispositif portable et procédé de commande
US20240028129A1 (en) Systems for detecting in-air and surface gestures available for use in an artificial-reality environment using sensors at a wrist-wearable device, and methods of use thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20951200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20951200

Country of ref document: EP

Kind code of ref document: A1