CN109870984B - Multi-household-appliance control method based on wearable device - Google Patents

Multi-household-appliance control method based on wearable device Download PDF

Info

Publication number
CN109870984B
CN109870984B CN201811601448.3A CN201811601448A CN109870984B CN 109870984 B CN109870984 B CN 109870984B CN 201811601448 A CN201811601448 A CN 201811601448A CN 109870984 B CN109870984 B CN 109870984B
Authority
CN
China
Prior art keywords
wearable device
equipment
household appliance
data
room
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811601448.3A
Other languages
Chinese (zh)
Other versions
CN109870984A (en
Inventor
董玮
蒋跃芳
高艺
周寒
刘汶鑫
宋心怡
汤涌江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Soyea Jiurong Technology Co ltd
Zhejiang University ZJU
Original Assignee
Soyea Jiurong Technology Co ltd
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Soyea Jiurong Technology Co ltd, Zhejiang University ZJU filed Critical Soyea Jiurong Technology Co ltd
Priority to CN201811601448.3A priority Critical patent/CN109870984B/en
Publication of CN109870984A publication Critical patent/CN109870984A/en
Application granted granted Critical
Publication of CN109870984B publication Critical patent/CN109870984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A multi-household-appliance control method based on wearable equipment comprises the following steps: step 1, acquiring three-dimensional space information of a room where a user is located and the position of a household appliance; step 2, placing ultrasonic ranging equipment for positioning in a room; step 3, acquiring sensor information on the wearable device; step 4, in the training stage, extracting a sensor data characteristic value and establishing a characteristic database and a classifier; and 5, in the control stage, acquiring and executing a user instruction according to sensor information returned by the wearable equipment and the auxiliary positioning equipment. The method is simple and natural, and avoids the risk of privacy disclosure; the position and the direction of the wearable device can be calculated by analyzing the inertial measurement unit and the ultrasonic ranging information, the device selected by the user can be accurately judged under multiple electric fields by utilizing the pre-established data table, and the reliability of the control system is improved.

Description

Multi-household-appliance control method based on wearable device
Technical Field
The invention relates to a wearable device-based multi-household-appliance control method, in particular to a multi-household-appliance control method which is low in cost, simple and convenient and free of privacy risks by fusing inertial sensor information and ultrasonic positioning information to identify a user control instruction.
Background
Compared with the common home, the intelligent home has the traditional living function and can provide rich human-computer interaction function, so that a user can conveniently acquire environment and equipment information and control the equipment. The characteristic is beneficial to saving unnecessary time consumption of a user, so that the household life is safer and more comfortable. Since all intelligent home appliances have access to the control server by wireless communication, the conventional control method using physical keys of the appliances or dedicated remote controllers is now too cumbersome and difficult to use, which requires more users (to go to the vicinity of the appliances or prepare a plurality of remote controllers). Today, as the wearable device user group gradually expands, the wearable device is close to the user and is equipped with rich characteristics of the sensor, and the interaction technology based on the wearable device is concerned by the user.
The voice control is a convenient and natural man-machine interaction mode. The use of voice technology for intelligent device control has been accepted by many vendors and users. At present, a large number of intelligent household products are provided with a voice control interface, but the following problems can occur in practical use: the effectiveness of the voice recognition technology is difficult to maintain under the far-field situation, and the recognition rate of the instruction is reduced under the condition that the user is far away from the equipment; from the aspect of privacy, a voice device which continuously monitors environmental sounds may become a hidden danger of revealing privacy of a user.
A control method based on a graphical interface generally runs a specific application on a smart phone or a tablet of a user, and requires the user to give an instruction by clicking on the application. This control method attracts many manufacturers and users because of the popularity of smartphones and tablets. It should be noted that, this kind of interaction mode is relatively more loaded down with trivial details, and switching between the equipment, the issue of instruction need a lot of clicks, and the house equipment is more, and unnecessary time that consumes in the App is also more.
In summary, it is very important to research a user-friendly and privacy-risk-free multi-household control method.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a wearable device-based multi-household-appliance control method.
In order to achieve the above purpose, the invention adopts a wearable device-based multi-household-appliance control method, which comprises the following steps:
step 1, acquiring three-dimensional space information of a room where a user is located and a position of a household appliance, comprising the following steps of:
(1.1) acquiring information sufficient to describe the three-dimensional shape of the room;
and (1.2) establishing an X-Y-Z three-dimensional right-hand coordinate system in the room.
And (1.3) establishing a household appliance data table of the room. For each intelligent household appliance which can be controlled by a network in a room, calculating a three-dimensional space coordinate (namely a device center coordinate) of the device in the coordinate system established in the step (1.2), and storing a device name, a device size and the coordinate into a data table;
and 2, placing ultrasonic ranging equipment for positioning in a room. The method comprises the following steps:
(2.1) ultrasonic positioning equipment is arranged at three arbitrary positions in a room, and the equipment is provided with a microphone and can receive ultrasonic waves with the frequency of 17KHz and above. Meanwhile, the device has wireless communication capability and can establish wireless transmission with the wearable device;
(2.2) recording three-dimensional space coordinates of the three ultrasonic positioning devices under the coordinate system established in the step (1.2);
step 3, acquiring sensor information on the wearable device, including:
(3.1) collecting readings of a nine-axis inertial measurement unit of the wearable device, wherein the readings comprise readings of a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer;
(3.2) transmitting the data acquired in the step (3.1) to a server for comprehensive processing and analysis;
step 4, in the training stage, extracting the characteristic value of the sensor data and establishing a characteristic database and a classifier, wherein the training stage comprises the following steps:
(4.1) predefining a number of control actions for a wearable device worn on the wrist. A user wears wearable equipment to complete each predefined control action for a plurality of times, and simultaneously collects the readings of the nine-axis inertia measurement unit in the action process;
(4.2) slicing the acquired sensor data. The accelerometer readings are monitored using a moving time window and the average of the acceleration data within the window is calculated. And if the difference between the updated sensor reading and the average data in the moving window is larger than a set threshold value, determining that the action start/end is detected, and storing the readings of the nine-axis inertia measurement unit in the process from the action start to the action end.
(4.3) performing feature extraction on the sensor data obtained in the step (4.2);
(4.4) constructing a motion recognition classifier by combining the features obtained in the step (4.3) with the marks of the real values of the motion classes, wherein the classifier can output which control motion the current data belongs to;
and 5, in the control stage, acquiring and executing a user instruction according to sensor information returned by the wearable equipment and the auxiliary positioning equipment, wherein the method comprises the following steps:
(5.1) the speaker of the wearable device continuously generates FMCW continuous frequency modulated waves. After the ultrasonic positioning equipment placed indoors receives the signal, the relative distance R between the equipment is calculated according to the time-of-flight distance measuring principle. Three relative distances are calculated by three ultrasonic positioning devices<R1,R2,R3>By usingThe particle filter calculates the 3-dimensional coordinate of the wearable equipment which best accords with the ranging result in the space;
(5.2) calculating the current direction of the wearable device by using an inertial navigation method for the acquired accelerometer data, gyroscope data and magnetometer data on the wearable device;
(5.3) calculating a spatial vector based on the position and orientation using the wearable device position obtained in (5.1) and the wearable device orientation obtained in (5.2). Judging which household appliance is currently in the pointed state according to the space vector and the household appliance data table established in the step (1.3);
and (5.4) judging that the time when any household appliance is in the continuous pointed state exceeds a certain time according to the result obtained in the step (5.3), and judging that the household appliance enters the selected state by the system. The wearable device prompts the user that the household appliance is selected;
and (5.5) after any household appliance enters the selected state, the user can issue any action control instruction. And (4) slicing the motion by using the mode in the step (4.2), extracting features according to the mode in the step (4.3), and finally identifying the motion by using the classifier trained in the step (4.4).
And (5.6) the wearable equipment sends the recognized action type to a household appliance control server, and the server sends a corresponding control instruction to the household appliance to complete control.
Further, the ultrasonic ranging device in the step (2.1) of the present invention integrates the existing commercial ultrasonic receiving module.
Further, the wearable device in the step (3.1) of the present invention is an existing commercial smart watch.
The invention has the beneficial effects that: the control instruction of the user is obtained by utilizing the motion recognition technology, the control mode is simple and natural, and the risk of privacy disclosure is avoided. The position and the direction of the wearable device can be calculated by analyzing the inertial measurement unit and the ultrasonic ranging information, the device selected by the user can be accurately judged under multiple electric fields by utilizing the pre-established data table, and the reliability of the control system is improved.
Drawings
FIG. 1 is a work flow diagram of the method of the present invention.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The specific embodiment of the invention is as follows:
the invention discloses a wearable device-based multi-household-appliance control method, which comprises the following steps:
step 1, acquiring three-dimensional space information of a room where a user is located and a position of a household appliance, comprising the following steps of:
(1.1) taking a room with a rectangular plane shape as an example, the length, width and height of the room are obtained. For rooms of other shapes, information sufficient to describe their three-dimensional shape is also obtained;
and (1.2) taking a room with a rectangular plane shape as an example, and taking a certain determined space point (such as a certain wall corner) of the room as an origin to establish an X-Y-Z three-dimensional right-hand coordinate system.
And (1.3) establishing a household appliance data table of the room. For each intelligent household appliance which can be controlled by a network in a room, calculating a three-dimensional space coordinate (namely a device center coordinate) of the device in the coordinate system established in the step (1.2), and storing a device name, a device size and the coordinate into a data table;
and 2, placing ultrasonic ranging equipment for positioning in a room. The method comprises the following steps:
(2.1) ultrasonic positioning equipment is arranged at three arbitrary positions in a room, the equipment is provided with a microphone and can receive ultrasonic waves with the frequency of 17KHz and above, and the sampling frequency is set to be 48 KHz. Meanwhile, the device has WiFi communication capability and can establish wireless transmission with the wearable device;
(2.2) recording three-dimensional space coordinates of the three ultrasonic positioning devices under the coordinate system established in the step (1.2);
step 3, acquiring sensor information on the wearable device, including:
(3.1) using wearable device TicWatch2, collecting nine-axis inertial measurement unit readings, including three-axis accelerometer, three-axis gyroscope, and three-axis magnetometer readings, at a 50Hz sampling frequency;
(3.2) transmitting the data acquired in the step (3.1) to a server for comprehensive processing and analysis;
step 4, in the training stage, extracting the characteristic value of the sensor data and establishing a characteristic database and a classifier, wherein the training stage comprises the following steps:
(4.1) taking a watch type wearable device worn on the left wrist as an example, predefine 6 control actions, wave hand upwards, wave hand downwards, wave hand leftwards, wave hand rightwards, wave hand forwards, wave hand backwards. The user wears wearable equipment to finish each predefined control action for 30 times, and simultaneously acquires data of an accelerometer, a gyroscope and a magnetometer in the action process;
(4.2) slicing the acquired sensor data. The accelerometer readings are monitored using a moving time window and the average of the acceleration data within the window is calculated. And if the difference between the updated sensor reading and the average data in the moving window is larger than a set threshold value, determining that the action start/end is detected, and storing the readings of the nine-axis inertia measurement unit in the process from the action start to the action end.
(4.3) extracting characteristics of the sensor data obtained in the step (4.2), wherein the characteristics comprise a maximum value, a minimum value, an average value, a standard deviation, a quarter point, a middle value point, a quarter point, a slope and a kurtosis;
(4.4) combining the characteristics obtained in the step (4.3) with the marks of the real values of the action categories, constructing a classifier by using a Support Vector Machine (SVM) carrying a Radial Basis Function (RBF) kernel, wherein the classifier can output which predefined control action the current data belongs to;
and 5, in the control stage, acquiring and executing a user instruction according to sensor information returned by the wearable equipment and the auxiliary positioning equipment, wherein the method comprises the following steps:
(5.1) the speaker of the wearable device continuously generates FMCW continuous frequency modulated waves of 17KHz to 22 KH. After the indoor ultrasonic positioning equipment receives the signal, the relative distance R between the equipment is calculated according to the time-of-flight distance measuring principle, and the relative distance R is shown as a formula I:
Figure BDA0001922596570000061
wherein f ispThe frequency of the equipment receiving the sound signal, c is the propagation speed of the sound in the air, T is the signal period, and B is the signal bandwidth;
three relative distances are calculated by three ultrasonic positioning devices<R1,R2,R3>And the three-dimensional space coordinate Z of the wearable device under the coordinate system established in the step (1.2) should satisfy a formula II:
R1=EuclideanDistance(D1,Z),
R2=EuclideanDistance(D2,Z),
R3=EuclideanDistance(D3z), formula (2)
Wherein D1,D2,D3Representing the 3-dimensional coordinates of three ultrasonic positioning devices, respectively, the Euclidean distance function calculates the Euclidean distance between the two coordinates. Calculating a 3-dimensional coordinate which best accords with a formula (2) in space by using a particle filter;
and (5.2) calculating the current pointing direction of the wearable device by using an inertial navigation method for the acquired accelerometer data, gyroscope data and magnetometer data on the wearable device. Taking a wrist watch type wearable device worn on the left wrist as an example, calculating (1.2) a vector representing the positive X-axis direction of the device under the established coordinate system, wherein the physical meaning of the direction is the pointing direction of the forearm (from elbow to wrist) of the user in space;
(5.3) calculating a spatial vector based on the position and orientation using the wearable device indoor position obtained in (5.1) and the wearable device orientation obtained in (5.2). For each household appliance in the household appliance data table, calculating whether the distance between the coordinates of the household appliance and the pointing vector is smaller than the size of the household appliance, and if so, judging that the household appliance is in a pointed state;
and (5.4) judging that when the time that any household appliance is in the continuous pointed state exceeds 0.5 second according to the result obtained in the step (5.3), the system judges that the household appliance enters the selected state. The wearable device prompts the user that the household appliance is selected by screen prompt, voice, vibration or other modes;
and (5.5) after any household appliance enters the selected state, the user can issue any action control instruction. And (4) slicing the motion by using the mode in the step (4.2), extracting features according to the mode in the step (4.3), and finally identifying the motion by using the classifier trained in the step (4.4).
And (5.6) the wearable equipment sends the recognized action type to a household appliance control server, and the server sends a corresponding control instruction to the household appliance to complete control.
The embodiments described in this specification are merely illustrative of implementations of the inventive concept and the scope of the present invention should not be considered limited to the specific forms set forth in the embodiments but rather by the equivalents thereof as may occur to those skilled in the art upon consideration of the present inventive concept.

Claims (1)

1. A wearable device-based multi-household-appliance control method is characterized by comprising the following steps:
step 1, acquiring three-dimensional space information of a room where a user is located and a position of a household appliance, comprising the following steps of:
(1.1) acquiring information sufficient to describe the three-dimensional shape of the room;
(1.2) establishing an X-Y-Z three-dimensional right-hand coordinate system of the room by taking a certain determined space point of the room as an origin;
(1.3) establishing a household appliance data table of the room; for each intelligent household appliance which can be controlled through a network in a room, calculating the three-dimensional space coordinate of the appliance under the coordinate system established in the step (1.2), and storing the name, the size and the coordinate of the appliance into a data table;
step 2, placing ultrasonic ranging equipment for positioning in a room; the method comprises the following steps:
(2.1) placing ultrasonic positioning equipment at three arbitrary positions in a room, wherein the equipment is provided with a microphone and can receive ultrasonic waves with the frequency of 17KHz or more, and the sampling frequency of the equipment is two times greater than the maximum ultrasonic frequency used in the ultrasonic distance measurement process; meanwhile, the device has wireless communication capability and establishes wireless transmission with the wearable device;
(2.2) recording three-dimensional space coordinates of the three ultrasonic positioning devices in the coordinate system established in the step (1.2);
step 3, acquiring sensor information on the wearable device, including:
(3.1) collecting readings of a nine-axis inertial measurement unit of the wearable device, wherein the readings comprise readings of a three-axis accelerometer, a three-axis gyroscope and a three-axis magnetometer;
(3.2) transmitting the data acquired in the step (3.1) to a server for comprehensive processing and analysis;
step 4, in the training stage, extracting the characteristic value of the sensor data and establishing a characteristic database and a classifier, wherein the training stage comprises the following steps:
(4.1) predefining a number of control actions for a wearable device worn on the wrist; a user wears wearable equipment to complete each predefined control action for a plurality of times, and simultaneously acquires data of an accelerometer, a gyroscope and a magnetometer in the action process;
(4.2) slicing the acquired sensor data; monitoring accelerometer readings using a moving time window and calculating an average of the acceleration data within the window; if the difference between the updated sensor reading and the average data in the moving window is larger than a set threshold value, the start/end of the action is detected, and the readings of the nine-axis inertia measurement unit in the process from the start to the end of each action are stored;
(4.3) extracting characteristics of the sensor data obtained in the step (4.2), wherein the characteristics comprise a maximum value, a minimum value, an average value, a standard deviation, a quarter point, a middle value point, a quarter point, a slope and a kurtosis;
(4.4) combining the characteristics obtained in the step (4.3) with the marks of the real values of the action categories, constructing a classifier by using a Support Vector Machine (SVM) carrying a Radial Basis Function (RBF) kernel, wherein the classifier can output which predefined control action the current data belongs to;
and 5, in the control stage, acquiring and executing a user instruction according to sensor information returned by the wearable equipment and the auxiliary positioning equipment, wherein the method comprises the following steps:
(5.1) continuously generating FMCW continuous frequency modulation waves above 17KHz by a loudspeaker of the wearable device; after the indoor ultrasonic positioning equipment receives the signal, the relative distance R between the equipment is calculated according to the time-of-flight distance measuring principle and is shown in a formula (1):
Figure FDA0002490196020000031
wherein f ispThe frequency of the equipment receiving the sound signal, c is the propagation speed of the sound in the air, T is the signal period, and B is the signal bandwidth;
three relative distances < R are calculated by three ultrasonic positioning devices1,R2,R3>, the three-dimensional space coordinate Z of the wearable device in the coordinate system established in step (1.2) should satisfy formula (2):
R1=EuclideanDistance(D1,Z),
R2=EuclideanDistance(D2,Z),
R3=EuclideanDistance(D3z), formula (2)
Wherein D1,D2,D3Respectively representing 3-dimensional coordinates of three ultrasonic positioning devices, and calculating Euclidean distance between the two coordinates by using a function Euclidean distance; calculating a 3-dimensional coordinate which best accords with a formula (2) in space by using a particle filter;
(5.2) calculating the current direction of the wearable device by using an inertial navigation method for the acquired accelerometer data, gyroscope data and magnetometer data on the wearable device; for wearable equipment worn on the wrist, calculating a vector representing the positive X-axis direction of the equipment under the coordinate system established in the step (1.2), wherein the physical meaning of the direction is the pointing direction of the forearm of the user in space;
(5.3) calculating a spatial vector based on the position and orientation using the wearable device indoor position obtained in step (5.1) and the wearable device orientation obtained in step (5.2); for each household appliance in the household appliance data table, calculating whether the distance between the coordinates of the household appliance and the pointing vector is smaller than the size of the household appliance, and if so, judging that the household appliance is in a pointed state;
(5.4) according to the result obtained in the step (5.3), judging that when the time that any household appliance is in the continuous pointed state exceeds a certain time, the system judges that the household appliance enters the selected state; the wearable device prompts the user that the household appliance is selected in a screen prompting mode;
(5.5) after any household appliance enters the selected state, the user can issue any action control instruction; slicing the motion by using the mode in the step (4.2), extracting features according to the mode in the step (4.3), and finally identifying the motion by using the classifier trained in the step (4.4);
and (5.6) the wearable equipment sends the recognized action type to a household appliance control server, and the server sends a corresponding control instruction to the household appliance to complete control.
CN201811601448.3A 2018-12-26 2018-12-26 Multi-household-appliance control method based on wearable device Active CN109870984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811601448.3A CN109870984B (en) 2018-12-26 2018-12-26 Multi-household-appliance control method based on wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811601448.3A CN109870984B (en) 2018-12-26 2018-12-26 Multi-household-appliance control method based on wearable device

Publications (2)

Publication Number Publication Date
CN109870984A CN109870984A (en) 2019-06-11
CN109870984B true CN109870984B (en) 2020-09-11

Family

ID=66917158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811601448.3A Active CN109870984B (en) 2018-12-26 2018-12-26 Multi-household-appliance control method based on wearable device

Country Status (1)

Country Link
CN (1) CN109870984B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110262367B (en) * 2019-08-05 2021-03-30 刘盛荣 Intelligent household equipment control method and device
CN111141284A (en) * 2019-12-28 2020-05-12 西安交通大学 Intelligent building personnel thermal comfort level and thermal environment management system and method
CN112099368A (en) * 2020-09-25 2020-12-18 歌尔科技有限公司 Electrical equipment control method and system and wearable equipment
CN114838701B (en) * 2021-01-30 2023-08-22 华为技术有限公司 Method for acquiring attitude information and electronic equipment
CN113138665A (en) * 2021-04-02 2021-07-20 深圳大学 Smart watch-based smart device pointing control method and system
CN114390671B (en) * 2021-12-08 2023-04-18 珠海格力电器股份有限公司 Object positioning method and device, electronic equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184014A (en) * 2011-05-12 2011-09-14 浙江大学 Intelligent appliance interaction control method and device based on mobile equipment orientation
CN104298342A (en) * 2013-07-19 2015-01-21 中兴通讯股份有限公司 Three-dimensional space coordinate detection method, three-dimensional input method and corresponding devices
CN104765460A (en) * 2015-04-23 2015-07-08 王晓军 Intelligent ring and method for controlling intelligent terminal through intelligent ring via gestures
DE102014107683B3 (en) * 2014-06-02 2015-10-01 Insta Elektro Gmbh Method for operating a building installation with a situation monitor and building installation with a situation monitor
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
WO2016202079A1 (en) * 2015-06-16 2016-12-22 中兴通讯股份有限公司 Control method and apparatus for household appliance device
CN106843481A (en) * 2017-01-19 2017-06-13 武汉大学 A kind of three dimensions Freehandhand-drawing device and method based on gesture control
CN106950927A (en) * 2017-02-17 2017-07-14 深圳大学 A kind of method and Intelligent worn device of control smart home

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335203A1 (en) * 2012-06-19 2013-12-19 Yan Long Sun Portable electronic device for remotely controlling smart home electronic devices and method thereof
US9208676B2 (en) * 2013-03-14 2015-12-08 Google Inc. Devices, methods, and associated information processing for security in a smart-sensored home
CN103995607B (en) * 2014-05-22 2018-09-07 百度在线网络技术(北京)有限公司 Control method, control device and controlled method and controlled device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102184014A (en) * 2011-05-12 2011-09-14 浙江大学 Intelligent appliance interaction control method and device based on mobile equipment orientation
CN104298342A (en) * 2013-07-19 2015-01-21 中兴通讯股份有限公司 Three-dimensional space coordinate detection method, three-dimensional input method and corresponding devices
DE102014107683B3 (en) * 2014-06-02 2015-10-01 Insta Elektro Gmbh Method for operating a building installation with a situation monitor and building installation with a situation monitor
CN104765460A (en) * 2015-04-23 2015-07-08 王晓军 Intelligent ring and method for controlling intelligent terminal through intelligent ring via gestures
WO2016202079A1 (en) * 2015-06-16 2016-12-22 中兴通讯股份有限公司 Control method and apparatus for household appliance device
CN105549408A (en) * 2015-12-31 2016-05-04 歌尔声学股份有限公司 Wearable device and control method thereof, intelligent household server and control method thereof, and system
CN106843481A (en) * 2017-01-19 2017-06-13 武汉大学 A kind of three dimensions Freehandhand-drawing device and method based on gesture control
CN106950927A (en) * 2017-02-17 2017-07-14 深圳大学 A kind of method and Intelligent worn device of control smart home

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于物联网环境下体感交互技术的智能家居系统;苏本月,王广军,章健;《中南大学学报》;20130730;全文 *
智能家居控制系统设计与实现;邓中祚;《中国优秀硕士学位论文全文数据库》;20160215(第02期);全文 *

Also Published As

Publication number Publication date
CN109870984A (en) 2019-06-11

Similar Documents

Publication Publication Date Title
CN109870984B (en) Multi-household-appliance control method based on wearable device
EP3340243B1 (en) Method for performing voice control on device with microphone array, and device thereof
CN107643509B (en) Localization method, positioning system and terminal device
CN106575150B (en) Method for recognizing gestures using motion data and wearable computing device
CN106898348B (en) Dereverberation control method and device for sound production equipment
EP3783604B1 (en) Method for responding to voice signal, electronic device, medium and system
EP3923273A1 (en) Voice recognition method and device, storage medium, and air conditioner
WO2015007092A1 (en) Method, apparatus and device for controlling antenna of mobile device
CN106093950B (en) Mobile terminal positioning device and method
CN104767807A (en) Information transmission method based on wearable devices and related devices
CN108966077A (en) A kind of control method and system of speaker volume
CN106713598B (en) Instruction transmission method and device based on indication direction and intelligent equipment
CN110493690A (en) A kind of sound collection method and device
CN104811862A (en) Volume control method and terminal for sound box
CN109917922A (en) A kind of exchange method and wearable interactive device
CN110517677B (en) Speech processing system, method, apparatus, speech recognition system, and storage medium
CN113406610B (en) Target detection method, device, equipment and storage medium
CN109472825B (en) Object searching method and terminal equipment
KR101995799B1 (en) Place recognizing device and method for providing context awareness service
CN209606794U (en) A kind of wearable device, sound-box device and intelligent home control system
CN111028494B (en) Virtual remote control method of electrical equipment, computer readable storage medium and intelligent household appliance
KR20150082085A (en) Computing system with command-sense mechanism and method of operation thereof
CN110277097B (en) Data processing method and related equipment
CN108108144A (en) A kind of electronic equipment and information processing method
CN115134523B (en) Remote control method, remote control device, operation terminal, control device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant