CN111766800A - Intelligent device control method based on scene and big data - Google Patents

Intelligent device control method based on scene and big data Download PDF

Info

Publication number
CN111766800A
CN111766800A CN201910592601.9A CN201910592601A CN111766800A CN 111766800 A CN111766800 A CN 111766800A CN 201910592601 A CN201910592601 A CN 201910592601A CN 111766800 A CN111766800 A CN 111766800A
Authority
CN
China
Prior art keywords
user
scene
different
setting parameters
intelligent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910592601.9A
Other languages
Chinese (zh)
Inventor
池鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IGRS ENGINEERING LAB Ltd
Original Assignee
IGRS ENGINEERING LAB Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IGRS ENGINEERING LAB Ltd filed Critical IGRS ENGINEERING LAB Ltd
Priority to CN201910592601.9A priority Critical patent/CN111766800A/en
Publication of CN111766800A publication Critical patent/CN111766800A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/04Programme control other than numerical control, i.e. in sequence controllers or logic controllers
    • G05B19/042Programme control other than numerical control, i.e. in sequence controllers or logic controllers using digital processors
    • G05B19/0423Input/output
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/25Pc structure of the system
    • G05B2219/25257Microcontroller

Abstract

The invention discloses an intelligent device control method based on scenes and big data, which comprises the following steps: collecting user identity information, user types of the users and setting parameters of the users to different intelligent devices in different scenes, storing the user identity information, the user types of the users and the setting parameters into a database, and analyzing the setting parameters by using a linear regression algorithm to obtain a setting parameter fitting curve of the users; building a learning model, training the learning model by using parameters in a database, and establishing scene modes of users of the same type under different scenes; identifying the user identity, and sending the setting parameters of different intelligent devices under the scene mode corresponding to the user or the user of the same type to a central control system according to the user identity information or the user type and the scene where the user is located; and the central control system sets corresponding intelligent equipment according to the received setting parameters. According to the invention, personalized and accurate parameter setting is added for each user, and the flexibility and diversity of parameter setting of the intelligent equipment are improved.

Description

Intelligent device control method based on scene and big data
Technical Field
The invention relates to the technical field of application of the Internet of things, in particular to an intelligent device control method based on big data.
Background
With the development of society, the progress of science and technology and the arrival of information era, the internet of things industry in China is developed rapidly, a large amount of data with the same type (such as control data triggered by a user, data with the same type such as temperature regulation, switch time adjustment, light switch time, bright and dark lamps and the like formed at a service end) are generated, secondary information brought by the data is endless, and the intelligent equipment can better serve our lives, for example, big data and the internet of things technology are applied to intelligent equipment to set corresponding parameters according to different contextual models.
Currently, setting corresponding parameters of different contextual models of intelligent equipment by utilizing big data and internet of things technology mainly comprises the following steps:
the intelligent control terminal is connected with the cloud platform through WiFi and sends an instruction to the cloud platform;
the cloud platform is communicated with the intelligent gateway through a wide area network, and the intelligent gateway is communicated with the intelligent equipment of the Internet of things through an internal private network, so that the intelligent equipment of the Internet of things is controlled;
and the big data analysis module is used for carrying out parameter setting on the intelligent equipment of the Internet of things in each scene mode through analyzing the characteristic data. If the scene-based intelligent illumination integrated control system is used, different illumination parameters can be set according to different teaching scenes, and customized illumination schemes are provided for the different teaching scenes.
As can be seen from the prior art, the traditional technology of the internet of things can realize control over the internet of things equipment, but real man-machine interaction is not achieved, and a user still needs to select control equipment according to a corresponding scene; moreover, since the parameters are set according to different scene modes, the precision of the device parameters is not sufficient for each person, and the method cannot be flexibly applied to each user.
In view of this, it is urgently needed to improve the existing parameter setting scheme of the intelligent device so as to achieve real human-computer interaction, perform personalized and accurate parameter setting on each user, and improve the flexibility and diversity of parameter setting of the intelligent device.
Disclosure of Invention
The technical problem to be solved by the invention is that the existing parameter setting scheme of the intelligent equipment cannot set individual parameters for each user, so that the problem of poor parameter accuracy and flexibility occurs.
In order to solve the technical problem, the technical scheme adopted by the invention is to provide an intelligent device control method based on big data, which comprises the following steps:
step S10, in the using process of a user, collecting user identity information, the user type of the user and setting parameters of the user to different intelligent devices in different scenes, storing the user identity information, the user type of the user and the setting parameters of the user to different intelligent devices in a database, and analyzing big data of the setting parameters in the database by using a linear regression algorithm to obtain a fitting curve of the user to the setting parameters of different intelligent devices in different scenes; the scene is the environment place, occasion and different time periods where the user is located;
s20, constructing a learning model, training the learning model by using the big data of the setting parameters in the database, and establishing scene modes of the same type of users in different scenes; the scene mode is a parameter set for habits of users of the same type on different intelligent devices in the scene;
step S30, identifying the user identity, and sending the setting parameters of the user aiming at different intelligent devices to the central control system according to the user identity information or the user type and the scene, or sending the setting parameters of different intelligent devices under the scene mode corresponding to the same type of user to the central control system;
and step S40, the central control system sets corresponding intelligent equipment according to the received setting parameters.
In the above scheme, step S10 specifically includes the following steps:
step S11, collecting user types and log logs of setting parameters of the user to different intelligent devices in different scenes through the intelligent terminal, and storing the log logs into a log folder in the server;
step S12, monitoring the change of the log files in the log folder, and executing step S13 if the log files change; otherwise, monitoring all the time;
step S13, extracting the setting parameters in the log file, storing the setting parameters in a database, and cleaning the log file at regular time;
step S14, classifying the setting parameters in the database according to different users, different intelligent devices and scenes;
and step S15, analyzing the setting parameters of different intelligent devices of corresponding users in different scenes by running a linear regression algorithm of big data, and obtaining the fitting curve of the setting parameters of each user to different intelligent devices in different scenes.
In the above scheme, in step S30, the user identity is identified through the smart camera or the identity card, and the user identity is the user himself or the user category.
In the scheme, after the user identity is identified, if the scene of the user is not matched, the scene mode of the user of the same type in the scene is obtained according to the user category to which the user belongs, and different setting parameters of the intelligent equipment corresponding to the scene mode are sent to the central control system.
In the scheme, each scene is also provided with a sensor, the sensor sends the acquired field parameters to the central control system in real time, and the central control system compares the received field parameters with the set parameters in the scene and correspondingly adjusts the scene according to the comparison result.
In the above scheme, the sensor includes a temperature sensor, an illumination sensor, an air quality sensor and a decibel sensor.
In the scheme, the control instruction of the user is recognized through the voice recognition module, and the control instruction is sent to the central control system to automatically switch scenes.
In the scheme, the intelligent terminal is used for manually inputting a control instruction and sending the control instruction to the central control system, and manual scene switching is carried out.
In the above scheme, the intelligent terminal or the voice recognition module sends a control instruction to the central control system through the internal private network, and connects and controls the corresponding intelligent device according to the control instruction.
Compared with the prior art, the invention has the following advantages:
(1) the parameters of the Internet of things equipment are dynamically adjusted according to different scene modes, so that different use requirements of different users on the scene modes are better met;
(2) based on a diversified control mode of a specific user scene mode, user characteristics and habits can be obtained through secondary data analysis of big data and a trained learning model according to different scene modes, scene mode parameters of a user are finally obtained, and automatic control is performed on the Internet of things equipment in the corresponding user scene mode, so that user experience is improved;
(3) the control mode combining the automatic switching scene mode, the manual mode and the voice control improves the flexibility and the usability of the control.
Drawings
Fig. 1 is a flowchart of a method for controlling an intelligent device based on big data according to the present invention;
fig. 2 is a block diagram of an implementation of an intelligent device control method based on big data according to the present invention.
Detailed Description
The invention provides an intelligent equipment control method based on scenes and big data, which solves the problem of human-computer interaction by utilizing an identity recognition technology and a big data analysis and mining technology and increases humanized setting; big data analysis of specific users and scenes is carried out through secondary information data of terminal products (intelligent equipment), useful data are mined, and personalized setting parameters (such as intelligent equipment switch parameters, indoor temperature and humidity parameters, light brightness parameters and the like) of the users to different intelligent equipment in different scenes are obtained; when a specific user appears, after identity recognition is carried out, individual parameter setting is automatically carried out according to scenes, the scenes are flexibly switched, and the experience degree of the user is greatly improved. The invention is described in detail below with reference to the drawings and the detailed description.
As shown in fig. 1, the present invention provides a method for controlling an intelligent device based on a scene and big data, comprising the following steps:
step S10, collecting user identity information, the type of the user, and setting parameters of the user for different intelligent devices in different scenarios, and storing the collected information into a database, wherein the database may be an individual person or a class of persons, such as students, teachers, doctors, nurses, patients, etc. The scene is the environment place, occasion and different time periods where the user is located. And then, classifying the setting parameters of different intelligent devices according to the identity information and the user types of different users and different scenes of the users, storing the classification into a database, and analyzing the big data of the setting parameters in the database by using a linear regression algorithm to obtain the fitting curve of the setting parameters of each user to different intelligent devices in different scenes.
The step aims to perform parameter setting on the intelligent device according to a set parameter fitting curve of a specific user after the specific user is identified subsequently.
S20, constructing a learning model, training the learning model by using a large amount of big data of setting parameters in a database, and establishing scene modes of users of the same type in different scenes; and setting parameters for habits of the same type of users on different intelligent devices in the scene according to the scene mode.
The step is used for setting parameters of the intelligent equipment by using habit setting parameters of users of the same type when the subsequently identified users cannot be matched.
And S30, recognizing the user identity, and sending the setting parameters of the user aiming at different intelligent devices to the central control system according to the user identity information, the user type and the scene where the user is located, or sending the setting parameters of different intelligent devices under the scene mode corresponding to the user of the same type to the central control system.
And step S40, the central control system sets corresponding intelligent equipment according to the received setting parameters.
In the present invention, step S10 specifically includes the following steps:
step S11, collecting user identity information, user type and log of setting parameters of the user to different intelligent devices in different scenes through an application program (APP) on the intelligent terminal or a web browser on a mobile phone/a computer, and storing the log into a log folder in a server;
step S12, monitoring the change of the log files in the log folder on the server; if the log file is changed, the step S13 is executed; otherwise, monitoring all the time;
step S13, extracting the user type in the log file and storing the setting parameters of the user for different intelligent devices in different scenes into a database, and cleaning the log file regularly;
step S14, classifying the setting parameters in the database according to different users, different intelligent devices and scenes;
and step S15, analyzing the setting parameters of different intelligent devices of corresponding users in different scenes by running a linear regression algorithm of big data, and obtaining the fitting curve of the setting parameters of each user to different intelligent devices in different scenes.
As shown in fig. 2, in the present invention, a control instruction of a user can be recognized by a voice recognition module, and the recognized control instruction is sent to a central control system for setting device parameters, and according to the control instruction, automatic scene switching, connection with an intelligent device, or adjustment of setting parameters of the intelligent device in a current scene is performed; and a control instruction can be manually input through the intelligent terminal and sent to the central control system, so that manual scene switching is performed, the intelligent device is connected with the intelligent device, or the parameters of the intelligent device in the current scene mode are adjusted.
In the invention, the user identity is identified through the intelligent camera or the identity authentication card, wherein the technology of identifying the user identity and the user type by the camera is not the key point of the invention, and the invention can be realized by adopting the prior art, such as a face identification system, an image identification technology and the like which are commonly used in the prior art.
After the user identity is identified, the scene where the user is located is inquired according to the user identity, the setting parameters of different intelligent devices corresponding to the scene are sent to the central control system, if the user or the scene of the user is not matched (the system does not acquire the setting parameters of the user to different intelligent devices in different scenes), the scene mode of the user of the same type in the scene is acquired according to the user category to which the user belongs, the setting parameters of different intelligent devices corresponding to the scene mode are sent to the central control system, and the central control system can set the parameters of the intelligent devices in the current scene by using the habit setting parameters of the user of the same type as the user. That is, in the present invention, the user himself is first matched, and if matched, the favorite data of the user himself is used for setting; if the user does not match the current scene, the habit setting parameters of the same type of users learned by the learning model are used for setting. Thereby ensuring the normal operation of the system.
In the invention, the intelligent terminal or the voice recognition module sends a control instruction to the central control system through the internal private network, thereby achieving the purpose of connecting and controlling the intelligent equipment of the Internet of things.
The invention also comprises sensors arranged in each scene, wherein the sensors are all digital and comprise a temperature sensor, an illumination sensor, an air quality sensor, a decibel sensor and the like, the sensors in each scene transmit acquired field parameters to the central control system in real time, the central control system compares the received field parameters with actual parameters in the scene, if the field parameters are the same as the actual parameters, the central control system does not perform any operation, if the field parameters are higher or lower than the relevant parameters in the scene, the central control system re-transmits parameter setting instructions to one or more corresponding intelligent devices, performs corresponding low adjustment or high adjustment on the parameters, performs corresponding revision, ensures that the actual setting parameters of the intelligent devices in the scene are consistent with the setting parameters conforming to the scene in real time, and achieves the purpose of more friendly control.
In the present invention, during the using process, step S10 is continuously executed, and the setting parameters in the scene of each user are continuously closer to the habit characteristics of the user.
The intelligent equipment controlled by the invention can generate a large amount of data with the same type after being on-line, useful information concerned by people can be extracted from the data, the useful information is fed back to the central control system of the invention, and the invention is applied to the invention again to help perfect the performance of the invention, so that the central control system of the invention has learning capacity and can more intelligently serve life.
Several exemplary applications of the invention are as follows:
application example 1: and (4) an intelligent working room.
Air conditioner, light etc. of this intelligence studio are the smart machine who has accessed the thing networking, and these smart machines can set up through pronunciation or intelligent terminal to can be according to different users and the corresponding setting of scene automatic call of difference.
For example: the user A sets the scene according to the habit, the preference and the time period as follows:
Figure BDA0002116592610000071
Figure BDA0002116592610000081
for example: the user B sets the following according to the habit, preference and time period of the user B:
Figure BDA0002116592610000082
the user A and the user B can set the parameters by using an intelligent terminal or voice and store the parameters in the server.
When a user A enters an intelligent working room, the user A is identified through a camera or an identity authentication card in the intelligent working room, setting parameters of the user A aiming at different intelligent equipment are sent to a central control system according to the user A and the scene, the central control system sets the intelligent equipment according to a fitting curve of the user A to the setting parameters of the different intelligent equipment under the different scenes, when the user A enters the intelligent working room at 10 points, an air conditioner is automatically adjusted to be closed, and light is automatically adjusted to be bright white; and when the user A enters the intelligent working room at 15 points, the air conditioner is automatically adjusted to 24 ℃, and the light is automatically adjusted to bright yellow.
When a user B enters an intelligent working room, the user B is identified through a camera or an identity authentication card in the intelligent working room, setting parameters of the user B aiming at different intelligent equipment are sent to a central control system according to the scene where the user B is located, the central control system sets the intelligent equipment according to the fitting curve of the user B to the setting parameters of the different intelligent equipment under different scenes, when the user B enters the intelligent working room at 10 points, an air conditioner is automatically adjusted to 25 ℃, and lamplight is automatically adjusted to bright yellow; and when the user B enters the meeting room at 15 points, the air conditioner is automatically adjusted to 24 ℃, and the light is automatically adjusted to bright white.
The user can also adjust different smart machine in the use, and the setting parameter of these adjustments also can be collected for carry out big data analysis or train learning model, thereby make the result more and more close to user's custom.
The time period is an exemplary illustration, and in practical application, the time period can be further reduced and refined.
Application example 2: an intelligent conference room.
Firstly, setting the air conditioner and the light of the intelligent meeting room according to the scene and the number of people using the intelligent meeting room as follows:
Figure BDA0002116592610000091
thus, when 5 people use the intelligent conference room at 10:00, the camera of the conference room recognizes that the number of people in the conference room is 5, corresponding setting parameters are sent to the central control system, and then the central control system automatically adjusts the air conditioner to 26 ℃ and automatically adjusts the light to be common yellow.
When 10 people use the intelligent conference room at 15:00, the camera of the conference room recognizes that the number of people in the conference room is 10, corresponding setting parameters are sent to the central control system, and then the central control system automatically adjusts the air conditioner to 24 ℃ and automatically adjusts the light to bright yellow.
Application example 3: an intelligent conference room.
Firstly, setting the air conditioner and the light of an intelligent meeting room as follows according to meeting occasions and time intervals:
Figure BDA0002116592610000092
thus, when a meeting is started in an intelligent meeting room at a speed of 10:00, the participants say that the current meeting occasion is a technical discussion, the voice recognition module sends a corresponding control instruction to the central control system through the internal private network, the central control system automatically adjusts the air conditioner to 25 ℃ and automatically adjusts the light to bright white, if the participants say that the current meeting is a leader summary, the voice recognition module sends a corresponding control instruction to the central control system through the internal private network, the central control system automatically adjusts the air conditioner to 24 ℃ and automatically adjusts the light to common yellow.
The three application examples are that after the user sets the parameters, the set parameters are automatically called according to the specific user to automatically adjust.
Application example 4: intelligent rest room.
The user of different grade type is when using intelligent rest room, can be according to own hobby and time quantum setting air conditioner and light, and most of like users' use habit is similar, the parameter that the user set for can be preserved at first to the system, and bind this type of crowd (can be through the camera in the intelligent rest room or the identity of identity card discernment user), then utilize the big data of setting up the parameter in the database to train the study model, establish the scene mode of the same type user under different scenes, for example, the setting that goes out different types of people according to big data analysis is as follows:
Figure BDA0002116592610000101
when the intelligent rest room is used, the camera in the intelligent rest room identifies whether a user is a nurse or a doctor according to a nurse cap or an identity card, setting parameters of different intelligent equipment under scene modes corresponding to the same type of users are sent to the central control system, and when a nurse 10:00 uses the intelligent rest room, the central control system automatically adjusts the air conditioner to 25 ℃ and automatically adjusts light to dark yellow; when the doctor uses the intelligent rest room at a ratio of 10:00, the central control system automatically adjusts the air conditioner to 24 ℃ and automatically adjusts the light to be bright white.
The preferred setting of this scheme is user's own, only when can not matching, just can use the custom setting of the same type user.
The above application examples are only exemplary, the time period may be further refined, and meanwhile, the smart device is not limited to air conditioners and lights, and may further include other smart devices, for example: intelligent sound, coffee machine, tea set, etc.
The present invention is not limited to the above-mentioned preferred embodiments, and any structural changes made under the teaching of the present invention shall fall within the scope of the present invention, which is similar or similar to the technical solutions of the present invention.

Claims (9)

1. An intelligent device control method based on scenes and big data is characterized by comprising the following steps:
step S10, in the using process of a user, collecting user identity information, the user type of the user and setting parameters of the user to different intelligent devices in different scenes, storing the user identity information, the user type of the user and the setting parameters of the user to different intelligent devices in a database, and analyzing big data of the setting parameters in the database by using a linear regression algorithm to obtain a fitting curve of the user to the setting parameters of different intelligent devices in different scenes; the scene is the environment place, occasion and different time periods where the user is located;
s20, constructing a learning model, training the learning model by using the big data of the setting parameters in the database, and establishing scene modes of the same type of users in different scenes; the scene mode is a parameter set for habits of users of the same type on different intelligent devices in the scene;
step S30, identifying the user identity, and sending the setting parameters of the user aiming at different intelligent devices to the central control system according to the user identity information or the user type and the scene, or sending the setting parameters of different intelligent devices under the scene mode corresponding to the same type of user to the central control system;
and step S40, the central control system sets corresponding intelligent equipment according to the received setting parameters.
2. The method according to claim 1, wherein step S10 specifically comprises the steps of:
step S11, collecting user types and log logs of setting parameters of the user to different intelligent devices in different scenes through the intelligent terminal, and storing the log logs into a log folder in the server;
step S12, monitoring the change of the log files in the log folder, and executing step S13 if the log files change; otherwise, monitoring all the time;
step S13, extracting the setting parameters in the log file, storing the setting parameters in a database, and cleaning the log file at regular time;
step S14, classifying the setting parameters in the database according to different users, different intelligent devices and scenes;
and step S15, analyzing the setting parameters of different intelligent devices of corresponding users in different scenes by running a linear regression algorithm of big data, and obtaining the fitting curve of the setting parameters of each user to different intelligent devices in different scenes.
3. The method according to claim 1, wherein in step S30, the user identity is identified by a smart camera or an identity card, and the user identity is the user himself or the user category.
4. The method according to claim 3, wherein after the user identity is identified, if the scene of the user is not matched, the scene mode of the user of the same type in the scene is obtained according to the user category to which the user belongs, and the setting parameters of different intelligent devices corresponding to the scene mode are sent to the central control system.
5. The method according to claim 1, wherein each scene is further provided with a sensor, the sensor transmits the acquired field parameters to the central control system in real time, and the central control system compares the received field parameters with the set parameters in the scene and performs corresponding adjustment according to the comparison result.
6. The method of claim 5, wherein the sensors comprise a temperature sensor, a light sensor, an air quality sensor, and a decibel sensor.
7. The method of claim 1, wherein the control command of the user is recognized by a voice recognition module and sent to a central control system for automatic scene switching.
8. The method of claim 7, wherein the intelligent terminal is used for manually inputting a control command and sending the control command to the central control system, so as to manually switch the scenes.
9. The method according to claim 8, wherein the intelligent terminal or the voice recognition module sends a control instruction to the central control system through an internal private network, and connects and controls the corresponding intelligent device according to the control instruction.
CN201910592601.9A 2019-07-03 2019-07-03 Intelligent device control method based on scene and big data Pending CN111766800A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910592601.9A CN111766800A (en) 2019-07-03 2019-07-03 Intelligent device control method based on scene and big data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910592601.9A CN111766800A (en) 2019-07-03 2019-07-03 Intelligent device control method based on scene and big data

Publications (1)

Publication Number Publication Date
CN111766800A true CN111766800A (en) 2020-10-13

Family

ID=72718256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910592601.9A Pending CN111766800A (en) 2019-07-03 2019-07-03 Intelligent device control method based on scene and big data

Country Status (1)

Country Link
CN (1) CN111766800A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112782990A (en) * 2020-12-31 2021-05-11 珠海格力电器股份有限公司 Control method and device of intelligent equipment, storage medium and electronic equipment
CN113965700A (en) * 2021-11-26 2022-01-21 四川长虹电器股份有限公司 Automatic adjusting method and system for intelligent television scene

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105068515A (en) * 2015-07-16 2015-11-18 华南理工大学 Intelligent household equipment voice control method based on self-learning algorithm
CN106294738A (en) * 2016-08-10 2017-01-04 武汉诚迈科技有限公司 A kind of Intelligent household scene collocation method
US20180014390A1 (en) * 2016-07-08 2018-01-11 Locoroll, Inc. Intelligent lighting control system scene list selection apparatuses, systems, and methods
CN108052014A (en) * 2017-12-18 2018-05-18 美的集团股份有限公司 Control method, system and the computer readable storage medium of smart home
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene
CN109429416A (en) * 2017-08-29 2019-03-05 美的智慧家居科技有限公司 Illumination control method, apparatus and system for multi-user scene

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105068515A (en) * 2015-07-16 2015-11-18 华南理工大学 Intelligent household equipment voice control method based on self-learning algorithm
US20180014390A1 (en) * 2016-07-08 2018-01-11 Locoroll, Inc. Intelligent lighting control system scene list selection apparatuses, systems, and methods
CN106294738A (en) * 2016-08-10 2017-01-04 武汉诚迈科技有限公司 A kind of Intelligent household scene collocation method
CN109429416A (en) * 2017-08-29 2019-03-05 美的智慧家居科技有限公司 Illumination control method, apparatus and system for multi-user scene
CN108052014A (en) * 2017-12-18 2018-05-18 美的集团股份有限公司 Control method, system and the computer readable storage medium of smart home
CN108153158A (en) * 2017-12-19 2018-06-12 美的集团股份有限公司 Switching method, device, storage medium and the server of household scene

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112782990A (en) * 2020-12-31 2021-05-11 珠海格力电器股份有限公司 Control method and device of intelligent equipment, storage medium and electronic equipment
CN113965700A (en) * 2021-11-26 2022-01-21 四川长虹电器股份有限公司 Automatic adjusting method and system for intelligent television scene

Similar Documents

Publication Publication Date Title
CN105118257B (en) Intelligent control system and method
CN109890106B (en) Hotel individualized intelligent lighting device based on user identity automatic identification
KR101872635B1 (en) Automatic make-up evaluation system and operating method thereof
CN105204357A (en) Contextual model regulating method and device for intelligent household equipment
CN106340071B (en) A kind of smart classroom system based on cloud computing
WO2019085597A1 (en) Method, apparatus and system for controlling device
DE112018005025T5 (en) Imaging device and control method therefor
CN109886841B (en) Individualized intelligent lighting device in hotel based on cell-phone APP
CN107272607A (en) A kind of intelligent home control system and method
CN109741747B (en) Voice scene recognition method and device, voice control method and device and air conditioner
CN112074062A (en) Scene-based light adjusting method and intelligent lighting device
CN104506586A (en) Intelligent earphone system capable of regulating volume by gesture and regulation method
CN109788621A (en) Intelligent lamp and its control method, electronic equipment, storage medium
CN111766800A (en) Intelligent device control method based on scene and big data
CN109542233A (en) A kind of lamp control system based on dynamic gesture and recognition of face
CN105629750A (en) Smart home control method and system
CN109429416A (en) Illumination control method, apparatus and system for multi-user scene
CN117156635A (en) Intelligent interaction energy-saving lamp control platform
CN105912632A (en) Device service recommending method and device
CN113611306A (en) Intelligent household voice control method and system based on user habits and storage medium
CN112596405A (en) Control method, device and equipment of household appliance and computer readable storage medium
CN106332399A (en) Lighting display effect control method
CN114884763A (en) Intelligent lighting system adjusted according to emotion
WO2020078076A1 (en) Method and system for controlling air conditioner, air conditioner, and household appliance
CN112533070B (en) Video sound and picture adjusting method, terminal and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination