CN113341743A - Intelligent household equipment control method and device, electronic equipment and storage medium - Google Patents
Intelligent household equipment control method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN113341743A CN113341743A CN202110633950.8A CN202110633950A CN113341743A CN 113341743 A CN113341743 A CN 113341743A CN 202110633950 A CN202110633950 A CN 202110633950A CN 113341743 A CN113341743 A CN 113341743A
- Authority
- CN
- China
- Prior art keywords
- target
- execution function
- function
- user
- scene
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 230000006870 function Effects 0.000 claims abstract description 355
- 230000009471 action Effects 0.000 claims abstract description 31
- 230000006399 behavior Effects 0.000 claims description 29
- 238000012216 screening Methods 0.000 claims description 3
- 238000013507 mapping Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 11
- 238000004891 communication Methods 0.000 description 8
- 238000012544 monitoring process Methods 0.000 description 8
- 238000004378 air conditioning Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000005057 refrigeration Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 238000001816 cooling Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004134 energy conservation Methods 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 238000011022 operating instruction Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Selective Calling Equipment (AREA)
Abstract
The embodiment of the application discloses a method and a device for controlling intelligent household equipment, electronic equipment and a storage medium, wherein the method for controlling the intelligent household equipment comprises the following steps: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on a target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated; controlling the target equipment to be controlled to execute the target action to be operated; and determining a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated. By means of the method and the device, the intelligent household equipment is automatically controlled according to the behavior information of the user, the user does not need to associate the functions and scenes of the intelligent household equipment one by one, dynamic updating of the first association relation is achieved, and user experience is improved.
Description
Technical Field
The present disclosure relates to the field of home devices, and more particularly, to a method and an apparatus for controlling an intelligent home device, an electronic device, and a storage medium.
Background
In recent years, with the increasing economic level, the consumer's consumption capacity is gradually increased, and at the same time, the consumer's demand for consumer products is also increasing. Because intelligent device has characteristics such as control intelligence, energy-conservation, the occupation ratio of intelligent device in the market is also constantly promoting. In the prior art, usually, the intention of a user is obtained through a voice recognition method, an intelligent device which the user wants to control is determined, and the intelligent device is controlled according to the intention, so that intelligent control is realized. Although the intelligent device can be controlled conveniently in this way, the intelligent device is prone to deviation in the process of knowing the intention of the user, different user intentions may exist for the same voice, so that the result of controlling the intelligent device cannot meet the expectation of the user, therefore, in the actual control process of the intelligent device, the user is often required to configure the trigger condition of the intelligent device by himself, and inconvenience is brought to the user.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling intelligent household equipment, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present application provides an intelligent home device control method, where the intelligent home device control method includes: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated; controlling the target equipment to be controlled to execute the target action to be operated; and determining a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated.
In a second aspect, an embodiment of the present application provides an intelligent home device control apparatus, which may include a target scene determining module, a target execution function obtaining module, a function execution module, and an updating module. The target scene determining module is used for acquiring behavior information of a user and determining a target scene based on the behavior information. The target execution function acquisition module is used for acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated. The function execution module is used for controlling the target equipment to be controlled to execute the target action to be operated. The updating module is used for a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated.
In a third aspect, an embodiment of the present application provides an electronic device, including one or more processors and a memory; one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the methods described above.
In a fourth aspect, the present application provides a computer-readable storage medium, in which program codes are stored, wherein the method described above is performed when the program codes are executed.
The embodiment of the application provides a method and a device for controlling intelligent household equipment, electronic equipment and a storage medium. Through the implementation of the intelligent home equipment control method provided by the embodiment of the application, when a target scene is obtained according to user behavior information, a target execution function can be obtained based on a first association relation, and then the target equipment to be controlled is controlled to adjust the environment where the user is located, after the target equipment to be controlled is controlled based on the target execution function, the habit preference of the user is learned, and the first association relation is updated, so that when the same target scene appears in the subsequent period, the target execution function corresponding to the target scene can be inquired in the first association relation, and further the working state of the intelligent home equipment is automatically controlled, the user does not need to control each intelligent home equipment independently, and the user does not need to associate the equipment functions of the intelligent home equipment with the scene one by one, and the first association relation is dynamically updated, the convenience of the user for using the intelligent household equipment is improved, and the user use experience is improved.
A specific smart home device control method is described below. Based on the above-described application environment and system, embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Drawings
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 shows a schematic diagram of an application environment suitable for the embodiment of the present application.
Fig. 2 shows a schematic view of a control panel suitable for use in embodiments of the present application.
Fig. 3 shows a flowchart of a smart home device control method according to an embodiment of the present application.
Fig. 4 shows a flow chart illustrating the steps of determining a target execution function in the method shown in fig. 3.
Fig. 5 shows a flow diagram of the step of obtaining the second performed function in the method shown in fig. 4.
Fig. 6 shows a further flow diagram of the step of obtaining a second performed function in the method of fig. 4.
FIG. 7 is a further flowchart illustrating the step of obtaining a target execution function in the method of FIG. 4.
Fig. 8 is a flowchart illustrating a step of updating the first association relationship in the method illustrated in fig. 3.
Fig. 9 shows a functional module block diagram of an intelligent home device control apparatus according to an embodiment of the present application.
Fig. 10 shows a functional block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the current method for controlling the smart home devices, a user generally creates a trigger condition of each smart home device through a control panel in advance, for example, the user needs to manually input the trigger condition to bind a specific scene with a working state of each smart home device, and when the specific scene occurs, each smart home device enters the working state correspondingly bound with the specific scene. Although the method can automatically control the intelligent home equipment to perform the corresponding binding working state when a specific scene appears, the method is realized on the premise that a user needs to create a binding relation between the scene and the working state of the intelligent home equipment in advance and depends heavily on the operation of the user, when the situations that the number of the intelligent home equipment is large, the intelligent home equipment is newly accessed, the number of the scenes is large and the like occur, the user needs to perform various operations to bind the scene and the working state of the intelligent home equipment, and great inconvenience is brought to the user.
In order to avoid the above problems, the inventors of the present application have made extensive studies and have proposed a smart home device control method of the present application, including: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated; controlling the target equipment to be controlled to execute the target action to be operated; and determining a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated. Through the implementation of the method for controlling the smart home devices provided by the embodiment, when a target scene is obtained, a target execution function can be obtained based on the first association relationship, and then the target to-be-controlled device is controlled, so as to adjust the environment of the user, and after the target to-be-controlled device is controlled based on the target execution function, the first association relationship can be updated, so that when the same target scene appears in the subsequent period, the target execution function corresponding to the target scene can be inquired in the first association relationship, so as to realize learning of the habit preference of the user, and further automatically control the working state of the smart home devices, without the need of the user to individually control each smart home device, or the need of the user to associate the device function and the scene of the smart home device one by one, so as to realize dynamic update of the first association relationship, and greatly improve the convenience of the user in using the smart home devices, and the requirements of the user can be quickly responded.
First, an application environment of the intelligent home device control method provided by the embodiment of the application is introduced.
Referring to fig. 1, fig. 1 shows an application scenario diagram of an intelligent home device control method provided in an embodiment of the present application, where the intelligent home device control method may be applied to an intelligent home system 100, and the intelligent home system 100 may include a control panel 101 and an intelligent home device 102.
In this embodiment, the control panel 101 includes an intelligent control device for controlling the smart home device 102, and the intelligent control device can implement functions of system information acquisition, information input, information output, relay transmission, relay control, centralized control, remote control, linkage control, remote call, and the like. The intelligent control equipment can also be responsible for specific security alarm, household appliance control, power utilization information acquisition and the like. The intelligent control equipment can also carry out data interaction with products such as an intelligent interaction terminal and the like in a wireless mode or a wired mode. It should be noted that fig. 1 is only an exemplary illustration of an application environment, and the number of the control panels 101 in the figure may be increased or decreased according to needs.
Referring to fig. 2, the control panel 101 may include a control panel body 1011, wherein the control panel body 1011 may be a cylinder, a regular frame, an irregular frame, or the like, and optionally, as one mode, the control panel body 1011 is a regular frame, such as a rectangular parallelepiped, a square, or the like, and a fixing portion may be disposed on one side of the control panel body 1011, so as to fix the control panel body 1011 to a target object through the fixing portion, such as a wall, a door, or the like.
Further, the control panel 101 may include at least one virtual key, and the at least one virtual key is disposed on the control panel main body 1011. Specifically, the control panel body 1011 may include a touch screen 1013, and at least one virtual key is disposed on the touch screen 1013, wherein the touch screen 1013 is used for displaying the screen information output by the control panel 101 and for a user to touch an operation, for example, touch the at least one virtual key. The control panel 101 may further include at least one physical key 1012, the at least one physical key 1012 being disposed on the control panel main body 1011. The physical keys and the virtual keys can be associated with different operation instructions to realize different operations.
Among other things, control panel 101 may include devices that specifically input information and output information capabilities. For example, the control panel 101 may include one or more of a smart home control panel, a smart speaker, a smart television, a smart phone, a smart tablet, a notebook, a Personal Computer (PC), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a wearable Device (e.g., a smart watch, a smart bracelet, and smart glasses), and the like. Additionally, in some examples, the operating system run by the control panel 101 may include, but is not limited to, an Android operating system, an IOS operating system, a Symbian operating system, a UNIX operating system, a Linux operating system, a QNX operating system, a Black Berry operating system, a Windows Phone 8 operating system, and so forth.
In this embodiment, the smart home device 102 may establish a communication connection with the control panel 101. When the control panel 101 calls the smart home devices 102, the smart home devices 102 may interact with the control panel 101. The smart home device 102 should also have the ability to input information and output information. For example, the smart home device 102 may be a visualization manager, a listening manager, and the like.
In this embodiment, the control panel 101 and the smart home device 102 may be separately arranged, or may be an integrated structure, where the control panel 101 and the smart home device 102 are not specifically limited. In addition, the smart home system 100 may further include a server, the server may establish a communication connection with the control panel 101, the control panel 101 may be configured to receive an instruction sent by a user, and may also be configured to receive a control signal sent by the server, where the control signal is used to control the working state of the smart home device 102, and at this time, the server may receive the instruction sent by the control panel 101, perform recognition analysis based on the instruction to generate a control signal, and send the control signal to the control panel 101 to control the smart home device 102. It should be noted that, some or all of the steps in the method for controlling smart home devices provided in this embodiment may be executed in the server, or may be executed in the control panel 101, which is not limited herein.
The embodiment of the application provides an intelligent home device control method, which is applied to the control panel 101 in the intelligent home system 100, can also be applied to a server, and can also be applied to the control panel 101 and the server at the same time. A specific smart home device control method is described below. Based on the above-described application environment and system, embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 3, once triggered, the flow of the method in the embodiment may automatically run, where each step may be performed sequentially according to the sequence in the flowchart, or multiple steps may be performed simultaneously according to an actual situation, which is not limited herein. The smart home device control method may include the following steps S11 to S14.
Step S11: and acquiring behavior information of the user, and determining a target scene based on the behavior information.
In this embodiment, the behavior information may include information related to the behavior of the user. For example, the behavior information may include a user's action, voice, and the like. In some examples, the smart home device may receive voice of the user through a microphone, and may also obtain gestures of the user through a camera to obtain actions of the user. In other examples, the operation instruction of the user may be obtained through a sensor, and the sensor may generate a corresponding electrical signal based on the operation of the user, and therefore, the electrical signal is used for characterizing the action of the user. For example, the sensor may include a wave switch, a resistive touch switch, a capacitive touch switch, and the like.
In this embodiment, the target scene may include a scene represented by the behavior information, that is, the target scene may be consistent with a scene desired by the user. For example, the target scene may include a sleep scene, a wake scene, a rinse scene, and the like.
In some examples, when the behavior information is voice, the control panel may perform voice recognition on the voice to obtain a voice recognition result, and obtain a target scene represented by the voice from the voice recognition result. When the behavior information is the action of the user, the intelligent home equipment can perform image recognition on the image containing the action to obtain an image recognition result, and acquire a target scene represented by the action from the image recognition result, wherein the image can be one or more frames of images; when the behavior information is an operation instruction acquired by a sensor, a target scene corresponding to the operation instruction may be acquired from a preset mapping table based on the operation instruction, where the mapping table may include the operation instruction and a scene, and a mapping relationship exists between the operation instruction and the scene. For example, when the user makes a voice "sleep" or when the user moves as "go to bed" sleep, both the voice "sleep" and the motion "go to bed" sleep "can be used as the behavior information, and the target scene can be a sleep scene.
Step S12: and acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated.
In this embodiment, the target device to be controlled may be one or more smart home devices, and the target action to be operated may include an operation action performed by the smart home device to implement the target execution function.
In this embodiment, the first association relationship may include a mapping relationship between the target scene and the device function, where the mapping relationship may be configured in advance by the user or determined based on the user history operation/selection record. When the target scene is acquired, a device function having a mapping relationship with the target scene may be queried in the first association relationship, and the device function may be used as a target execution function, where the device function may include working parameters of the device, such as a working period, a working mode, and a working power.
In some examples, the first association may include a mapping between the scene and the environment and a mapping between the environment and the device function. The environment may be used to characterize the property of the space where the user is located, for example, the environment may include darkness, silence, low temperature, and the like. For example, when the target scene is a sleep scene, the target environment corresponding to the target scene "sleep scene" is dark, the device function corresponding to the target environment "dark" is off, and the device function "off" is determined as a target execution function, where the target execution function includes the target device to be controlled "light" and the target action to be operated "power off".
In some examples, the first association relationship may include a mapping relationship between a scene-environment variable-device function. The corresponding device functions (sets) of the same scene under different environment variables are not completely the same. The environment variable may be used to characterize an environment parameter of an environment where the user is located, for example, the environment variable may include weather, time, temperature, brightness, and the like. For example, when the target scene is a getting-up scene and the current environment variable is 5 o' clock or cloudy day, the corresponding recommended device function includes turning on the light, so that the device function "turning on the light" is determined as the target execution function; when the target scene is a getting-up scene and the environment variable is 10 o ' clock or sunny day, the corresponding recommended device function comprises opening a curtain, so that the device function ' window-opening curtain ' is determined as the target execution function.
In some examples, when the target scene is acquired, the target execution function corresponding to the target scene may be directly acquired based on the first association relationship, and before the step S11, the target execution function may be a device function that has been executed by the user or is executed by default in the target scene.
In some examples, when the target scene is acquired, when the target execution function cannot be acquired from the first association relationship, the target execution function may also be acquired based on the second association relationship; alternatively, when the target scene is acquired, the target execution function may be determined based on the first association relationship and the second association relationship. In this embodiment, the second association relationship may include a scene and a device function generated based on attribute recommendation of the scene, and when the target scene is acquired, a target execution function suitable for the target scene may be recommended based on the attribute of the target scene, and/or information of a user may be acquired, and a target execution function suitable for the target scene may be recommended based on the information of the user and the attribute of the target scene. In some examples, the target execution function may be generated by a user population segmentation model, an expert model, or the like. The manner of obtaining the target execution function from the preset association relationship is not particularly limited.
Step S13: and controlling the target equipment to be controlled to execute the target action to be operated.
In this embodiment, the target device to be controlled may be controlled to execute the target operation-waiting action by the target execution function. It should be noted that the target execution function may be one or more, specifically, the target execution function may include a plurality of to-be-operated actions of one target to-be-controlled device, and may also include a plurality of to-be-operated actions of a plurality of target to-be-controlled devices.
In some examples, when the smart home device control method provided in this embodiment is applied to a smart home system as shown in fig. 1, the control panel may send a corresponding target execution function to a target device to be controlled (a smart home device), so that the target device to be controlled executes a target action to be operated.
Step S14: and determining a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated.
After the target execution function is determined based on the first association relationship, it is determined whether the first association relationship includes a function to be updated, and in some examples, the function to be updated corresponding to the first association relationship may be determined by detecting a subsequent device operation of the user in the current target scene, or association data other than the first association relationship, and the like, so as to update the first association relationship.
In some examples, the behavior of the user may be continuously monitored, and the first association relationship is updated based on monitoring obtained monitoring information, where the monitoring manner may be an active acquisition type (for example, by actively querying the user and collecting audio of the user through a microphone, and collecting an image of the user through a camera), and a passive acquisition type (for example, acquiring an operation instruction of the user through a key, a touch screen, and so on, and acquiring control log data of each smart home device), and the type of the monitoring information should be consistent with the monitoring manner, that is, the monitoring information may include audio, an image, a control instruction, control log data, and so on.
Specifically, in a target scene, the device functions of one or more smart home devices controlled by the operation of the user are determined based on the monitoring information, at this time, if the device functions are not included in the first association relationship, the device functions and the mapping relationship between the device functions and the target scene are added to the first association relationship, and if the mapping relationship between the device functions and the target scene is not included in the first association relationship, the mapping relationship between the device functions and the target scene is added to the first association relationship. In addition, a target scene and device functions in the target scene can be obtained in real time based on the monitoring information, when a new target scene is obtained, the device functions executed by each smart home device in the target scene can be mapped and associated with the target scene, and the target scene, the device functions, and the mapping relationship between the target scene and the device functions can be added to the first association relationship.
In some examples, if the target execution function is determined by the second association relationship in the preset association relationship, or is obtained by the first association relationship and the second association relationship in the preset association relationship, it may be determined whether a corresponding mapping relationship exists between the target execution function and the target scene in the first association relationship, if so, the step of updating the target execution function to the first association relationship does not need to be performed, and if not, the target execution function is updated to the first association relationship, and the mapping relationship between the target execution function and the target scene is constructed.
In this embodiment, through the implementation of the above steps S11 to S14, when a target scene is obtained, a target execution function may be obtained based on a preset association relationship, and then the target device to be controlled is controlled, so as to adjust the environment where the user is located, after the target device to be controlled is controlled based on the target execution function, the first association relationship may be updated, so that when the same target scene subsequently appears, the target execution function corresponding to the target scene may be queried in the first association relationship, so as to implement learning of the habit preference of the user, and further automatically control the working state of the smart home devices, without the need for the user to individually control each smart home device, or without the need for the user to associate the device functions of the smart home devices with the scene one by one, so as to implement dynamic updating of the first association relationship, and greatly improve the convenience of the user in using the smart home devices, and the requirements of the user can be quickly responded.
An embodiment of the present application further provides a smart home device control method, as shown in fig. 4, the smart home device control method may include the following steps S21 to S24. The method for controlling smart home devices provided in this embodiment may include the same or similar steps as those in the above embodiments, and for the execution of the same or similar steps, reference may be made to the foregoing description, and details are not repeated in this specification.
Step S21: and acquiring behavior information of the user, and determining a target scene based on the behavior information.
Step S22: and acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation and a second incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated.
Further, as an implementation manner of this embodiment, when there is an intersection between the execution function obtained by the first association and the execution function obtained by the second association, the target execution function may be determined based on the priority of the first association and the priority of the second association; the first association relationship, in which there is a direct mapping relationship between the scene and the device function, may be set to have a higher priority than the second association relationship, as shown in fig. 4, and the step S22 may include the following steps S221 to S223.
Step S221: and acquiring a first execution function corresponding to the target scene based on the first incidence relation.
In this embodiment, the first association relationship may include a scene, a device function, and a mapping relationship between the scene and the device function, and when a target scene is obtained, a device function corresponding to the target scene may be queried in the first association relationship, and the device function is used as a first execution function, where the device function may include working parameters of a device, such as a working period, a working mode, and a working power, that is, a working process of the smart home device may be determined based on the first execution function. For example, when the target scene is a low temperature, in the first association relationship, the first execution function corresponding to the target scene "low temperature" includes an intelligent household device "air conditioner" cooling mode and has a temperature of 25 ℃.
In some examples, the first association may include a scene, an environment, a device function, a mapping between a scene and an environment, and a mapping between an environment and a device function. The environment may be used to characterize the property of the space where the user is located, for example, the environment may include darkness, silence, low temperature, and the like. For example, when the target scene is sleeping, the target environment "darkness" corresponding to the target scene "sleeping scene" may be queried in the second association relationship, and a first execution function corresponding to the target environment "darkness" may be further queried, where the first execution function includes turning off a lamp of the smart home device.
Step S222: and acquiring a second execution function corresponding to the target scene based on the second incidence relation.
Further, as an implementation manner of this embodiment, the second execution function may be determined based on a group to which the user belongs; the second association relationship may include group data, and as shown in fig. 5, the step S222 may include the following steps S2221 to S2222.
Step S2221: and acquiring the identity information of the user, and determining the group to which the user belongs based on the identity information.
In this embodiment, the identity information may include data characterizing the identity of the user. For example, the identity information may include an identification number, an account ID logged in the control panel, a gender, a ethnicity, an age, a preference preset by the user, and the like, and the content included in the identity information is not particularly limited.
In this embodiment, the group to which the user belongs may be used to characterize the type of the user, and at this time, the identity information of the user may be portrait data of the user, and the user is classified based on the portrait data. For example, the identity information may be age, and the group to which the user belongs may be an infant, a young child, a young adult, or the like; when the identity information can be the behavior preference of the user, the group to which the user belongs can be a sports habit unformed type, a good sports type, a forced sports type and the like, and the division standard of the group to which the user belongs is not particularly limited.
It should be noted that, for the same user, the user may be divided into one or more different user belonging groups based on the division criterion for the user belonging group, that is, the user belonging group acquired in step S2221 may be one or more.
Step S2222: and determining a group execution function corresponding to the user from the group data based on the target scene and the group to which the user belongs, and taking the group execution function as a second execution function.
In this embodiment, the group data may include respective preference degrees of groups to which different types of users belong to the device functions of the smart home devices in the same target scene. For example, through data statistics, when a user is in an environment with a higher temperature and a target scene is determined to be "sleeping" based on behavior information of the user, if a group to which the user of the user belongs is an elderly group, the elderly group is more inclined to turn on a fan compared with air-conditioning refrigeration, and at this time, the preference degree of the device function "air-conditioning refrigeration" is smaller than that of the device function "turn on the fan"; if the user group of the user is a young and middle-aged group, the young and middle-aged group is more inclined to air-conditioning refrigeration compared with the case of starting the fan, and the preference degree of the equipment function of air-conditioning refrigeration is greater than that of the equipment function of starting the fan.
In this embodiment, the group execution function may include a device function in which the group to which the user belongs has a higher priority in the target scene. The device functions with the ranking within the preset ranking can be used as the group execution functions by scoring the priority degrees of the device functions. For example, the score of the device function may be determined by the execution result of the device function and the scene attribute of the target scene, and the score may be used to represent the priority of the device function.
In this embodiment, through implementation of the above steps S2221 to S2222, the device functions corresponding to the target group data and the target scene may be obtained from the group data, and the matching degree between the obtained second execution function and the user may be improved, so that the user experience is improved.
Further, as an implementation manner of this embodiment, the second association relationship may further include scene general data, and the priority of the group data is greater than that of the scene general data, where the step S222 may further include: and when the group execution function corresponding to the user does not exist in the group data, determining the general execution function corresponding to the user from the scene general data based on the target scene, and taking the general execution function as a second execution function.
In this embodiment, the scene general data may include a device function with a higher priority in the target scene, and at this time, the scene general data may be equal to the device function expert database of the smart home device. The device functions with the ranking within the preset ranking can be used as the group execution functions by scoring the priority degree of the device functions. For example, the score of the device function may be determined by the execution result of the device function and the scene attribute of the target scene, and the score may be used to represent the priority of the device function.
In some examples, when the user is in an environment with a high temperature and the target scene is determined to be a "sleep scene" based on the behavior information of the user, it is determined that the airflow speed can be increased by "turning on a fan" based on the scene general data, the indoor space can be cooled by "air-conditioning cooling", and the "light off", and at this time, the general execution functions may include "turning on the fan", "air-conditioning cooling", and "light off".
In this embodiment, the second execution function may be acquired when there is no group execution function corresponding to the user in the group data.
Step S223: a target execution function is determined based on the first execution function and the second execution function.
In this embodiment, both the first execution function and the second execution function may be determined as the target execution function, and one of the first execution function and the second execution function may also be determined as the target execution function.
In this embodiment, the priority of the first association is higher than that of the second association, and the priority of the first execution function acquired based on the first association is higher than that of the second execution function. Specifically, when the smart home devices corresponding to the first execution function are consistent with the smart home devices corresponding to the second execution function, the first execution function may be determined as the target execution function. For example, when the target scene is sleeping, the first execution function is to raise the temperature of the air conditioner to 26 ℃, the second execution function is to raise the temperature of the air conditioner to 25 ℃, the objects controlled by the first execution function and the second execution function are both air conditioners, the priority degree of the first execution function is greater than that of the second execution function, and at this time, the first execution function is determined as the target execution function.
In this embodiment, through implementation of the above steps S221 to S223, a first execution function may be obtained based on the first association relationship, a second execution function may be obtained based on the second association relationship, a target execution function is determined from the first execution function and the second execution function, and when the smart home devices corresponding to the first execution function are consistent with the smart home devices corresponding to the second execution function, the first execution function may be used as the target execution function, so that a control result of the smart home devices better meets an expectation of a user.
Further, as an implementation manner of this embodiment, in order to ensure that the acquired target execution function is more in line with the user expectation, the target execution function may be determined based on the confidence of the second execution function, as shown in fig. 6, and the step S223 may include the following steps S2231 to S2232.
Step S2231: and acquiring a second execution function different from the first execution function as a third execution function, and acquiring the confidence of the third execution function.
In the present embodiment, when the second execution function is the same as the first execution function, the first execution function may be taken as the target execution function; when the second execution function is different from the first execution function, the second execution function may be regarded as a third execution function, and the confidence of the third execution function is obtained.
In this embodiment, the confidence level may be used to characterize the probability of determining the third performed function as the target performed function. Specifically, control log data of a user for controlling the smart home devices may be acquired, a preference degree of a device function for controlling each smart home device is determined based on the control log data, and a confidence of a third execution function is acquired based on the preference degree; and obtaining portrait data of a plurality of users through big data, establishing preference degrees of the plurality of users for different device functions, and obtaining confidence degrees of the third execution function based on the preference degrees, wherein the mode of obtaining the confidence degrees of the third execution function is not particularly limited.
Step S2232: and taking the third execution function and the first execution function with the confidence coefficient greater than or equal to a preset threshold value as target execution functions.
In this embodiment, the preset threshold may be set based on actual requirements, for example, the preset threshold may be 0.8, 0.9, 0.95, and the like, and the preset threshold is not particularly limited herein.
In this embodiment, through the implementation of the steps S2231 to S2232, the third execution function and the first execution function with the confidence level greater than or equal to the preset threshold may be used as the target execution function, so that the control result of the smart home device better meets the expectation of the user.
Further, as an implementation manner of this embodiment, as shown in fig. 7, the step S223 may further include the following steps S2233 to S2234.
Step S2233: and generating inquiry information based on the third execution function with the confidence coefficient smaller than the preset threshold value, and receiving feedback information sent by the user based on the inquiry information.
In this embodiment, the inquiry information may include information inquiring the user whether to execute the third execution function with the confidence level smaller than the preset threshold. The presentation form of the query information may be any one of audio, schematic image, text, character, etc., and the presentation form of the query information is not particularly limited herein.
In this embodiment, the feedback information may include information issued by the user based on the query information. The representation form of the feedback information may be any one of a video, an image, an instruction manually controlled by a user, and the like, and the representation form of the feedback information is not particularly limited herein.
In some examples, when the third function is "fan third gear on", the query message may be a message indicating "is fan third gear on? The audio and feedback information can be voice representing 'can', 'determine', 'OK', 'determine to turn on three gears of the fan' and the like, which can accurately represent the intention of the user.
Step S2234: and taking the third execution function which has the feedback information as the execution permission as the target execution function.
In the present embodiment, it may be determined whether to take the third execution function as the target execution function based on the content of the feedback information.
In this embodiment, through the implementation of the steps S2233 to S2234, when the confidence level is smaller than the third execution function with the preset threshold, query information is sent, feedback information of the user is received, and whether the third execution function is used as the target execution function is determined based on the content of the feedback information, so that the control result of the smart home device better meets the expectation of the user.
Step S23: and controlling the target equipment to be controlled to execute the target action to be operated.
Step S24: and determining a function to be updated of the first association relation based on the target execution function, and updating the first association relation based on the function to be updated so as to update the execution function corresponding to the target scene in the first association relation to be the target execution function.
Further, as an implementation manner of the present embodiment, as shown in fig. 8, the above step S24 may include the following steps S241 to S243.
Step S241: and matching the target execution function with the first association relation.
Step S242: and screening the target execution function which does not match with the first incidence relation to serve as the function to be updated.
Step S243: and updating the first incidence relation based on the function to be updated and the target scene.
In this embodiment, when the first association relationship may include a scene, an apparatus function, and a mapping relationship between the scene and the apparatus function, the target execution function that does not match the first association relationship is used as the function to be updated, and if the function to be updated does not exist in the first association relationship, the function to be updated is added to the first association relationship, and the function to be updated is mapped and associated with the target scene; and if the function to be updated exists in the first association relationship, directly mapping and associating the function to be updated and the target scene.
In this embodiment, when the first association relationship includes a scene, an environment, an equipment function, a mapping relationship between the scene and the environment, and a mapping relationship between the environment and the equipment function, if there is no function to be updated in the first association relationship, adding the function to be updated to the first association relationship, and performing mapping association between the function to be updated and the corresponding environment; and if the function to be updated exists in the first association relationship, mapping and associating the function to be updated with the corresponding environment.
In this embodiment, when the first association relationship includes a scene, an environment, an apparatus function, a mapping relationship between the scene and the environment, and a mapping relationship between the environment and the apparatus function, if a target environment corresponding to the target scene does not exist in the first association relationship, the target scene may be added to the first association relationship, and the target environment, the target scene, and the function to be updated are mapped and associated, respectively; if the target environment corresponding to the target scene exists in the first association relationship, the target environment, the target scene and the function to be updated can be mapped and associated respectively.
In this embodiment, through the implementation of the steps S241 to S243, the function to be updated may be updated to the first association relationship, so that when a target scene is subsequently obtained, the function to be updated may be obtained based on the first association relationship, the control of the corresponding smart home devices in the same scene is realized, and the user experience is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
The embodiment of the application provides an intelligent household equipment control device, and the intelligent household equipment control device is approximately in one-to-one correspondence with the intelligent household equipment control method.
Referring to fig. 9, a smart home device control apparatus provided in an embodiment of the present application is shown, where the smart home device control apparatus may include a target scene determining module 71, a target execution function obtaining module 72, a function executing module 73, and an updating module 74. The target scene determining module 71 is configured to obtain behavior information of a user, and determine a target scene based on the behavior information. The target execution function obtaining module 72 is configured to obtain a target execution function based on a target scene and a preset association relationship, where the preset association relationship includes a first association relationship, and the target execution function includes a target device to be controlled and a target action to be operated. The function executing module 73 is used for controlling the target device to be controlled to execute the target action to be operated. The updating module 74 is configured to determine a function to be updated of the first association relationship, and update the first association relationship based on the function to be updated.
Further, as an implementation manner of this embodiment, the updating module 74 may include a matching unit, a function to be updated obtaining unit, and a first association relation updating unit. The matching unit is used for matching the target execution function with the first association relation. The function to be updated acquiring unit is used for screening the target execution function which is not matched with the first incidence relation, and the target execution function is used as the function to be updated. The first incidence relation updating unit is used for updating the first incidence relation based on the function to be updated and the target scene.
Further, as an implementation manner of this embodiment, the preset association relationship includes a second association relationship, the priority of the first association relationship is higher than that of the second association relationship, and the target execution function acquiring module 72 may include a first execution function acquiring unit, a second execution function acquiring unit, and a target execution function determining unit. The first executing function acquiring unit is used for acquiring a first executing function corresponding to the target scene based on the first incidence relation. The second execution function acquisition unit is used for acquiring a second execution function corresponding to the target scene based on the second incidence relation. The target execution function determination unit is used for determining a target execution function based on the first execution function and the second execution function.
Further, as an implementation manner of the present embodiment, the target execution function determination unit may include a confidence level acquisition sub-unit and a first target execution function acquisition sub-unit. The confidence coefficient acquiring subunit is configured to acquire, as a third execution function, a second execution function different from the first execution function, and acquire a confidence coefficient of the third execution function. The first target execution function acquisition subunit is configured to use the third execution function and the first execution function with the confidence level greater than or equal to a preset threshold as target execution functions.
Further, as an implementation manner of the present embodiment, the second executed function acquiring unit may include a feedback information receiving subunit and a second target executed function acquiring subunit. The feedback information receiving subunit is configured to generate query information based on the third execution function with the confidence coefficient smaller than the preset threshold, and receive feedback information sent by the user based on the query information. The second target executive function acquisition subunit takes the third executive function whose feedback information is approved to be executed as the target executive function.
Further, as an implementation manner of the present embodiment, the target execution function determination unit may include a group determination subunit to which the user belongs and a group execution function acquisition subunit. The user group determining subunit is used for acquiring the identity information of the user and determining the user group based on the identity information. The group execution function acquisition subunit is used for determining a group execution function corresponding to the user from the group data based on the target scene and the group to which the user belongs, and taking the group execution function as a second execution function.
Further, as an implementation manner of this embodiment, the second association relationship may further include scene general data, the priority of the group data is greater than that of the scene general data, and the second execution function acquiring unit may further include a general execution function acquiring subunit, where the general execution function acquiring subunit is configured to determine, when there is no group execution function corresponding to the user in the group data, a general execution function corresponding to the user from the scene general data based on the target scene, and use the general execution function as the second execution function.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of each module in the above-described apparatus may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling or direct coupling or communication connection between the modules shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or modules may be in an electrical, mechanical or other form.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
Referring to fig. 10, an electronic device 800 according to an embodiment of the present application is shown, including: a processor 810, a communication module 820, a memory 830, and a bus. The processor 810, the communication module 820 and the memory 830 are connected to each other through a bus and perform communication with each other. The bus may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. Wherein:
and a memory 830 for storing programs. In particular, the memory 830 may be used to store software programs as well as various data. The memory 830 may mainly include a program storage area and a data storage area, wherein the program storage area may store an application program required to operate at least one function and may include program codes including computer operating instructions. In addition to storing programs, the memory 830 may temporarily store messages or the like that the communication module 820 needs to send. Memory 830 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 810 is configured to execute programs stored in the memory 830. When being executed by the processor, the program realizes the steps of the control method based on the intelligent household equipment in each embodiment.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process based on the embodiment of the smart home device control method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods of the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.
Claims (10)
1. A smart home device control method is characterized by comprising the following steps:
acquiring behavior information of a user, and determining a target scene based on the behavior information;
acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated;
controlling the target equipment to be controlled to execute the target action to be operated;
and determining a function to be updated of the first incidence relation, and updating the first incidence relation based on the function to be updated.
2. The method according to claim 1, wherein the preset association further includes a second association, the first association has a higher priority than the second association, and the obtaining the target execution function based on the target scene and the preset association comprises:
acquiring a first execution function corresponding to the target scene based on the first incidence relation;
acquiring a second execution function corresponding to the target scene based on the second incidence relation;
determining the target execution function based on the first execution function and the second execution function.
3. The method according to claim 2, wherein the determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated comprises:
matching the target execution function with the first incidence relation;
screening the target execution function which is not matched with the first incidence relation based on the preset incidence relation to serve as the function to be updated;
and updating the first incidence relation based on the function to be updated and the target scene.
4. The method of claim 2, wherein determining the target execution function based on the first execution function and the second execution function comprises:
acquiring the second execution function different from the first execution function as a third execution function, and acquiring the confidence of the third execution function;
and taking the third execution function and the first execution function with the confidence coefficient greater than or equal to a preset threshold value as the target execution function.
5. The method of claim 4, wherein determining the target execution function based on the first execution function and the second execution function further comprises:
generating inquiry information based on the third execution function with the confidence coefficient smaller than the preset threshold value, and receiving feedback information sent by a user based on the inquiry information;
and taking the third execution function with the feedback information as the execution permission as a target execution function.
6. The method of claim 2, wherein the second association comprises group data, and wherein obtaining a second executed function corresponding to the target scene based on the second association comprises:
acquiring identity information of the user, and determining a group to which the user belongs based on the identity information;
and determining a group execution function corresponding to the user from the group data based on the target scene and the group to which the user belongs, and taking the group execution function as the second execution function.
7. The method according to claim 6, wherein the second association further includes scene general data, the priority of the group data is greater than the scene general data, and the obtaining a second execution function corresponding to the target scene based on the second association further includes:
when the group execution function corresponding to the user does not exist in the group data, determining a general execution function corresponding to the user from the scene general data based on the target scene, and taking the general execution function as the second execution function.
8. The utility model provides an intelligence house equipment controlling means which characterized in that includes:
the target scene determining module is used for acquiring behavior information of a user and determining a target scene based on the behavior information;
the target execution function acquisition module is used for acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation, and the target execution function comprises a target device to be controlled and a target action to be operated;
the function execution module is used for controlling the target equipment to be controlled to execute the target action to be operated;
and the updating module is used for determining a function to be updated of the first incidence relation and updating the first incidence relation based on the function to be updated.
9. An electronic device, comprising:
a memory;
one or more processors coupled with the memory;
one or more programs stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the smart home device control method of any of claims 1 to 7.
10. A computer-readable storage medium, wherein a program code is stored in the computer-readable storage medium, and the program code can be called by a processor to execute the smart home device control method according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110633950.8A CN113341743B (en) | 2021-06-07 | 2021-06-07 | Smart home equipment control method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110633950.8A CN113341743B (en) | 2021-06-07 | 2021-06-07 | Smart home equipment control method and device, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113341743A true CN113341743A (en) | 2021-09-03 |
CN113341743B CN113341743B (en) | 2023-11-28 |
Family
ID=77474696
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110633950.8A Active CN113341743B (en) | 2021-06-07 | 2021-06-07 | Smart home equipment control method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113341743B (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113703334A (en) * | 2021-09-22 | 2021-11-26 | 深圳市欧瑞博科技股份有限公司 | Intelligent scene updating method and device |
CN113848748A (en) * | 2021-11-15 | 2021-12-28 | 苏州蓝赫朋勃智能科技有限公司 | Intelligent home control method and intelligent home system |
CN114124692A (en) * | 2021-10-29 | 2022-03-01 | 青岛海尔科技有限公司 | Intelligent device skill access method and device, electronic device and storage medium |
CN114237068A (en) * | 2021-12-20 | 2022-03-25 | 珠海格力电器股份有限公司 | Intelligent device control method, intelligent device control module, intelligent device and storage medium |
CN114442536A (en) * | 2022-01-29 | 2022-05-06 | 北京声智科技有限公司 | Interaction control method, system, device and storage medium |
CN114639379A (en) * | 2022-03-10 | 2022-06-17 | 平安普惠企业管理有限公司 | Interaction method and device of intelligent electric appliance, computer equipment and medium |
CN114942988A (en) * | 2022-05-17 | 2022-08-26 | 珠海格力电器股份有限公司 | Recommendation method and device, electronic equipment and storage medium |
CN115064171A (en) * | 2022-08-18 | 2022-09-16 | 安徽立诺威智能科技有限公司 | Voice awakening method and system for intelligent air disinfection equipment |
CN115083399A (en) * | 2022-05-11 | 2022-09-20 | 深圳绿米联创科技有限公司 | Equipment control method, device, equipment and storage medium |
WO2023168856A1 (en) * | 2022-03-09 | 2023-09-14 | 青岛海尔科技有限公司 | Associated scene recommendation method and device, storage medium, and electronic device |
WO2024001196A1 (en) * | 2022-06-29 | 2024-01-04 | 青岛海尔科技有限公司 | Household appliance control method and apparatus, storage medium, and electronic apparatus |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014147683A1 (en) * | 2013-03-19 | 2014-09-25 | パナソニック株式会社 | Environment control system |
CN104503253A (en) * | 2014-12-17 | 2015-04-08 | 宇龙计算机通信科技(深圳)有限公司 | Equipment control method, equipment control system and terminal |
CN104852975A (en) * | 2015-04-29 | 2015-08-19 | 北京海尔广科数字技术有限公司 | Household equipment calling method and household equipment calling device |
CN109597313A (en) * | 2018-11-30 | 2019-04-09 | 新华三技术有限公司 | Method for changing scenes and device |
US20190215184A1 (en) * | 2018-01-08 | 2019-07-11 | Brilliant Home Technology, Inc. | Automatic scene creation using home device control |
CN111338227A (en) * | 2020-05-18 | 2020-06-26 | 南京三满互联网络科技有限公司 | Electronic appliance control method and control device based on reinforcement learning and storage medium |
CN111650842A (en) * | 2020-05-09 | 2020-09-11 | 珠海格力电器股份有限公司 | Household appliance control method and device |
CN111752165A (en) * | 2020-07-10 | 2020-10-09 | 广州博冠智能科技有限公司 | Intelligent equipment control method and device of intelligent home system |
-
2021
- 2021-06-07 CN CN202110633950.8A patent/CN113341743B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014147683A1 (en) * | 2013-03-19 | 2014-09-25 | パナソニック株式会社 | Environment control system |
CN104503253A (en) * | 2014-12-17 | 2015-04-08 | 宇龙计算机通信科技(深圳)有限公司 | Equipment control method, equipment control system and terminal |
CN104852975A (en) * | 2015-04-29 | 2015-08-19 | 北京海尔广科数字技术有限公司 | Household equipment calling method and household equipment calling device |
US20190215184A1 (en) * | 2018-01-08 | 2019-07-11 | Brilliant Home Technology, Inc. | Automatic scene creation using home device control |
CN109597313A (en) * | 2018-11-30 | 2019-04-09 | 新华三技术有限公司 | Method for changing scenes and device |
CN111650842A (en) * | 2020-05-09 | 2020-09-11 | 珠海格力电器股份有限公司 | Household appliance control method and device |
CN111338227A (en) * | 2020-05-18 | 2020-06-26 | 南京三满互联网络科技有限公司 | Electronic appliance control method and control device based on reinforcement learning and storage medium |
CN111752165A (en) * | 2020-07-10 | 2020-10-09 | 广州博冠智能科技有限公司 | Intelligent equipment control method and device of intelligent home system |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113703334A (en) * | 2021-09-22 | 2021-11-26 | 深圳市欧瑞博科技股份有限公司 | Intelligent scene updating method and device |
CN114124692A (en) * | 2021-10-29 | 2022-03-01 | 青岛海尔科技有限公司 | Intelligent device skill access method and device, electronic device and storage medium |
CN114124692B (en) * | 2021-10-29 | 2024-03-22 | 青岛海尔科技有限公司 | Intelligent equipment skill access method and device, electronic equipment and storage medium |
CN113848748A (en) * | 2021-11-15 | 2021-12-28 | 苏州蓝赫朋勃智能科技有限公司 | Intelligent home control method and intelligent home system |
CN114237068A (en) * | 2021-12-20 | 2022-03-25 | 珠海格力电器股份有限公司 | Intelligent device control method, intelligent device control module, intelligent device and storage medium |
CN114237068B (en) * | 2021-12-20 | 2024-05-03 | 珠海格力电器股份有限公司 | Intelligent device control method, module, intelligent device and storage medium |
CN114442536A (en) * | 2022-01-29 | 2022-05-06 | 北京声智科技有限公司 | Interaction control method, system, device and storage medium |
WO2023168856A1 (en) * | 2022-03-09 | 2023-09-14 | 青岛海尔科技有限公司 | Associated scene recommendation method and device, storage medium, and electronic device |
CN114639379A (en) * | 2022-03-10 | 2022-06-17 | 平安普惠企业管理有限公司 | Interaction method and device of intelligent electric appliance, computer equipment and medium |
CN115083399A (en) * | 2022-05-11 | 2022-09-20 | 深圳绿米联创科技有限公司 | Equipment control method, device, equipment and storage medium |
CN114942988A (en) * | 2022-05-17 | 2022-08-26 | 珠海格力电器股份有限公司 | Recommendation method and device, electronic equipment and storage medium |
WO2024001196A1 (en) * | 2022-06-29 | 2024-01-04 | 青岛海尔科技有限公司 | Household appliance control method and apparatus, storage medium, and electronic apparatus |
CN115064171A (en) * | 2022-08-18 | 2022-09-16 | 安徽立诺威智能科技有限公司 | Voice awakening method and system for intelligent air disinfection equipment |
Also Published As
Publication number | Publication date |
---|---|
CN113341743B (en) | 2023-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113341743B (en) | Smart home equipment control method and device, electronic equipment and storage medium | |
CN108919669B (en) | Intelligent home dynamic decision method and device and service terminal | |
US11782590B2 (en) | Scene-operation method, electronic device, and non-transitory computer readable medium | |
CN106487928B (en) | Message pushing method and device | |
WO2022267671A1 (en) | Air conditioner operation mode pushing method and apparatus, and air conditioner | |
CN106647645A (en) | Method and system for home control adjustment | |
WO2020024506A1 (en) | Air conditioner control method and device, storage medium, and processor | |
US20140129032A1 (en) | Genetic learning for environmental control automation | |
CN111965985B (en) | Smart home equipment control method and device, electronic equipment and storage medium | |
CN109951363B (en) | Data processing method, device and system | |
CN110726231B (en) | Control method and device of air conditioner | |
WO2020228030A1 (en) | Device recommendation method and apparatus, electronic device, and storage medium | |
CN112037785B (en) | Control method and device of intelligent equipment, electronic equipment and storage medium | |
CN111965989B (en) | System updating method and device, intelligent home control panel and storage medium | |
CN109450745A (en) | Information processing method, device, intelligence control system and intelligent gateway | |
CN111817936A (en) | Control method and device of intelligent household equipment, electronic equipment and storage medium | |
CN114253147A (en) | Intelligent device control method and device, electronic device and storage medium | |
CN107634887A (en) | Message treatment method, device and intelligence control system | |
WO2018163022A1 (en) | Cooker hood and household interconnection control method based on cooker hood | |
CN114724558A (en) | Method and device for voice control of air conditioner, air conditioner and storage medium | |
CN118483941A (en) | Equipment control method, device, electronic equipment and storage medium | |
CN113852657A (en) | Intelligent home local control method and system based on edge calculation | |
CN112331190A (en) | Intelligent equipment and method and device for self-establishing voice command thereof | |
CN115479370A (en) | Air conditioner control method, device and equipment and air conditioner | |
CN113220991A (en) | Method, system, device and storage medium for automatically recommending switching scenes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |