CN113341743B - Smart home equipment control method and device, electronic equipment and storage medium - Google Patents

Smart home equipment control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN113341743B
CN113341743B CN202110633950.8A CN202110633950A CN113341743B CN 113341743 B CN113341743 B CN 113341743B CN 202110633950 A CN202110633950 A CN 202110633950A CN 113341743 B CN113341743 B CN 113341743B
Authority
CN
China
Prior art keywords
target
execution function
function
user
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110633950.8A
Other languages
Chinese (zh)
Other versions
CN113341743A (en
Inventor
祝琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Oribo Technology Co Ltd
Original Assignee
Shenzhen Oribo Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Oribo Technology Co Ltd filed Critical Shenzhen Oribo Technology Co Ltd
Priority to CN202110633950.8A priority Critical patent/CN113341743B/en
Publication of CN113341743A publication Critical patent/CN113341743A/en
Application granted granted Critical
Publication of CN113341743B publication Critical patent/CN113341743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The embodiment of the application discloses an intelligent household equipment control method, an intelligent household equipment control device, electronic equipment and a storage medium, wherein the intelligent household equipment control method comprises the following steps: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on a target scene and a preset association relationship, wherein the preset association relationship comprises a first association relationship, and the target execution function comprises target equipment to be controlled and target actions to be operated; controlling target equipment to be controlled to execute target operation actions; and determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated. By utilizing the method and the system, the intelligent home equipment is automatically controlled according to the behavior information of the user, the user does not need to associate the functions and the scenes of the intelligent home equipment one by one, the first association relation is dynamically updated, and the use experience of the user is improved.

Description

Smart home equipment control method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of household equipment, in particular to an intelligent household equipment control method and device, electronic equipment and a storage medium.
Background
In recent years, with the continuous improvement of economic level, the consumer's consumption ability is gradually improved, and simultaneously, the consumer's requirement for consumer products is also continuously improved. Because the intelligent device has the characteristics of intelligent control, energy saving and the like, the occupancy of the intelligent device in the market is also continuously improved. In the prior art, the intention of a user is generally obtained through a voice recognition mode, and an intelligent device which the user wants to control is determined, so that the intelligent device is controlled according to the intention, and intelligent control is realized. Although the intelligent equipment can be controlled conveniently in the mode, deviation is easy to occur in the process of knowing the intention of the user, different user intentions can exist for the same section of voice, so that the result of controlling the intelligent equipment cannot reach the expectation of the user, and therefore, in the actual control process of the intelligent equipment, the user is often required to configure the triggering condition of the intelligent equipment by himself, and inconvenience is brought to the user.
Disclosure of Invention
The embodiment of the application provides an intelligent household equipment control method and device, electronic equipment and a storage medium.
In a first aspect, an embodiment of the present application provides a method for controlling an intelligent home device, where the method includes: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on the target scene and a preset association relation, wherein the preset association relation comprises a first association relation, and the target execution function comprises target equipment to be controlled and target actions to be operated; controlling target equipment to be controlled to execute target operation actions; and determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated.
In a second aspect, an embodiment of the present application provides an intelligent home device control apparatus, where the intelligent home device control apparatus may include a target scene determining module, a target execution function obtaining module, a function executing module, and an updating module. The target scene determining module is used for acquiring behavior information of the user and determining a target scene based on the behavior information. The target execution function acquisition module is used for acquiring a target execution function based on the target scene and a preset association relation, wherein the preset association relation comprises a first association relation, and the target execution function comprises target equipment to be controlled and target action to be operated. The function execution module is used for controlling the target equipment to be controlled to execute the target action to be operated. The updating module is used for updating the function to be updated of the first association relation, and the first association relation is updated based on the function to be updated.
In a third aspect, an embodiment of the present application provides an electronic device, including one or more processors and a memory; one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the methods described above.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having program code stored therein, wherein the above-described method is performed when the program code is run.
The embodiment of the application provides an intelligent household equipment control method and device, electronic equipment and a storage medium. Through implementation of the intelligent home equipment control method provided by the embodiment of the application, when the target scene is acquired according to the user behavior information, the target execution function can be acquired based on the first association relation, and then the target equipment to be controlled is controlled, so that the environment where the user is positioned is adjusted, and after the target equipment to be controlled is controlled based on the target execution function, the habit preference of the user is learned, and the first association relation is updated, so that when the same target scene appears later, the target execution function corresponding to the target scene can be inquired in the first association relation, and then the working state of the intelligent home equipment is automatically controlled, and the user does not need to control each intelligent home equipment independently, and does not need to associate the equipment function and the scene of the intelligent home equipment one by one, so that the first association relation is dynamically updated, the convenience of the user in using the intelligent home equipment is improved, and the use experience of the user is improved.
The following describes a specific control method of the smart home device. Based on the application environments and the system introduced above, embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Drawings
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
FIG. 1 illustrates a schematic diagram of an application environment suitable for use with embodiments of the present application.
Fig. 2 shows a schematic diagram of a control panel suitable for use in an embodiment of the application.
Fig. 3 shows a flow diagram of a smart home device control method according to an embodiment of the application.
Fig. 4 is a flow chart illustrating steps for determining a target execution function in the method shown in fig. 3.
Fig. 5 shows a flow chart of the step of obtaining the second execution function in the method shown in fig. 4.
Fig. 6 shows a further flow diagram of the step of obtaining the second execution function in the method shown in fig. 4.
Fig. 7 shows a further flow diagram of the step of obtaining the target execution function in the method shown in fig. 4.
Fig. 8 is a flowchart illustrating a step of updating the first association relationship in the method shown in fig. 3.
Fig. 9 shows a functional block diagram of a smart home device control apparatus according to an embodiment of the present application.
Fig. 10 shows a functional block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
In the current control method of the smart home devices, a user generally creates trigger conditions of each smart home device through a control panel in advance, for example, the user needs to manually input, bind a specific scene with the working state of each smart home device, and when the specific scene appears, each smart home device enters the working state bound corresponding to the specific scene. Although the mode can automatically control the intelligent household equipment to carry out the corresponding binding working state when a specific scene appears, the premise of the mode is that a user needs to create the binding relation between the scene and the working state of the intelligent household equipment in advance, the mode is seriously dependent on the operation of the user, and when the conditions of more intelligent household equipment, more newly accessed intelligent household equipment, more scenes and the like occur, the user needs to carry out various operations so as to bind the scenes with the working state of the intelligent household equipment, and great inconvenience is brought to the user.
In order to avoid the problems, the inventor of the present application has made a great deal of research and has proposed the control method of the smart home device of the present application, the control method of the smart home device includes: acquiring behavior information of a user, and determining a target scene based on the behavior information; acquiring a target execution function based on the target scene and a preset association relation, wherein the preset association relation comprises a first association relation, and the target execution function comprises target equipment to be controlled and target actions to be operated; controlling target equipment to be controlled to execute target operation actions; and determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated. Through implementation of the intelligent home equipment control method provided by the embodiment, when the target scene is acquired, the target execution function can be acquired based on the first association relation, and then the target equipment to be controlled is controlled so as to adjust the environment where the user is located, when the target equipment to be controlled is controlled based on the target execution function, the first association relation can be updated, so that when the same target scene appears later, the target execution function corresponding to the target scene can be queried in the first association relation, habit preference of the user is learned, and further the working state of the intelligent home equipment is automatically controlled, without independently controlling each intelligent home equipment by the user, and without associating the equipment functions and the scenes of the intelligent home equipment one by the user, thereby dynamically updating the first association relation, greatly improving the convenience of the user in using the intelligent home equipment and rapidly responding to the requirement of the user.
Firstly, an application environment of the intelligent home equipment control method provided by the embodiment of the application is introduced.
Referring to fig. 1, fig. 1 shows an application scenario schematic diagram of an intelligent home device control method according to an embodiment of the present application, where the intelligent home device control method may be applied to an intelligent home system 100, and the intelligent home system 100 may include a control panel 101 and an intelligent home device 102.
In the embodiment of the present application, the control panel 101 includes an intelligent control device for controlling the intelligent home device 102, where the intelligent control device may implement functions such as collection of system information, information input, information output, relay transmission, relay control, centralized control, remote control, coordinated control, and remote call. The intelligent control equipment can also be responsible for specific security alarm, household appliance control, electricity consumption information acquisition and the like. The intelligent control device can also interact data with products such as intelligent interaction terminals in a wireless mode or a wired mode. It should be noted that fig. 1 illustrates an application environment, and the number of control panels 101 in the drawing may be increased or decreased as required.
Referring to fig. 2, the control panel 101 may include a control panel body 1011, where the control panel body 1011 may be a cylinder, a regular frame, an irregular frame, or the like, alternatively, the control panel body 1011 may be a regular frame, for example, a cuboid, a cube, or the like, and one side of the control panel body 1011 may be provided with a fixing portion to fix the control panel body 1011 to a target object, for example, to fix the control panel body 1011 to a wall, to a door, or the like.
Further, the control panel 101 may include at least one virtual key disposed on the control panel body 1011. Specifically, the control panel body 1011 may include a touch screen 1013, at least one virtual key being disposed on the touch screen 1013, wherein the touch screen 1013 is used to display screen information outputted by the control panel 101 and is used for a user to touch operations, for example, touching at least one virtual key. The control panel 101 may further include at least one physical key 1012, the at least one physical key 1012 being disposed on the control panel body 1011. The physical keys and the virtual keys can be associated with different operation instructions to realize different operations.
The control panel 101 may include devices for specific information input and information output capabilities, among other things. For example, the control panel 101 may include one or more of various devices such as smart home control panels, smart speakers, smart televisions, smart phones, smart tablets, notebook computers, personal computers (Personal Computer, PCs), personal digital assistants (Personal Digital Assistant, PDAs), mobile internet devices (Mobile Internet Device, MIDs), wearable devices (e.g., smart watches, smart bracelets, smart glasses), and the like. Additionally, in some examples, the operating system run by the control panel 101 may include, but is not limited to, an Android (Android) operating system, an IOS operating system, a Symbian (sain) operating system, a UNIX operating system, a Linux operating system, a QNX operating system, a Black Berry operating system, a Windows Phone 8 operating system, and the like.
In this embodiment, the smart home device 102 may establish a communication connection with the control panel 101. When the control panel 101 calls the smart home device 102, the smart home device 102 may interact with the control panel 101. The smart home device 102 should also have the ability to input information and output information. For example, smart home device 102 may be a visualization manager, a listening manager, or the like.
In this embodiment, the control panel 101 and the smart home device 102 may be separately provided, or may be in an integrated structure, where the control panel 101 and the smart home device 102 are not particularly limited. In addition, the smart home system 100 may further include a server, where the server may establish a communication connection with the control panel 101, where the control panel 101 may be configured to receive an instruction sent by a user, and may also be configured to receive a control signal sent by the server, where the control signal is configured to control an operating state of the smart home device 102, and where the server may receive the instruction sent by the control panel 101, perform recognition analysis based on the instruction, thereby generating a control signal, and send the control signal to the control panel 101 to control the smart home device 102. It should be noted that, part or all of the steps in the control method for the smart home device provided in the present embodiment may be performed in the server, or may be performed in the control panel 101, which is not limited herein.
The embodiment of the application provides an intelligent household equipment control method, which is applied to a control panel 101 in an intelligent household system 100, can be applied to a server, and can be simultaneously applied to the control panel 101 and the server. The following describes a specific control method of the smart home device. Based on the application environments and the system introduced above, embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 3, in the method for controlling an intelligent home device according to the embodiment of the present application, once triggered, the flow of the method in the embodiment may be automatically operated, where each step may be performed sequentially in the order shown in the flowchart when operating, or may be performed simultaneously according to the actual situation, and the method is not limited herein. The smart home device control method may include the following steps S11 to S14.
Step S11: and acquiring behavior information of the user, and determining a target scene based on the behavior information.
In this embodiment, the behavior information may include information related to the behavior of the user. For example, the behavior information may include actions of the user, voice, and the like. In some examples, the smart home device may receive the user's voice through a microphone, or may obtain the user's gesture through a camera to obtain the user's action. In other examples, the user's operating instructions may also be obtained by a sensor that may generate corresponding electrical signals based on the user's operation, and thus the electrical signals are used to characterize the user's actions. For example, the sensor may include a wave switch, a resistive touch switch, a capacitive touch switch, or the like.
In this embodiment, the target scene may include a scene characterized by behavior information, that is, the target scene may coincide with a scene desired by the user. For example, the target scene may include a sleep scene, a wake scene, a wash scene, and the like.
In some examples, when the behavior information is voice, the control panel may perform voice recognition on the voice to obtain a voice recognition result, and obtain a target scene represented by the voice from the voice recognition result. When the behavior information is the action of the user, the intelligent home equipment can perform image recognition on an image containing the action to obtain an image recognition result, and a target scene represented by the action is obtained from the image recognition result, wherein the image can be one or more frames of images; when the behavior information is an operation instruction obtained through a sensor, a target scene corresponding to the operation instruction can be obtained from a preset mapping table based on the operation instruction, wherein the mapping table can comprise the operation instruction and the scene, and a mapping relation exists between the operation instruction and the scene. For example, when the user makes a voice "sleep" or when the user's action is "go to bed" at this time, both the voice "sleep" and the action "go to bed" may be used as behavior information, and the target scene may be a sleep scene.
Step S12: and acquiring a target execution function based on the target scene and a preset association relationship, wherein the preset association relationship comprises a first association relationship, and the target execution function comprises target equipment to be controlled and target actions to be operated.
In this embodiment, the target device to be controlled may be one or more smart home devices, and the target operation to be performed may include an operation performed by the smart home device to implement a target execution function.
In this embodiment, the first association relationship may include a mapping relationship between the target scene and the device function, where the mapping relationship may be configured in advance by the user or may be determined based on a user history operation/selection record. When the target scene is acquired, the device function having a mapping relation with the target scene can be queried in the first association relation, and the device function is used as a target execution function, wherein the device function can comprise working parameters such as a working period, a working mode, working power and the like of the device.
In some examples, the first association may include a mapping between the scene and the environment and a mapping between the environment and the device function. The environment may be used to characterize properties of the space in which the user is located, for example, the environment may include darkness, silence, low temperature, etc. For example, when the target scene is a sleep scene, the target environment corresponding to the target scene "sleep scene" is dark, the device function corresponding to the target environment "dark" is turned off, the device function "turn off" is determined as the target execution function, and the target execution function includes the target device to be controlled "light" and the target action to be operated "power-off".
In some examples, the first association may include a mapping between scene-environment variable-device function. The corresponding device functions (sets) of the same scene under different environment variables are not identical. The environment variable may be used to characterize an environment in which the user is located, for example, the environment variable may include weather, time, temperature, brightness, and the like. For example, when the target scene is a getting up scene and the current environment variable is 5 points or cloudy days, the corresponding recommended device function includes turning on a light, so that the device function is determined to be a target execution function; when the target scene is a get-up scene and the environment variable is 10 points or sunny days, the corresponding recommended device function comprises a curtain opening, so the device function 'curtain opening' is determined as the target executing function.
In some examples, when the target scene is acquired, the target execution function corresponding to the target scene may be directly acquired based on the first association relationship, and before the step S11, the target execution function may be a device function executed by the user or executed by default in the target scene.
In some examples, when the target scene is acquired, when the target execution function cannot be acquired from the first association relationship, the target execution function may also be acquired based on the second association relationship; or, when the target scene is acquired, the target execution function may be determined based on the first association relationship and the second association relationship. In this embodiment, the second association relationship may include a scene and a device function generated based on an attribute recommendation of the scene, and when the target scene is acquired, a target execution function suitable for the target scene may be recommended based on the attribute of the target scene, and/or information of a user may be acquired, and a target execution function suitable for the target scene may be recommended based on the information of the user and the attribute of the target scene. In some examples, the target execution function may be generated by way of a user population subdivision model, an expert model, and the like. The manner of acquiring the target execution function from the preset association relationship is not particularly limited herein.
Step S13: and controlling the target equipment to be controlled to execute the target action to be operated.
In this embodiment, the target device to be controlled may be controlled by the target execution function to execute the target operation to be performed. It should be noted that the target execution function may be one or more, and specifically, the target execution function may include a plurality of actions to be operated of one target device to be controlled, or may include actions to be operated of a plurality of target devices to be controlled.
In some examples, when the smart home device control method provided in this embodiment is applied to the smart home system as shown in fig. 1, the control panel may send a corresponding target execution function to the target device to be controlled (smart home device), so that the target device to be controlled executes the target operation to be performed.
Step S14: and determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated.
After determining the target execution function based on the first association relationship, determining whether the first association relationship includes a function to be updated, in some examples, determining the function to be updated corresponding to the first association relationship by detecting subsequent device operations of the user in the current target scene or association data other than the first association relationship, and further updating the first association relationship.
In some examples, the behavior of the user may be continuously monitored, and the first association relationship is updated based on monitoring information obtained, where the monitoring manner may be an active acquisition type (such as by actively querying the user and collecting the audio of the user through a microphone, collecting the image of the user through a camera, etc.) and a passive acquisition type (such as by acquiring the operation instruction of the user through a key, a touch screen, etc., and acquiring the control log data of each smart home device), and the type of the monitoring information should be consistent with the monitoring manner, that is, the monitoring information may include audio, image, control instruction, control log data, etc.
Specifically, in the target scene, the device function of one or more intelligent home devices controlled by the operation of the user is determined based on the monitoring information, at this time, if the device function is not included in the first association relationship, the device function and the mapping relationship between the device function and the target scene are added to the first association relationship, and if the mapping relationship between the device function and the target scene is not included in the first association relationship, the mapping relationship between the device function and the target scene is added to the first association relationship. In addition, the target scene and the device functions under the target scene can be acquired in real time based on the monitoring information, when a new target scene is acquired, the device functions executed by each intelligent home device under the target scene and the target scene can be mapped and associated, and the target scene, the device functions, and the mapping relationship between the target scene and the device functions can be added to the first association relationship.
In some examples, if the target execution function is determined by the second association relationship in the preset association relationship, or is obtained by the first association relationship and the second association relationship in the preset association relationship together, it may be determined whether a mapping relationship exists between the target execution function and the target scene in the first association relationship, if so, the step of updating the target execution function to the first association relationship does not need to be performed, if not, the target execution function is updated to the first association relationship, and a mapping relationship between the target execution function and the target scene is constructed.
In this embodiment, through implementation of the steps S11 to S14, when the target scene is acquired, the target execution function can be acquired based on the preset association relationship, so as to control the target to-be-controlled device, so as to adjust the environment where the user is located, when the target to-be-controlled device is controlled based on the target execution function, the first association relationship can be updated, so that when the same target scene appears later, the target execution function corresponding to the target scene can be queried in the first association relationship, the habit preference of the user is learned, the working state of the intelligent home device is automatically controlled, the user does not need to control each intelligent home device independently, the user does not need to associate the device function of the intelligent home device with the scene one by one, the first association relationship is dynamically updated, the convenience of using the intelligent home device by the user is greatly improved, and the requirement of the user is responded quickly.
The embodiment of the application also provides a control method of the intelligent household equipment, as shown in fig. 4, the control method of the intelligent household equipment can comprise the following steps S21 to S24. The method for controlling the smart home device in this embodiment may include the same or similar steps as those in the foregoing embodiment, and for execution of the same or similar steps, reference may be made to the foregoing description, which is not repeated herein.
Step S21: and acquiring behavior information of the user, and determining a target scene based on the behavior information.
Step S22: and acquiring a target execution function based on the target scene and a preset association relationship, wherein the preset association relationship comprises a first association relationship and a second association relationship, and the target execution function comprises target equipment to be controlled and target action to be operated.
Further, as an implementation manner of the present embodiment, when there is an intersection between the execution function acquired through the first association relationship and the execution function acquired through the second association relationship, the target execution function may be determined based on the priorities of the first association relationship and the second association relationship; in the first association, there is a direct mapping relationship between the scene and the device function, and the first association may be set to have a higher priority than the second association, as shown in fig. 4, and step S22 may include the following steps S221 to S223.
Step S221: and acquiring a first executive function corresponding to the target scene based on the first association relation.
In this embodiment, the first association relationship may include a scenario, a device function, and a mapping relationship between the scenario and the device function, and when the target scenario is acquired, the device function corresponding to the target scenario may be queried in the first association relationship and used as a first execution function, where the device function may include working parameters such as a working period, a working mode, and working power of the device, that is, a working process of the smart home device may be determined based on the first execution function. For example, when the target scene is low temperature, in the first association relationship, the first execution function corresponding to the target scene "low temperature" includes an intelligent home device "air conditioner" cooling mode and the temperature is 25 ℃.
In some examples, the first association may include a scene, an environment, a device function, a mapping between the scene and the environment, and a mapping between the environment and the device function. The environment may be used to characterize properties of the space in which the user is located, for example, the environment may include darkness, silence, low temperature, etc. For example, when the target scene is sleeping, the second association relationship can be used for inquiring that the target environment corresponding to the target scene "sleeping scene" is dark, and further inquiring the first execution function corresponding to the target environment "dark", wherein the first execution function comprises that the intelligent household equipment "lamp" is turned off.
Step S222: and acquiring a second execution function corresponding to the target scene based on the second association relation.
Further, as an implementation of the present embodiment, the second execution function may be determined based on the group to which the user belongs; the second association relationship may include group data, and as shown in fig. 5, the above-described step S222 may include the following steps S2221 to S2222.
Step S2221: and acquiring the identity information of the user, and determining the group to which the user belongs based on the identity information.
In this embodiment, the identity information may include data characterizing the identity of the user. For example, the identity information may include an identification number, an account ID registered on the control panel, gender, ethnicity, age, preferences preset by the user, etc., and the content included in the identity information is not particularly limited herein.
In this embodiment, the group to which the user belongs may be used to characterize the type of the user, and at this time, the identity information of the user may be portrait data of the user, and the user is classified based on the portrait data. For example, the identity information may be age, and the group to which the user belongs may be infants, young children, teenagers, young adults, etc.; when the identity information can be the behavior preference of the user, the group to which the user belongs can be an unshaped sports habit, a good sports type, a forced sports type and the like, and the dividing standard of the group to which the user belongs is not particularly limited.
It should be noted that, for the same user, the user may be divided into one or more different groups to which the user belongs based on the division criteria for the groups to which the user belongs, that is, the group to which the user acquired in step S2221 belongs may be one or more.
Step S2222: and determining a group execution function corresponding to the user from group data based on the target scene and the group to which the user belongs, and taking the group execution function as a second execution function.
In this embodiment, the group data may include respective preference degrees of the groups to which the different types of users belong for the device functions of the smart home device under the same target scene. For example, through data statistics, when a user is in a high-temperature environment and a target scene is determined to be sleeping based on behavior information of the user, if a group to which the user belongs is an old group, the old group is more biased to start a fan relative to air conditioning refrigeration, and the preference degree of an equipment function of air conditioning refrigeration is smaller than that of the equipment function of starting the fan; if the user belongs to the group of the user, the group of the user is more biased to air conditioning refrigeration than the fan is started, and the preference degree of the equipment function of air conditioning refrigeration is larger than that of the equipment function of fan starting.
In this embodiment, the group execution function may include a device function to which the group to which the user belongs has a higher priority in the target scene. Wherein, the device functions ranked within the preset rank can be used as group execution functions by scoring the priority degree of the device functions. For example, a score for a device function may be determined by a result of execution of the device function and a scene attribute of a target scene, the score being used to characterize a priority of the device function.
In this embodiment, through implementation of steps S2221 to S2222, device functions corresponding to the target group data and the target scene may be obtained from the group data, and the matching degree between the obtained second execution function and the user may be improved, so as to improve the use experience of the user.
Further, as an implementation manner of this embodiment, the second association relationship may further include scene generic data, where the priority of the group data is greater than that of the scene generic data, and step S222 may further include: and when the group execution function corresponding to the user does not exist in the group data, determining the general execution function corresponding to the user from the scene general data based on the target scene, and taking the general execution function as a second execution function.
In this embodiment, the scene generic data may include device functions with higher priority in the target scene, where the scene generic data may be equivalent to a device function expert database of the smart home device. Wherein, the device functions ranked within the preset rank can be used as group execution functions by scoring the priority degree of the device functions. For example, a score for a device function may be determined by a result of execution of the device function and a scene attribute of a target scene, the score being used to characterize a priority of the device function.
In some examples, when the user is in a high-temperature environment, the target scene is determined to be a "sleep scene" based on the behavior information of the user, the airflow flow rate can be increased by turning on a fan, cooling the room by air-conditioning cooling, and turning off a lamp based on the scene general data, and at this time, the general execution functions can include turning on the fan, cooling the air-conditioning, and turning off the lamp.
In the present embodiment, the second execution function may be acquired in a case where there is no group execution function corresponding to the user in the group data.
Step S223: the target execution function is determined based on the first execution function and the second execution function.
In the present embodiment, both the first execution function and the second execution function may be determined as target execution functions, and one of the first execution function and the second execution function may also be determined as target execution function.
In this embodiment, the priority of the first association relationship is higher than the priority of the second association relationship, and the priority of the first execution function acquired based on the first association relationship is higher than the priority of the second execution function. Specifically, when the smart home device corresponding to the first execution function is consistent with the smart home device corresponding to the second execution function, the first execution function may be determined as the target execution function. For example, when the target scene is sleeping, the first execution function is to raise the temperature of the air conditioner to 26 ℃, the second execution function is to raise the temperature of the air conditioner to 25 ℃, the objects controlled by the first execution function and the second execution function are both air conditioners, and the priority degree of the first execution function is greater than that of the second execution function, at this time, the first execution function is determined as the target execution function.
In this embodiment, through implementation of steps S221 to S223, a first execution function may be acquired based on a first association relationship, a second execution function may be acquired based on a second association relationship, a target execution function is determined from the first execution function and the second execution function, and when an intelligent home device corresponding to the first execution function is consistent with an intelligent home device corresponding to the second execution function, the first execution function may be used as the target execution function, so that a control result of the intelligent home device better accords with a user' S expectation.
Further, as an implementation manner of the present embodiment, to ensure that the obtained target execution function is more consistent with the user expectation, the target execution function may be determined based on the confidence of the second execution function, as shown in fig. 6, and the step S223 may include the following steps S2231 to S2232.
Step S2231: and acquiring a second execution function which is different from the first execution function, and acquiring the confidence of the third execution function as the third execution function.
In the present embodiment, when the second execution function is the same as the first execution function, the first execution function may be regarded as the target execution function; when the second execution function is different from the first execution function, the second execution function can be used as a third execution function, and the confidence of the third execution function is obtained.
In this embodiment, the confidence may be used to characterize the probability of determining the third executive as the target executive. Specifically, control log data of a user controlling the intelligent home devices can be obtained, preference degrees of device functions controlling the intelligent home devices are determined based on the control log data, and confidence degrees of third execution functions are obtained based on the preference degrees; the portrait data of the plurality of users can be obtained through the big data, preference degrees of the plurality of users for different device functions are established, the confidence coefficient of the third execution function is obtained based on the preference degrees, and the way for obtaining the confidence coefficient of the third execution function is not particularly limited.
Step S2232: and taking the third executive function with the confidence coefficient larger than or equal to a preset threshold value and the first executive function as a target executive function.
In this embodiment, the preset threshold may be set based on actual requirements, for example, the preset threshold may be 0.8, 0.9, 0.95, etc., and the preset threshold is not particularly limited herein.
In this embodiment, through implementation of the steps S2231 to S2232, the third execution function and the first execution function with the confidence level greater than or equal to the preset threshold value may be used as the target execution function, so that the control result of the smart home device more accords with the expectations of the user.
Further, as an implementation manner of the present embodiment, as shown in fig. 7, the above step S223 may further include the following steps S2233 to S2234.
Step S2233: and generating query information based on the third execution function with the confidence coefficient smaller than the preset threshold value, and receiving feedback information sent by the user based on the query information.
In this embodiment, the query information may include information for querying the user whether to execute the third execution function with the confidence level smaller than the preset threshold. The form of presentation of the query information may be any of audio, schematic images, text, characters, and the like, and is not particularly limited herein.
In this embodiment, the feedback information may include information sent by the user based on the query information. The expression form of the feedback information may be any one of video, image, instruction manually controlled by a user, and the like, and the expression form of the feedback information is not particularly limited herein.
In some examples, when the third performance function is "turn on fan three gear", the query may be a message that characterizes "whether to turn on fan three gear? The audio frequency of the 'and the feedback information can be voice which can represent the user's wish accurately, such as 'can', 'confirm', 'OK', 'confirm to open the fan three-gear', etc.
Step S2234: and taking the feedback information as the third executive function which is executed in agreement as a target executive function.
In the present embodiment, whether or not to regard the third execution function as the target execution function may be determined based on the content of the feedback information.
In this embodiment, through implementation of the steps S2233 to S2234, when the confidence coefficient is smaller than the third execution function of the preset threshold, query information may be sent, feedback information of the user may be received, and further, whether the third execution function is used as the target execution function may be determined based on the content of the feedback information, so that the control result of the smart home device better accords with the expectation of the user.
Step S23: and controlling the target equipment to be controlled to execute the target action to be operated.
Step S24: and determining a function to be updated of the first association relation based on the target execution function, and updating the first association relation based on the function to be updated so as to update the execution function corresponding to the target scene in the first association relation into the target execution function.
Further, as an implementation manner of the present embodiment, as shown in fig. 8, the above step S24 may include the following steps S241 to S243.
Step S241: and matching the target execution function with the first association relation.
Step S242: and screening target execution functions which are not matched with the first association relation, and taking the target execution functions as functions to be updated.
Step S243: and updating the first association relation based on the function to be updated and the target scene.
In this embodiment, when the first association relationship may include a scene, a device function, and a mapping relationship between the scene and the device function, the target execution function that is not matched with the first association relationship is used as the function to be updated, if the function to be updated does not exist in the first association relationship, the function to be updated is added to the first association relationship, and mapping association is performed between the function to be updated and the target scene; if the function to be updated exists in the first association relation, the function to be updated is directly mapped and associated with the target scene.
In this embodiment, when the first association relationship includes a scene, an environment, a device function, a mapping relationship between the scene and the environment, and a mapping relationship between the environment and the device function, if the first association relationship does not include the function to be updated, the function to be updated is added to the first association relationship, and the function to be updated is mapped and associated with the corresponding environment; and if the function to be updated exists in the first association relation, mapping and associating the function to be updated with the corresponding environment.
In this embodiment, when the first association relationship includes a scene, an environment, a device function, a mapping relationship between the scene and the environment, and a mapping relationship between the environment and the device function, if a target environment corresponding to a target scene does not exist in the first association relationship, the target scene may be added to the first association relationship, and the target environment, the target scene, and the function to be updated are mapped and associated respectively; if the target environment corresponding to the target scene exists in the first association relation, the target environment, the target scene and the function to be updated can be mapped and associated respectively.
In this embodiment, through implementation of the steps S241 to S243, the function to be updated may be updated to the first association relationship, so that when the target scene is acquired later, the function to be updated may be acquired based on the first association relationship, so as to control the corresponding smart home device in the same scene, and improve the user experience.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic, and should not limit the implementation process of the embodiment of the present application.
The embodiment of the application provides an intelligent household equipment control device which is approximately in one-to-one correspondence with the intelligent household equipment control method.
Referring to fig. 9, an intelligent home device control apparatus provided by an embodiment of the present application may include a target scene determining module 71, a target executing function obtaining module 72, a function executing module 73, and an updating module 74. The target scene determining module 71 is configured to obtain behavior information of a user, and determine a target scene based on the behavior information. The target execution function obtaining module 72 is configured to obtain a target execution function based on a target scene and a preset association relationship, where the preset association relationship includes a first association relationship, and the target execution function includes a target device to be controlled and a target action to be operated. The function execution module 73 is configured to control the target device to be controlled to execute the target operation action. The updating module 74 is configured to determine a function to be updated of the first association relationship, and update the first association relationship based on the function to be updated.
Further, as an implementation manner of the present embodiment, the update module 74 may include a matching unit, a function to be updated obtaining unit, and a first association relation updating unit. The matching unit is used for matching the target execution function with the first association relation. The function to be updated acquisition unit is used for screening target execution functions which are not matched with the first association relation and taking the target execution functions as the functions to be updated. The first association relation updating unit is used for updating the first association relation based on the function to be updated and the target scene.
Further, as an implementation manner of the present embodiment, the preset association relationship includes a second association relationship, and the first association relationship has a higher priority than the second association relationship, and the target execution function obtaining module 72 may include a first execution function obtaining unit, a second execution function obtaining unit, and a target execution function determining unit. The first executive function acquisition unit is used for acquiring a first executive function corresponding to the target scene based on the first association relation. The second execution function acquisition unit is used for acquiring a second execution function corresponding to the target scene based on a second association relation. The target execution function determining unit is used for determining a target execution function based on the first execution function and the second execution function.
Further, as an implementation manner of the present embodiment, the target execution function determining unit may include a confidence acquiring subunit and a first target execution function acquiring subunit. The confidence acquiring subunit is configured to acquire a second execution function different from the first execution function, and as a third execution function, acquire a confidence of the third execution function. The first target execution function acquisition subunit is configured to use the third execution function and the first execution function, where the confidence coefficient is greater than or equal to a preset threshold, as target execution functions.
Further, as an implementation manner of the present embodiment, the second execution function obtaining unit may include a feedback information receiving subunit and a second target execution function obtaining subunit. The feedback information receiving subunit is configured to generate query information based on a third execution function with a confidence coefficient smaller than a preset threshold value, and receive feedback information sent by a user based on the query information. The second target execution function obtaining subunit takes the third execution function with the feedback information being the consent to execute as the target execution function.
Further, as an implementation manner of the present embodiment, the target execution function determining unit may include a group determining subunit to which the user belongs and a group execution function acquiring subunit. The group determination subunit is used for acquiring the identity information of the user and determining the group to which the user belongs based on the identity information. The group execution function acquisition subunit is used for determining a group execution function corresponding to the user from group data based on the target scene and the group to which the user belongs, and taking the group execution function as a second execution function.
Further, as an implementation manner of the present embodiment, the second association relationship may further include scene general data, a priority of the group data is greater than that of the scene general data, and the second execution function obtaining unit may further include a general execution function obtaining subunit, where the general execution function obtaining subunit is configured to determine, when the group execution function corresponding to the user does not exist in the group data, a general execution function corresponding to the user from the scene general data based on the target scene, and use the general execution function as the second execution function.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of each module in the above description apparatus may refer to the corresponding process in the foregoing method embodiment, which is not repeated herein.
In the several embodiments provided by the present application, the illustrated or discussed coupling or direct coupling or communication connection of the modules to each other may be through some interfaces, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other forms.
In addition, each functional module in each embodiment of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module. The integrated modules may be implemented in hardware or in software functional modules.
Referring to fig. 10, an electronic device 800 according to an embodiment of the present application includes: processor 810, communication module 820, memory 830, and bus. The processor 810, the communication module 820, and the memory 830 are connected to each other through buses and perform communication with each other. The bus may be an ISA bus, a PCI bus, an EISA bus, or the like. The buses may be divided into address buses, data buses, control buses, etc. Wherein:
a memory 830 for storing a program. In particular, the memory 830 may be used to store software programs as well as various data. The memory 830 may mainly include a storage program area and a storage data area, wherein the storage program area may store an application program required for operating at least one function may include program code including computer operation instructions. In addition to storing programs, the memory 830 may also store messages and the like that the communication module 820 needs to send. Memory 830 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
The processor 810 is configured to execute programs stored in the memory 830. The program when executed by the processor realizes the steps of the intelligent household equipment-based control method in the above embodiments.
The embodiment of the application also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, realizes the above processes based on the intelligent home equipment control method embodiment, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here. Among them, a computer readable storage medium such as Read-Only Memory (ROM), random access Memory (Random Access Memory RAM), magnetic disk or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method of the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (8)

1. The intelligent household equipment control method is characterized by comprising the following steps of:
acquiring behavior information of a user, and determining a target scene based on the behavior information;
acquiring a target execution function based on the target scene and a preset incidence relation, wherein the preset incidence relation comprises a first incidence relation and a second incidence relation, the target execution function comprises target equipment to be controlled and target action to be operated, and the priority of the first incidence relation is higher than that of the second incidence relation;
the obtaining the target execution function based on the target scene and the preset association relation comprises the following steps:
acquiring a first execution function corresponding to the target scene based on the first association relation;
acquiring a second execution function corresponding to the target scene based on the second association relation;
determining the target execution function based on the first execution function and the second execution function;
the determining the target execution function based on the first execution function and the second execution function includes:
acquiring a second execution function which is different from the first execution function and is used as a third execution function, and acquiring the confidence coefficient of the third execution function;
Taking a third execution function with the confidence coefficient being greater than or equal to a preset threshold value and the first execution function as the target execution function;
controlling the target equipment to be controlled to execute the target action to be operated;
and determining a function to be updated of the first association relation, and updating the first association relation based on the function to be updated.
2. The method of claim 1, wherein the determining the function to be updated for the first association, updating the first association based on the function to be updated, comprises:
matching the target execution function with the first association relation;
screening the target execution function which is not matched with the first association relation based on the preset association relation, and taking the target execution function as the function to be updated;
and updating the first association relation based on the function to be updated and the target scene.
3. The method of claim 1, wherein the determining the target executive based on the first executive and the second executive further comprises:
generating query information based on a third execution function with the confidence coefficient smaller than a preset threshold value, and receiving feedback information sent by a user based on the query information;
And taking the third executive function which is informed to be executed in a consent way as a target executive function.
4. The method of claim 1, wherein the second association includes population data, and the acquiring a second execution function corresponding to the target scene based on the second association includes:
acquiring identity information of the user, and determining a group to which the user belongs based on the identity information;
and determining a group execution function corresponding to the user from the group data based on the target scene and the group to which the user belongs, and taking the group execution function as the second execution function.
5. The method of claim 4, wherein the second association further comprises scene generic data, the group data having a priority greater than the scene generic data, the obtaining a second execution function corresponding to the target scene based on the second association further comprising:
and when the group execution function corresponding to the user does not exist in the group data, determining a general execution function corresponding to the user from the scene general data based on the target scene, and taking the general execution function as the second execution function.
6. An intelligent home equipment control device, which is characterized by comprising:
the target scene determining module is used for acquiring behavior information of a user and determining a target scene based on the behavior information;
the target execution function acquisition module is used for acquiring a target execution function based on the target scene and a preset association relationship, wherein the preset association relationship comprises a first association relationship and a second association relationship, the target execution function comprises target equipment to be controlled and target action to be operated, and the priority of the first association relationship is higher than that of the second association relationship;
the first execution function corresponding to the target scene is acquired based on the first association relation;
acquiring a second execution function corresponding to the target scene based on the second association relation;
determining the target execution function based on the first execution function and the second execution function;
the function execution module is used for controlling the target equipment to be controlled to execute the target action to be operated;
an updating module, configured to determine a function to be updated of the first association relationship, and update the first association relationship based on the function to be updated
7. An electronic device, comprising:
A memory;
one or more processors coupled with the memory;
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the smart home device control method of any of claims 1-5.
8. A computer readable storage medium having stored therein program code that is callable by a processor to perform the smart home device control method as claimed in any one of claims 1 to 5.
CN202110633950.8A 2021-06-07 2021-06-07 Smart home equipment control method and device, electronic equipment and storage medium Active CN113341743B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110633950.8A CN113341743B (en) 2021-06-07 2021-06-07 Smart home equipment control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110633950.8A CN113341743B (en) 2021-06-07 2021-06-07 Smart home equipment control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113341743A CN113341743A (en) 2021-09-03
CN113341743B true CN113341743B (en) 2023-11-28

Family

ID=77474696

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110633950.8A Active CN113341743B (en) 2021-06-07 2021-06-07 Smart home equipment control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113341743B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703334A (en) * 2021-09-22 2021-11-26 深圳市欧瑞博科技股份有限公司 Intelligent scene updating method and device
CN114124692B (en) * 2021-10-29 2024-03-22 青岛海尔科技有限公司 Intelligent equipment skill access method and device, electronic equipment and storage medium
CN114442536A (en) * 2022-01-29 2022-05-06 北京声智科技有限公司 Interaction control method, system, device and storage medium
CN114690731A (en) * 2022-03-09 2022-07-01 青岛海尔科技有限公司 Associated scene recommendation method and device, storage medium and electronic device
CN115268282A (en) * 2022-06-29 2022-11-01 青岛海尔科技有限公司 Control method and device of household appliance, storage medium and electronic device
CN115064171A (en) * 2022-08-18 2022-09-16 安徽立诺威智能科技有限公司 Voice awakening method and system for intelligent air disinfection equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014147683A1 (en) * 2013-03-19 2014-09-25 パナソニック株式会社 Environment control system
CN104503253A (en) * 2014-12-17 2015-04-08 宇龙计算机通信科技(深圳)有限公司 Equipment control method, equipment control system and terminal
CN104852975A (en) * 2015-04-29 2015-08-19 北京海尔广科数字技术有限公司 Household equipment calling method and household equipment calling device
CN109597313A (en) * 2018-11-30 2019-04-09 新华三技术有限公司 Method for changing scenes and device
CN111338227A (en) * 2020-05-18 2020-06-26 南京三满互联网络科技有限公司 Electronic appliance control method and control device based on reinforcement learning and storage medium
CN111650842A (en) * 2020-05-09 2020-09-11 珠海格力电器股份有限公司 Household appliance control method and device
CN111752165A (en) * 2020-07-10 2020-10-09 广州博冠智能科技有限公司 Intelligent equipment control method and device of intelligent home system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11057238B2 (en) * 2018-01-08 2021-07-06 Brilliant Home Technology, Inc. Automatic scene creation using home device control

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014147683A1 (en) * 2013-03-19 2014-09-25 パナソニック株式会社 Environment control system
CN104503253A (en) * 2014-12-17 2015-04-08 宇龙计算机通信科技(深圳)有限公司 Equipment control method, equipment control system and terminal
CN104852975A (en) * 2015-04-29 2015-08-19 北京海尔广科数字技术有限公司 Household equipment calling method and household equipment calling device
CN109597313A (en) * 2018-11-30 2019-04-09 新华三技术有限公司 Method for changing scenes and device
CN111650842A (en) * 2020-05-09 2020-09-11 珠海格力电器股份有限公司 Household appliance control method and device
CN111338227A (en) * 2020-05-18 2020-06-26 南京三满互联网络科技有限公司 Electronic appliance control method and control device based on reinforcement learning and storage medium
CN111752165A (en) * 2020-07-10 2020-10-09 广州博冠智能科技有限公司 Intelligent equipment control method and device of intelligent home system

Also Published As

Publication number Publication date
CN113341743A (en) 2021-09-03

Similar Documents

Publication Publication Date Title
CN113341743B (en) Smart home equipment control method and device, electronic equipment and storage medium
CN108919669B (en) Intelligent home dynamic decision method and device and service terminal
CN107370649B (en) Household appliance control method, system, control terminal and storage medium
CN105471705B (en) Intelligent control method, equipment and system based on instant messaging
US11782590B2 (en) Scene-operation method, electronic device, and non-transitory computer readable medium
CN106487928B (en) Message pushing method and device
CN111025922B (en) Target equipment control method and electronic equipment
CN110793167B (en) Air conditioner control method and device
CN113531818B (en) Running mode pushing method and device for air conditioner and air conditioner
CN106371326B (en) Storage method and device for equipment working scene
CN111965989B (en) System updating method and device, intelligent home control panel and storage medium
WO2020228030A1 (en) Device recommendation method and apparatus, electronic device, and storage medium
CN112037785B (en) Control method and device of intelligent equipment, electronic equipment and storage medium
CN111965985A (en) Intelligent household equipment control method and device, electronic equipment and storage medium
CN112201242A (en) Method and device for waking up equipment, electronic equipment and storage medium
CN115793481A (en) Device control method, device, electronic device and storage medium
CN109974233B (en) Control method, control device, air conditioner system and storage medium
CN114724558A (en) Method and device for voice control of air conditioner, air conditioner and storage medium
CN110597074A (en) Control method and device of intelligent household equipment and intelligent panel
CN103905837A (en) Image processing method and device and terminal
CN114826805A (en) Computer readable storage medium, mobile terminal and intelligent home control method
CN112331190A (en) Intelligent equipment and method and device for self-establishing voice command thereof
TW202022650A (en) Method for controlling devices based on voice message and control terminal device
CN110425693B (en) Intelligent air conditioner and use method thereof
CN110347047B (en) Method, device and system for deleting equipment, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant