CN111352350B - Method and device for determining execution of smart home scene - Google Patents

Method and device for determining execution of smart home scene Download PDF

Info

Publication number
CN111352350B
CN111352350B CN202010180858.6A CN202010180858A CN111352350B CN 111352350 B CN111352350 B CN 111352350B CN 202010180858 A CN202010180858 A CN 202010180858A CN 111352350 B CN111352350 B CN 111352350B
Authority
CN
China
Prior art keywords
action
scene
equipment
current
execution sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010180858.6A
Other languages
Chinese (zh)
Other versions
CN111352350A (en
Inventor
王艳青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Co Ltd
Qingdao Hisense Smart Life Technology Co Ltd
Original Assignee
Hisense Co Ltd
Qingdao Hisense Smart Life Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Co Ltd, Qingdao Hisense Smart Life Technology Co Ltd filed Critical Hisense Co Ltd
Priority to CN202010180858.6A priority Critical patent/CN111352350B/en
Publication of CN111352350A publication Critical patent/CN111352350A/en
Application granted granted Critical
Publication of CN111352350B publication Critical patent/CN111352350B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The embodiment of the invention provides a method and a device for determining execution of an intelligent home scene, wherein the method comprises the steps of obtaining scene information to be executed, the scene information to be executed comprises actions corresponding to current action equipment related to the scene to be executed, determining whether equipment actions the same as or opposite to the actions corresponding to the current action equipment exist in a scene execution sequence according to the actions corresponding to the current action equipment, deleting the equipment actions the same as or opposite to the actions corresponding to the current action equipment from the scene execution sequence when the actions corresponding to the current action equipment are determined to have cancellation capacity if the equipment actions are determined to have the cancellation capacity, and inserting the actions corresponding to the current action equipment into the scene execution sequence according to delay time.

Description

Method and device for determining execution of smart home scene
Technical Field
The embodiment of the invention relates to the technical field of intelligent home, in particular to a method and a device for determining execution of an intelligent home scene.
Background
At present, scenes of the smart home industry can be divided into click-type scenes, condition scenes and timing scenes. All these scenarios involve scenario actions, which are composed of one or more device actions. Some intelligent home schemes support setting a delay time for scene actions in a unified manner, namely after the scene is triggered, actions of action equipment related to the scene can be executed after the specified delay time; some intelligent home schemes support setting a delay time for a scene action in a unified manner, and also support setting delay times for action devices in a scene respectively. When a scene is executed, a future execution time is calculated for the actions related to the action equipment in the scene according to the delay time and the current time, the actions of the action equipment are put into an execution sequence according to the future execution time, and an independent thread is responsible for monitoring the execution sequence established according to the execution time and processing the execution sequence to realize the execution of the scene. However, in these technical solutions, when multiple scenes are executed in a crossed manner, the same or opposite actions of the action devices in the scenes have a conflict problem in the execution process, and there is no unified scene execution rule to normalize the actions of the action devices in these scenes.
In summary, there is a need for a method for determining execution of smart home scenes, so as to solve the problem in the prior art that when multiple scenes are executed in a cross manner, actions of the same or opposite action devices in the scenes are conflicting in the execution process.
Disclosure of Invention
The embodiment of the invention provides a method and a device for determining execution of an intelligent home scene, which are used for solving the problem that in the prior art, when multiple scenes are executed in a crossed manner, actions of same or opposite action equipment in the scenes conflict in the execution process.
In a first aspect, an embodiment of the present invention provides a method for determining execution of an intelligent home scene, including:
acquiring scene information to be executed, wherein the scene information to be executed comprises actions corresponding to current action equipment related to a scene to be executed;
determining whether a device action which is the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence according to the action corresponding to the current action device;
if so, deleting the device action which is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when the action corresponding to the current action device is determined to have the cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
In the technical scheme, whether equipment actions which are the same as or opposite to the actions corresponding to the current action equipment exist in the scene execution sequence is determined according to the actions corresponding to the current action equipment, if yes, when the actions corresponding to the current action equipment have cancellation capability, the equipment actions which are the same as or opposite to the actions corresponding to the current action equipment are deleted from the scene execution sequence, and the actions corresponding to the current action equipment are inserted into the scene execution sequence according to the delay time.
Optionally, the method further comprises:
and if the scene execution sequence does not have the device action which is the same as or opposite to the action corresponding to the current action device, inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
In the above technical solution, the device action that is the same as or opposite to the action corresponding to the current action device does not exist in the scene execution sequence, and the action corresponding to the current action device is inserted into the scene execution sequence according to the delay time, so that the action corresponding to the current action device that does not exist in the scene execution sequence and is the same as or opposite to the current action device can be smoothly executed when the execution time reaches.
Optionally, the method further comprises:
and when the action corresponding to the current action equipment is determined not to have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
In the above technical solution, when it is determined that the action corresponding to the current action device does not have the cancellation capability, the action corresponding to the current action device is inserted into the scene execution sequence according to the delay time, so that the action corresponding to the current action device without the cancellation capability can be smoothly executed when the execution time is reached.
Optionally, before acquiring the scene information to be executed, the method further includes:
acquiring a setting instruction of action equipment related to each scene, wherein the setting instruction is used for setting whether the action of the action equipment related to each scene has cancellation capability or not;
and setting the action equipment related to each scene according to the setting instruction.
In the technical scheme, before the scene information to be executed is acquired, the setting instruction of the action device related to each scene is acquired, and whether the action device related to each scene or the action setting corresponding to the action device has cancellation capability or not is determined according to the setting instruction, so that the problem that when multiple scenes are executed in a crossed manner in the prior art, the action of the same or opposite action devices related to each scene has conflict in the execution process can be solved, the execution of the action device related to each scene can be conveniently judged, and the user experience can be improved.
Optionally, after inserting the action corresponding to the current action device into the scene execution sequence according to the delay time, the method further includes:
determining whether the action corresponding to the current action equipment is inserted into the starting position of the scene execution sequence;
and if so, deleting the timer of the action after the action corresponding to the current action equipment in the scene execution sequence, and triggering a scene execution processing thread to process the scene execution sequence.
In the above technical solution, after the action corresponding to the current action device is inserted into the scene execution sequence according to the delay time, it is determined whether the action corresponding to the current action device is inserted into the start position of the scene execution sequence, if so, the timer of the action located after the action corresponding to the current action device in the scene execution sequence is deleted, and the scene execution processing thread is triggered to process the scene execution sequence.
In a second aspect, an embodiment of the present invention further provides an apparatus for determining execution of an intelligent home scene, where the apparatus includes:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring scene information to be executed, and the scene information to be executed comprises actions corresponding to current action equipment related to a scene to be executed;
the processing unit is used for determining whether a device action which is the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence according to the action corresponding to the current action device; if so, deleting the device action which is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when the action corresponding to the current action device is determined to have the cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
Optionally, the processing unit is further configured to:
and if the scene execution sequence does not have the equipment action which is the same as or opposite to the action corresponding to the current action equipment, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
Optionally, the processing unit is further configured to:
and when determining that the action corresponding to the current action equipment does not have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
Optionally, the processing unit is further configured to:
before acquiring scene information to be executed, acquiring a setting instruction of action equipment related to each scene, wherein the setting instruction is used for setting whether the action of the action equipment related to each scene has cancellation capability or not;
and setting the action equipment related to each scene according to the setting instruction.
Optionally, the processing unit is further configured to:
after the action corresponding to the current action equipment is inserted into the scene execution sequence according to the delay time, determining whether the action corresponding to the current action equipment is inserted into the initial position of the scene execution sequence;
and if so, deleting the timer of the action after the action corresponding to the current action equipment in the scene execution sequence, and triggering a scene execution processing thread to process the scene execution sequence.
In a third aspect, an embodiment of the present invention provides a computing device, including:
a memory for storing program instructions;
and the processor is used for calling the program instruction stored in the memory and executing the method for determining the intelligent household scene execution according to the obtained program.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, where computer-executable instructions are stored, and the computer-executable instructions are configured to enable a computer to execute a method for determining an execution of an intelligent home scene.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a system architecture according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method for determining execution of an intelligent home scene according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of an action insertion scene execution sequence corresponding to an action device involved in a scene according to an embodiment of the present invention;
fig. 4 is a schematic flowchart illustrating a scene execution sequence processed by a scene execution processing thread according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an apparatus for determining execution of an intelligent home scene according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Fig. 1 is a system architecture provided in an embodiment of the present invention. As shown in fig. 1, the system architecture may be a server 100 including a processor 110, a communication interface 120, and a memory 130.
The communication interface 120 is used for communicating with a terminal device, and transceiving information transmitted by the terminal device to implement communication.
The processor 110 is a control center of the server 100, connects various parts of the entire server 100 using various interfaces and lines, performs various functions of the server 100 and processes data by running or executing software programs and/or modules stored in the memory 130 and calling data stored in the memory 130. Optionally, processor 110 may include one or more processing units.
The memory 130 may be used to store software programs and modules, and the processor 110 executes various functional applications and data processing by operating the software programs and modules stored in the memory 130. The memory 130 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to a business process, etc. Further, the memory 130 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
It should be noted that the structure shown in fig. 1 is only an example, and the embodiment of the present invention is not limited thereto.
Based on the foregoing description, fig. 2 exemplarily shows a flow of a method for determining execution of an intelligent home scene, where the flow may be executed by a device for determining execution of an intelligent home scene, and the device may be located in the server 100 shown in fig. 1, or may be the server 100.
As shown in fig. 2, the process specifically includes:
step 201, scene information to be executed is acquired.
In this embodiment of the present invention, the scene information to be executed may include an action corresponding to a current action device related to the scene to be executed. Before obtaining the scene information to be executed, a scene needs to be created, and whether action devices in the scene or action configurations corresponding to the action devices have the function of canceling capability or not is determined, so that when the scene is created, whether all the action devices in the scene have the canceling capability or not can be set uniformly, and when specific action devices are set, whether actions corresponding to specific action devices have the canceling capability or not can be set. The cancellation capability refers to canceling all the device actions which are the same as or opposite to the action corresponding to the currently-acting device, namely, taking the action corresponding to the currently-acting device as the final action of the device.
Step 202, determining whether a device action the same as or opposite to the action corresponding to the current action device exists in the scene execution sequence according to the action corresponding to the current action device.
In the embodiment of the invention, whether a device action the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence is determined according to the action corresponding to the current action device, if the device action the same as or opposite to the action corresponding to the current action device exists in the scene execution sequence, whether the action corresponding to the current action device has the cancellation capability needs to be judged, further processing is carried out according to whether the action corresponding to the current action device has the cancellation capability, and if the device action the same as or opposite to the action corresponding to the current action device does not exist in the scene execution sequence, the action corresponding to the current action device is inserted into the scene execution sequence according to the delay time.
Step 203, if there is a device action that is the same as or opposite to the action corresponding to the current action device in the scene execution sequence, deleting the device action that is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when it is determined that the action corresponding to the current action device has cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
In the embodiment of the invention, when determining that the action corresponding to the current action equipment has the cancellation capability, deleting the equipment action which is the same as or opposite to the action corresponding to the current action equipment from the scene execution sequence; when determining that the action corresponding to the current action equipment does not have the cancellation capability, inserting the action corresponding to the current action equipment into a scene execution sequence according to the delay time, after inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time, determining whether the action corresponding to the current action equipment is inserted into the initial position of the scene execution sequence, if so, deleting a timer of the action positioned after the action corresponding to the current action equipment in the scene execution sequence, and triggering a scene execution processing thread to process the scene execution sequence. Specifically, the cancellation capability may be set or not set in the scene at the time of scene creation, and when the device action has been set with the cancellation capability, when the scene is cross-executed, and the same action with the same device occurs, the action of the action device of the scene executed later can cancel the action of the action device of the scene executed earlier that has not yet been executed. The execution time of the first scene is later than the execution time of the same action of the second scene; the delay time refers to how long the action corresponding to the designated action equipment is delayed to be executed when the scene is created, namely after the scene is executed, the action of the equipment is executed after the designated time is reached; the timer is set according to the time difference between the current time and the execution time.
For example, there are two scenarios a and b, a scenario a where the air conditioner AC is turned on after 10S, b scenario where the air conditioner AC is turned off after 5S, the user has executed scenario a first and wants to turn the air conditioner AC on after 10S, and within 2 seconds after scenario a has been executed, he has executed scenario b again and wants to turn the air conditioner AC off. When the scene b is executed, the action corresponding to the action device for controlling the air conditioner AC to be turned on corresponding to the previously executed scene a is deleted from the scene execution sequence, and finally, the action corresponding to the action device for controlling the air conditioner AC to be turned off corresponding to the scene b is executed only when the scene b is executed; in addition, when the scene b is created, the user may set the non-cancellation capability to the entire operating device corresponding to the scene b, or set the non-cancellation capability to the operation corresponding to the operating device that controls the air conditioner AC to be turned off, and when the scene b is executed, the operation corresponding to the operating device that controls the air conditioner AC to be turned off corresponding to the scene b is inserted into the scene execution sequence, while the operation corresponding to the operating device that controls the air conditioner AC to be turned on corresponding to the scene a that was executed before is still stored in the scene execution operation sequence, and finally, when the scene is executed, the operation corresponding to the operating device that controls the air conditioner AC to be turned off corresponding to the scene b is executed first, and then the operation corresponding to the operating device that controls the air conditioner AC to be turned on corresponding to the scene a is executed later.
In order to better explain an embodiment of an action insertion scene execution sequence corresponding to an action device related to a scenario of the present invention, a flow of the action insertion scene execution sequence corresponding to the action device related to the scenario provided by the embodiment of the present invention is described below through a specific implementation scenario.
As shown in fig. 3, the process includes the following steps:
step 301, whether the action corresponding to the current action device has the cancellation capability or not.
When creating a scene, a user sets whether the action device or the action corresponding to the action device in the scene has a function of canceling the capability. When the scene is executed, whether the action corresponding to the current action device related in the scene has the cancellation capability is judged, if yes, step 302 is executed, and if not, step 303 is executed.
Step 302, delete the device action that is the same as or opposite to the action corresponding to the currently acting device.
And when the action corresponding to the current action equipment is determined to have the cancellation capability, deleting the equipment action which is the same as or opposite to the action corresponding to the current action equipment from the scene execution sequence.
And 303, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
And when the action corresponding to the current action equipment is determined not to have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
And step 304, whether the action corresponding to the current action equipment is inserted into the starting position of the scene execution sequence.
After the action corresponding to the current action device is inserted into the scene execution sequence, it is further determined whether the action corresponding to the current action device is inserted into the start position of the scene execution sequence, if so, step 305 is executed, and if not, the action corresponding to the current action device needs to wait until the execution time reaches and then is executed.
Step 305, deleting the action timer after the action corresponding to the current action device in the scene execution sequence, and triggering the scene execution processing thread to process the scene execution sequence.
The embodiment shows that the problem that actions of equipment related to the same or opposite actions in a scene have conflict in the execution process when multiple scenes are executed in a crossed manner in the prior art can be solved, so that the problem that the actions of the equipment related to the same or opposite actions in the scene have conflict in the execution process can be solved, the experience of executing the same or opposite actions in multiple crossed scenes can be realized, and the final control of the equipment related to the same or opposite actions can be improved when the equipment related to the same or opposite scenes is executed, and the user experience can be further realized.
In order to better explain the embodiment of the present invention for processing a scene execution sequence by a scene execution processing thread, a flow of processing the scene execution sequence by the scene execution processing thread provided by the embodiment of the present invention is described below by using a specific implementation scenario. The scene execution processing thread is responsible for processing and executing the action corresponding to the action device involved in the scene execution sequence.
As shown in fig. 4, the process includes the following steps:
step 401, obtaining an action corresponding to a first action device to be executed from a scene execution sequence.
When the scene is executed, the action corresponding to the first action device to be executed is obtained from the scene execution sequence, and the execution time of the action corresponding to the action device to be executed is obtained.
Step 402, whether the execution time is reached.
After acquiring the action corresponding to the first to-be-executed action device and the execution time of the action corresponding to the to-be-executed action device from the scene execution sequence, determining whether the execution time of the action corresponding to the to-be-executed action device reaches, if so, executing step 404, and if not, executing step 403.
And step 403, setting a timer and delaying triggering.
And after determining that the execution time of the action corresponding to the action equipment to be executed does not reach, setting a timer for the action corresponding to the action equipment to be executed, and processing the scene execution sequence again after the timer time reaches.
And step 404, executing the action corresponding to the action device to be executed.
And after the execution time of the action corresponding to the action equipment to be executed is determined to reach, executing the action corresponding to the action equipment to be executed.
Step 405, deleting the action corresponding to the action device to be executed from the scene execution sequence.
After the action corresponding to the action device to be executed is executed, the action corresponding to the action device to be executed is deleted from the scene execution sequence, and the action corresponding to the action device to be executed next in the scene execution sequence is continuously obtained.
The embodiment shows that the problem that actions of the same or opposite action devices in a scene conflict in the execution process when multiple scenes are executed in a crossed manner in the prior art can be solved, and user experience can be improved.
Based on the same technical concept, fig. 5 exemplarily shows an apparatus for determining execution of a smart home scene, which may execute a flow of a method for determining execution of a smart home scene according to an embodiment of the present invention.
As shown in fig. 5, the apparatus includes:
an obtaining unit 501, configured to obtain scene information to be executed, where the scene information to be executed includes an action corresponding to a current action device related to a scene to be executed;
a processing unit 502, configured to determine, according to an action corresponding to the current action device, whether a device action that is the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence; if so, deleting the device action which is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when the action corresponding to the current action device is determined to have the cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
Optionally, the processing unit 502 is further configured to:
and if the scene execution sequence does not have the device action which is the same as or opposite to the action corresponding to the current action device, inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
Optionally, the processing unit 502 is further configured to:
and when the action corresponding to the current action equipment is determined not to have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
Optionally, the processing unit 502 is further configured to:
before acquiring scene information to be executed, acquiring a setting instruction of action equipment related to each scene, wherein the setting instruction is used for setting whether the action of the action equipment related to each scene has cancellation capability or not;
and setting the action equipment related to each scene according to the setting instruction.
Optionally, the processing unit 502 is further configured to:
after the action corresponding to the current action equipment is inserted into the scene execution sequence according to the delay time, determining whether the action corresponding to the current action equipment is inserted into the initial position of the scene execution sequence;
and if so, deleting the timer of the action after the action corresponding to the current action equipment in the scene execution sequence, and triggering a scene execution processing thread to process the scene execution sequence.
Based on the same technical concept, an embodiment of the present invention provides a computing device, including:
a memory for storing program instructions;
and the processor is used for calling the program instruction stored in the memory and executing the method for determining the intelligent household scene execution according to the obtained program.
Based on the same technical concept, embodiments of the present invention provide a computer-readable storage medium, where computer-executable instructions are stored, and the computer-executable instructions are configured to enable a computer to execute a method for determining an execution of an intelligent home scene.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including the preferred embodiment and all changes and modifications that fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present application and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A method for determining execution of an intelligent home scene is applied to a plurality of scenes which are executed in a crossed manner, and is characterized by comprising the following steps:
acquiring scene information to be executed, wherein the scene information to be executed comprises actions corresponding to current action equipment related to a scene to be executed;
determining whether a device action which is the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence according to the action corresponding to the current action device; the scene to which the same or opposite equipment action belongs and the scene to which the action corresponding to the current action equipment belongs belong to two scenes which are executed in a crossed way; the same or opposite equipment action and the action corresponding to the current action equipment belong to the action of the same action equipment; the executing action time of the scene to which the same or opposite equipment action belongs is different from the executing action time of the scene to which the action corresponding to the current action equipment belongs;
if yes, deleting the device action which is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when the action corresponding to the current action device is determined to have the cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time; the cancellation capability is used for indicating that the device actions which are the same as or opposite to the action corresponding to the current action device are cancelled, and the action corresponding to the current action device is taken as the final action belonging to the same action device.
2. The method of claim 1, wherein the method further comprises:
and if the scene execution sequence does not have the equipment action which is the same as or opposite to the action corresponding to the current action equipment, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
3. The method of claim 1, wherein the method further comprises:
and when the action corresponding to the current action equipment is determined not to have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
4. The method of claim 1, prior to obtaining context information to be executed, further comprising:
acquiring a setting instruction of action equipment related to each scene, wherein the setting instruction is used for setting whether the action of the action equipment related to each scene has cancellation capability or not;
and setting the action equipment related to each scene according to the setting instruction.
5. The method of any one of claims 1 to 4, wherein after inserting the action corresponding to the current action device into the scene execution sequence according to the delay time, further comprising:
determining whether the action corresponding to the current action equipment is inserted into the starting position of the scene execution sequence;
and if so, deleting the timer of the action after the action corresponding to the current action equipment in the scene execution sequence, and triggering a scene execution processing thread to process the scene execution sequence.
6. The utility model provides a confirm that intelligent house scene is executed device, is applied to a plurality of scenes of alternately executing, its characterized in that includes:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring scene information to be executed, and the scene information to be executed comprises actions corresponding to current action equipment related to a scene to be executed;
the processing unit is used for determining whether a device action which is the same as or opposite to the action corresponding to the current action device exists in a scene execution sequence according to the action corresponding to the current action device; the scene to which the same or opposite equipment action belongs and the scene to which the action corresponding to the current action equipment belongs belong to two scenes which are executed in a crossed manner; the same or opposite equipment action and the action corresponding to the current action equipment belong to the action of the same action equipment; the executing action time of the scene to which the same or opposite equipment action belongs is different from the executing action time of the scene to which the action corresponding to the current action equipment belongs; if yes, deleting the device action which is the same as or opposite to the action corresponding to the current action device from the scene execution sequence when the action corresponding to the current action device is determined to have the cancellation capability, and inserting the action corresponding to the current action device into the scene execution sequence according to the delay time; the cancellation capability is used for indicating that the device actions which are the same as or opposite to the action corresponding to the current action device are cancelled, and the action corresponding to the current action device is taken as the final action belonging to the same action device.
7. The apparatus as recited in claim 6, said processing unit to further:
and if the scene execution sequence does not have the device action which is the same as or opposite to the action corresponding to the current action device, inserting the action corresponding to the current action device into the scene execution sequence according to the delay time.
8. The apparatus as recited in claim 6, said processing unit to further:
and when determining that the action corresponding to the current action equipment does not have the cancellation capability, inserting the action corresponding to the current action equipment into the scene execution sequence according to the delay time.
9. A computing device, comprising:
a memory for storing program instructions;
a processor for invoking program instructions stored in said memory for executing the method of any of claims 1 to 5 in accordance with the obtained program.
10. A computer-readable storage medium having stored thereon computer-executable instructions for causing a computer to perform the method of any one of claims 1 to 5.
CN202010180858.6A 2020-03-16 2020-03-16 Method and device for determining execution of smart home scene Active CN111352350B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010180858.6A CN111352350B (en) 2020-03-16 2020-03-16 Method and device for determining execution of smart home scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010180858.6A CN111352350B (en) 2020-03-16 2020-03-16 Method and device for determining execution of smart home scene

Publications (2)

Publication Number Publication Date
CN111352350A CN111352350A (en) 2020-06-30
CN111352350B true CN111352350B (en) 2023-02-17

Family

ID=71197497

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010180858.6A Active CN111352350B (en) 2020-03-16 2020-03-16 Method and device for determining execution of smart home scene

Country Status (1)

Country Link
CN (1) CN111352350B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112286167B (en) * 2020-10-15 2022-04-29 青岛海尔科技有限公司 Internet of things scene conflict detection method and device, storage medium and electronic equipment
CN112578757A (en) * 2020-12-24 2021-03-30 珠海格力电器股份有限公司 Control method and device of intelligent household equipment
CN114578705B (en) * 2022-04-01 2022-12-27 深圳冠特家居健康系统有限公司 Intelligent home control system based on 5G Internet of things

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371326A (en) * 2016-09-18 2017-02-01 海信集团有限公司 Storage method and apparatus of equipment work scenes
CN107808669A (en) * 2017-09-30 2018-03-16 深圳市艾特智能科技有限公司 Sound control method, intelligent domestic system, storage medium and computer equipment
CN108111378A (en) * 2017-12-22 2018-06-01 南京物联传感技术有限公司 The conflict coordination system and method for work of a kind of scene settings of smart home
CN108572594A (en) * 2018-05-09 2018-09-25 深圳绿米联创科技有限公司 Generation method, device and the terminal device of smart machine control instruction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10084609B2 (en) * 2014-11-10 2018-09-25 Sengled Optoelectronics Co., Ltd. Method, apparatus, and system for controlling smart home environment using LED lighting device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371326A (en) * 2016-09-18 2017-02-01 海信集团有限公司 Storage method and apparatus of equipment work scenes
CN107808669A (en) * 2017-09-30 2018-03-16 深圳市艾特智能科技有限公司 Sound control method, intelligent domestic system, storage medium and computer equipment
CN108111378A (en) * 2017-12-22 2018-06-01 南京物联传感技术有限公司 The conflict coordination system and method for work of a kind of scene settings of smart home
CN108572594A (en) * 2018-05-09 2018-09-25 深圳绿米联创科技有限公司 Generation method, device and the terminal device of smart machine control instruction

Also Published As

Publication number Publication date
CN111352350A (en) 2020-06-30

Similar Documents

Publication Publication Date Title
CN111352350B (en) Method and device for determining execution of smart home scene
CN111176133A (en) Method and device for determining smart home scene
CN111131352B (en) Theme switching method and device
WO2014106410A1 (en) Method for terminal acceleration, terminal and storage medium
CN112306411A (en) Data storage method and device, nonvolatile storage medium and processor
CN114153584A (en) Task processing method, equipment and storage medium for game frame circulation
CN111309548A (en) Timeout monitoring method and device and computer readable storage medium
CN111338803B (en) Thread processing method and device
CN109922014B (en) Method and system for judging cold and hot start of switch
CN110555009B (en) Processing method and device for Network File System (NFS) service
CN104182264B (en) The start optimization method and its device of a kind of eMMC
CN108170493B (en) System module loading method, system and device
CN110490798A (en) Point cloud method and system
CN109558249B (en) Control method and device for concurrent operation
CN113010236B (en) Program execution method, device, equipment and storage medium
CN106202262B (en) Information processing method and electronic equipment
CN114461323A (en) Card pause processing method and device, electronic equipment and storage medium
CN109933437B (en) Method, device and equipment for preventing thread from being stuck and computer readable medium
CN108038007B (en) Method and system for orderly processing messages based on Ignite
CN111273561A (en) Method and device for controlling intelligent household equipment
CN110737320A (en) Power consumption detection method and device, electronic equipment and storage medium
CN109241066B (en) Request processing method and device
CN116112299B (en) Configuration method, device and medium for POE system hot restart scene
CN111381942A (en) Method and device for realizing APP background resident
CN111274576B (en) Control method, system, equipment and medium for intelligent contract operating environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 266100 Songling Road, Laoshan District, Qingdao, Shandong Province, No. 399

Applicant after: Qingdao Hisense Smart Life Technology Co.,Ltd.

Applicant after: HISENSE Co.,Ltd.

Address before: 266100 Songling Road, Laoshan District, Qingdao, Shandong Province, No. 399

Applicant before: QINGDAO HISENSE SMART HOME SYSTEMS Co.,Ltd.

Applicant before: HISENSE Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant