CN114035444B - Control method for intelligent home - Google Patents

Control method for intelligent home Download PDF

Info

Publication number
CN114035444B
CN114035444B CN202111478960.5A CN202111478960A CN114035444B CN 114035444 B CN114035444 B CN 114035444B CN 202111478960 A CN202111478960 A CN 202111478960A CN 114035444 B CN114035444 B CN 114035444B
Authority
CN
China
Prior art keywords
environment
category
target
determining
parameter monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111478960.5A
Other languages
Chinese (zh)
Other versions
CN114035444A (en
Inventor
王军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jinlin High Tech Technology Co ltd
Original Assignee
Beijing Jinlin High Tech Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jinlin High Tech Technology Co ltd filed Critical Beijing Jinlin High Tech Technology Co ltd
Priority to CN202111478960.5A priority Critical patent/CN114035444B/en
Publication of CN114035444A publication Critical patent/CN114035444A/en
Application granted granted Critical
Publication of CN114035444B publication Critical patent/CN114035444B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Air Conditioning Control Device (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention relates to a control method of an intelligent home, which comprises the following steps: acquiring an environment control instruction; analyzing the environment control instruction to obtain at least one target intelligent home, and generating a target control instruction corresponding to each target intelligent home; and a corresponding target control instruction is sent to the target intelligent home, so that the intelligent home is controlled according to the environment control instruction.

Description

Control method for intelligent home
Technical Field
The invention relates to the technical field of intelligent home, in particular to a control method of intelligent home.
Background
At present, intelligent home control has become an important development direction of home management, and electrical equipment, lighting equipment and other supporting equipment in home can be controlled at any time and any place through an intelligent home control technology, so that home environment can be adaptively and automatically adjusted according to actual needs of households.
However, the existing intelligent home control requirements clearly control objects, namely accurately sending control instructions to the intelligent home to be controlled.
Disclosure of Invention
First, the technical problem to be solved
In view of the prior art, the invention provides a control method for intelligent home.
(II) technical scheme
In order to achieve the above purpose, the main technical scheme adopted by the invention comprises the following steps:
a method for controlling smart home, the method comprising:
S101, acquiring an environment control instruction;
s102, analyzing the environment control instruction to obtain at least one target intelligent home, and generating target control instructions corresponding to the target intelligent home;
S103, sending a corresponding target control instruction to the target intelligent home.
Optionally, the S102 includes:
S102-1, analyzing the environment control instruction to obtain an environment range, an environment category and environment parameters; the environment types are acoustic environment, optical environment and thermal environment;
S102-2, determining candidate intelligent households according to the environment category;
S102-3, determining a target intelligent home from the candidate intelligent home according to the environment range;
S102-4, generating target control instructions corresponding to the target smart home according to the target smart home and the environment parameters.
Optionally, the environment control instruction is a section of voice or a section of text describing the target environment;
the S102-1 comprises:
Carrying out semantic recognition on the environment control instruction to obtain a position keyword, a category keyword and an expected value corresponding to the category keyword;
Matching an environment range according to the position keywords;
If the category keywords comprise first keywords, determining that the environment category is an acoustic environment; if the category keywords comprise second keywords, determining that the environment category is a light environment; if the category keywords comprise third keywords, determining that the environment category is a thermotechnical environment; the first keywords are keywords for describing sound and/or noise; the second keywords are keywords for describing brightness; the third keywords are keywords for describing temperature and/or humidity and/or air flow rate;
And determining the environment parameters according to the expected values reached by the target environment category.
Optionally, the step S102-2 includes:
acquiring environment labels of all intelligent home furnishings, wherein the environment labels are used for describing environment types related to all the intelligent home furnishings;
And taking the smart home with the environment label matched with the environment category as a candidate smart home.
Optionally, the step S102-3 includes:
Determining the effective working range of each candidate intelligent home;
And determining the candidate intelligent home with the effective working range coincident with the environment range as a target intelligent home.
Optionally, the step S102-4 includes:
Determining current parameters of the environment category;
And generating target control instructions of all target intelligent households according to the current parameters and the environment parameters.
Optionally, before determining the current parameter of the environment category, the method further includes:
Determining each subspace according to the current segmentation condition of the space of all intelligent households;
Grid division is carried out for each subspace;
parameter monitoring sensors for the respective environmental categories are configured in each grid.
Optionally, the meshing for each subspace includes:
For any one of the subspaces,
Determining a monitoring range A i of a parameter monitoring sensor of each environment category, wherein i is an environment category identifier;
Determining the minimum circumscribed rectangle of any subspace; the side lengths of the minimum circumscribed rectangle are a and b respectively;
Dividing the circumscribed rectangle into X-Y grids, wherein,
Parameter monitoring sensors of each environment category are respectively configured at four vertexes of each grid.
Optionally, the determining the current parameter of the environment category includes:
Determining each monitoring value of each target parameter monitoring sensor in a first time range; the ending time of the first time range is the current time; the target parameter monitoring sensor is a parameter monitoring sensor of the environmental category in the environmental range;
Determining standard deviation sigma j of each monitoring value of each target parameter monitoring sensor; j is a parameter monitoring sensor mark;
determining the total standard deviation sigma of all monitoring values;
determining a target parameter monitoring sensor with sigma j being less than or equal to sigma as an effective parameter monitoring sensor;
Determining current parameters of an environmental category Wherein u is an effective parameter monitoring sensor identifier, N is the total number of effective parameter monitoring sensors, avg u is the average value of monitoring values of the effective parameter monitoring sensors u, and w u is the weight of the effective parameter monitoring sensors u.
Alternatively, w u is determined by the following formula:
Wherein v is a target smart home identifier, d uv is a distance between the effective parameter monitoring sensor u and the target smart home v, and maxd v is a maximum distance between all the effective parameter monitoring sensors and the target smart home v.
(III) beneficial effects
Acquiring an environment control instruction; analyzing the environment control instruction to obtain at least one target intelligent home, and generating a target control instruction corresponding to each target intelligent home; and a corresponding target control instruction is sent to the target intelligent home, so that the intelligent home is controlled according to the environment control instruction.
Drawings
Fig. 1 is a schematic flow chart of a control method of an intelligent home according to an embodiment of the present invention;
Fig. 2 is a schematic configuration diagram of a parameter monitoring sensor according to an embodiment of the invention.
Detailed Description
The invention will be better explained by the following detailed description of the embodiments with reference to the drawings.
The existing intelligent home control requirement clearly controls the object, namely, accurately sends a control instruction to the intelligent home to be controlled.
Based on the above, the invention provides a control method of an intelligent home, which comprises the following steps: acquiring an environment control instruction; analyzing the environment control instruction to obtain at least one target intelligent home, and generating a target control instruction corresponding to each target intelligent home; and a corresponding target control instruction is sent to the target intelligent home, so that the intelligent home is controlled according to the environment control instruction.
Referring to fig. 1, the implementation process of the control method of the smart home provided in this embodiment is as follows:
S101, acquiring an environment control instruction.
Wherein the environment control instruction is a piece of voice or a piece of text describing the target environment.
For example, the environmental control instructions are: the temperature in the room was adjusted to 26 degrees celsius. Or the environmental control instruction is: the room is too bright.
S102, analyzing the environment control instruction to obtain at least one target intelligent home, and generating target control instructions corresponding to the target intelligent home.
Specifically, the implementation process of the step is as follows:
S102-1, analyzing the environment control instruction to obtain an environment range, an environment category and environment parameters.
Wherein the environment categories are acoustic, optical and thermal.
The acoustic environment is related to sound, such as sound, noise, etc.
The light environment is related to light, such as brightness, illuminance, etc.
The thermal environment is related to humidity, temperature, air flow.
In particular, the method comprises the steps of,
1. And carrying out semantic recognition on the environment control instruction to obtain the position keywords, the category keywords and expected values corresponding to the category keywords.
The position keywords are words representing positions, the category keywords are words representing categories, and expected values corresponding to the category keywords are words representing target values corresponding to the categories.
Since the environment control instruction is a piece of voice or a piece of text describing the target environment. Therefore, for the environment control instruction in the voice form, the voice can be directly subjected to semantic recognition, the voice can be converted into characters at present, and then the characters are subjected to semantic recognition.
For the environment control instruction in the text form, the text can be directly subjected to semantic recognition.
In addition, the location keyword, the category keyword, and the expected value corresponding to the category keyword may be clearly identified in the environmental control instruction acquired in step S101. For example, the environmental control instructions are: the temperature in the room was adjusted to 26 degrees celsius. At this time, the position keywords can be accurately identified as rooms, the category keywords are temperatures, and the expected values corresponding to the category keywords are 26 ℃.
The location keyword, the category keyword, or the category keyword may not be clearly identified in the environmental control instruction acquired in step S101, and at this time, a specific value of the keyword may be obtained according to semantic analysis and a preset semantic library. For example, the environmental control instructions are: the room is too bright. At this time, only the position keyword can be accurately identified as a room, the category keyword is bright, but the expected value corresponding to the category keyword cannot be accurately identified, so that the optimal brightness of the room is 300Lux (Lux) according to a preset semantic library, and therefore the expected value corresponding to the category keyword is determined to be 300Lux.
For the situation that the position keywords, the category keywords or the category keywords can be clearly identified, the specific value of the keywords can be obtained according to the current attribute of the user besides the specific value of the keywords can be obtained through a preset semantic library.
Among other things, the user's current attributes may include, but are not limited to: the user's current location, the time the user is currently at, the user's current actions (e.g., watching television), the user's current objects of interest (e.g., television content being watched), etc. The attribute can be obtained through a monitoring sensor, wherein the monitoring sensor comprises a preset video sensor, an preset audio sensor and the like, and further comprises intelligent household equipment such as a television, a mobile phone and the like. If the position of the user is determined through a camera, then the content currently watched by the user is acquired through a television.
For example, the environmental control instructions are: is too noisy. At this time, only the category keywords can be accurately identified as noisy, but the position keywords and expected values corresponding to the category keywords cannot be accurately identified. Then the position keyword can be determined to be a room according to the position of the user (the user is currently in the room), and the expected value corresponding to the category keyword obtains the volume of the room to be 30 db according to the preset semantic library, so that the expected value corresponding to the category keyword is determined to be identified to be 30 db.
The semantic library setting process adopts the existing setting process, and is not described herein.
In addition, the setting process of the parameter monitoring sensor is as follows:
1) And determining each subspace according to the current segmentation condition of the space of all intelligent households.
For example, a two-room, one-hall household includes 5 subspaces, namely a primary lying subspace, a secondary lying subspace, a living room subspace, a kitchen subspace, and a bathroom subspace. Or 6 subspaces, namely a main sleeping subspace, a secondary sleeping subspace, a living room subspace, a kitchen subspace, a bathroom subspace and a balcony subspace.
The division of subspaces is related to the design of all intelligent home space construction.
2) Meshing is performed for each subspace.
For any one of the subspaces,
(1) The monitoring range a i of the parameter monitoring sensor for each environmental category is determined.
Wherein i is an environment category identification.
The parameter monitoring sensors can be multiple, and correspond to multiple environment types. That is, there will be a sensor for each environmental category, and one environmental category will have multiple sensors, one sensor corresponding to one environmental category, e.g., a sensor of a thermal environment: a temperature sensor, a humidity sensor; sensor of acoustic environment: acoustic sensors, etc., sensors of light environment: illuminance sensors, and the like. There are a plurality of temperature sensors, etc.
Each sensor has a different monitoring range, where the monitoring range a i of each parameter monitoring sensor is obtained.
If the monitoring ranges of the parameter monitoring sensors in the same environment category are the same (for example, the sensor types are the same), the monitoring range of any parameter monitoring sensor can be used as the monitoring range of the parameter monitoring sensor in the environment category. If the monitoring ranges of the parameter monitoring sensors in the same environment category are different (for example, the sensor models are different), the minimum monitoring range is used as the monitoring range of the parameter monitoring sensor in the environment category.
(2) A minimum bounding rectangle for either subspace is determined.
The side lengths of the minimum circumscribed rectangle are a and b respectively.
The determination scheme of the minimum circumscribed rectangle adopts the existing scheme, and is not repeated here.
(3) The circumscribed rectangle is partitioned into an X Y grid.
Wherein,
As shown in fig. 2, the circumscribed rectangle is divided.
(4) Parameter monitoring sensors of each environment category are respectively configured at four vertexes of each grid.
As shown in fig. 2, each black dot is configured with a parameter monitoring sensor for each environmental category.
Since the rectangle is the smallest circumscribed rectangle of any subspace, it may happen that a black dot is not the range of the actual subspace, but is outside the subspace (for example, the Z point in fig. 2), and then only the point (for example, the Z' point in fig. 2) closest to the black dot in the subspace needs to be determined, and the parameter monitoring sensor of each environment type is configured at the closest point.
3) Parameter monitoring sensors for the respective environmental categories are configured in each grid.
For a black dot, a parameter monitoring sensor of one type of environment category may be configured, and a parameter monitoring sensor of a plurality of types of environment categories may be configured.
2. And matching the environment range according to the position keywords.
For example, the location key is a room, and then the environment scope is the scope in which the room is located.
3. And determining the environment category according to the category keywords.
If the category keyword includes the first keyword, then the environment category is determined to be an acoustic environment.
Wherein the first keyword is a keyword describing sound and/or noise.
For example, a category keyword is a noisy keyword that describes sound and/or noise, and thus, an environment category is determined as an acoustic environment.
If the category keyword includes a second keyword, then determining that the environment category is a light environment.
Wherein the second keyword is a keyword describing brightness.
For example, if the category keyword is bright, the environment category is determined to be a light environment.
If the category keyword includes a third keyword, then determining that the environment category is a thermodynamic environment.
Wherein the third keyword is a keyword describing temperature and/or humidity and/or air flow rate.
For example, if the category keyword is temperature, the environment category is determined to be a thermal environment.
4. And determining the environment parameters according to the expected values reached by the target environment category.
For example, if the expected value corresponding to the category keyword is 26 degrees celsius, the environment parameter is determined to be 26 degrees celsius.
For another example, if the expected value corresponding to the category keyword is 300Lux, the environment parameter is determined to be 300Lux.
For another example, if the expected value corresponding to the category keyword is 30 db, the environmental parameter is determined to be 30 db.
S102-2, determining candidate intelligent households according to the environment types.
In particular, the method comprises the steps of,
1. And acquiring the environment labels of all intelligent households.
The environment labels are used for describing environment categories related to the intelligent home.
When the intelligent home is set, the environment label is configured for the intelligent home.
For example, lamps whose environment label is a light environment
As another example, an air conditioner affects not only temperature but also air flow, and thus its environmental label is a thermal environment.
One smart home may be configured with multiple environmental tags.
For example, curtains, which affect not only brightness but also air flow, also block outdoor sound, and thus their environmental labels are light environments, thermal environments, acoustic environments.
2. And taking the smart home with the environment label matched with the environment category as a candidate smart home.
All intelligent home related to the environment control instruction in the home can be obtained in the step.
That is, all the smart home tags including the environment category in the environment control instruction are used as candidate smart home.
S102-3, determining the target smart home from the candidate smart home according to the environment range.
In particular, the method comprises the steps of,
1. And determining the effective working range of each candidate intelligent home.
Each smart home has an effective operating range that is also determined when the smart home is set.
Such as an air conditioner in a living room, which is an effective working range.
The effective working range of the indoor air conditioner is a room.
The effective working range of the central air conditioner is all the range of the house.
2. And determining the candidate intelligent home with the effective working range coincident with the environment range as a target intelligent home.
The step determines candidate smart home meeting the environmental scope in the environmental control instruction as the target smart home.
For example, the environment range is the range of the room, then the air conditioner in the room is the target smart home, and the air conditioner in the living room is not the target smart home although it is the candidate smart home.
Executing the method can acquire all intelligent home devices matched with the environment range and the environment category in the environment control instruction, namely the target intelligent home device.
S102-4, generating target control instructions corresponding to the target smart home according to the target smart home and the environment parameters.
In particular, the method comprises the steps of,
1. Current parameters of the environment category are determined.
Wherein the current parameter is obtained by a parameter monitoring sensor.
In addition, the setting process of the parameter monitoring sensor is as follows:
1) And determining each subspace according to the current segmentation condition of the space of all intelligent households.
For example, a two-room, one-hall household includes 5 subspaces, namely a primary lying subspace, a secondary lying subspace, a living room subspace, a kitchen subspace, and a bathroom subspace. Or 6 subspaces, namely a main sleeping subspace, a secondary sleeping subspace, a living room subspace, a kitchen subspace, a bathroom subspace and a balcony subspace.
The division of subspaces is related to the design of all intelligent home space construction.
2) Meshing is performed for each subspace.
For any one of the subspaces,
(1) The monitoring range a i of the parameter monitoring sensor for each environmental category is determined.
Wherein i is an environment category identification.
The parameter monitoring sensors can be multiple, and correspond to multiple environment types. That is, there will be a sensor for each environmental category, and one environmental category will have multiple sensors, one sensor corresponding to one environmental category, e.g., a sensor of a thermal environment: a temperature sensor, a humidity sensor; sensor of acoustic environment: acoustic sensors, etc., sensors of light environment: illuminance sensors, and the like. There are a plurality of temperature sensors, etc.
Each sensor has a different monitoring range, where the monitoring range a i of each parameter monitoring sensor is obtained.
If the monitoring ranges of the parameter monitoring sensors in the same environment category are the same (for example, the sensor types are the same), the monitoring range of any parameter monitoring sensor can be used as the monitoring range of the parameter monitoring sensor in the environment category. If the monitoring ranges of the parameter monitoring sensors in the same environment category are different (for example, the sensor models are different), the minimum monitoring range is used as the monitoring range of the parameter monitoring sensor in the environment category.
(2) A minimum bounding rectangle for either subspace is determined.
The side lengths of the minimum circumscribed rectangle are a and b respectively.
The determination scheme of the minimum circumscribed rectangle adopts the existing scheme, and is not repeated here.
(3) The circumscribed rectangle is partitioned into an X Y grid.
Wherein,
As shown in fig. 2, the circumscribed rectangle is divided.
(4) Parameter monitoring sensors of each environment category are respectively configured at four vertexes of each grid.
As shown in fig. 2, each black dot is configured with a parameter monitoring sensor for each environmental category.
Since the rectangle is the smallest circumscribed rectangle of any subspace, it may happen that a black dot is not the range of the actual subspace, but is outside the subspace (for example, the Z point in fig. 2), and then only the point (for example, the Z' point in fig. 2) closest to the black dot in the subspace needs to be determined, and the parameter monitoring sensor of each environment type is configured at the closest point.
3) Parameter monitoring sensors for the respective environmental categories are configured in each grid.
For a black dot, a parameter monitoring sensor of one type of environment category may be configured, and a parameter monitoring sensor of a plurality of types of environment categories may be configured.
The specific process of determining the current parameters of the environment category based on the parameter monitoring sensor is as follows:
A. And determining each monitoring value of each target parameter monitoring sensor in the first time range.
The ending time of the first time range is the current time.
The target parameter monitoring sensor is a parameter monitoring sensor of an environmental category in an environmental range.
For example, the first time range is half an hour of the current start.
Taking an environment range as a room and an environment category as a thermal environment as an example, the target parameter monitoring sensor is a parameter monitoring sensor of the thermal environment configured in the room, such as a temperature sensor in the room, and the like.
Since the target parameter monitoring sensors are grid-configured within the subspace, the target parameter monitoring sensors are located at different positions in the room.
For example, if the target parameter monitoring sensors are sensor 1, sensor 2, and sensor 3, each monitoring value of sensor 1, each monitoring value of sensor 2, and each monitoring value of sensor 3 are determined within half an hour.
B. The standard deviation sigma j of each monitoring value of each target parameter monitoring sensor is determined.
And j is a parameter monitoring sensor identifier.
For example, the standard deviation σ 1 of each monitor value of the sensor 1, the standard deviation σ 2 of each monitor value of the sensor 2, and the standard deviation σ 3 of each monitor value of the sensor 3 are determined.
C. the total standard deviation sigma of all monitored values is determined.
For example, the standard deviation of all monitoring values of sensor 1, sensor 2, and sensor 3 is determined.
D. and determining the target parameter monitoring sensor with sigma j less than or equal to sigma as an effective parameter monitoring sensor.
If the standard deviation σ 1 > σ of each monitored value of the sensor 1, the sensor 1 is determined to be an invalid parameter monitoring sensor.
If the standard deviation sigma 2 < sigma of each monitored value of the sensor 2, the sensor 2 is determined to be a valid parameter monitoring sensor.
If the standard deviation σ 3 =σ of each monitored value of the sensor 3, the sensor 3 is determined to be a valid parameter monitoring sensor.
E. Determining current parameters of an environmental category
Wherein u is an effective parameter monitoring sensor identifier, N is the total number of effective parameter monitoring sensors, avg u is the average value of monitoring values of the effective parameter monitoring sensors u, and w u is the weight of the effective parameter monitoring sensors u.
W u is determined by the following formula:
Wherein v is a target smart home identifier, d uv is a distance between the effective parameter monitoring sensor u and the target smart home v, and maxd v is a maximum distance between all the effective parameter monitoring sensors and the target smart home v.
For example, there are 2 effective parameter monitoring sensors, namely, sensor 2 and sensor 3, respectively, then n=2, avg 1 is the average value of each monitored value of sensor 2, avg 2 is the average value of each monitored value of sensor 3, w 1 is the weight of sensor 2, and w 2 is the weight of sensor 3.
The target smart home is air conditioner 1, which is identified as I1, then d 1I1 is the distance between sensor 2 and air conditioner 1, maxd I1 is max { the distance between sensor 2 and air conditioner 1, the distance between sensor 3 and air conditioner 1 }. Then
Determining current parameters of an environmental category
For a certain target smart home, if there are multiple target smart home, the step calculates B of each target smart home, that is, determines the current parameters of the corresponding environment class for each target smart home.
2. And generating target control instructions of all target intelligent households according to the current parameters and the environment parameters.
Specifically, the relation between the current parameters and the environment parameters is compared, and target control instructions of all target intelligent households are generated.
As the current parameters of its corresponding environment category are determined for each target smart home. The target control instruction of each target smart home is generated according to the current parameters of the corresponding environment category of each target smart home and the environment parameters.
For example, for air conditioner 1, if the environmental parameter is 26 degrees celsius, and the current parameter of the environmental category corresponding to the air conditioner is 30 degrees celsius, then the generation instruction is "I1, refrigeration, 26 degrees celsius".
The instruction generation can be generated in an existing manner, and will not be described in detail.
S103, sending a corresponding target control instruction to the target intelligent home.
If the instruction generated in S102 is "I1, cool, 26 degrees celsius", cool is sent to I1, 26 degrees celsius, or "I1, cool, 26 degrees celsius" is sent to I1. And then I1 (namely, the air conditioner 1) and the target control command are enabled to adjust the temperature to 26 ℃.
The method of the embodiment obtains an environment control instruction; analyzing the environment control instruction to obtain at least one target intelligent home, and generating a target control instruction corresponding to each target intelligent home; and a corresponding target control instruction is sent to the target intelligent home, so that the intelligent home is controlled according to the environment control instruction.
In order that the above-described aspects may be better understood, exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions.
Furthermore, it should be noted that in the description of the present specification, the terms "one embodiment," "some embodiments," "example," "specific example," or "some examples," etc., refer to a specific feature, structure, material, or characteristic described in connection with the embodiment or example being included in at least one embodiment or example of the present invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art upon learning the basic inventive concepts. Therefore, the appended claims should be construed to include preferred embodiments and all such variations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, the present invention should also include such modifications and variations provided that they come within the scope of the following claims and their equivalents.

Claims (6)

1. The control method for the intelligent home is characterized by comprising the following steps:
S101, acquiring an environment control instruction;
s102, analyzing the environment control instruction to obtain at least one target intelligent home, and generating target control instructions corresponding to the target intelligent home;
The S102 includes: s102-1, analyzing the environment control instruction to obtain an environment range, an environment category and environment parameters; the environment types are acoustic environment, optical environment and thermal environment; s102-2, determining candidate intelligent households according to the environment category; s102-3, determining a target intelligent home from the candidate intelligent home according to the environment range; s102-4, generating target control instructions corresponding to all target intelligent households according to the target intelligent households and the environmental parameters; specifically, determining current parameters of the environment category; generating target control instructions of all target intelligent households according to the current parameters and the environment parameters;
The determining the current parameters of the environment category includes:
Determining each monitoring value of each target parameter monitoring sensor in a first time range; the ending time of the first time range is the current time; the target parameter monitoring sensor is a parameter monitoring sensor of the environmental category in the environmental range;
Determining standard deviation sigma j of each monitoring value of each target parameter monitoring sensor; j is a parameter monitoring sensor mark; determining the total standard deviation sigma of all monitoring values; determining a target parameter monitoring sensor with sigma j being less than or equal to sigma as an effective parameter monitoring sensor; determining current parameters of an environmental category Wherein u is an effective parameter monitoring sensor identifier, N is the total number of the effective parameter monitoring sensors, avg u is the average value of all monitoring values of the effective parameter monitoring sensor u, and w u is the weight of the effective parameter monitoring sensor u; /(I)V is a target smart home identifier, d uv is the distance between the effective parameter monitoring sensor u and the target smart home v, and maxd v is the maximum distance between all the effective parameter monitoring sensors and the target smart home v;
S103, sending a corresponding target control instruction to the target intelligent home.
2. The method of claim 1, wherein the environmental control instruction is a piece of speech or a piece of text describing a target environment;
the S102-1 comprises:
Carrying out semantic recognition on the environment control instruction to obtain a position keyword, a category keyword and an expected value corresponding to the category keyword;
Matching an environment range according to the position keywords;
If the category keywords comprise first keywords, determining that the environment category is an acoustic environment; if the category keywords comprise second keywords, determining that the environment category is a light environment; if the category keywords comprise third keywords, determining that the environment category is a thermotechnical environment; the first keywords are keywords for describing sound and/or noise; the second keywords are keywords for describing brightness; the third keywords are keywords for describing temperature and/or humidity and/or air flow rate;
And determining the environment parameters according to the expected values reached by the target environment category.
3. The method according to claim 1, wherein S102-2 comprises:
acquiring environment labels of all intelligent home furnishings, wherein the environment labels are used for describing environment types related to all the intelligent home furnishings;
And taking the smart home with the environment label matched with the environment category as a candidate smart home.
4. The method according to claim 1, wherein S102-3 comprises:
Determining the effective working range of each candidate intelligent home;
And determining the candidate intelligent home with the effective working range coincident with the environment range as a target intelligent home.
5. The method of claim 4, wherein prior to determining the current parameters of the environmental category, further comprising:
Determining each subspace according to the current segmentation condition of the space of all intelligent households;
Grid division is carried out for each subspace;
parameter monitoring sensors for the respective environmental categories are configured in each grid.
6. The method of claim 5, wherein meshing each subspace comprises:
For any one of the subspaces,
Determining a monitoring range A i of a parameter monitoring sensor of each environment category, wherein i is an environment category identifier;
Determining the minimum circumscribed rectangle of any subspace; the side lengths of the minimum circumscribed rectangle are a and b respectively;
Dividing the circumscribed rectangle into X-Y grids, wherein,
Parameter monitoring sensors of each environment category are respectively configured at four vertexes of each grid.
CN202111478960.5A 2021-12-06 2021-12-06 Control method for intelligent home Active CN114035444B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111478960.5A CN114035444B (en) 2021-12-06 2021-12-06 Control method for intelligent home

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111478960.5A CN114035444B (en) 2021-12-06 2021-12-06 Control method for intelligent home

Publications (2)

Publication Number Publication Date
CN114035444A CN114035444A (en) 2022-02-11
CN114035444B true CN114035444B (en) 2024-05-07

Family

ID=80146421

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111478960.5A Active CN114035444B (en) 2021-12-06 2021-12-06 Control method for intelligent home

Country Status (1)

Country Link
CN (1) CN114035444B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487396A (en) * 2015-12-29 2016-04-13 宇龙计算机通信科技(深圳)有限公司 Method and device of controlling smart home
CN106302041A (en) * 2016-08-05 2017-01-04 深圳博科智能科技有限公司 A kind of intelligent home equipment control method and device
CN111578465A (en) * 2020-04-27 2020-08-25 青岛海尔空调器有限总公司 Intelligent adjusting method and system for indoor environment
CN111596560A (en) * 2020-04-27 2020-08-28 青岛海尔空调器有限总公司 Intelligent regulation and control method and system for preparation scene before sleep
CN112286115A (en) * 2020-11-17 2021-01-29 珠海格力电器股份有限公司 Control method and device of intelligent household equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105487396A (en) * 2015-12-29 2016-04-13 宇龙计算机通信科技(深圳)有限公司 Method and device of controlling smart home
CN106302041A (en) * 2016-08-05 2017-01-04 深圳博科智能科技有限公司 A kind of intelligent home equipment control method and device
CN111578465A (en) * 2020-04-27 2020-08-25 青岛海尔空调器有限总公司 Intelligent adjusting method and system for indoor environment
CN111596560A (en) * 2020-04-27 2020-08-28 青岛海尔空调器有限总公司 Intelligent regulation and control method and system for preparation scene before sleep
CN112286115A (en) * 2020-11-17 2021-01-29 珠海格力电器股份有限公司 Control method and device of intelligent household equipment

Also Published As

Publication number Publication date
CN114035444A (en) 2022-02-11

Similar Documents

Publication Publication Date Title
US11721186B2 (en) Systems and methods for categorizing motion events
CN108899023B (en) Control method and device
US11243502B2 (en) Interactive environmental controller
US20220317641A1 (en) Device control method, conflict processing method, corresponding apparatus and electronic device
US9449229B1 (en) Systems and methods for categorizing motion event candidates
WO2020244573A1 (en) Voice instruction processing method and device, and control system
WO2019034083A1 (en) Voice control method for smart home, and smart device
CN110687817A (en) Smart home control method and device, terminal and computer-readable storage medium
CN111665737B (en) Smart home scene control method and system
CN110942773A (en) Method and device for controlling intelligent household equipment through voice
CN106664773B (en) Light scene creation or modification by means of lighting device usage data
WO2022247244A1 (en) Voice control method for air conditioner, and air conditioner
WO2022247245A1 (en) Voice control method for air conditioner and air conditioner
WO2022268136A1 (en) Terminal device and server for voice control
CN109741747A (en) Voice scene recognition method and device, voice control method and device and air conditioner
CN114067798A (en) Server, intelligent equipment and intelligent voice control method
CN114035444B (en) Control method for intelligent home
CN113676382B (en) IOT voice command control method, system and computer readable storage medium
CN116774599A (en) Intelligent equipment control method based on knowledge graph, computer device and computer readable storage medium
WO2018023523A1 (en) Motion and emotion recognizing home control system
CN110941189A (en) Intelligent household system and control method thereof and readable storage medium
WO2023115659A1 (en) Method and apparatus for automatically identifying environmental parameters
CN114137847A (en) User-oriented intelligent household graphical programming method and system
CN113542689A (en) Image processing method based on wireless Internet of things and related equipment
CN111306714A (en) Air conditioner and control method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant