CN108919655A - A kind of scene judgment method and device based on user behavior - Google Patents

A kind of scene judgment method and device based on user behavior Download PDF

Info

Publication number
CN108919655A
CN108919655A CN201810597909.8A CN201810597909A CN108919655A CN 108919655 A CN108919655 A CN 108919655A CN 201810597909 A CN201810597909 A CN 201810597909A CN 108919655 A CN108919655 A CN 108919655A
Authority
CN
China
Prior art keywords
scene
control instruction
control
time
matched
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810597909.8A
Other languages
Chinese (zh)
Inventor
林锦滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Fang Sheng Intelligent Engineering Co Ltd
Original Assignee
Guangzhou Fang Sheng Intelligent Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Fang Sheng Intelligent Engineering Co Ltd filed Critical Guangzhou Fang Sheng Intelligent Engineering Co Ltd
Priority to CN201810597909.8A priority Critical patent/CN108919655A/en
Publication of CN108919655A publication Critical patent/CN108919655A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The present invention provides a kind of scene judgment method and device based on user behavior, the control instruction that user sends to smart machine can be captured, and according to scenario table preset in database, it obtains and the matched contextual model of the current controlling behavior of user, and the contextual model is sent to the intelligent controller in family, so that intelligent controller is controlled corresponding smart home device according to the contextual model received, reduce the demand for control of user, improves user experience.

Description

A kind of scene judgment method and device based on user behavior
Technical field
The present invention relates to smart home field, in particular to a kind of scene judgment method and device based on user behavior.
Background technique
The development of intelligent home network technology at home starts from late 1990s, general to 2000 or so smart homes Thought starts to be publicized on a large scale, and the ordinarily resident in China is allowed to start to understand and receive the concept of smart home.Nowadays each The developer of cell also more considers the construction of intelligent infrastructure in the design phase of residential quarters and house, A small number of high-grade residential quarters oneself through mating quite perfect intelligent home network, many developers are by house " intelligence Change " it is largely publicized in the advertisement of real estate sale as a bright spot.
However, present smart home largely also rests on intelligent distant control level, user passes through such as mobile phone, plate electricity The smart machines such as brain, PC are remotely controlled all kinds of smart machines in family, or check the state of all kinds of smart machines, and distance is really Full-automatic intelligent can control there are also certain distance.
The optimal mode of smart home is exactly the presence for allowing people to forget " intellectual product ".Because of final all products It all can be intellectual product from now on, the meeting that we welcome is the epoch of one " object connects object ".Network-enabled intelligent is being experienced in electricity On brain and mobile phone bring it is convenient with it is excellent after, it is prosperous that the case where user is to current smart home industry analysis is that smart home has The demand of Sheng.In daily life, people are to non intelligent environment more discontented, it is following it is this it is discontented will be increasingly Strongly, resulting demand also can be more more and more urgent.
Summary of the invention
It is an object of the invention to overcome the shortage of prior art, a kind of scene judgment method based on user behavior is provided And device, the contextual model that user currently needs can be judged by the control instruction that user sends from preset scenario table, And the contextual model that will acquire is sent to the intelligent controller in family.
The following technical solution is employed to achieve the above object by the present invention:
In a first aspect, the present invention provides a kind of scene judgment method based on user behavior, including:
Obtain the control instruction that user sends;
Read default scenario table, wherein the default scenario table include at least one scene control model and at least one Control instruction is triggered with the matched scene of the scene control model;
Matched scene triggering control instruction is obtained from the default scenario table according to the control instruction;
Matched scene control mould is obtained from the default scenario table according to accessed scene triggering control instruction Formula;
The scene control model is sent to controller.
In one embodiment of this hair, a kind of scene judgment method based on user behavior further includes:
When there is no when scene triggering control instruction matched with first control instruction in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and institute State the matched scene control instruction group of scene control model;
The control instruction that user sends within the default learning time is obtained, and is generated according to acquired control instruction Compare instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate and refer to Enable group;
Acquisition and the matched scene control model of the candidate instruction group, are denoted as pairing scene from the preset instructions table Control model;
First control instruction is denoted as and triggers control instruction with the matched scene of the pairing scene control model, and It is written in the default scenario table.
In one embodiment of this hair, the default learning time includes preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control Instruction generates control instruction group, specifically includes:
The control instruction that user sends is obtained, and starts timing;
New control instruction or timing time have not been obtained after being more than the preset interval time to hold greater than described preset When the continuous time;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
In an embodiment of the present invention, the default scenario table further includes at least one and scene triggering control instruction The matched scene triggered time;
The scene control module is matched with the scene triggered time described at least one;
Then, the scene triggering control instruction according to accessed by obtains matched scene from the default scenario table Control model specifically includes:
Current time is obtained, is denoted as at the first time;
It is obtained from the default scenario table and triggers the control instruction matched scene triggered time with the scene, be denoted as the Two times;
Note is candidate time with the first time matched second time;
It is obtained and the matched scene control module of the candidate time from the default scenario table.
In one embodiment of this hair, a kind of scene judgment method based on user behavior further includes:
When the second time match matched with the first time is not present in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and institute State the matched scene control instruction group of scene control model;
The control instruction that user sends within the default learning time is obtained, and is generated according to acquired control instruction Compare instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate and refer to Enable group;
Acquisition and the matched scene control model of the candidate instruction group, are denoted as pairing scene from the preset instructions table Control model;
By the first time be denoted as with the pairing scene control model matched scene triggered time, and be written described in In default scenario table.
In one embodiment of this hair, the default learning time includes preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control Instruction generates control instruction group, specifically includes:
The control instruction that user sends is obtained, and starts timing;
New control instruction or timing time have not been obtained after being more than the preset interval time to hold greater than described preset When the continuous time;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
Second aspect, the scene judgment means based on user behavior that the present invention also provides a kind of, including control instruction obtain Modulus block, default scenario table read module, control instruction matching module, scene control model obtain module and sending module;
Wherein, the control instruction obtains module, for obtaining the control instruction of user's transmission;
The default scenario table read module, for giving reading default scenario table, wherein the default scenario table includes extremely Lack a scene control model and at least one and the matched scene of the scene control model triggers control instruction;
The control instruction matching module, it is matched for being obtained from the default scenario table according to the control instruction Scene triggers control instruction;
The scene control model obtains module, presets for triggering control instruction according to accessed scene from described Matched scene control model is obtained in scenario table;
The sending module, for the scene control model to be sent to controller.
Beneficial effects of the present invention:
The present invention provides a kind of scene judgment method and device based on user behavior, can capture user and set to intelligence The control instruction that preparation is sent, and according to scenario table preset in database, it obtains and the matched scene of the current controlling behavior of user Mode, and the contextual model is sent to the intelligent controller in family, allow intelligent controller according to the scene received The corresponding smart home device of scheme control reduces the demand for control of user, improves user experience.
Detailed description of the invention
Fig. 1 is a kind of flow diagram of the scene judgment method based on user behavior in one embodiment of the invention;
Fig. 2 is a kind of structural schematic diagram of the scene judgment means based on user behavior in one embodiment of the invention.
Specific embodiment
With reference to the accompanying drawing and specific embodiment the present invention will be further described, illustrative examples therein and Illustrate only to be used to explain the present invention, but not as a limitation of the invention.
In a first aspect, as shown in Figure 1, the present invention provides a kind of scene judgment method based on user behavior, including:
S100:Obtain the first control instruction;
Specifically, what first control instruction can actively be issued by user, as user is sent out by mobile terminal It send;It can also be issued automatically by sensor, the presence sensor that entrance hall is such as arranged in senses to be sent when the user goes home.
S200:Default scenario table is read, wherein the default scenario table includes at least one scene triggering control instruction, and At least one triggers the matched scene control model of control instruction with the scene;
S300:Matched scene triggering control is obtained from the default scenario table according to first control instruction to refer to It enables;
S400:Matched scene control is obtained from the default scenario table according to accessed scene triggering control instruction Molding formula;
S500:The scene control model is sent to controller.
In a concrete application scene of the invention, method provided by first aspect present invention is executed by background server, The default scenario table is stored in background server, and the default scenario table includes at least one scene triggering control instruction, And at least one triggers the matched scene control model of control instruction, such as following table with the scene:
Scene triggers control instruction Contextual model
Open porch lamp It comes home from work mode
User goes home to instruct It comes home from work mode
Close compartment lamp Rest mode
…… ……
When user goes back home, user is sent to background server by mobile phone and opens porch lamp instruction;Or setting exists The presence sensor in entrance hall sends user to background server automatically after sensing that user goes home and goes home to instruct;Background server According to the instruction received, the scene mould for instructions match of going home with the instruction of opening porch lamp or user is read from default scenario table Formula is mode of coming home from work;The mode of coming home from work that then background server will acquire is sent to the Intelligent housing in family Device, the intelligent domestic appliance controller obtain corresponding according to the mode of coming home from work received from the preset instructions table of storage Control instruction group, and according to the working condition of the corresponding equipment in the control instruction group control man got, such as open hot water Device, TV, air-conditioning etc..
In one embodiment of this hair, a kind of scene judgment method based on user behavior further includes:
When there is no when scene triggering control instruction matched with first control instruction in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and institute State the matched scene control instruction group of scene control model;
The control instruction that user sends within the default learning time is obtained, and is generated according to acquired control instruction Compare instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate and refer to Enable group;
Acquisition and the matched scene control model of the candidate instruction group, are denoted as pairing scene from the preset instructions table Control model;
First control instruction is denoted as and triggers control instruction with the matched scene of the pairing scene control model, and It is written in the default scenario table.
In one embodiment of this hair, the default learning time includes preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control Instruction generates control instruction group, specifically includes:
The control instruction that user sends is obtained, and starts timing;
New control instruction or timing time have not been obtained after being more than the preset interval time to hold greater than described preset When the continuous time;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
In a concrete application scene of the invention, method provided by first aspect present invention is executed by background server, The default scenario table and preset instructions table are stored in background server;
After user goes back home, air-conditioning open command is sent to background server by mobile phone;Background server is according to institute The instruction got, the scene of match query triggers control instruction from default scenario table;When background server inquiry less than with When the matched scene triggering command of air-conditioning open command, background server enters mode of learning, and background server obtains default learn The time is practised, if preset durations are 10 minutes, preset interval time is 2 minutes;Meanwhile background server starts timing, and Obtain the control instruction that user sends in 10 minutes;It is more than 10 when user is more than 2 minutes not sent new control instructions or time After minute, background server is according to the control instruction got, such as opens parlor lamp, open water heater, open TV, generates Compare instruction group;And it is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group, and root with described Matched scene control model, mould of such as coming home from work are obtained from the preset instructions table according to acquired scene control instruction group Formula;Then background server judge with the matched scene control model of air-conditioning open command be the mode of coming home from work, thus will described in Air-conditioning open command is denoted as the scene triggering control instruction for the mode of coming home from work, and is written in default scenario table.
In an embodiment of the present invention, the default scenario table further includes at least one and scene triggering control instruction The matched scene triggered time;
The scene control module is matched with the scene triggered time described at least one;
Then, step S400 is specifically included:
Current time is obtained, is denoted as at the first time;
It is obtained from the default scenario table and triggers the control instruction matched scene triggered time with the scene, be denoted as the Two times;
Note is candidate time with the first time matched second time;
It is obtained and the matched scene control module of the candidate time from the default scenario table.
Specifically, the scene departure time includes a certain specific moment, and such as 8 points, or a certain specific period, such as 8 points It further include the combination of date and moment or period, such as Mon-Fri to 10 points, 8 points to 10 points.
In a concrete application scene of the invention, method provided by first aspect present invention is executed by background server, The default scenario table is stored in background server, and the default scenario table includes at least one scene triggering control instruction, At least one triggers control instruction matched scene triggered time, and at least one and the scene triggered time with the scene Matched scene control model, such as following table:
When user goes back home, user is sent to background server by mobile phone and opens porch lamp instruction;Background server According to the instruction received, read from default scenario table the scene triggered time with opening porch lamp instructions match, including:Week One to Friday, 19:00-22:00;Mon-Fri, 7:00-8:00;Weekend, 7:00-10:00;Background server obtains simultaneously Current time, such as Tuesday, 19:30, then background server obtains matched scene touching according to current time from default scenario table The hair time be Mon-Fri, 19:00-22:00, while obtaining with the scene departure time matched contextual model is to come off duty back Family's mode;The mode of coming home from work that background server will acquire is sent to the intelligent domestic appliance controller in family, the intelligence man Controller is occupied according to the mode of coming home from work received, corresponding control instruction group is obtained from the preset instructions table of storage, and According to the working condition of the corresponding equipment in the control instruction group control man got, water heater, TV, air-conditioning are such as opened Deng.
In one embodiment of this hair, a kind of scene judgment method based on user behavior further includes:
When the second time match matched with the first time is not present in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and institute State the matched scene control instruction group of scene control model;
The control instruction that user sends within the default learning time is obtained, and is generated according to acquired control instruction Compare instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate and refer to Enable group;
Acquisition and the matched scene control model of the candidate instruction group, are denoted as pairing scene from the preset instructions table Control model;
By the first time be denoted as with the pairing scene control model matched scene triggered time, and be written described in In default scenario table.
In one embodiment of this hair, the default learning time includes preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control Instruction generates control instruction group, specifically includes:
The control instruction that user sends is obtained, and starts timing;
New control instruction or timing time have not been obtained after being more than the preset interval time to hold greater than described preset When the continuous time;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
It in a concrete application scene of the invention, uses the example above, user is sent to background server by mobile phone and opens door The instruction of Room lamp;Background server reads from default scenario table according to the instruction received and opens porch lamp instructions match Scene triggered time, while background server acquisition current time, such as Tuesday, 22:30, background server inquiry less than with it is current The scene triggered time of time match, background server enter mode of learning, and background server obtains default learning time, such as pre- If the duration is 10 minutes, preset interval time is 2 minutes;Meanwhile background server starts timing, and obtains in 10 minutes The control instruction that user sends;It is more than that after ten minutes, backstage takes when user is more than 2 minutes not sent new control instructions or time Device be engaged according to the control instruction got, such as opens parlor lamp, opens water heater, opens TV, generates control instruction group;And It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, and according to acquired scene Control instruction group obtains matched scene control model from the preset instructions table, home mode of such as coming off duty;Then background server Judge with the scene control model of current time matches as the mode of coming home from work, so that current time is denoted as the mode of coming home from work The scene triggered time and be written in default scenario table.
Second aspect, the scene judgment means based on user behavior that the present invention also provides a kind of, including control instruction obtain Modulus block 100, default scenario table read module 200, control instruction matching module 300, scene control model obtain module 400 and Sending module 500;
Wherein, control instruction obtains module 100, for obtaining the control instruction of user's transmission;
Default scenario table read module 200, for giving reading default scenario table, wherein the default scenario table includes at least One scene control model and the matched scene of at least one and the scene control model trigger control instruction;
Control instruction matching module 300, it is matched for being obtained from the default scenario table according to the control instruction Scene triggers control instruction;
Scene control model obtains module 400, presets for triggering control instruction according to accessed scene from described Matched scene control model is obtained in scenario table;
Sending module 500, for the scene control model to be sent to controller.
In one embodiment of this hair, a kind of scene judgment means based on user behavior further include study module and pre- If instruction catalogue obtains module;
When control instruction matching module 300 judges that there is no match with first control instruction in the default scenario table Scene trigger control instruction when;
The study module is for obtaining default learning time;
The preset instructions table obtains module for obtaining preset instructions table, and the preset instructions table includes at least one feelings Scape control model and at least one and the matched scene control instruction group of the scene control model;
The study module is also used to obtain the control instruction that user sends within the default learning time, and according to institute The control instruction of acquisition generates control instruction group;
The study module, which is also used to obtain from the preset instructions table, compares the matched scene control of instruction group with described Instruction group processed is denoted as candidate instruction group;
The study module is also used to obtain and the matched scene control of the candidate instruction group from the preset instructions table Molding formula is denoted as pairing scene control model;
The study module is also used to for first control instruction being denoted as matched with the pairing scene control model Scene triggers control instruction, and is written in the default scenario table.
In an embodiment of the present invention, the default scenario table further includes at least one and the matched feelings of the control instruction The scape triggered time;
Then, the scene judgment means based on user behavior further include obtaining module, the acquisition of the second time at the first time Module, judgment module;
Wherein, the first time obtains module, for obtaining current time, is denoted as at the first time;
Note is candidate time with the first time matched second time;
It is obtained and the matched scene control module of the candidate time from the default scenario table.
Second time-obtaining module triggers control instruction with the scene for obtaining from the default scenario table In the matched scene triggered time, it was denoted as the second time;
The judgment module, for remembering that with the first time matched second time be candidate time;
Scene control model acquisition module 400 is also used to obtain from the default scenario table to be matched with the candidate time Scene control module.
In one embodiment of this hair, a kind of scene judgment means based on user behavior further include study module and pre- If instruction catalogue obtains module;
Judge that there is no match with the first time in the default scenario table when scene control model obtains module 400 The second time match when;
The study module is for obtaining default learning time;
The preset instructions table obtains module for obtaining preset instructions table, and the preset instructions table includes at least one feelings Scape control model and at least one and the matched scene control instruction group of the scene control model;
The study module is also used to obtain the control instruction that user sends within the default learning time, and according to institute The control instruction of acquisition generates control instruction group;
The study module, which is also used to obtain from the preset instructions table, compares the matched scene control of instruction group with described Instruction group processed is denoted as candidate instruction group;
The study module is also used to obtain and the matched scene control of the candidate instruction group from the preset instructions table Molding formula is denoted as pairing scene control model;
The study module is also used to for being denoted as the first time and the matched scene of the pairing scene control model Triggered time, and be written in the default scenario table.
Obviously, above-described embodiment is used for the purpose of clearer expression technical solution of the present invention example, rather than right The restriction of embodiment of the present invention.To those skilled in the art, it can also make on the basis of the above description other Various forms of variations or variation, without departing from the inventive concept of the premise, these are all within the scope of protection of the present invention.Cause The scope of protection of the patent of the invention shall be subject to the appended claims for this.

Claims (7)

1. a kind of scene judgment method based on user behavior, which is characterized in that including:
Obtain the control instruction that user sends;
Default scenario table is read, wherein the default scenario table includes at least one scene control model and at least one and institute State the matched scene triggering control instruction of scene control model;
Matched scene triggering control instruction is obtained from the default scenario table according to the control instruction;
Matched scene control model is obtained from the default scenario table according to accessed scene triggering control instruction;
The scene control model is sent to controller.
2. the scene judgment method based on user behavior as described in claim 1, institute is it is characterized in that, described be based on user's row For scene judgment method, further include:
When there is no when scene triggering control instruction matched with first control instruction in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and the feelings The matched scene control instruction group of scape control model;
The control instruction that user sends within the default learning time is obtained, and control is generated according to acquired control instruction Instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate instruction Group;
Acquisition and the matched scene control model of the candidate instruction group from the preset instructions table are denoted as pairing scene control Mode;
First control instruction is denoted as and triggers control instruction with the matched scene of the pairing scene control model, and is written In the default scenario table.
3. the scene judgment method based on user behavior as claimed in claim 2, which is characterized in that the default learning time Including preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control instruction Control instruction group is generated, is specifically included:
The control instruction that user sends is obtained, and starts timing;
When being more than that new control instruction or timing time have not been obtained after the preset interval time to be greater than described default lasting Between when;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
4. the scene judgment method based on user behavior as described in claim 1, which is characterized in that the default scenario table is also The control instruction matched scene triggered time is triggered with the scene including at least one;
The scene control module is matched with the scene triggered time described at least one;
Then, the scene triggering control instruction according to accessed by obtains matched scene control from the default scenario table Mode specifically includes:
Current time is obtained, is denoted as at the first time;
It is obtained from the default scenario table and triggers the control instruction matched scene triggered time with the scene, when being denoted as second Between;
Note is candidate time with the first time matched second time;
It is obtained and the matched scene control module of the candidate time from the default scenario table.
5. the scene judgment method based on user behavior as claimed in claim 4, which is characterized in that described to be based on user behavior Scene judgment method, further include:
When the second time match matched with the first time is not present in the default scenario table;
Obtain default learning time;
Preset instructions table is obtained, the preset instructions table includes at least one scene control model and at least one and the feelings The matched scene control instruction group of scape control model;
The control instruction that user sends within the default learning time is obtained, and control is generated according to acquired control instruction Instruction group;
It is obtained from the preset instructions table and compares the matched scene control instruction group of instruction group with described, be denoted as candidate instruction Group;
Acquisition and the matched scene control model of the candidate instruction group from the preset instructions table are denoted as pairing scene control Mode;
By the first time be denoted as with the pairing scene control model matched scene triggered time, and be written described default In scenario table.
6. the scene judgment method based on user behavior as claimed in claim 5, which is characterized in that the default learning time Including preset durations and preset interval time;
Then, the control instruction for obtaining user and being sent within the default learning time, and according to acquired control instruction Control instruction group is generated, is specifically included:
The control instruction that user sends is obtained, and starts timing;
When being more than that new control instruction or timing time have not been obtained after the preset interval time to be greater than described default lasting Between when;
Stop obtaining the control instruction that user sends;
Control instruction group is generated according to the control instruction obtained.
7. a kind of scene judgment means based on user behavior, which is characterized in that obtain module, default scene including control instruction Table read module, control instruction matching module, scene control model obtain module and sending module;
Wherein, the control instruction obtains module, for obtaining the control instruction of user's transmission;
The default scenario table read module, for giving reading default scenario table, wherein the default scenario table includes at least one A scene control model and the matched scene of at least one and the scene control model trigger control instruction;
The control instruction matching module, for obtaining matched scene from the default scenario table according to the control instruction Trigger control instruction;
The scene control model obtains module, for triggering control instruction from the default scene according to accessed scene Matched scene control model is obtained in table;
The sending module, for the scene control model to be sent to controller.
CN201810597909.8A 2018-06-11 2018-06-11 A kind of scene judgment method and device based on user behavior Pending CN108919655A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810597909.8A CN108919655A (en) 2018-06-11 2018-06-11 A kind of scene judgment method and device based on user behavior

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810597909.8A CN108919655A (en) 2018-06-11 2018-06-11 A kind of scene judgment method and device based on user behavior

Publications (1)

Publication Number Publication Date
CN108919655A true CN108919655A (en) 2018-11-30

Family

ID=64418852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810597909.8A Pending CN108919655A (en) 2018-06-11 2018-06-11 A kind of scene judgment method and device based on user behavior

Country Status (1)

Country Link
CN (1) CN108919655A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381385A (en) * 2019-07-25 2019-10-25 四川长虹电器股份有限公司 A kind of polymerization TV Internet of Things platform intelligent scene edit methods based on web
CN111474884A (en) * 2020-04-28 2020-07-31 广州方胜智能工程有限公司 Intelligent scene linkage method and system
CN114500139A (en) * 2022-01-27 2022-05-13 青岛海尔科技有限公司 Instruction group sending method and device, storage medium and electronic device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8490006B1 (en) * 2012-09-04 2013-07-16 State Farm Mutual Automobile Insurance Company Scene creation for building automation systems
CN104834223A (en) * 2015-05-08 2015-08-12 丰唐物联技术(深圳)有限公司 Information pushing method and device thereof
CN105182777A (en) * 2015-09-18 2015-12-23 小米科技有限责任公司 Equipment controlling method and apparatus
CN105511286A (en) * 2016-01-14 2016-04-20 上海岂控信息科技有限公司 Smart home situation control method and system based on multi-condition judgment
CN105785777A (en) * 2016-03-04 2016-07-20 橙朴(上海)智能科技有限公司 Intelligent home control system based on learning
CN106019971A (en) * 2016-07-22 2016-10-12 宁波久婵物联科技有限公司 Anti-theft scenario system for simulating presence of person at home
CN106773766A (en) * 2016-12-31 2017-05-31 广东博意建筑设计院有限公司 Smart home house keeper central control system and its control method with learning functionality
CN107809355A (en) * 2016-09-08 2018-03-16 北京京东尚科信息技术有限公司 A kind of method and system of smart machine coordinated signals
CN107991892A (en) * 2017-11-02 2018-05-04 珠海格力电器股份有限公司 The methods, devices and systems of control device
CN108052014A (en) * 2017-12-18 2018-05-18 美的集团股份有限公司 Control method, system and the computer readable storage medium of smart home

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8490006B1 (en) * 2012-09-04 2013-07-16 State Farm Mutual Automobile Insurance Company Scene creation for building automation systems
CN104834223A (en) * 2015-05-08 2015-08-12 丰唐物联技术(深圳)有限公司 Information pushing method and device thereof
CN105182777A (en) * 2015-09-18 2015-12-23 小米科技有限责任公司 Equipment controlling method and apparatus
CN105511286A (en) * 2016-01-14 2016-04-20 上海岂控信息科技有限公司 Smart home situation control method and system based on multi-condition judgment
CN105785777A (en) * 2016-03-04 2016-07-20 橙朴(上海)智能科技有限公司 Intelligent home control system based on learning
CN106019971A (en) * 2016-07-22 2016-10-12 宁波久婵物联科技有限公司 Anti-theft scenario system for simulating presence of person at home
CN107809355A (en) * 2016-09-08 2018-03-16 北京京东尚科信息技术有限公司 A kind of method and system of smart machine coordinated signals
CN106773766A (en) * 2016-12-31 2017-05-31 广东博意建筑设计院有限公司 Smart home house keeper central control system and its control method with learning functionality
CN107991892A (en) * 2017-11-02 2018-05-04 珠海格力电器股份有限公司 The methods, devices and systems of control device
CN108052014A (en) * 2017-12-18 2018-05-18 美的集团股份有限公司 Control method, system and the computer readable storage medium of smart home

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110381385A (en) * 2019-07-25 2019-10-25 四川长虹电器股份有限公司 A kind of polymerization TV Internet of Things platform intelligent scene edit methods based on web
CN111474884A (en) * 2020-04-28 2020-07-31 广州方胜智能工程有限公司 Intelligent scene linkage method and system
CN114500139A (en) * 2022-01-27 2022-05-13 青岛海尔科技有限公司 Instruction group sending method and device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN108572559A (en) A kind of smart home scene control method and system
US8390432B2 (en) Apparatus and method of controlling digital appliances based on parking management
CN108919655A (en) A kind of scene judgment method and device based on user behavior
CN105840037B (en) A kind of multifunctional window and curtain Controller and method
CN205696571U (en) Wardrobe
CN106707770B (en) Intelligent electrical appliance control and intelligent Home Appliance Controller
CN111025925A (en) Intelligent home furnishing system based on cloud computing
CN110308660A (en) Smart machine control method and device
CN105162971B (en) A kind of mobile intelligent terminal controls the method and system of intelligent domestic system
CN104009898A (en) Household appliance and control method and device thereof
CN106292308A (en) User terminal, household central controller, intelligent home furnishing control method and system
CN211043962U (en) Intelligent household control system
CN204880542U (en) Air conditioner intelligent control box
CN103616879A (en) Informationized smart home life control system
CN110287937A (en) Equipment state reminding method, control equipment and the control system of knowledge based map
CN105042779A (en) Intelligent air conditioner control box and method
CN107566227A (en) Control method, device, smart machine and the storage medium of home appliance
CN102997369A (en) Humidifier control method
CN110308661A (en) Smart machine control method and device based on machine learning
CN106909078A (en) Home gateway and intelligent domestic system, the control method of household electrical appliance
CN113009843A (en) Household appliance control method and device, household appliance and storage medium
CN110973846A (en) Intelligent wardrobe
CN108873834A (en) A kind of smart machine inter-linked controlling method and device
CN111431776A (en) Information configuration method, device and system
CN208241681U (en) A kind of Internet of Things management equipment and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20181130

RJ01 Rejection of invention patent application after publication