CN115348115B - Attack prediction model training method, attack prediction method and system for smart home - Google Patents

Attack prediction model training method, attack prediction method and system for smart home Download PDF

Info

Publication number
CN115348115B
CN115348115B CN202211282236.XA CN202211282236A CN115348115B CN 115348115 B CN115348115 B CN 115348115B CN 202211282236 A CN202211282236 A CN 202211282236A CN 115348115 B CN115348115 B CN 115348115B
Authority
CN
China
Prior art keywords
value
training
attack prediction
prediction model
attack
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211282236.XA
Other languages
Chinese (zh)
Other versions
CN115348115A (en
Inventor
赖方民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Youkegu Technology Co ltd
Original Assignee
Guangzhou Youkegu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Youkegu Technology Co ltd filed Critical Guangzhou Youkegu Technology Co ltd
Priority to CN202211282236.XA priority Critical patent/CN115348115B/en
Publication of CN115348115A publication Critical patent/CN115348115A/en
Application granted granted Critical
Publication of CN115348115B publication Critical patent/CN115348115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/14Network analysis or design
    • H04L41/147Network analysis or design for predicting network behaviour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L41/00Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
    • H04L41/16Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks using machine learning or artificial intelligence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/20Network architectures or network communication protocols for network security for managing network security; network security policies in general

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to an attack prediction model training method for smart home, which comprises the following steps: s1, intelligent household equipment constructs a training data set: the method comprises the steps that the intelligent household equipment receives original data from other equipment, and the original data are marked as data according to the influence of each data access on the intelligent household equipmenty i (ii) a Extracting the characteristics of the original data to obtain characteristic vectorsx i (ii) a Tuple (x i y i ) Forming a training sample; the training data constructed by the intelligent household equipment is concentrated, and the number of training samples isnThe number of the main components is one,i∈[1,n](ii) a S2, constructing a prediction model; s3, entering an iterative solution process and obtaining an iterative variablejIs 0, and the iteration end condition isj≥JJIs a default constant.

Description

Attack prediction model training method, attack prediction method and system for smart home
Technical Field
The invention relates to the technical field of Internet of things, in particular to an attack prediction model training method, an attack prediction method and an attack prediction system for smart home.
Background
With the technology of internet of things and the technology of intelligent hardware becoming mature, smart homes have started to be applied from concepts. The intelligent home is system-integrated in the fields of artificial intelligence technology, sensor technology, network technology, automatic control technology and the like, so that a small-sized internet of things with houses or buildings as units is formed, and the efficiency and the intelligent degree of information exchange among equipment are improved. The smart home system is responsible for automatically monitoring and controlling home devices such as lighting, temperature and humidity, entertainment systems and electric appliances. It also requires the security of the home personnel or equipment in a safe operating state, such as access control and alarm services. In consideration of privacy protection and system protection, the smart home systems generally have greater exclusivity, and therefore a security protection algorithm needs to be developed for detecting the attack behavior of an external malicious node.
At present, the common practice is to use a deep learning series algorithm to perform model training on collected sample data, and then use the trained model to classify unknown samples, so as to obtain probability prediction of whether the issuer of the sample is an attacker. Such algorithms have high requirements on hardware performance of the device, and have a relatively ideal prediction accuracy only when the sample set, particularly the number of valid samples, is sufficient. However, the complete equipment of the smart home is mainly a processor with low performance, and the prediction accuracy and the real-time performance are difficult to be considered. In addition, the smart home scene is not easy to collect enough samples, which in turn adversely affects the deep learning algorithm. Therefore, developing an attack prediction model fast training algorithm under a small sample aiming at the field of smart home is one of important support technologies which can be further popularized and applied in the field.
Disclosure of Invention
The invention aims to provide a method for training an attack prediction model of an intelligent home, which is used for designing a rapid training convergence algorithm facing a small sample under the condition that the computing capacity of intelligent home equipment is limited, so that the method can be well fitted to the training sample and has higher robustness.
In order to realize the purpose of the invention, the technical scheme is as follows:
the attack prediction model training method for the smart home comprises the following steps:
s1, intelligent household equipment constructs a training data set: the intelligent home equipment receives original data from other equipment, and marks the original data as the influence of each data access on the intelligent home equipmenty i (ii) a Extracting the characteristics of the original data to obtain characteristic vectorsx i (ii) a Tuple (x i y i ) Forming a training sample; the training data constructed by the intelligent household equipment is concentrated, and the number of training samples isnThe number of the main components is one,i∈[1,n]
s2. ConstructionPrediction model
Figure 560833DEST_PATH_IMAGE002
(ii) a Wherein
Figure 212394DEST_PATH_IMAGE004
Representing an derived vectorxSum vectorx i Inner product of (d);u i vrepresenting the parameters of the model to be solved,u i vthe initial value of (2) is 0;u i ∈[0,C]Cis a default constant;
s3, entering an iterative solving process from step S3.1 to step S3.3, and iterating variablesjIs 0, and the iteration end condition isj≥JJIs a default constant;
s3.1. OrderT=0,TModel parameters for labeling prediction model in iterative process of current roundu i vWhether it is tuned best;
s3.2. For training sample (x k y k ) Performing the operations of S3.2.1-S3.2.3, whereink∈[1,n]kIs 1;
s3.2.1. will train sample: (x k y k ) In (1)x k Input into a prediction model, which outputs predicted expectation values
Figure 460973DEST_PATH_IMAGE006
Calculating the error between the expected value and the actual value:
Figure 574291DEST_PATH_IMAGE008
s3.2.2. Judging error
Figure 857505DEST_PATH_IMAGE010
Whether the precision reaches the preset precision or not is judged, if yes, the step S3.2.3 is executed, and if not, the step S3.2.2.1-the step S3.2.2.4 is executed for adjusting and optimizing;
s3.2.2.1. From the training data setIn randomly selecting a training sample: (x l y l ) As a sample for reference, a sample of,k≠lcalculated by the method of step S3.2.1
Figure 352072DEST_PATH_IMAGE012
Keep currentu k u l vI.e. by
Figure 337214DEST_PATH_IMAGE014
Figure 270535DEST_PATH_IMAGE016
Figure 408255DEST_PATH_IMAGE018
S3.2.2.2. Judging model parametersu k vWhether an overfitting condition is satisfied:
(1)y k andy l the lower value of the dynamic variation range of the parameters is equal to the upper value thereof;
(2)x l andx k is greater than the mean of the sum of their respective self-inner products;
(3) For reference purposesu l The change of the value of (a) is less than a default value; namely, it is
Figure 870461DEST_PATH_IMAGE020
Figure 546161DEST_PATH_IMAGE022
Is a default constant;
when any one of the conditions (1), (2) and (3) is satisfied, the model parameteru k vIf the overfitting condition is satisfied, the order isk=k+ 1And jumping to step S3.2, otherwise executing step S3.2.2.3;
s3.2.2.3 current model parametersu k vNot satisfying overfittingUnder the conditions of (A), (B)x k y k )、(x l y l ) Update model parameters for valid samplesu k v
S3.2.2.4. UpdateT=T+1
S3.2.3. JudgekWhether or not it is greater than or equal tonIf yes, executing step S3.3; otherwise makek=k+1Then step S3.2 is performed;
s3.3, after the iteration of the round is finished, judgingTWhether it is equal to 0, if so, then orderj=j+1Then step S3.1 is performed, otherwise the variables are iteratedjThe original value is maintained and then step S3.1 is performed.
Preferably, in the step S3.2.2, the error is determined when either one of the following two conditions is satisfied
Figure 752015DEST_PATH_IMAGE010
The preset precision is not reached, otherwise, the preset precision is reached;
(1)
Figure 744242DEST_PATH_IMAGE024
and is
Figure 892195DEST_PATH_IMAGE026
(2)
Figure 602662DEST_PATH_IMAGE028
And is
Figure 612206DEST_PATH_IMAGE030
Figure 724519DEST_PATH_IMAGE032
Is a default constant.
Preferably, in the step S3.2.2.2, the lower bound of the dynamic variation range of the parameter is expressed as:
Figure 777794DEST_PATH_IMAGE034
the upper bound of the dynamic variation range of the parameter is expressed as:
Figure 178820DEST_PATH_IMAGE036
where max () means getting the larger value and min () means getting the smaller value.
Preferably, in said step S3.2.2.2,x l andx k the mean value of the cross inner products greater than the sum of their respective self inner products is specifically expressed as:
Figure 726476DEST_PATH_IMAGE038
preferably, the step S3.2.2.3 updates the model parametersu k vThe concrete expression is as follows:
Figure 942563DEST_PATH_IMAGE040
Figure 183051DEST_PATH_IMAGE042
wherein
Figure 602531DEST_PATH_IMAGE044
As a parameter of the modelu k An updated value;
Figure 219457DEST_PATH_IMAGE046
as a parameter of the modelvThe updated value.
Preferably, in the step S1, y i is 1 or-1;y i when the value of (1) is 1, the data access is safe access;y i when the value of (1) is-1, the data access memory is indicatedAt risk.
Meanwhile, the invention also provides an attack prediction method for the smart home, and the specific scheme is as follows:
the method comprises the following steps:
each intelligent household device trains a respective prediction model by applying the attack prediction model training method;
after each intelligent household device trains a respective prediction model, if one intelligent household device A receives a suspicious data packet, the suspicious data packet is broadcasted to the intelligent household devices of the whole network;
the intelligent home equipment of the whole network inputs the broadcasted suspicious data packet into a prediction model of the intelligent home equipment to carry out attack prediction, and returns the result of the attack prediction to the intelligent home equipment A;
the intelligent home equipment A carries out weighted average on the attack prediction results of the intelligent home equipment of the whole network; and in the attack prediction results of the intelligent home equipment of the whole network, when the suspicious data packet is judged to be that the proportion of the attack prediction results of the dangerous access exceeds a set threshold value, the suspicious data packet is judged to be an attack behavior, otherwise, the suspicious data packet is judged to be a safety behavior.
In addition, the invention also provides an attack prediction system for the smart home, which comprises the following contents: the method comprises a plurality of intelligent household devices, and when the intelligent household devices perform attack prediction, the attack prediction method of the intelligent household devices is executed.
Compared with the prior art, the invention has the beneficial effects that:
(1) According to the attack prediction model training method for the smart home, provided by the invention, under the condition that the computing capacity of the smart home equipment is limited, a rapid training convergence algorithm facing a small sample is designed, so that the method can be well fitted to the training sample, and has higher robustness; meanwhile, the designed algorithm has the advantages that the calculation steps are as simple as possible, and the algorithm can be matched with a low-cost chip.
(2) The attack prediction method for the smart home can improve the accuracy of the smart home system in predicting the potential attack behaviors under the condition of small samples, has better real-time performance, and particularly has limited computing capability of smart home equipment.
(3) The attack prediction method for the smart home provided by the invention adopts a linear classification model and a random training process which is easy to calculate, and enables the algorithm steps to be rapidly converged into a training set while considering generalization capability, so that the method has higher robustness, is beneficial to ensuring that smart home equipment is not attacked by malicious nodes, and provides good technical support for further popularization and application of the smart home.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an attack prediction method for smart home.
Fig. 2 is a schematic structural diagram of an attack prediction system of a smart home.
Detailed Description
In order to make the objects, features and advantages of the present invention more obvious and understandable, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the embodiments described below are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
The embodiment provides an attack prediction model training method for smart home, which comprises the following steps:
s1, intelligent household equipment constructs a training data set: the intelligent home equipment receives original data from other equipment and accesses the intelligent home equipment according to the data each timeThe influence of the backup, marking the original data asy i (ii) a Extracting the characteristics of the original data to obtain characteristic vectorsx i (ii) a Tuple (x i y i ) Forming a training sample; the training data constructed by the intelligent household equipment is concentrated, and the number of training samples isnThe number of the main components is one,i∈[1,n]
s2, constructing a prediction model
Figure 290050DEST_PATH_IMAGE002
(ii) a Wherein
Figure 701440DEST_PATH_IMAGE004
Representing an derived vectorxSum vectorx i Inner product of (d);u i vrepresenting the parameters of the model to be solved,u i vthe initial value of (2) is 0;u i ∈[0,C]Cis a default constant;
s3, entering an iterative solving process from step S3.1 to step S3.3, and iterating variablesjIs 0, and the iteration end condition isj≥JJIs a default constant;
s3.1. OrderT=0,TModel parameters for labeling prediction model in iteration process of current roundu i vWhether it is tuned best;
s3.2. For training sample (x k y k ) Performing the operation of S3.2.1-S3.2.3, whereink∈[1,n]kIs 1;
s3.2.1. will train the sample: (x k y k ) In (1)x k Input into a prediction model, which outputs predicted expectation values
Figure 608216DEST_PATH_IMAGE006
Calculating the error between the expected value and the actual value:
Figure 497675DEST_PATH_IMAGE008
s3.2.2. Judging error
Figure 688353DEST_PATH_IMAGE010
Whether the precision reaches the preset precision or not is judged, if yes, the step S3.2.3 is executed, and if not, the step S3.2.2.1-the step S3.2.2.4 is executed for adjusting and optimizing;
s3.2.2.1. Randomly selecting a training sample from the training data set: (x l y l ) As a sample for reference, a sample of,k≠lcalculated by the method of step S3.2.1
Figure 270645DEST_PATH_IMAGE012
Keep currentu k u l vI.e. by
Figure 664717DEST_PATH_IMAGE014
Figure 357866DEST_PATH_IMAGE016
Figure 137472DEST_PATH_IMAGE018
S3.2.2.2. Judging model parametersu k vWhether an overfitting condition is satisfied:
(1)y k andy l the lower value of the dynamic variation range of the parameters is equal to the upper value thereof;
(2)x l andx k is greater than the mean of the sum of their respective self-inner products;
(3) For reference purposesu l The change of the value of (a) is less than a default value; namely, it is
Figure 890665DEST_PATH_IMAGE020
Figure 975295DEST_PATH_IMAGE022
Is a default constant;
when any one of the conditions (1), (2) and (3) is satisfied, the model parameteru k vIf the overfitting condition is satisfied, the order isk=k+ 1And jumping to step S3.2, otherwise executing step S3.2.2.3;
s3.2.2.3 current model parametersu k vWhen the overfitting condition is not satisfied (x k y k )、(x l y l ) Updating model parameters for valid samplesu k v
S3.2.2.4. UpdateT=T+1
S3.2.3. JudgekWhether or not it is greater than or equal tonIf yes, executing step S3.3; otherwise makek=k+1Then step S3.2 is performed;
s3.3, after the iteration of the round is finished, judgingTWhether it is equal to 0, if so, orderj=j+1Then step S3.1 is performed, otherwise the variables are iteratedjThe original value is maintained and then step S3.1 is performed.
In a specific implementation, in the step S3.2.2, the error is determined when either one of the following two conditions is satisfied
Figure 721403DEST_PATH_IMAGE010
The preset precision is not reached, otherwise, the preset precision is reached;
(1)
Figure 371828DEST_PATH_IMAGE024
and is
Figure 30342DEST_PATH_IMAGE026
(2)
Figure 399006DEST_PATH_IMAGE028
And is
Figure 683226DEST_PATH_IMAGE030
Figure 453736DEST_PATH_IMAGE032
Is a default constant.
In a specific implementation process, in the step S3.2.2.2, the lower bound of the parameter dynamic variation range is expressed as:
Figure 283152DEST_PATH_IMAGE034
the upper bound of the dynamic variation range of the parameter is expressed as:
Figure 139112DEST_PATH_IMAGE036
where max () denotes finding the larger value and min () denotes finding the smaller value.
In a specific implementation, in step S3.2.2.2,x l andx k the mean value of the cross inner products of (a) greater than the sum of their respective self inner products is specifically expressed as:
Figure 227023DEST_PATH_IMAGE038
in a specific implementation process, the step S3.2.2.3 updates the model parametersu k vThe concrete expression is as follows:
Figure 586460DEST_PATH_IMAGE040
Figure 852356DEST_PATH_IMAGE042
wherein
Figure 930034DEST_PATH_IMAGE044
As a parameter of the modelu k An updated value;
Figure 821635DEST_PATH_IMAGE046
as a parameter of the modelvThe updated value.
In a specific implementation, in the step S1, y i is 1 or-1;y i when the value of (1) is 1, the data access is safe access;y i when the value of (1) is-1, it indicates that there is a danger in this data access.
Example 2
The embodiment provides an attack prediction method for smart home, which trains a prediction model by applying the attack prediction model training method described in embodiment 1, and a specific flowchart of the method is shown in fig. 1. The scheme of the attack prediction method for the smart home specifically comprises the following steps:
preparing respective training data sets and training models for each intelligent household device
(II) performing iterative training on the model, wherein the initial value of an iterative variable j is 0, and the iteration ending condition isj≥J
The iteration body is as follows:
1.T =0 for marking whether the parameter is adjusted and optimized in the iteration process of the current round;
2. carrying out the following cyclic operation on each sample in turn;
2.1. calculating the error between the expected value and the actual value of the sample;
2.2. judging whether the error reaches the precision required by the model;
2.3 randomly selecting a reference sample, and calculating whether the parameters meet the overfitting condition;
2.4. when the overfitting condition is not met, updating each parameter of the model; and updateT=T+1
3. If the iteration has parameter updating operation, i.e.T≠0, the stability of the model is not yet met, and the iteration variable is carried outjMaintaining the original value; otherwisej=j+1
Thirdly, after each intelligent household device trains the respective prediction model, when one of the devices receives a suspicious data packet, the device broadcasts the suspicious data packet all over the network; each device sends the data packet to a local prediction model and returns the prediction result to the requested device; the equipment carries out weighted average on the prediction results, and when the prediction results of the danger exceed the default proportion, the equipment judges the behavior is an attack behavior, otherwise, the equipment judges the behavior is a safety behavior.
Example 3
The embodiment provides an attack prediction system for smart home, as shown in fig. 2, which includes the following contents: the method comprises a plurality of intelligent devices, and when the intelligent devices perform attack prediction, the attack prediction method of the intelligent home as described in embodiment 2 is executed.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (8)

1. The attack prediction model training method for the smart home is characterized by comprising the following steps: the method comprises the following steps:
s1, intelligent household equipment constructs a training data set: the intelligent home equipment receives original data from other equipment, and marks the original data as the influence of each data access on the intelligent home equipmenty i (ii) a Extracting the characteristics of the original data to obtain characteristic vectorsx i (ii) a Tuple (x i y i ) Forming a training sample; training for constructing intelligent household equipmentIn the training data set, the number of training samples isnThe number of the main components is one,i∈[1,n]
s2, constructing a prediction model
Figure 490855DEST_PATH_IMAGE002
(ii) a Wherein
Figure 927653DEST_PATH_IMAGE004
Representing an derived vectorxSum vectorx i Inner product of (d);u i vrepresenting the parameters of the model to be solved,u i vthe initial value of (2) is 0;u i ∈[0,C]Cis a default constant;
s3, entering an iterative solving process from step S3.1 to step S3.3, and iterating variablesjIs 0, and the iteration end condition isj≥JJIs a default constant;
s3.1. OrderT=0,TModel parameters for labeling prediction model in iteration process of current roundu i vWhether it is tuned best;
s3.2. For training sample (x k y k ) Performing the operations of S3.2.1-S3.2.3, whereink∈[1,n]kIs 1;
s3.2.1. will train sample: (x k y k ) In (1)x k Input into a prediction model, which outputs predicted expectation values
Figure 492626DEST_PATH_IMAGE006
Calculating the error between the expected value and the actual value:
Figure 187919DEST_PATH_IMAGE008
s3.2.2. Judging error
Figure 521948DEST_PATH_IMAGE010
Whether or not toIf the preset precision is reached, executing a step S3.2.3 if the preset precision is reached, otherwise, performing an adjusting step of S3.2.2.1-S3.2.2.4;
s3.2.2.1. Randomly selecting a training sample from the training data set: (x l y l ) As a sample for reference, a sample of,k≠lcalculated by the method of step S3.2.1
Figure 332909DEST_PATH_IMAGE012
Keep currentu k u l vI.e. by
Figure 634447DEST_PATH_IMAGE014
Figure 353004DEST_PATH_IMAGE016
Figure 72698DEST_PATH_IMAGE018
S3.2.2.2. Judging model parametersu k vWhether an overfitting condition is satisfied:
(1)y k andy l the lower value of the dynamic variation range of the parameters is equal to the upper value thereof;
(2)x l andx k is greater than the mean of the sum of their respective self-inner products;
(3) For reference purposesu l The change of the value of (a) is less than a default value; namely, it is
Figure 851298DEST_PATH_IMAGE020
Figure 843394DEST_PATH_IMAGE022
Is a default constant;
when any one of the conditions (1), (2) and (3) is satisfied, the model parameteru k vIf the overfitting condition is satisfied, the order isk=k+1And jumping to step S3.2, otherwise executing step S3.2.2.3;
s3.2.2.3 current model parametersu k vWhen the overfitting condition is not satisfied (x k y k )、(x l y l ) Updating model parameters for valid samplesu k v
S3.2.2.4. UpdateT=T+1
S3.2.3. JudgekWhether or not it is greater than or equal tonIf yes, executing step S3.3; otherwise makek=k+1Then step S3.2 is performed;
s3.3, after the iteration of the round is finished, judgingTWhether it is equal to 0, if so, orderj=j+1Then step S3.1 is performed, otherwise the variables are iteratedjThe original value is maintained and then step S3.1 is performed.
2. The attack prediction model training method for smart homes according to claim 1, characterized in that: in the step S3.2.2, the error is determined when either of the following two conditions is satisfied
Figure 365642DEST_PATH_IMAGE010
The preset precision is not reached, otherwise, the preset precision is reached;
(1)
Figure 939843DEST_PATH_IMAGE024
and is
Figure 138612DEST_PATH_IMAGE026
(2)
Figure 899895DEST_PATH_IMAGE028
And is
Figure 225834DEST_PATH_IMAGE030
Figure 388962DEST_PATH_IMAGE032
Is a default constant.
3. The attack prediction model training method for smart homes according to claim 1, characterized in that: in step S3.2.2.2, the lower bound of the dynamic variation range of the parameter is expressed as:
Figure 961895DEST_PATH_IMAGE034
the upper bound of the dynamic variation range of the parameter is expressed as:
Figure 476053DEST_PATH_IMAGE036
where max () denotes finding the larger value and min () denotes finding the smaller value.
4. The attack prediction model training method for smart homes according to claim 1, characterized in that: in the step S3.2.2.2,x l andx k the mean value of the cross inner products greater than the sum of their respective self inner products is specifically expressed as:
Figure 340103DEST_PATH_IMAGE038
5. the attack prediction model training method for smart homes according to claim 4, characterized in that: the step S3.2.2.3 updates the model parametersu k vThe concrete expression is as follows:
Figure 607006DEST_PATH_IMAGE040
Figure 163889DEST_PATH_IMAGE042
wherein
Figure 165343DEST_PATH_IMAGE044
As a parameter of the modelu k An updated value;
Figure 833085DEST_PATH_IMAGE046
as parameters of the modelvThe updated value.
6. The attack prediction model training method for smart homes according to any one of claims 1~5, comprising: in the step S1, the first step is performed, y i is 1 or-1;y i when the value of (1) is 1, the data access is safe access;y i when the value of (b) is-1, it indicates that there is a risk of the data access.
7. The attack prediction method for the smart home is characterized by comprising the following steps: the method comprises the following steps:
each intelligent household device trains a respective prediction model by applying the attack prediction model training method of any one of claims 1~6;
after each intelligent household device trains a respective prediction model, if one intelligent household device A receives a suspicious data packet, the suspicious data packet is broadcasted to the intelligent household devices of the whole network;
the intelligent home equipment of the whole network inputs the broadcasted suspicious data packet into a prediction model of the intelligent home equipment to carry out attack prediction, and returns the result of the attack prediction to the intelligent home equipment A;
the intelligent home equipment A carries out weighted average on the attack prediction results of the intelligent home equipment of the whole network; and in the attack prediction results of the intelligent home equipment of the whole network, when the ratio of the attack prediction results of the suspicious data packet judged as dangerous access exceeds a set threshold value, judging the suspicious data packet as an attack behavior, otherwise, judging the suspicious data packet as a safety behavior.
8. Attack prediction system of intelligence house, its characterized in that: the method comprises a plurality of intelligent home devices, and when the intelligent home devices perform attack prediction, the intelligent home attack prediction method according to claim 7 is executed.
CN202211282236.XA 2022-10-19 2022-10-19 Attack prediction model training method, attack prediction method and system for smart home Active CN115348115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211282236.XA CN115348115B (en) 2022-10-19 2022-10-19 Attack prediction model training method, attack prediction method and system for smart home

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211282236.XA CN115348115B (en) 2022-10-19 2022-10-19 Attack prediction model training method, attack prediction method and system for smart home

Publications (2)

Publication Number Publication Date
CN115348115A CN115348115A (en) 2022-11-15
CN115348115B true CN115348115B (en) 2022-12-20

Family

ID=83957553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211282236.XA Active CN115348115B (en) 2022-10-19 2022-10-19 Attack prediction model training method, attack prediction method and system for smart home

Country Status (1)

Country Link
CN (1) CN115348115B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489674A (en) * 2021-05-25 2021-10-08 南京邮电大学 Malicious traffic intelligent detection method and application for Internet of things system
CN114221790A (en) * 2021-11-22 2022-03-22 浙江工业大学 BGP (Border gateway protocol) anomaly detection method and system based on graph attention network
CN114679310A (en) * 2022-03-22 2022-06-28 安徽赛福贝特信息技术有限公司 Network information security detection method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340214B (en) * 2020-02-21 2021-06-08 腾讯科技(深圳)有限公司 Method and device for training anti-attack model

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113489674A (en) * 2021-05-25 2021-10-08 南京邮电大学 Malicious traffic intelligent detection method and application for Internet of things system
CN114221790A (en) * 2021-11-22 2022-03-22 浙江工业大学 BGP (Border gateway protocol) anomaly detection method and system based on graph attention network
CN114679310A (en) * 2022-03-22 2022-06-28 安徽赛福贝特信息技术有限公司 Network information security detection method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于多Agent的智能家居系统研究与设计;李晓华等;《计算机工程》;20061105;第32卷(第21期);第233-234页 *
一种基于等级划分的物联网安全模型;孙知信等;《计算机工程》;20110520;第37卷(第10期);第1-7页 *

Also Published As

Publication number Publication date
CN115348115A (en) 2022-11-15

Similar Documents

Publication Publication Date Title
CN112181666B (en) Equipment assessment and federal learning importance aggregation method based on edge intelligence
WO2021155713A1 (en) Weight grafting model fusion-based facial recognition method, and related device
CN113485144B (en) Intelligent home control method and system based on Internet of things
CN112491818B (en) Power grid transmission line defense method based on multi-agent deep reinforcement learning
CN114491525B (en) Android malicious software detection feature extraction method based on deep reinforcement learning
CN113839926B (en) Method, system and device for modeling intrusion detection system based on characteristic selection of wolf algorithm
CN114077913A (en) Method and system for multi-step prediction of future wind speed based on automatic reservoir neural network
CN113628059A (en) Associated user identification method and device based on multilayer graph attention network
WO2023168838A1 (en) Sentence text recognition method and apparatus, and storage medium and electronic apparatus
CN112272074B (en) Information transmission rate control method and system based on neural network
CN110166344A (en) A kind of identity recognition methods, device and relevant device
CN114329109A (en) Multimodal retrieval method and system based on weakly supervised Hash learning
Shen et al. Active perception in adversarial scenarios using maximum entropy deep reinforcement learning
Shan et al. NeuPot: A neural network-based honeypot for detecting cyber threats in industrial control systems
CN115062709A (en) Model optimization method, device, equipment, storage medium and program product
CN115348115B (en) Attack prediction model training method, attack prediction method and system for smart home
CN117932455A (en) Internet of things asset identification method and system based on neural network
CN109242294A (en) Improve the power communication performance method for early warning and device of fuzzy neural network
WO2024001196A1 (en) Household appliance control method and apparatus, storage medium, and electronic apparatus
CN116595528A (en) Method and device for poisoning attack on personalized recommendation system
CN110071845B (en) Method and device for classifying unknown applications
CN115510978A (en) Industrial control system intrusion detection method and device and electronic equipment
CN111212423B (en) Credible cooperative interference node selection method based on hidden Markov model
CN115019359A (en) Cloud user identity recognition task allocation and parallel processing method
Cui et al. Prototype generation based shift graph convolutional network for semi-supervised anomaly detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant