CN116112194A - User behavior analysis method and device, electronic equipment and computer storage medium - Google Patents

User behavior analysis method and device, electronic equipment and computer storage medium Download PDF

Info

Publication number
CN116112194A
CN116112194A CN202211302471.9A CN202211302471A CN116112194A CN 116112194 A CN116112194 A CN 116112194A CN 202211302471 A CN202211302471 A CN 202211302471A CN 116112194 A CN116112194 A CN 116112194A
Authority
CN
China
Prior art keywords
data
analyzed
feature vector
algorithm
anomaly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211302471.9A
Other languages
Chinese (zh)
Inventor
王亮
朱豪杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Westone Information Industry Inc
Original Assignee
Chengdu Westone Information Industry Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Westone Information Industry Inc filed Critical Chengdu Westone Information Industry Inc
Priority to CN202211302471.9A priority Critical patent/CN116112194A/en
Publication of CN116112194A publication Critical patent/CN116112194A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/14Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic
    • H04L63/1408Network architectures or network communication protocols for network security for detecting or protecting against malicious traffic by monitoring network traffic
    • H04L63/1425Traffic logging, e.g. anomaly detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Testing And Monitoring For Control Systems (AREA)

Abstract

The disclosure provides a user behavior analysis method, a user behavior analysis device, electronic equipment and a computer storage medium, wherein data to be analyzed of a target user in a target intranet are obtained; converting the data to be analyzed into corresponding feature vectors; performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain anomaly data; a behavioral portrayal of the target user is generated based on the anomaly data. In the method, the device and the system, the data to be analyzed of the target user in the target intranet can be converted into the corresponding feature vector, the feature vector can be subjected to abnormality detection based on the pre-trained abnormality detection model, the abnormal data in the data to be analyzed are obtained, and/or the feature vector is detected based on the machine learning algorithm, so that the abnormal data are obtained, the accurate detection of the abnormal data in the data to be analyzed is realized, and further, the analysis of the user behaviors in the intranet is realized accurately.

Description

User behavior analysis method and device, electronic equipment and computer storage medium
Technical Field
The present disclosure relates to the field of network security technologies, and more particularly, to a user behavior analysis method, apparatus, electronic device, and computer storage medium.
Background
Currently, in order to facilitate the mutual communication among a plurality of users, the plurality of users may be incorporated into an intranet (Local Area Network) provided so that the plurality of users communicate via the intranet. However, since the computers in the intranet can access and communicate with each other and can share resources, the security of the intranet is affected by the user, for example, if the user maliciously propagates an encrypted file, tampers with the encrypted file, and the like, the security of the intranet is reduced.
In order to ensure the safety of the intranet, the behavior of the user can be detected based on the feature knowledge base and the rule matching identification detection, so as to judge whether the intranet is safe or not; however, the detection mode based on the feature knowledge base and rule matching identification is a non-black or white detection technology, has strong dependence on the field technology and knowledge, requires the support of field experts or professional technology, and the expert knowledge can only establish a rule base aiming at the malicious characteristics which have occurred in the history or the characteristics which can be predicted in advance, and the randomness and flexibility of the user behavior are very high, so that the expert can not be predicted in advance, the detection mode based on the feature knowledge base and rule matching identification can not detect various potential legal behaviors triggered by the user in the intranet, the detection omission condition easily occurs, and the safety of the intranet is difficult to be ensured.
In summary, how to accurately analyze the user behavior in the intranet is a problem to be solved by those skilled in the art.
Disclosure of Invention
The disclosure aims to provide a user behavior analysis method, which can solve the technical problem of how to accurately analyze user behaviors in an intranet to a certain extent. The disclosure also provides a user behavior analysis device, an electronic device and a computer readable storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided a user behavior analysis method, including:
acquiring data to be analyzed of a target user in a target intranet;
converting the data to be analyzed into corresponding feature vectors;
performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain the anomaly data;
and generating a behavior portrait of the target user based on the abnormal data.
Preferably, the converting the data to be analyzed into the corresponding feature vector includes:
performing normative processing on the data to be analyzed to obtain a processing result;
Converting the processing result into the corresponding feature vector;
the normative processing comprises information completion, attribute unified mapping, data verification merging, association integration, filtering, aggregation and abnormal behavior identification.
Preferably, the converting the processing result into the corresponding feature vector includes:
and carrying out feature processing and conversion on the processing result based on a feature processing operator single-heat coding, an N-Gram model, an TF-IDF model and/or a preset feature operator and/or a preset algorithm to obtain a corresponding feature vector, wherein the preset feature operator comprises a count, a variance and a mean value.
Preferably, the feature processing operator-based single-heat coding, and/or an N-Gram model, and/or a TF-IDF model, and/or a preset feature operator, and/or a preset algorithm performs feature processing and conversion on the processing result to obtain the corresponding feature vector, which includes:
performing feature processing and conversion on enumeration type data in the data to be analyzed based on the feature processing operator single-hot coding to obtain the corresponding feature vector;
and/or carrying out feature processing and conversion on the data to be aggregated in the data to be analyzed based on the preset feature operator to obtain the corresponding feature vector;
And/or carrying out feature processing and conversion on text character strings in the data to be analyzed based on the N-Gram model and the TF-IDF model to obtain the corresponding feature vectors;
and/or carrying out feature processing and conversion on the processing result based on the preset algorithm to obtain the corresponding feature vector.
Preferably, the anomaly detection for the feature vector based on the pre-trained anomaly detection model includes:
and carrying out anomaly detection on the feature vector based on the anomaly detection model trained in advance, wherein the anomaly detection model comprises a detection model generated based on a time sequence algorithm and/or a classification algorithm and/or a statistical analysis algorithm.
Preferably, the abnormality detection model includes:
a first detection model which is constructed based on the time sequence algorithm and the statistical analysis algorithm and is used for carrying out anomaly detection on the single-dimensional time sequence data in the data to be analyzed;
and a second detection model which is built based on the classification algorithm and is used for carrying out anomaly detection on the multidimensional labeled data in the data to be analyzed, wherein the classification algorithm comprises SVM and xgboost.
Preferably, the detecting the feature vector based on the machine learning algorithm includes:
And detecting the feature vector corresponding to the multidimensional label-free data in the data to be separated based on the machine learning algorithm, wherein the machine learning algorithm comprises a clustering algorithm and a graph algorithm.
Preferably, the type of the data to be analyzed includes user behavior information, secret related information circulation data, network flow data, user own information and intranet information.
Preferably, the generating the behavioral portraits of the target users based on the abnormal data includes:
determining an abnormal behavior category and an abnormal behavior event of the target user based on the abnormal data;
the behavioral profile is generated based on the abnormal behavior category and the abnormal behavior event.
Preferably, the abnormal behavior category comprises one or more of login abnormality, override violation, operation violation, login violation, performing abnormality and data outgoing violation; the abnormal behavior event includes: the method comprises one or more of the steps of degrading print data, using peripherals by violations, logging in foreign addresses, violating confidential files by violations, logging in violations by a host, logging in violations by a database, accessing confidential files by violations, logging in by common equipment, frequently deleting account numbers, frequently changing files, controlling violations by processes, modifying registration information by violations and logging in non-working time.
According to a second aspect of the embodiments of the present disclosure, there is provided a user behavior analysis apparatus including:
the first acquisition module is used for acquiring data to be analyzed of a target user in a target intranet;
the first conversion module is used for converting the data to be analyzed into corresponding feature vectors;
the first detection module is used for carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or carrying out detection on the feature vector based on a machine learning algorithm to obtain the anomaly data;
and the first generation module is used for generating the behavior portraits of the target users based on the abnormal data.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program in the memory to implement the steps of any of the methods as described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of any of the methods described above.
The user behavior analysis method provided by the disclosure obtains data to be analyzed of a target user in a target intranet; converting the data to be analyzed into corresponding feature vectors; performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain anomaly data; a behavioral portrayal of the target user is generated based on the anomaly data. According to the user behavior analysis method, the data to be analyzed of the target user in the target intranet can be converted into the corresponding feature vector, the subsequent data to be analyzed can be conveniently processed, the feature vector can be subjected to abnormal detection based on the pre-trained abnormal detection model to obtain abnormal data in the data to be analyzed, and/or the feature vector is detected based on the machine learning algorithm to obtain the abnormal data, so that the accurate detection of the abnormal data in the data to be analyzed is realized, and then, if the behavior representation of the target user is generated based on the abnormal data, the behavior representation capable of accurately reflecting the abnormal behavior of the target user can be generated, and the analysis of the user behavior in the intranet is realized. The present disclosure relates to a user behavior analysis apparatus, an electronic device, and a computer-readable storage medium, which solve the corresponding technical problems.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required to be used in the embodiments or the description of the prior art will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present disclosure, and other drawings may be obtained according to the provided drawings without inventive effort to those of ordinary skill in the art.
FIG. 1 is a first flow chart illustrating a method of user behavior analysis according to an exemplary embodiment;
FIG. 2 is a second flowchart illustrating a user behavior analysis method according to an exemplary embodiment;
FIG. 3 is a third flowchart illustrating a user behavior analysis method according to an exemplary embodiment;
FIG. 4 is a schematic illustration of a visual interface;
FIG. 5 is a schematic diagram of a multitasking parallel computing framework;
FIG. 6 is a fourth flowchart illustrating a method of user behavior analysis, according to an example embodiment;
FIG. 7 is a schematic diagram of a user behavior analysis device according to an exemplary embodiment;
Fig. 8 is a block diagram of an electronic device 900, according to an example embodiment.
Detailed Description
The following description of the technical solutions in the embodiments of the present disclosure will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. Based on the embodiments in this disclosure, all other embodiments that a person of ordinary skill in the art would obtain without making any inventive effort are within the scope of protection of this disclosure.
Currently, in order to facilitate the mutual communication among a plurality of users, the plurality of users may be incorporated into an intranet (Local Area Network) provided so that the plurality of users communicate via the intranet. However, since access and communication can be performed between computers in the intranet and resource sharing can be performed, the security of the intranet is affected by users, and in recent years, data leakage events frequently occur, and investigation reports by authorities show that more than 70% of information leakage events come from "legal" users in the intranet, that is, users having legal rights in the intranet environment and seriously threatening the security of the intranet due to unintentional or malicious actions, the operation actions of the "legal" users can bypass the security protection mechanism completely due to legal, and break through the network boundary and steal sensitive data.
The reason for this is because the intranet has the following problems: the intranet does not belong to a completely isolated network, and the computers can be accessed and communicated, such as information exchange, data circulation and the like through tools such as mail, instant messaging and the like; part of computers have sharing conditions, and can share, transmit and the like data resources, such as printers, writers and the like; human reasons exist to cause policy configuration errors, such as that a security administrator allocates high rights to accounts with low rights; intranet systems themselves present technical or administrative vulnerabilities, etc. Based on the above, if the security awareness and privacy awareness of the intranet user are weak, some actions such as sharing confidential information to personnel beyond the authority of the known scope, transferring high-level information to a low-level domain through mails or instant messaging tools, attempting to label a confidential file, attempting to tamper with confidential document security, attempting to collect confidential information through a low-frequency and long-term operation mode, inadvertently installing a system patch, and the like are easily triggered. Both of these actions threaten the security of the intranet.
In order to ensure the safety of the intranet, the behavior of the user can be detected based on the feature knowledge base and the rule matching identification detection, so as to judge whether the intranet is safe or not; however, the detection mode based on the feature knowledge base and rule matching identification is a non-black or white detection technology, has strong dependence on the field technology and knowledge, requires the support of field experts or professional technology, and the expert knowledge can only establish a rule base aiming at the malicious characteristics which have occurred in the history or the characteristics which can be predicted in advance, and the randomness and flexibility of the user behavior are very high, so that the expert can not be predicted in advance, the detection mode based on the feature knowledge base and rule matching identification can not detect various potential legal behaviors triggered by the user in the intranet, the detection omission condition easily occurs, and the safety of the intranet is difficult to be ensured. The user behavior analysis method provided by the application can accurately analyze the user behavior in the intranet.
Referring to fig. 1, fig. 1 is a first flowchart illustrating a user behavior analysis method according to an exemplary embodiment.
The user behavior analysis method related to the disclosure may include the following steps:
Step S101: and acquiring data to be analyzed of the target user in the target intranet.
It can be understood that the type of the data to be analyzed can be determined according to the application scenario, for example, the data to be analyzed can include user behavior information, secret related information circulation data, network flow data, user own information, intranet information and the like; the user behavior information refers to information describing corresponding behaviors of the user in an intranet, such as file sending behavior information, file deleting behavior information and the like of the user; the secret-related information circulation data refers to circulation data of information related to confidentiality, such as transmission data of an encrypted file in an intranet and the like; the network flow data refers to flow data generated by user behaviors in an intranet and the like; the user information, intranet information may include personnel attribute data, confidential information attribute data, organization information, and the like. Further, after the data to be analyzed of the target user in the target intranet is obtained, the data to be analyzed can be stored according to the class, so that the corresponding data to be analyzed can be processed quickly later, for example, the user's own information, secret related information circulation data and intranet information can be stored in a relational database, and the user behavior information, network flow data and intranet information can be stored in an elastic search cluster or queue.
It can be appreciated that the process of acquiring the data to be analyzed of the target user in the target intranet may also be determined according to an application scenario, for example, the data to be analyzed of the target user in the target intranet may be acquired through syslog (system log), jdbc (Java Database Connectivity ), rest (Representational State Transfer, representational state transfer) interfaces, and the like. In addition, the format of the acquired data to be analyzed may be a log or the like, and the disclosure is not particularly limited herein.
It should be noted that, in the application scenario, the corresponding data to be analyzed of the target user may be recorded from the angles of a bean. Hostname, a bean. Name, a bean. Versin, an event category, an event generation time, an event level, an event name, an event responsibility department ID, an event responsibility person name, an event responsibility security level, an event tag, an event type, a host IP, a log type, an operation result, a Key of an event reporting device asset, an event reporting device IP, a Syslog log reporting module, a timetable, a user name, etc., so that the user may accurately learn the corresponding information of the data to be analyzed in detail, for example, the bean. Hostname may be a K-PC, the bean. Name may be an audiogram, the bean. Versin may be a 7.3.2, the event category may be an audit log, the event generation time may be 2021/3/10t16:00:00.000+0800, the event level may be information, the event name may be file download audit, the event responsibility department may be a security application product line, the event responsibility department ID may be 01001001003, the event responsibility department ID may be e6e6ad937b5a464cb9dfc591d36b1001b, the event responsibility name may be yellow XX4565, the event responsibility security level may be important secret, the event label may be behavior-related, file circulation, the event type may be file operation audit, the host IP may be 192.168.100.231, the log type may be download file output, the operation result may be operation success, the Key of the event reporting device asset may be collection agent/Windows/V1.0, the event reporting device IP may be 10.131.110.10, the sysysplex log reporting module may be AUD, the log report module may be 2021/11/26/13:14:11, the user name may be yellow XX4565, etc., and the disclosure is not particularly limited herein.
Step S102: and converting the data to be analyzed into corresponding feature vectors.
It can be understood that after the data to be analyzed of the target user in the target intranet is obtained, the data to be analyzed can be converted into corresponding feature vectors, so that the feature vectors can be processed by means of a machine learning algorithm later, and the data to be analyzed can be processed.
Step S103: and carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain the anomaly data.
Step S104: a behavioral portrayal of the target user is generated based on the anomaly data.
It can be understood that after the data to be analyzed is converted into the corresponding feature vector, the feature vector can be subjected to abnormality detection based on a pre-trained abnormality detection model to obtain abnormal data in the data to be analyzed, and/or the feature vector can be detected based on a machine learning algorithm to obtain abnormal data; further, a behavior representation of the target user can be generated based on the anomaly data.
In the disclosure, the anomaly detection is performed on the feature vector based on the pre-trained anomaly detection model to obtain the anomaly data in the data to be analyzed, and/or the detection is performed on the feature vector based on the machine learning algorithm, so that the frequency of obtaining the anomaly data can be determined according to an application scene, for example, the anomaly detection can be performed on the feature vector based on the pre-trained anomaly detection model at a fixed time every day to obtain the anomaly data in the data to be analyzed, and/or the detection is performed on the feature vector based on the machine learning algorithm to obtain the anomaly data; the feature vector may be detected for abnormality based on a pre-trained abnormality detection model at intervals, for example, 3 days, to obtain abnormal data in the data to be analyzed, and/or the feature vector may be detected based on a machine learning algorithm to obtain abnormal data, which is not specifically limited herein.
The user behavior analysis method provided by the disclosure obtains data to be analyzed of a target user in a target intranet; converting the data to be analyzed into corresponding feature vectors; performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain anomaly data; a behavioral portrayal of the target user is generated based on the anomaly data. According to the user behavior analysis method, the data to be analyzed of the target user in the target intranet can be converted into the corresponding feature vector, the subsequent data to be analyzed can be conveniently processed, the feature vector can be subjected to abnormal detection based on the pre-trained abnormal detection model to obtain abnormal data in the data to be analyzed, and/or the feature vector is detected based on the machine learning algorithm to obtain the abnormal data, so that the accurate detection of the abnormal data in the data to be analyzed is realized, and then, if the behavior representation of the target user is generated based on the abnormal data, the behavior representation capable of accurately reflecting the abnormal behavior of the target user can be generated, and the analysis of the user behavior in the intranet is realized.
Referring to fig. 2, fig. 2 is a second flowchart illustrating a user behavior analysis method according to an exemplary embodiment.
The user behavior analysis method related to the disclosure may include the following steps:
step S201: and acquiring data to be analyzed of the target user in the target intranet.
Step S202: carrying out normative processing on data to be analyzed to obtain a processing result; the normalization processing comprises information completion, attribute unified mapping, data verification merging, association integration, filtering, aggregation and abnormal behavior identification.
Step S203: and converting the processing result into a corresponding feature vector.
It can be understood that in order to accurately and rapidly process the data to be analyzed, in the process of converting the data to be analyzed into the corresponding feature vector, the data to be analyzed can be subjected to normative processing to obtain a processing result; and converting the processing result into a corresponding feature vector. The type of the normalization processing may be determined according to an application scenario, for example, the normalization processing may include information complementation, attribute unified mapping, data check merging, association integration, filtering, aggregation, abnormal behavior recognition and the like, where it is to be noted that the information complementation may include information complementation of an organization, association integration may include association integration of relatively isolated data in the data to be analyzed, filtering, aggregation may be used for performing anti-counterfeits on the data to be analyzed, and the abnormal behavior recognition may include recognition of sensitive operations, abnormal use, illegal access, authority abnormal promotion and the like, and the disclosure is not limited specifically herein.
It may be appreciated that the process of converting the processing result into the corresponding feature vector may be determined according to an application scenario, for example, the processing result may be subjected to feature processing and conversion based on a feature processing operator single-Hot Encoding (One-Hot Encoding), and/or an N-Gram model, and/or a TF-IDF (forward word frequency-reverse document word frequency) model, and/or a preset feature operator, and/or a preset algorithm, so as to obtain the corresponding feature vector, where the preset feature operator may include One or more of a count, a variance, a mean value, and the like. It should be noted that, in order to accurately convert the processing result into a corresponding feature vector, an adaptive conversion method may be selected based on the type of the processing result, that is, in the process of performing feature processing and conversion on the processing result based on the feature processing operator single-heat coding and/or the N-Gram model and/or the TF-IDF model and/or the preset feature operator and/or the preset algorithm to obtain the corresponding feature vector, feature processing and conversion may be performed on enumeration type data in the data to be analyzed based on the feature processing operator single-heat coding to obtain the corresponding feature vector; and/or carrying out feature processing and conversion on the data to be aggregated in the data to be analyzed based on a preset feature operator to obtain a corresponding feature vector; and/or carrying out feature processing and conversion on text character strings in the data to be analyzed based on the N-Gram model and the TF-IDF model, such as word segmentation on the text character strings based on a part-of-speech labeler model in the N-Gram model, then constructing a fixed-length word bag model, and finally calculating feature processing such as word frequency/weight and the like to obtain corresponding feature vectors and the like; and/or performing feature processing and conversion on the processing result based on a preset algorithm to obtain a corresponding feature vector, wherein the type of the preset algorithm can be determined according to a specific application scene, and the application is not particularly limited herein.
Step S204: and carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain the anomaly data.
Step S105: a behavioral portrayal of the target user is generated based on the anomaly data.
Referring to fig. 3, fig. 3 is a third flowchart illustrating a user behavior analysis method according to an exemplary embodiment.
The user behavior analysis method related to the disclosure may include the following steps:
step S301: and acquiring data to be analyzed of the target user in the target intranet.
Step S302: and converting the data to be analyzed into corresponding feature vectors.
Step S303: performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain anomaly data; the anomaly detection model comprises a detection model generated based on a time sequence algorithm and/or a classification algorithm and/or a statistical analysis algorithm.
It may be appreciated that, in the disclosure, the type of the anomaly detection model applied when performing anomaly detection on the feature vector based on the anomaly detection model trained in advance may be determined according to an application scenario, for example, the anomaly detection model may include a detection model generated based on a timing algorithm and/or a classification algorithm and/or a statistical analysis algorithm, and in some implementation scenarios, the corresponding anomaly detection model may be determined according to the type of the feature vector processed, for example, the anomaly detection model may include a first detection model that is constructed based on the timing algorithm and the statistical analysis algorithm and performs anomaly detection on single-dimensional timing data in the data to be analyzed, and the first detection model may be a historical baseline model or the like; for example, the anomaly detection model may include a second detection model that is constructed based on a classification algorithm, where the classification algorithm includes SVM (Support Vector Machine ), xgboost, etc., to perform anomaly detection on multi-dimensional tagged data in the data to be analyzed; the present disclosure is not specifically limited herein. In addition, in the application scenario, the anomaly detection model, the anomaly data and the like may be stored in a database or a file and the like, for example, the anomaly detection model, the anomaly data and the like may be stored in a MySQL database, an elastiscearch database and the like, so as to manage the anomaly detection model, the anomaly data and the like by means of the database or the file and the like, and in practical application, the anomaly detection process and the like of the anomaly data may be recorded from the angles of anomaly data detection time, anomaly detection type description, anomaly detection model name, anomaly detection model generation algorithm type and the like, so that the subsequent detection process of the anomaly data based on the recorded data can be traced, for example, the anomaly detection time may be 2022/5/4, the anomaly detection type may be the anomaly detection predicted by the device report data amount, the anomaly detection type description may be the anomaly detection by using the device report log data amount single-dimensional time sequence data prediction model, the anomaly data may be the log report amount, the anomaly detection model name may be the device report data amount detection model, the anomaly detection model may be the anomaly detection model, the anomaly detection model generation algorithm name may be the Fbropt may be the specific algorithm type, and the anomaly detection algorithm may not be defined by the type generation algorithm.
It can be understood that in the application scenario, the first detection model type may be determined according to actual needs, for example, the first detection model may include a model that is set up based on a statistical analysis algorithm and detects single-dimensional value vector features, and a processing procedure of the statistical analysis algorithm may include feature vector mean calculation, feature vector standard deviation calculation, mean deviation parameter configuration, detection, and the like; the first detection model may include a model that is built based on a timing algorithm to detect the single-dimensional timing numerical data, and the processing of the timing algorithm may include building a data trend prediction model, prediction model learning, confidence region setting, and the like.
It may be understood that the training process of the second detection model by applying the classification algorithm may be determined according to an application scenario, for example, the multi-dimensional data in the feature vector may be labeled first, then the multi-dimensional data is divided into training data and verification data according to the labeling result, the training data is classified by applying the classification algorithm, the second detection model is configured with parameters, the second detection model is trained by applying the training data and the corresponding classification result, after the training is completed, the verification data is applied to verify the second detection model, if the second detection model is verified to be the optimal model, the second detection model is output, and if the second detection model is verified to be not the optimal model, the steps after the parameter configuration of the second detection model are performed may be returned until the optimal second detection model is output.
It may be understood that the process of detecting the feature vector based on the machine learning algorithm may be determined according to an application scenario, for example, the process of detecting the feature vector corresponding to the multi-dimensional label-free data in the data to be separated based on the machine learning algorithm may be performed, in this process, the rule of the feature vector corresponding to the multi-dimensional label-free data may be mined and learned by the machine learning algorithm, an outlier data point in the feature vector corresponding to the multi-dimensional label-free data is found, and the outlier data point is used as the abnormal data, etc., and the type of the applied machine learning algorithm may be determined according to the application scenario, for example, the machine learning algorithm may include a clustering algorithm, a graph algorithm, etc., in an implementation scenario, the processing process of the clustering algorithm may include feature clustering calculation, abnormal recognition parameter configuration learning, etc., and the processing process of the graph algorithm may include determining whether hmm (hidden markov model) learning is performed, if hmm learning is performed, then damping coefficient learning is performed, if not, damping coefficient learning is directly performed, and then Topk parameters are determined based on the damping coefficient learning result.
It can be understood that the training process of the anomaly detection model can be flexibly determined according to the application scenario, for example, the anomaly detection model can be trained at regular time, and the anomaly detection model can be verified to screen out a final anomaly detection model and the like; in addition, in the process of processing the feature vector by using the anomaly detection model, the feature vector may be matched by using the anomaly detection model, and if the feature vector data does not match the anomaly detection model, the feature vector may be regarded as anomaly data.
Step S304: a behavioral portrayal of the target user is generated based on the anomaly data.
It can be understood that, in an application scenario, in order to facilitate management and control of a user and the like by using the user behavior analysis method provided by the disclosure, a corresponding visual management interface may be set for the user, so that the user and the like can manage and control a user behavior analysis process by using the visual management interface, a form of the visual management interface to be described may be determined according to actual needs, for example, the visual interface may be as shown in fig. 4, and may include task visual management and user behavior visual management, where task visual management may provide creation, editing, deletion, cloning, execution and operation result viewing of an anomaly detection model training task and an anomaly detection task, and execution may be manual or timed execution of the anomaly detection model training task, deletion may be deletion of a task, and operation result viewing may be viewing of a task analysis result; the training task of the anomaly detection model can comprise task creation/editing/cloning and the like, and specifically, the training task of the anomaly detection model can comprise data screening configuration, feature processing template configuration, analysis algorithm configuration, such as algorithm parameters, model parameters, threshold parameters and the like, task information, execution period and the like; the anomaly detection task may include task creation/editing/cloning such as configuration of detection model, data filtering configuration, feature processing template configuration, task information, etc. for the anomaly detection task. The visual management of the user behavior can provide management of a supervision user list, management of a supervision user behavior event list, a user behavior portrait view, detailed information management of a user behavior time line and the like; the present disclosure is not specifically limited herein.
It can be understood that, in an application scenario, in order to improve the operation efficiency of the user behavior analysis method provided by the present disclosure, the scheme of the present disclosure may be implemented in a multitasking parallel manner, and the multitasking parallel manner may be determined according to actual needs, for example, a multitasking parallel computing framework may be as shown in fig. 5, that is, the multitasking parallel computing framework may include a WEB end module, a computing framework master control module, a computing node module and a monitoring module; the WEB end module can be used for creating, submitting, receiving and displaying the task state and the task running result feedback after the task execution in charge of user behavior analysis tasks; the computing frame main control module can be responsible for receiving a user behavior analysis task application and an execution task of the WEB terminal module and distributing the task to the Kafka cluster; the Kafka cluster module can be used for distributing computing tasks, namely distributing task messages to different consumers in the same consumer group, so as to realize concurrent execution of multiple tasks; the computing node module can be used for receiving a computing allocation instruction of the computing frame main control module, starting the execution of a specific computing task, feeding back the running state of the task and the result to the monitoring module and the like; the monitoring data collection module can be used for collecting task running state information of each computing node, including but not limited to, and pushing the collected node information to a WEB terminal for display; the database cluster can be used for providing inquiry of calculation data for each calculation node, receiving and storing calculation results returned by each calculation node, providing data inquiry to be displayed for a WEB terminal and the like. It should be noted that, in the multitasking parallel computing framework disclosed in this embodiment, a task corresponding to a user behavior analysis process may be a training task of an anomaly detection model, an anomaly data analysis task, a feature vector conversion task, or the like, in other words, the user behavior analysis process in this disclosure may be split and executed by means of a task, which is not specifically limited herein.
Referring to fig. 6, fig. 6 is a fourth flowchart illustrating a user behavior analysis method according to an exemplary embodiment.
The user behavior analysis method related to the disclosure may include the following steps:
step S401: and acquiring data to be analyzed of the target user in the target intranet.
Step S402: and converting the data to be analyzed into corresponding feature vectors.
Step S403: and carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain the anomaly data.
Step S404: and determining the abnormal behavior category and the abnormal behavior event of the target user based on the abnormal data.
Step S405: a behavioral profile is generated based on the abnormal behavior categories and the abnormal behavior events.
It can be appreciated that the process of generating the behavior portraits of the target users based on the abnormal data can be determined according to the application scenario, for example, the abnormal behavior categories and the abnormal behavior events of the target users can be determined based on the abnormal data; and generating a behavior portrait based on the abnormal behavior category and the abnormal behavior event.
It will be appreciated that the type of abnormal behavior and the type of abnormal behavior event may be determined according to the application scenario, e.g., the abnormal behavior category may include one or more of login exceptions, override violations, operational violations, login violations, track violations, data outgoing violations, etc.; the abnormal behavioral event may include: the method comprises the steps of performing data printing by violating degradation, using peripherals by violating foreign addresses, logging in violating secret files by violating, logging in violations of a host, logging in violations of a database, accessing secret files by violating rules, logging in by using equipment, frequently deleting account numbers, frequently changing files, controlling violations of processes, modifying registration information by violations, logging in non-working time, and the like.
It can be appreciated that further, the risk classification may be further performed on the abnormal behavior events of the target user, for example, classified into super-risk, high-risk, medium-risk, low-risk, and the like, and the risk scoring may be performed on the target user, for example, the risk scoring is performed on the target user within 1-100, so as to quantify the risk of the target user threatening the target intranet by means of the classification result, the risk scoring result, and the like, so as to conveniently evaluate the threat of the target user to the target intranet by means of the quantification result.
It can be understood that in some application scenarios, in order to facilitate the operation and maintenance personnel, security analysis personnel and the like to acquire information such as abnormal behaviors of the target user, each user and corresponding abnormal behavior events can be recorded by means of a user supervision list and a user behavior event supervision list, so that the operation and maintenance personnel, security analysis personnel and the like acquire abnormal behavior event information of the user by means of the user supervision list and the user behavior event supervision list; in addition, in the process of generating the behavior portraits, operation and maintenance personnel, security analysis personnel and the like can select the user abnormal behavior events to be checked in the user supervision list and the user behavior event supervision list to generate corresponding behavior portraits, and the operation and maintenance personnel, the security analysis personnel and the like can directly select the user abnormal behavior events to be checked in the user supervision list and the user behavior event supervision list to determine the user abnormal behavior events to be checked, the user abnormal behavior events to be checked can also be selected in the user supervision list and the user behavior event supervision list by means of the set search box, and the like; furthermore, when the abnormal behavior events of the user are recorded by means of the user supervision list and the user behavior event supervision list, the abnormal behavior events of the user can be recorded according to a time line, so that operation and maintenance personnel, security analysis personnel and the like can analyze, process the abnormal behavior events of the user according to the time line; the present disclosure is not specifically limited herein.
Referring to fig. 7, fig. 7 is a schematic diagram illustrating a structure of a user behavior analysis apparatus according to an exemplary embodiment.
A user behavior analysis apparatus 700 according to the present disclosure may include:
the first obtaining module 710 is configured to obtain data to be analyzed of the target user in the target intranet;
the first conversion module 720 is configured to convert the data to be analyzed into corresponding feature vectors;
the first detection module 730 is configured to perform anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detect the feature vector based on a machine learning algorithm to obtain anomaly data;
a first generation module 740 for generating a behavioral portrayal of the target user based on the anomaly data.
The present disclosure relates to a user behavior analysis apparatus, and a first conversion module may include:
the first processing unit is used for carrying out normative processing on the data to be analyzed to obtain a processing result;
the first conversion unit is used for converting the processing result into a corresponding feature vector;
the normalization processing comprises information completion, attribute unified mapping, data verification merging, association integration, filtering, aggregation and abnormal behavior identification.
The present disclosure relates to a user behavior analysis device, a first conversion unit may be configured to: and carrying out feature processing and conversion on the processing result based on the feature processing operator single-heat coding, and/or the N-Gram model, and/or the TF-IDF model, and/or the preset feature operator and/or the preset algorithm to obtain a corresponding feature vector, wherein the preset feature operator comprises a count, a variance and a mean value.
The present disclosure relates to a user behavior analysis device, a first conversion unit may be configured to: performing feature processing and conversion on enumeration type data in the data to be analyzed based on feature processing operator single-heat coding to obtain corresponding feature vectors; and/or carrying out feature processing and conversion on the data to be aggregated in the data to be analyzed based on a preset feature operator to obtain a corresponding feature vector; and/or carrying out feature processing and conversion on text character strings in the data to be analyzed based on the N-Gram model and the TF-IDF model to obtain corresponding feature vectors; and/or carrying out feature processing and conversion on the processing result based on a preset algorithm to obtain a corresponding feature vector.
The present disclosure relates to a user behavior analysis device, and the first detection module may include:
the first detection unit is used for carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model, wherein the anomaly detection model comprises a detection model generated based on a time sequence algorithm and/or a classification algorithm and/or a statistical analysis algorithm.
The present disclosure relates to a user behavior analysis device, an abnormality detection model including:
a first detection model which is built based on a timing sequence algorithm and a statistical analysis algorithm and is used for carrying out anomaly detection on single-dimensional timing sequence data in the data to be analyzed;
and a second detection model which is constructed based on a classification algorithm and is used for carrying out anomaly detection on the multidimensional labeled data in the data to be analyzed, wherein the classification algorithm comprises SVM and xgboost.
The present disclosure relates to a user behavior analysis device, and the first detection module may include:
the second detection unit is used for detecting the feature vector corresponding to the multidimensional label-free data in the data to be separated based on a machine learning algorithm, and the machine learning algorithm comprises a clustering algorithm and a graph algorithm.
The type of data to be analyzed can comprise user behavior information, secret-related information circulation data, network flow data, user own information and intranet information.
The present disclosure relates to a user behavior analysis device, and the first generation module may include:
the first determining unit is used for determining the abnormal behavior category and the abnormal behavior event of the target user based on the abnormal data;
The first generation unit is used for generating a behavior portrait based on the abnormal behavior category and the abnormal behavior event.
The user behavior analysis device related to the disclosure may include one or more of login anomalies, override violations, operation violations, login violations, performance anomalies, data outgoing violations; the abnormal behavioral event may include: the method comprises one or more of the steps of degrading print data, using peripherals by violations, logging in foreign addresses, violating confidential files by violations, logging in violations by a host, logging in violations by a database, accessing confidential files by violations, logging in by common equipment, frequently deleting account numbers, frequently changing files, controlling violations by processes, modifying registration information by violations and logging in non-working time.
Fig. 8 is a block diagram of an electronic device 900, according to an example embodiment. As shown in fig. 8, the electronic device 900 may include: processor 901, memory 902. The electronic device 900 may also include one or more of a multimedia component 903, an input/output (I/O) interface 904, and a communication component 905.
The processor 901 is configured to control the overall operation of the electronic device 900 to perform all or part of the steps in the user behavior analysis method described above. The memory 902 is used to store various types of data to support operations at the electronic device 900, which may include, for example, instructions for any application or method operating on the electronic device 900, as well as application-related data, such as contact data, transceived messages, pictures, audio, video, and so forth. The Memory 902 may be implemented by any type or combination of volatile or nonvolatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM for short), erasable programmable Read-Only Memory (Erasable Programmable Read-Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk, or optical disk. The multimedia component 903 may include a screen and audio components. Wherein the screen may be, for example, a touch screen, the audio component being for outputting and/or inputting audio signals. For example, the audio component may include a microphone for receiving external audio signals. The received audio signal may be further stored in the memory 902 or transmitted through the communication component 905. The audio assembly further comprises at least one speaker for outputting audio signals. The I/O interface 904 provides an interface between the processor 901 and other interface modules, which may be a keyboard, mouse, buttons, etc. These buttons may be virtual buttons or physical buttons. The communication component 905 is used for wired or wireless communication between the electronic device 900 and other devices. Wireless communication, such as Wi-Fi, bluetooth, near field communication (Near Field Communication, NFC for short), 2G, 3G or 4G, or a combination of one or more thereof, the corresponding communication component 905 may thus comprise: wi-Fi module, bluetooth module, NFC module.
In an exemplary embodiment, the electronic device 900 may be implemented by one or more application specific integrated circuits (Application Specific Integrated Circuit, abbreviated as ASIC), digital signal processors (Digital Signal Processor, abbreviated as DSP), digital signal processing devices (Digital Signal Processing Device, abbreviated as DSPD), programmable logic devices (Programmable Logic Device, abbreviated as PLD), field programmable gate arrays (Field Programmable Gate Array, abbreviated as FPGA), controllers, microcontrollers, microprocessors, or other electronic components for performing the user behavior analysis method described above.
In another exemplary embodiment, a computer readable storage medium is also provided comprising program instructions which, when executed by a processor, implement the steps of the user behavior analysis method described above. For example, the computer readable storage medium may be the memory 902 described above including program instructions executable by the processor 901 of the electronic device 900 to perform the user behavior analysis method described above.
The description of the relevant parts in the user behavior analysis device, the electronic device and the computer readable storage medium provided in the embodiments of the present disclosure refers to the detailed description of the corresponding parts in the user behavior analysis method provided in the embodiments of the present disclosure, and will not be repeated here. In addition, the parts of the foregoing technical solutions provided in the embodiments of the present disclosure, which are consistent with the implementation principles of the corresponding technical solutions in the prior art, are not described in detail, so that redundant descriptions are avoided.
It is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present disclosure. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (13)

1. A method of user behavior analysis, comprising:
acquiring data to be analyzed of a target user in a target intranet;
converting the data to be analyzed into corresponding feature vectors;
performing anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or detecting the feature vector based on a machine learning algorithm to obtain the anomaly data;
and generating a behavior portrait of the target user based on the abnormal data.
2. The method of claim 1, wherein the converting the data to be analyzed into corresponding feature vectors comprises:
performing normative processing on the data to be analyzed to obtain a processing result;
converting the processing result into the corresponding feature vector;
the normative processing comprises information completion, attribute unified mapping, data verification merging, association integration, filtering, aggregation and abnormal behavior identification.
3. The method of claim 2, wherein said converting said processing result into a corresponding said feature vector comprises:
and carrying out feature processing and conversion on the processing result based on a feature processing operator single-heat coding, an N-Gram model, an TF-IDF model and/or a preset feature operator and/or a preset algorithm to obtain a corresponding feature vector, wherein the preset feature operator comprises a count, a variance and a mean value.
4. A method according to claim 3, wherein the feature processing operator-based single-heat encoding, and/or N-Gram model, and/or TF-IDF model, and/or preset feature operator, and/or preset algorithm performs feature processing and conversion on the processing result to obtain the corresponding feature vector, and the method comprises:
performing feature processing and conversion on enumeration type data in the data to be analyzed based on the feature processing operator single-hot coding to obtain the corresponding feature vector;
and/or carrying out feature processing and conversion on the data to be aggregated in the data to be analyzed based on the preset feature operator to obtain the corresponding feature vector;
and/or carrying out feature processing and conversion on text character strings in the data to be analyzed based on the N-Gram model and the TF-IDF model to obtain the corresponding feature vectors;
and/or carrying out feature processing and conversion on the processing result based on the preset algorithm to obtain the corresponding feature vector.
5. The method of claim 1, wherein the anomaly detection of the feature vector based on a pre-trained anomaly detection model comprises:
And carrying out anomaly detection on the feature vector based on the anomaly detection model trained in advance, wherein the anomaly detection model comprises a detection model generated based on a time sequence algorithm and/or a classification algorithm and/or a statistical analysis algorithm.
6. The method of claim 5, wherein the anomaly detection model comprises:
a first detection model which is constructed based on the time sequence algorithm and the statistical analysis algorithm and is used for carrying out anomaly detection on the single-dimensional time sequence data in the data to be analyzed;
and a second detection model which is built based on the classification algorithm and is used for carrying out anomaly detection on the multidimensional labeled data in the data to be analyzed, wherein the classification algorithm comprises SVM and xgboost.
7. The method of claim 1, wherein the machine learning algorithm based detection of the feature vector comprises:
and detecting the feature vector corresponding to the multidimensional label-free data in the data to be separated based on the machine learning algorithm, wherein the machine learning algorithm comprises a clustering algorithm and a graph algorithm.
8. The method according to any one of claims 1 to 7, wherein the type of data to be analyzed includes user behavior information, confidential information flow data, network traffic data, user own information, intranet information.
9. The method of claim 8, wherein the generating a behavioral representation of the target user based on the anomaly data comprises:
determining an abnormal behavior category and an abnormal behavior event of the target user based on the abnormal data;
the behavioral profile is generated based on the abnormal behavior category and the abnormal behavior event.
10. The method of claim 9, wherein the abnormal behavior categories include one or more of login anomalies, override violations, operational violations, login violations, performance anomalies, data outgoing violations; the abnormal behavior event includes: the method comprises one or more of the steps of degrading print data, using peripherals by violations, logging in foreign addresses, violating confidential files by violations, logging in violations by a host, logging in violations by a database, accessing confidential files by violations, logging in by common equipment, frequently deleting account numbers, frequently changing files, controlling violations by processes, modifying registration information by violations and logging in non-working time.
11. A user behavior analysis apparatus, comprising:
the first acquisition module is used for acquiring data to be analyzed of a target user in a target intranet;
The first conversion module is used for converting the data to be analyzed into corresponding feature vectors;
the first detection module is used for carrying out anomaly detection on the feature vector based on a pre-trained anomaly detection model to obtain anomaly data in the data to be analyzed, and/or carrying out detection on the feature vector based on a machine learning algorithm to obtain the anomaly data;
and the first generation module is used for generating the behavior portraits of the target users based on the abnormal data.
12. An electronic device, comprising:
a memory for storing a computer program;
a processor for executing the computer program in the memory to implement the steps of the method of any one of claims 1 to 10.
13. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any one of claims 1 to 10.
CN202211302471.9A 2022-10-24 2022-10-24 User behavior analysis method and device, electronic equipment and computer storage medium Pending CN116112194A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211302471.9A CN116112194A (en) 2022-10-24 2022-10-24 User behavior analysis method and device, electronic equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211302471.9A CN116112194A (en) 2022-10-24 2022-10-24 User behavior analysis method and device, electronic equipment and computer storage medium

Publications (1)

Publication Number Publication Date
CN116112194A true CN116112194A (en) 2023-05-12

Family

ID=86262039

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211302471.9A Pending CN116112194A (en) 2022-10-24 2022-10-24 User behavior analysis method and device, electronic equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN116112194A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116451190A (en) * 2023-06-15 2023-07-18 恺恩泰(南京)科技有限公司 Data authority setting method based on Internet medical service system
CN116702229A (en) * 2023-08-04 2023-09-05 四川蓉城蕾茗科技有限公司 Safety house information safety control method and system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116451190A (en) * 2023-06-15 2023-07-18 恺恩泰(南京)科技有限公司 Data authority setting method based on Internet medical service system
CN116451190B (en) * 2023-06-15 2023-08-18 恺恩泰(南京)科技有限公司 Data authority setting method based on Internet medical service system
CN116702229A (en) * 2023-08-04 2023-09-05 四川蓉城蕾茗科技有限公司 Safety house information safety control method and system
CN116702229B (en) * 2023-08-04 2023-11-21 四川蓉城蕾茗科技有限公司 Safety house information safety control method and system

Similar Documents

Publication Publication Date Title
US11586972B2 (en) Tool-specific alerting rules based on abnormal and normal patterns obtained from history logs
US11157629B2 (en) Identity risk and cyber access risk engine
US11336669B2 (en) Artificial intelligence cyber security analyst
EP3925194B1 (en) Systems and methods for detecting security incidents across cloud-based application services
US11012472B2 (en) Security rule generation based on cognitive and industry analysis
US10771493B2 (en) Cognitive security exposure analysis and resolution based on security trends
US10264009B2 (en) Automated machine learning scheme for software exploit prediction
US11258814B2 (en) Methods and systems for using embedding from Natural Language Processing (NLP) for enhanced network analytics
WO2017037443A1 (en) Predictive human behavioral analysis of psychometric features on a computer network
CN116112194A (en) User behavior analysis method and device, electronic equipment and computer storage medium
CN111475370A (en) Operation and maintenance monitoring method, device and equipment based on data center and storage medium
Costante et al. A white-box anomaly-based framework for database leakage detection
CN113704328B (en) User behavior big data mining method and system based on artificial intelligence
US20230281249A1 (en) Computer-implemented methods, systems comprising computer-readable media, and electronic devices for enabled intervention into a network computing environment
US10291483B2 (en) Entity embedding-based anomaly detection for heterogeneous categorical events
CN113704772B (en) Safety protection processing method and system based on user behavior big data mining
US11290325B1 (en) System and method for change reconciliation in information technology systems
Paul et al. An ontology-based integrated assessment framework for high-assurance systems
KR102433233B1 (en) Security compliance automation method
Khan et al. Context-based irregular activity detection in event logs for forensic investigations: An itemset mining approach
US20240080332A1 (en) System and method for gathering, analyzing, and reporting global cybersecurity threats
Malek et al. GUI-based user behavior intrusion detection
Wang et al. Security situational awareness of power information networks based on machine learning algorithms
Baror et al. Functional Architectural Design of a Digital Forensic Readiness Cybercrime Language as a Service
US20240073229A1 (en) Real time behavioral alert processing in computing environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination