CN116702229B - Safety house information safety control method and system - Google Patents

Safety house information safety control method and system Download PDF

Info

Publication number
CN116702229B
CN116702229B CN202310974064.0A CN202310974064A CN116702229B CN 116702229 B CN116702229 B CN 116702229B CN 202310974064 A CN202310974064 A CN 202310974064A CN 116702229 B CN116702229 B CN 116702229B
Authority
CN
China
Prior art keywords
file
sequence
behavior
user
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310974064.0A
Other languages
Chinese (zh)
Other versions
CN116702229A (en
Inventor
梁鹏
吴皓月
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Leiming Technology Co ltd
Sichuan Rongcheng Leiming Technology Co ltd
Original Assignee
Suzhou Leiming Technology Co ltd
Sichuan Rongcheng Leiming Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Leiming Technology Co ltd, Sichuan Rongcheng Leiming Technology Co ltd filed Critical Suzhou Leiming Technology Co ltd
Priority to CN202310974064.0A priority Critical patent/CN116702229B/en
Publication of CN116702229A publication Critical patent/CN116702229A/en
Application granted granted Critical
Publication of CN116702229B publication Critical patent/CN116702229B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/64Protecting data integrity, e.g. using checksums, certificates or signatures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/604Tools and structures for managing or administering access control systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6209Protecting access to data via a platform, e.g. using keys or access control rules to a single file or object, e.g. in a secure envelope, encrypted and accessed using a key, or with access control rules appended to the object itself
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Computer Hardware Design (AREA)
  • Bioethics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Storage Device Security (AREA)

Abstract

The invention provides a method and a system for controlling information security of a security house, wherein the method comprises the following steps: identifying user behavior in the secure enclave, the user behavior including action behavior and/or operation behavior; judging whether the user behavior meets a preset condition or not; and determining that the user behavior belongs to the secret leakage behavior in response to the preset condition being met. The safety house information safety control method and the system provided by the embodiment of the specification can accurately determine the leakage behavior, effectively prevent the leakage of important information data and improve the reliability of the information safety control system.

Description

Safety house information safety control method and system
Technical Field
The present disclosure relates to the field of information security, and in particular, to a method and system for controlling information security of a secure room.
Background
With the penetration of computers, the Internet and enterprise informatization construction, most of enterprise information is managed and stored in the form of electronic files, and the informatization facilitates the production, storage, acquisition, sharing and propagation of enterprise information. However, if an effective information security management mechanism is lacking, once information is leaked, immeasurable losses are caused to enterprises. Therefore, in the process of realizing information circulation, the guarantee of information security is particularly important. In this case, the presence of the security house aims to guarantee the information security. The reasonable use of the safety house can enable information to be safer and more controllable in the circulation process.
Aiming at solving the problem of information disclosure, CN110087238B provides a mobile electronic equipment information security protection system, which comprises a user mobile electronic equipment terminal, wireless communication equipment and a server terminal, wherein the server terminal comprises a plurality of modules for encrypting file data and managing authority use, managing the authority of the mobile electronic equipment, detecting and timely isolating viruses in a system and carrying out security monitoring on websites and networks in the system. But the system has poor flexibility and sometimes affects the normal development of work.
Therefore, the method and the system for controlling the information security of the safe house are beneficial to protecting the information security of enterprises and preventing information leakage.
Disclosure of Invention
In order to improve the reliability of the information security management and control system and solve the technical problem of how to determine the leakage behavior so as to prevent important information from being leaked, the invention intelligently evaluates the user behavior to judge whether the user behavior belongs to the leakage behavior, and can accurately distinguish the user behavior possibly having abnormality, thereby effectively avoiding the occurrence of security problems such as important information leakage and the like.
The invention provides a safety house information safety control system, which comprises: the identification module is used for identifying user behaviors in the security area, wherein the user behaviors comprise action behaviors and/or operation behaviors; the judging module is used for judging whether the user behavior meets the preset condition; and the determining module is used for determining that the user behavior belongs to the secret leakage behavior in response to the fact that the preset condition is met.
The invention provides a system for controlling information security of a security house, which further comprises: the generation module is used for constructing a relevance map and calculating a relevance sequence based on the relevance map; and the prediction module is used for determining the abnormal probability through a prediction model based on the file operation sequence, the security level sequence and the association degree sequence.
The invention provides a method for controlling information security of a security house, which is executed by a processor and comprises the following steps: identifying user behavior in the secure enclave, the user behavior including action behavior and/or operation behavior; judging whether the user behavior meets a preset condition or not; and determining that the user behavior belongs to the secret leakage behavior in response to the preset condition being met.
Drawings
The present specification will be further elucidated by way of example embodiments, which will be described in detail by means of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is an exemplary flow chart of a method of security management and control of security house information according to some embodiments of the present description;
FIG. 2 is an exemplary flow chart for determining anomaly probabilities shown in accordance with some embodiments of the present specification;
FIG. 3 is an exemplary schematic diagram of a relevancy map shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary schematic diagram of a predictive model shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present specification, and it is possible for those of ordinary skill in the art to apply the present specification to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It will be appreciated that "system," "apparatus," "unit" and/or "module" as used herein is one method for distinguishing between different components, elements, parts, portions or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this specification and the claims, the terms "a," "an," "the," and/or "the" are not specific to a singular, but may include a plurality, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
A flowchart is used in this specification to describe the operations performed by the system according to embodiments of the present specification. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
In order to improve the information security supervision level, CN110087238B proposes an information security protection system for mobile electronic devices, which manages and monitors rights of users using mobile electronic devices, has poor flexibility and is easy to influence the development of normal work. Therefore, the embodiment of the specification monitors the user behavior through the safety house information safety control system, calculates the abnormal probability of the user behavior by combining the prediction model, evaluates the possibility of leakage of the user according to the abnormal probability, and reduces the risk of information leakage.
In some embodiments, the secure enclave may effectively resolve the system contradiction that exists between execution efficiency and data access security management in large data processing accesses. In some embodiments, the method and the terminal for realizing the safety house can be suitable for various data flows, and the safety in the data use process is well ensured on the basis of realizing data isolation.
In some embodiments, a secure room information security management and control system includes: the identification module is used for identifying user behaviors in the security area, wherein the user behaviors comprise action behaviors and/or operation behaviors; the judging module is used for judging whether the user behavior meets a preset condition or not; and the determining module is used for determining that the user behavior belongs to the secret leakage behavior in response to the fact that the preset condition is met.
In some embodiments, the preset condition includes that the degree of abnormality of the user behavior is greater than a preset abnormality threshold, and the determining module is further configured to: and calculating the abnormal degree of the user behavior based on a preset employee behavior rule.
In some embodiments, the preset condition includes that the abnormal probability of the user behavior is greater than a preset probability threshold, and the judging module is further configured to: determining a file operation sequence in a preset time period based on the user behavior, wherein the file operation sequence comprises a file opening sequence and a file downloading sequence; generating a security level sequence and a relevancy sequence based on the file operation sequence; and predicting the anomaly probability based on the security level sequence and the association degree sequence.
In some embodiments, the secure house information security management and control system further comprises a generation module for: constructing a relevance map, wherein nodes of the relevance map comprise files and file labels, and edges of the relevance map represent the relevance relationship among the nodes; and calculating the association sequence based on the association map.
In some embodiments, the secure room information security management and control system further comprises a prediction module for: determining the abnormal probability through a prediction model based on the file operation sequence, the security level sequence and the association degree sequence; the predictive model is a machine learning model.
In some embodiments, the identification module, the judgment module, the determination module, the generation module, and the prediction module may be integrated into a processor, and the following flow 100 in fig. 1 and flow 200 in fig. 2 may be regarded as being executed by the processor.
FIG. 1 is an exemplary flow chart of a method of secure house information security management according to some embodiments of the present description.
In some embodiments, the process 100 may be implemented based on a secure room information security management and control system. As shown in fig. 1, the process 100 includes the steps of:
In step 110, user behavior within the secure enclave is identified, the user behavior including action behavior and/or operation behavior.
The confidential area refers to an area that needs to be kept confidential. For example, the secured area may include secured locations, secured departments, areas of use of secured devices, and the like. The range in which the secret-related device is used refers to the shooting area of the monitoring camera. The secret-related device is a device which stores data needing to be secret or is in a secret area. For example, the confidential device includes at least one of a confidential computer, a confidential storage device, and the like. In some embodiments, the confidential device may be used to monitor user behavior within the confidential area (e.g., user behavior such as plugging in a USB flash disk, capturing a shot, opening or closing a file, etc. may be monitored).
User behavior refers to behavior that a user makes within a secured area.
In some embodiments, the user behavior may include action behavior and/or operation behavior.
The action behavior refers to actions made by a user on a secret-related device in a secret-related area. For example, the action may include turning off and/or on hardware, connecting an external device, blocking a camera, plugging in a network cable, turning on bluetooth, etc.
The operation behavior refers to the operation of the user on the confidential data in the confidential area. For example, the operation behavior may include a mouse operation to open a view, download, copy, send/receive confidential data, or the like, a usage record of a computer program, or the like. The usage record of the computer program may include record data of statistics screen capturing conditions, uploading conditions of chat tool files, and the like.
The confidential data may include data to be confidential, which is described in the form of text, data, symbols, graphics, images, sound, and the like. In some embodiments, the secret-related data may include a file in the secret-related device.
In some embodiments, user actions may also include any other actions made within the secured area.
User behavior may be obtained in a variety of ways. For example, the identification module may monitor operations such as a screen, a program, a file, etc. in the secret-related device through various monitoring management programs installed in the secret-related device, so as to obtain operation behaviors of the user.
In some embodiments, the identification module may also obtain the monitoring image through a camera installed in the secured area, and obtain the action behavior based on the identification of the monitoring image, for example, a cell phone photographing behavior, a peeping screen behavior, and the like.
Step 120, determining whether the user behavior satisfies a preset condition.
The preset condition refers to a determination condition for evaluating whether or not there is a leakage behavior. For a description of the disclosure, please refer to the corresponding description in fig. 1 below. For example, the preset condition may include any condition to be satisfied when the user behavior belongs to a preset abnormal behavior. The preset abnormal behavior may be a behavior with a high risk of disclosure, such as, for example, high-frequency screenshot, copying the confidential file to an unknown usb disk, closing a camera, recording and externally carrying confidential data (e.g., a behavior of collecting data using a hidden collection device, recording the data in a written manner or other manner, and carrying the data out of a confidential area), and the like.
In some embodiments, the preset condition may include the degree of abnormality of the user behavior being greater than a preset abnormality threshold. The judging module can calculate the abnormal degree of the user behavior based on the preset employee behavior rule.
The employee behavior rule may be a rule set in advance for judging whether the user behavior is abnormal. Employee behavior rules may be company-made rules. For example, employee behavior rules may include at least one of the following rules: when a user accesses a file, the user has the user authority corresponding to the file, the file belongs to a department where the user is located, the camera is not blocked, and the like.
In some embodiments, the determining module may preset user rights of access files of staff at different positions (user rights of staff at different positions), preset file access rights of storage spaces in the secret-related device (corresponding to the user rights), and so on. The user rights and file access rights of the staff at each post can be expressed in various ways.
The preset abnormality threshold value refers to a maximum threshold value of the degree of abnormality of the user behavior. In some embodiments, the determination module may be preset based on experience, or may be calculated based on statistical rules. For example, the determining module may perform statistical analysis on historical user behavior data (e.g., the past week/month, etc.), determine a standard deviation of the degree of abnormality of the user behavior, and weight the weighting coefficient with the standard deviation of the degree of abnormality of the user behavior to obtain the preset abnormality threshold. The weighting factor may be a system default or manually set value, for example, the weighting factor may be 3.
The degree of abnormality of the user behavior refers to the degree of deviation of the user behavior from the employee behavior rules.
In some embodiments, the determination module may determine the degree of abnormality of the user behavior based on the degree of deviation of the user behavior from each of the employee behavior rules. For example, it may be determined from the sum of the degrees of deviation of different user behaviors from each rule. Illustratively, the degree of abnormality of the user behavior may be determined based on the following formula (1):
(1)
Wherein D1 in the formula (1) represents an abnormal degree of user behavior, E0 represents a file access right, which may be a preset value based on user rights corresponding to different employees (for example, whether the user rights correspond to the file access right, if yes, 0 is the file access right, otherwise 1), T represents access time, which may be a duration value of the user access file determined respectively based on different time ranges (for example, different security levels correspond to different time ranges), and the values of E0 to En may change with the difference of T; e1 E2, … En represent the degree of deviation of the different user behaviors from each rule. For example, E1 indicates whether the file belongs to the department where the employee is located, and if the file belongs to the department, the deviation degree is 0, and if the file does not belong to the department, the deviation degree is 1; similarly, E2 represents whether the camera is shielded, the degree of deviation is 0 if no shielding exists, and the degree of deviation is 1 if shielding exists; the En may map other employee behavior rules to numerical values in a similar manner as described above.
In some embodiments, the determination module may determine based on a weighted sum of the degree of deviation of the different user behaviors from each respective rule:
(2)
in the formula (2), k0, k1, k2, … kn represent weight coefficients, which are values within [0,1] intervals preset for different degrees of deviation.
The degree of abnormality of the user behavior can be obtained in various ways. For example, the judging module may calculate the degree of abnormality of each user behavior of a certain user in a preset period of time, respectively; and summing the degree of abnormality of each user behavior in a preset time period to determine the degree of abnormality of the final user behavior. The preset time period may be a time range set by default or by human.
In some embodiments, for a highly sensitive file (e.g., a higher security level file), the degree of anomalies in the user behavior (which may be of a different user) involved in a preset period of time may be summed in a manner similar to that described above with reference to predicting the degree of anomalies in the user behavior. For example, the judging module may calculate the degree of abnormality of each user behavior involved in a certain file within a preset period of time, respectively; and summing the degree of abnormality of each user behavior in a preset time period, and determining the degree of abnormality of all user behaviors related to the file in the preset time period. By summing all user behaviors related to the high-sensitivity file, the monitoring effect on the high-sensitivity file is improved, and the confidentiality reliability is improved.
In some embodiments of the present disclosure, by calculating the degree of abnormality of the user behavior according to the preset employee behavior rules, the user behavior may be comprehensively and accurately analyzed, which is beneficial to accurately determining the leakage behavior later.
In some embodiments, the preset conditions may further include that the anomaly probability of the user behavior is greater than a preset probability threshold, and more related embodiments may be described with reference to fig. 3.
In some embodiments, the determining module may determine that the degree of abnormality of the user behavior satisfies the preset condition when determining that the degree of abnormality of the final user behavior exceeds the preset abnormality threshold.
In some embodiments, the determining module may determine the degree of abnormality of the user behavior in any other manner, for example, by identifying a similarity determination of the user behavior with a preset abnormal behavior through a machine learning model.
And 130, determining that the user behavior belongs to the leakage behavior in response to the preset condition being met.
The disclosure behavior refers to a user behavior that causes disclosure-related data to have been compromised or possibly compromised.
In some embodiments, the determining module may obtain record data of operations such as accessing, copying, etc. of the file by the user; calculating the abnormal degree of each user behavior in the recorded data, summing all the abnormal degrees of the behaviors in a preset time period, and judging that the employee has the leakage behavior when the summed value meets a preset condition (for example, is larger than a preset abnormal threshold value). The summation may be an arithmetic summation or a weighted summation.
In some embodiments, for a plurality of user behaviors (which may be user behaviors of different users) involved in a high-sensitivity file within a preset time period, the determining module may sum the degree of abnormality of each user behavior, and when the summed value satisfies a preset condition, determine that there is a possibility that the file is compromised.
In some embodiments, different preset conditions may be set for different time ranges. For example, different preset anomaly thresholds are set according to different security levels in different time ranges, and the higher the security level is, the lower the preset anomaly threshold can be.
In some embodiments, the determination module may obtain a maximum value or other statistic of all behavioral anomalies (e.g., standard deviation, etc.) over a preset period of time and set a corresponding preset condition (e.g., set a corresponding preset anomaly threshold); and when the maximum value or other statistics are larger than the corresponding preset abnormal threshold, judging that the preset condition is met, and determining that the employee has the secret leakage behavior.
In some embodiments, the compromised behavior may be a combination of multiple user behaviors in a time-sequential order. The determining module can judge the leakage behavior according to a preset event rule.
The preset event rule refers to a rule for judging a leakage behavior. The preset event rule may be composed of a plurality of events. An event refers to an event related to user behavior within a secured area. For example, the event may be that the user makes an action of opening a file and closing a camera or the like in the case where there is a material with a high security level in the security device of the security area.
In some embodiments, the determining module may increase the level of risk of disclosure when the user behavior is ranked in time order, and meets the event ranking and category requirements in the preset event rule.
For example, in the case where there is security information, when it is monitored that a user has one or more user actions of opening a file, closing a camera, inserting a storage device such as a usb disk, copying a file, capturing a screen, transmitting a picture/compression package, opening a file, transmitting a screenshot file/compression package, closing a public network, or a computer monitor, etc., the level of risk of disclosure is increased.
In some embodiments, the determination module may determine that confidential information is present after monitoring, by the monitoring manager, an important document generated in the confidential device, monitoring user behavior of an uploaded screenshot/document, monitoring, by a camera, an event such as a contract being entered or a customer being interviewed.
In some embodiments, increasing the risk level of disclosure may include determining that the user behavior belongs to a disclosure behavior, issuing an alarm prompt, closing its corresponding user rights, and so forth.
In some embodiments, the determining module may determine whether the user behavior belongs to the disclosure behavior according to different actual situations (for example, whether there is a secret information or not), and in combination with the type, the occurrence frequency and the combination manner of different user behaviors.
In some embodiments, the determining module may monitor the operation behavior of opening files of different file types (e.g., confidential files, non-confidential files, etc.) by marking the confidential files, so as to facilitate targeted security management and control (e.g., monitoring frequency for the confidential files is higher) for the different file types, respectively.
In some embodiments, the operation behavior of the inserted usb disk may be determined by monitoring the interface usage of the secret-related device; and whether the operation behavior of the inserted U disk belongs to the secret leakage behavior is judged by combining whether secret information exists or not. For example, when no secret information exists, the inserted USB flash disk does not belong to secret leakage behavior; when secret information exists; inserting a USB flash disk belongs to a secret leakage behavior.
In some embodiments, the level of risk of compromise is increased when the number/frequency of inserts into the USB flash disk exceeds a USB flash disk threshold. The USB flash disk threshold may be a system default or a manually preset value.
In some embodiments, the level of risk of disclosure is increased when the frequency of screenshots (e.g., at least one page of confidential information is screened as a picture while a compressed package of screenshots/screenshots is sent) exceeds a screenshot threshold. The screen capture threshold may be a system default or a manually preset value.
In some embodiments, the operation behavior of sending a specific file is easy to monitor, and the operation behavior of sending a screenshot file/compression package is not easy to monitor, so when the opened file type is monitored to be a secret-related file, and the time interval of sending the screenshot file/compression package is lower than a time threshold, the secret leakage risk level is improved. The time threshold may be a system default or a manually preset value.
In some embodiments, when the determining module monitors that the public network or the computer monitor is closed, the disclosure is determined.
In some embodiments, for the key confidential equipment, the determining module may determine whether a mobile phone photographing behavior and a peeping screen behavior exist based on the monitoring image.
In some embodiments of the present disclosure, by monitoring the user behavior, the disclosure behavior may be accurately determined, so that leakage of relevant important information data may be effectively prevented, and reliability of the information security management and control system may be further improved.
It should be noted that the above description of the flow is only for the purpose of illustration and description, and does not limit the application scope of the present specification. Various modifications and changes to the flow may be made by those skilled in the art under the guidance of this specification. However, such modifications and variations are still within the scope of the present description.
FIG. 2 is an exemplary flow chart for determining anomaly probabilities, shown in accordance with some embodiments of the present specification.
In some embodiments, the process 200 may be implemented based on a secure room information security management and control system. As shown in fig. 2, the process 200 includes the steps of:
step 210, determining a file operation sequence in a preset time period based on the user behavior.
For more on user behavior, see the relevant description of fig. 1.
The preset time period refers to a time range before a preset current time.
The file operation sequence may be a sequence composed of operation information related to different files. The operational information may include user actions on the file, such as opening, deleting, modifying, downloading, etc. the file.
In some embodiments, the determining module may construct the file operation sequence in time sequence based on operation information of the file by the user at each point in time.
In some embodiments, the sequence of file operations may include a file open sequence and a file download sequence.
The file open sequence refers to a sequence composed of related information of a file opened by a user. The related information of the opened file may include a start time of the opening, an end time of the opening, a path of the file, a security level, etc. For more on the security level, see the relevant description below in fig. 2.
The file download sequence refers to a sequence composed of related information of the file downloaded by the user. The related information of the downloaded file may include a start time of the download, an end time of the download, a path of the file, a security level, etc.
In some embodiments, the judging module may construct the file opening sequence and the file downloading sequence according to time sequence based on the file opening record and the file downloading record of the user at each time point.
In some embodiments, the judging module may monitor the user behavior related to each file based on various monitoring management programs installed in the secret-related device, to obtain a file operation sequence.
Step 220, based on the file operation sequence, a security level sequence and a relevance sequence are generated.
The security level sequence refers to a sequence of security level components of different files in the file operation sequence.
In some embodiments, the determination module may construct a sequence of file security levels based on the security level of each file in the sequence of file operations.
In some embodiments, the determining module may count security levels of each file in the file open sequence and the file download sequence, and form a security level sequence in time sequence.
The security level refers to the security level of the document, and the security level may reflect the importance of the text. The security level may be obtained based on a system preset or a human setting. For example, the security level may include three levels of privacy, confidentiality, and privacy.
In some embodiments of the present disclosure, by constructing the security level sequence, the importance degree of each file involved in the file opening sequence and the file downloading sequence may be more clearly reflected, so as to improve the efficiency of determining abnormal behavior and improve the security of file management and control.
The association degree sequence can be a sequence consisting of association relations of files in a file opening sequence and a file downloading sequence. The association relationship may include the same or similar relationship. The same refers to the case where the file versions are the same (e.g., the same file) or other attributes are the same. Close refers to the case where file holding locations are close (e.g., contained in the same folder), file content is related, or otherwise interrelated.
The association sequence may be determined in a number of ways. For example, if the file opening sequence is (file 1, file 2, file 3, file 4, file 5), the corresponding association sequence may be (association of file 1 and file 2, association of file 2 and file 3, association of file 3 and file 4, association of file 4 and file 5); by calculating the similarity of the file 1 and the file 2, the similarity of the file 2 and the file 3, the similarity of the file 3 and the file 4 and the similarity of the file 4 and the file 5 to be 0.9, 0.6, 0.8 and 0.7 respectively, namely the generated association sequence is (0.9, 0.6, 0.8 and 0.7). The similarity of the files may refer to the similarity of the related information of the files, for example, including the file format, the path along which the files are located, the similarity of the content of the files, and the like (the similarity of the files may be determined by weighted summation based on the similarity of different types of the files).
In some embodiments, the determining module may find a sequence with high association degree by searching for frequent item sets, including: acquiring a file operation sequence, and intercepting the file operation sequence based on the size of a preset sliding window and a preset step length to obtain a plurality of file operation subsequences; for each file operation subsequence, matching it in the frequent item database and obtaining a degree of association: and combining the association degree of each file operation subsequence to form an association degree sequence.
The preset sliding window is a window for dividing the file operation sequence by sliding.
The preset sliding window size refers to the number of file operation sequences contained in the preset sliding window. The preset step size may be the number of file operation sequences spaced between adjacent preset sliding windows (e.g., 3 file operation sequences spaced between two adjacent sliding windows, the preset step size is 3).
In some embodiments, when the preset sliding window slides once according to the preset step length, the judging module intercepts operation information of a section of step length file in the preset sliding window, determines the operation information as a file operation subsequence, and so on until all file operation sequences are intercepted.
For example, the file operation sequence is (operation information 1, operation information 2, operation information 3, operation information 4, operation information 5), the preset sliding window size is 3, and the preset step size is 1, then the following 3 file operation subsequences can be obtained by sliding the preset sliding window:
file operation sub-sequence 1: (operation information 1, operation information 2, operation information 3);
file operation sub-sequence 2: (operation information 2, operation information 3, operation information 4);
File operation sub-sequence 3: (operation information 3, operation information 4, operation information 5).
The frequent item database refers to a database that stores, indexes, and queries frequent items. For example, a frequent item database may store a large number of frequent items.
Wherein the frequent items are made up of a history file operation subsequence. Each frequent item has a corresponding support (and is each greater than a set support threshold). The support degree refers to the frequency of occurrence of frequent items in the statistical history data.
In some embodiments, the frequent items may be composed based on a sequence of normal file operations (a sequence of operations in which no compromise exists) for a large number of users, and the frequent item database may be a database of frequent items composed of stored sequences of normal file operations.
In some embodiments, the determining module may match in the frequent item database based on the file operation subsequence and obtain a corresponding degree of association. The matching condition may refer to a judgment condition for determining the target frequent item. For example, for the file operation sequences (file 2, file 3, file 5) obtained by sequentially operating the file 2, file 3 and file 5 in the preset time period, matching the file operation sequences in the frequent item database, if a frequent item composed of the file operation subsequences (file 2, file 3 and file 5) is found, determining the frequent item as a matched target frequent item, wherein the support degree corresponding to the target frequent item is used as the association degree of the file operation subsequences (or is used as the association degree after coefficient conversion).
In some embodiments, the determination module may, for each file operation subsequence, match frequent items in a frequent item database.
And if the matching result is the frequent item, taking the support degree of the frequent item as the association degree of the file operation subsequence. In some embodiments, the support may be determined as a degree of association after coefficient conversion.
If the frequency item is not matched, calculating the similarity between the frequency item and each frequency item in the frequency item database (for example, calculating the similarity by calculating the file coincidence proportion or any other mode), and determining the association degree based on the maximum similarity. In some embodiments, the maximum similarity may be directly used as the association degree, or the support degree of the frequent item corresponding to the maximum similarity may be used as the association degree.
In some embodiments, the similarity may be used as a weight, and the degree of support corresponding to the frequent item may be weighted to determine the degree of association. For example, if a frequent item corresponding to 70% of the maximum similarity is matched and the support degree of the frequent item is 3, the association degree may be calculated by multiplying 3 by 70%, and the association degree is obtained as 2.1.
In some embodiments, the determining module may combine in time order to form a relevance sequence based on the relevance of each file operation subsequence.
In some embodiments of the present disclosure, the association degree of each file operation subsequence can be accurately obtained through matching in the frequent item database, so that accuracy in determining the association degree can be improved, and thus, processing efficiency of information security management and control is ensured.
In some embodiments, the determination module may calculate the association sequence based on the association map, as more on the association map, see the associated description of fig. 3.
Step 230, predicting the anomaly probability based on the security level sequence and the association degree sequence.
In some embodiments, the determination module may obtain the anomaly probability in a variety of ways. For example, the anomaly probability may be determined based on a pre-set table or vector database constructed from a privacy level sequence, a relevance sequence. The preset table/vector database may be a table or database or the like that characterizes the correspondence between the security level sequence, the association sequence, and the anomaly probability.
In some embodiments, the determination module may determine the anomaly probability through a predictive model, and for more on determining the anomaly probability through a predictive model, reference may be made to the associated description of FIG. 4.
In some embodiments of the present disclosure, by determining a security level sequence, a relevance sequence, and predicting an anomaly probability through a file operation sequence, an abnormal user behavior can be efficiently and accurately evaluated, an effect better than that of human evaluation can be obtained, and occurrence of anomalies such as important information leakage can be effectively avoided.
Fig. 3 is an exemplary schematic diagram of a relevancy map 300 shown in accordance with some embodiments of the present description.
In some embodiments, generating the privacy level sequence and the association level sequence based on the file operation sequence comprises: constructing a relevance map, wherein nodes of the relevance map comprise files and file labels, and edges of the relevance map represent the relevance relationship among the nodes; based on the association map, an association sequence is calculated.
The association map is a map constructed based on the association between files and files, and between files and file tags. The association degree map consists of nodes and edges.
The nodes include files and file tags. The file tags include the department to which the file belongs, the project to which the file belongs, the client to which the file belongs, the file type (e.g., contract document, meeting record), the file update time, the file security level, etc.
Edges may represent the degree of association between files, between files and file tags, between file tags and file tags. When the number of files opened/downloaded by the user is similar and the association degree between the nodes is low, the possibility of divulging the secret of the user is high; the association degree between the files is high, so that the user can do special research aiming at each file with high association degree, and the possibility of divulging the secret of the user is low.
As shown in the association graph 300, file a, file B, file C, file D, technology part label, business part label are nodes. The sides connect nodes, the side length is the distance between nodes (i.e. the association degree of the file), for example, the side length of the file a and the technical part label is 6, the side length of the technical part label and the file B is 4, the side length of the file B and the file C is 2, the side length of the technical part label and the business part label is 12, and the side length of the business part label and the file D is 6 in the association degree map 300.
In some embodiments, the processor may determine the degree of association in a variety of ways. For example, the processor may determine the association using a preset table based on a department to which the file belongs, an item to which the file belongs, a client to which the file belongs, a file type, a file update time, a file security, and the like.
In some embodiments, a sequence of associations may be generated based on the associations.
In some embodiments, the relevance of a group of files in the relevance map may be obtained by calculating the distance between the nodes; and calculating the distance between each group of files corresponding to the file opening sequence and/or the file downloading sequence on the association degree map, and combining according to the sequence of the file opening sequence/the file downloading sequence based on the distance to form the association degree sequence.
The distance between nodes refers to the shortest path in the association graph from one node to another. In some embodiments, the distance between nodes may be determined by a Dijkstra algorithm, a depth or breadth first search algorithm, or other means.
A group of documents refers to any combination of documents together in a relationship graph. The association map may be obtained based on data stored in a confidential device, database, or the like. For example, a correlation map can be constructed for all files and file tags on a secret-related device; for another example, a relationship map may be constructed for all files and file tags in a database.
In some embodiments, the processor may calculate the distance on the relevancy map for each group of files in the file opening sequence and/or the file downloading sequence, and then combine the calculated distances to form the relevancy sequence.
In some embodiments, the distance of one or more edges in the association map may be preset according to rules. For example, the distance between the edge connecting the file tag and the file, the edge connecting the file tag and the file tag, and the like needs to be preset according to a rule, and the distance between the edges connecting the two files can be obtained by calculation or after the preset.
In some embodiments, the distance of one or more edges in the association map may also be calculated from the normal operation sequence by the frequent item method described above. For example, when the edge distance between two nodes of the file B and the file C is calculated by the frequent item method, the sliding window size may be preset to be 2, the file opening sequences corresponding to the file B and the file C are (file B, file C), the corresponding frequent items are matched from the frequent item database based on the file opening sequences, and the association degree corresponding to the matched frequent items is determined as the distance between two nodes of the file B and the file C, and for the related embodiment of the frequent items, see fig. 2 and related description thereof.
In some embodiments of the present disclosure, the relevance of a set of files in a relevance map may be obtained by calculating the distance between nodes; and calculating the distance of each group of files in the file opening sequence and/or the file downloading sequence on the graph, and combining the files to form a relevance sequence, so that the relevance between the files can be quantized efficiently.
In some embodiments of the present description, the relevance sequence is calculated by constructing a relevance map, so as to effectively predict the abnormal probability of the user behavior.
FIG. 4 is an exemplary schematic diagram of a predictive model shown in accordance with some embodiments of the present description.
In some embodiments, predicting the anomaly probability based on the privacy level sequence and the association level sequence comprises: determining abnormal probability through a prediction model based on the file operation sequence, the security level sequence and the association degree sequence; the predictive model is a machine learning model.
The prediction model 420 refers to a model for predicting an anomaly probability, and may be a deep neural network (Deep Neural Networks, DNN) model.
In some embodiments, the prediction model 420 includes an embedded layer 421 and a prediction layer 423.
In some embodiments, the inputs to the embedding layer 421 include a file operation sequence 411, a security level sequence 412, a relevance sequence 413, and output as an embedded vector 422. The security level sequence and the association level sequence are determined based on the file operation sequence, and particularly, reference may be made to the description corresponding to fig. 2.
In some embodiments, the file operation sequence may further include a file feature of each file operation corresponding to the file and a foreground opening duration.
File characteristics refer to attribute characteristics associated with a file. Such as file size, file type, etc.
The foreground opening duration refers to the duration of time that the user opens the file. The foreground opening time can be obtained through statistics of a device background system.
In some embodiments of the present description, entering file features and foreground open time into the embedding layer facilitates accurate construction of embedded vectors, thereby improving accuracy of the predictive model.
In some embodiments, the inputs to the prediction layer 423 include external features 431, user identities 432, embedded vectors 422, and the output is an anomaly probability 440.
External features refer to features of the user's action behavior, such as frequency of screenshots, length of time to block the camera, etc.
User identity refers to information related to the user identity, such as user position, user job number, etc.
In some embodiments, the output of the embedded layer may be used as an input to the prediction layer, and the prediction model may be derived by joint training of the embedded layer and the prediction layer.
In some embodiments, the predictive model may be trained from a plurality of labeled training samples. Training samples and tags may be obtained by historical disclosure users or simulated disclosure users.
In some embodiments, through the leakage behavior of the gripped leakage user, a positive sample and a label are constructed, the positive sample refers to a sample corresponding to the leakage behavior, the label corresponding to the positive sample is 1, and the positive sample can be expanded by simulating the leakage behavior so as to increase the training times of the prediction model, thereby improving the accuracy of the prediction model. For relevant content on the compromise behaviour reference may be made to the corresponding description of fig. 1.
In some embodiments, the negative samples and labels are constructed by the normal operating behavior of the user. Negative samples refer to samples corresponding to normal behavior, with a label of 0. Normal behavior is user behavior that does not include compromised behavior.
In some embodiments, the joint training sample data includes a sample file operation sequence, a sample privacy level sequence, a sample association level sequence, labeled 1 and 0 (i.e., the joint training sample data may include positive samples and negative samples). And inputting the sample file operation sequence, the sample security level sequence and the sample association degree sequence into an embedding layer to obtain an embedding vector, and inputting the embedding vector serving as a training sample, the external characteristics of the sample and the user identity of the sample into a prediction layer to obtain a prediction result. And constructing a loss function based on the labels and the prediction results, synchronously updating parameters of the embedded layer and the prediction layer, and obtaining a trained prediction model through parameter updating.
In some embodiments of the present disclosure, the prediction model includes an embedding layer and a prediction layer, where a file operation sequence, a security level sequence, and a relevance sequence are input into the embedding layer to construct an embedding vector, and then the embedding vector, an external feature, and a user identity are input into the prediction layer, so that accuracy of a prediction result is effectively improved.
In some embodiments of the present disclosure, based on the file operation sequence, the security level sequence, and the association degree sequence, the anomaly probability is determined through the prediction model, so as to evaluate the possibility of disclosure of the user behavior, and effectively prevent information disclosure.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations to the present disclosure may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this specification, and therefore, such modifications, improvements, and modifications are intended to be included within the spirit and scope of the exemplary embodiments of the present invention.
Meanwhile, the specification uses specific words to describe the embodiments of the specification. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present description. Thus, it should be emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this specification are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present description may be combined as suitable.
Furthermore, the order in which the elements and sequences are processed, the use of numerical letters, or other designations in the description are set forth is not intended to limit the order in which the processes and systems of the description are presented unless specifically recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present disclosure. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing server or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed in this specification and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the present description. Indeed, less than all of the features of a single embodiment disclosed above.
In some embodiments, numbers describing the components, number of attributes are used, it being understood that such numbers being used in the description of embodiments are modified in some examples by the modifier "about," approximately, "or" substantially. Unless otherwise indicated, "about," "approximately," or "substantially" indicate that the number allows for a 20% variation. Accordingly, in some embodiments, numerical parameters set forth in the specification and claims are approximations that may vary depending upon the desired properties sought to be obtained by the individual embodiments. In some embodiments, the numerical parameters should take into account the specified significant digits and employ a method for preserving the general number of digits. Although the numerical ranges and parameters set forth herein are approximations that may be employed in some embodiments to confirm the breadth of the range, in particular embodiments, the setting of such numerical values is as precise as possible.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., referred to in this specification is incorporated herein by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the content of this specification, documents that are currently or later attached to this specification in which the broadest scope of the claims to this specification is limited are also. It is noted that, if the description, definition, and/or use of a term in an attached material in this specification does not conform to or conflict with what is described in this specification, the description, definition, and/or use of the term in this specification controls.
Finally, it should be understood that the embodiments described in this specification are merely illustrative of the principles of the embodiments of this specification. Other variations are possible within the scope of this description. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present specification may be considered as consistent with the teachings of the present specification. Accordingly, the embodiments of the present specification are not limited to only the embodiments explicitly described and depicted in the present specification.

Claims (4)

1. A secure room information security management and control system, the system comprising:
the identification module is used for identifying user behaviors in the security area, wherein the user behaviors comprise action behaviors and/or operation behaviors;
the judging module is used for judging whether the user behavior meets a preset condition or not, the preset condition comprises that the abnormal probability of the user behavior is larger than a preset probability threshold value, and the judging module is further used for:
determining a file operation sequence in a preset time period based on the user behavior, wherein the file operation sequence at least comprises a file opening sequence and a file downloading sequence; the file opening sequence refers to a sequence formed by relevant information of a file opened by a user, the relevant information of the opened file comprises at least one of opening starting time, opening ending time, a path of the file and confidentiality grade, the file downloading sequence refers to a sequence formed by relevant information of a file downloaded by the user, and the relevant information of the downloaded file comprises at least one of downloading starting time, downloading ending time, a path of the file and confidentiality grade;
Generating a security level sequence based on the security level of each file in the file operation sequence;
generating a relevance sequence through a generating module based on the file operation sequence; the association degree sequence is a sequence formed by association relations of files in the file operation sequence, the association relations comprise identical relations and/or similar relations, the identical relations are identical in file version and/or identical in file attribute, and the similar relations are similar in file storage position and/or related to file content; the generating module is used for: constructing a relevance map, wherein the relevance map is a map constructed based on the relevance between files and file labels, and nodes of the relevance map comprise files and file labels, and edges of the relevance map represent the relevance relationship among the nodes; calculating the distance of each group of files in the file operation sequence on the relevance map based on the relevance map and the file operation sequence, and generating the relevance sequence based on the file operation sequence and the corresponding distance;
predicting the anomaly probability by a prediction module based on the security level sequence and the association degree sequence; the prediction module is used for: determining the abnormal probability through a prediction model based on the file operation sequence, the security level sequence and the association degree sequence; the prediction model is a machine learning model;
And the determining module is used for determining that the user behavior belongs to the secret leakage behavior in response to the fact that the preset condition is met.
2. The system of claim 1, the preset condition comprising an abnormality degree of the user behavior being greater than a preset abnormality threshold, the determination module further to:
and calculating the abnormal degree of the user behavior based on a preset employee behavior rule.
3. A method of secure house information security management and control, the method performed by a processor, the method comprising:
identifying user behavior in a secure area, the user behavior including action behavior and/or operation behavior;
judging whether the user behavior meets a preset condition or not, wherein the preset condition comprises that the abnormal probability of the user behavior is larger than a preset probability threshold, and the judging method comprises the following steps:
determining a file operation sequence in a preset time period based on the user behavior, wherein the file operation sequence at least comprises a file opening sequence and a file downloading sequence; the file opening sequence refers to a sequence formed by relevant information of a file opened by a user, the relevant information of the opened file comprises at least one of opening starting time, opening ending time, a path of the file and confidentiality grade, the file downloading sequence refers to a sequence formed by relevant information of a file downloaded by the user, and the relevant information of the downloaded file comprises at least one of downloading starting time, downloading ending time, a path of the file and confidentiality grade;
Generating a security level sequence based on the security level of each file in the file operation sequence;
generating a relevance sequence through a generating module based on the file operation sequence; the association degree sequence is a sequence formed by association relations of files in the file operation sequence, the association relations comprise identical relations and/or similar relations, the identical relations are identical in file version and/or identical in file attribute, and the similar relations are similar in file storage position and/or related to file content; the generating module is used for: constructing a relevance map, wherein the relevance map is a map constructed based on the relevance between files and file labels, and nodes of the relevance map comprise files and file labels, and edges of the relevance map represent the relevance relationship among the nodes; calculating the distance of each group of files in the file operation sequence on the relevance map based on the relevance map and the file operation sequence, and generating the relevance sequence based on the file operation sequence and the corresponding distance;
predicting the anomaly probability based on the security level sequence and the association degree sequence, including: determining the abnormal probability through a prediction model based on the file operation sequence, the security level sequence and the association degree sequence; the prediction model is a machine learning model;
And responding to the preset condition, and determining that the user behavior belongs to a secret leakage behavior.
4. A method according to claim 3, the preset condition comprising an abnormality degree of the user behaviour being greater than a preset abnormality threshold, the determining of the abnormality degree comprising:
and calculating the abnormal degree of the user behavior based on a preset employee behavior rule.
CN202310974064.0A 2023-08-04 2023-08-04 Safety house information safety control method and system Active CN116702229B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310974064.0A CN116702229B (en) 2023-08-04 2023-08-04 Safety house information safety control method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310974064.0A CN116702229B (en) 2023-08-04 2023-08-04 Safety house information safety control method and system

Publications (2)

Publication Number Publication Date
CN116702229A CN116702229A (en) 2023-09-05
CN116702229B true CN116702229B (en) 2023-11-21

Family

ID=87832531

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310974064.0A Active CN116702229B (en) 2023-08-04 2023-08-04 Safety house information safety control method and system

Country Status (1)

Country Link
CN (1) CN116702229B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117993876A (en) * 2024-04-03 2024-05-07 四川蓉城蕾茗科技有限公司 Resume evaluation system, method, device and medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101256507B1 (en) * 2012-11-13 2013-04-19 국방과학연구소 An malicious insider detection system via user behavior analysis and method thereof
CN107302520A (en) * 2017-05-15 2017-10-27 北京明朝万达科技股份有限公司 A kind of dynamic anti-leak of data and method for early warning and system
CN108717510A (en) * 2018-05-11 2018-10-30 深圳市联软科技股份有限公司 A kind of method, system and terminal by clustering file abnormal operation behavior
CN109800589A (en) * 2019-01-25 2019-05-24 深信服科技股份有限公司 A kind of information security management and control method, system, device and readable storage medium storing program for executing
CN110087238A (en) * 2019-05-13 2019-08-02 商洛学院 A kind of information safety of mobile electronic equipment protection system
CN111046415A (en) * 2018-10-15 2020-04-21 珠海格力电器股份有限公司 Intelligent grading early warning system and method for confidential files
KR20200136620A (en) * 2019-05-28 2020-12-08 타우데이타 주식회사 Prediction Algorithm for Industrial Technology Leakage Based on Machine Learning and its Prediction System and Method
CN115470524A (en) * 2022-10-31 2022-12-13 中国电力科学研究院有限公司 Method, system, equipment and medium for detecting leakage of confidential documents
CN115565242A (en) * 2022-08-03 2023-01-03 江苏天罡居正数字技术有限公司 Data leakage prevention system based on AI behavior recognition technology
CN116112194A (en) * 2022-10-24 2023-05-12 成都卫士通信息产业股份有限公司 User behavior analysis method and device, electronic equipment and computer storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6648531B2 (en) * 2016-01-14 2020-02-14 富士通株式会社 File operation check device, file operation check program, and file operation check method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101256507B1 (en) * 2012-11-13 2013-04-19 국방과학연구소 An malicious insider detection system via user behavior analysis and method thereof
CN107302520A (en) * 2017-05-15 2017-10-27 北京明朝万达科技股份有限公司 A kind of dynamic anti-leak of data and method for early warning and system
CN108717510A (en) * 2018-05-11 2018-10-30 深圳市联软科技股份有限公司 A kind of method, system and terminal by clustering file abnormal operation behavior
CN111046415A (en) * 2018-10-15 2020-04-21 珠海格力电器股份有限公司 Intelligent grading early warning system and method for confidential files
CN109800589A (en) * 2019-01-25 2019-05-24 深信服科技股份有限公司 A kind of information security management and control method, system, device and readable storage medium storing program for executing
CN110087238A (en) * 2019-05-13 2019-08-02 商洛学院 A kind of information safety of mobile electronic equipment protection system
KR20200136620A (en) * 2019-05-28 2020-12-08 타우데이타 주식회사 Prediction Algorithm for Industrial Technology Leakage Based on Machine Learning and its Prediction System and Method
CN115565242A (en) * 2022-08-03 2023-01-03 江苏天罡居正数字技术有限公司 Data leakage prevention system based on AI behavior recognition technology
CN116112194A (en) * 2022-10-24 2023-05-12 成都卫士通信息产业股份有限公司 User behavior analysis method and device, electronic equipment and computer storage medium
CN115470524A (en) * 2022-10-31 2022-12-13 中国电力科学研究院有限公司 Method, system, equipment and medium for detecting leakage of confidential documents

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A User Grouping Method for Detection of Data Leakage by Cooperation Insiders;Shin, Hee-Jin 等;Jounal of The Korea Society of Information Technology Policy & Management;第10卷(第1期);全文 *
基于内容的内网安全管控系统的设计与实现;秦艺文;中国优秀硕士学位论文全文数据库信息科技辑(第(2018)04期);I139-265 *
基于角色行为模式挖掘的内部威胁检测研究;李殿伟;何明亮;袁方;;信息网络安全(03);全文 *
数据泄露防护系统的设计分析;李勇;李秀芬;王鹏;;内蒙古电力技术(04);全文 *

Also Published As

Publication number Publication date
CN116702229A (en) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2017065070A1 (en) Suspicious behavior detection system, information-processing device, method, and program
Lu et al. Insider threat detection with long short-term memory
US11704439B2 (en) Systems and methods for managing privacy policies using machine learning
US20210357508A1 (en) Method and a system for testing machine learning and deep learning models for robustness, and durability against adversarial bias and privacy attacks
Nisioti et al. Data-driven decision support for optimizing cyber forensic investigations
US20220360597A1 (en) Cyber security system utilizing interactions between detected and hypothesize cyber-incidents
CN116702229B (en) Safety house information safety control method and system
US20230135660A1 (en) Educational Tool for Business and Enterprise Risk Management
Lin et al. Collaborative alert ranking for anomaly detection
CN114422224A (en) Attack tracing-oriented threat information intelligent analysis method and system
CN109388949B (en) Data security centralized management and control method and system
Wall et al. A Bayesian approach to insider threat detection
US10291483B2 (en) Entity embedding-based anomaly detection for heterogeneous categorical events
Ramadhan et al. Forensic malware identification using naive bayes method
US20230396641A1 (en) Adaptive system for network and security management
Dai et al. Homeguardian: Detecting anomaly events in smart home systems
Mora et al. Enforcing corporate security policies via computational intelligence techniques
Kan et al. User-level malicious behavior analysis model based on the NMF-GMM algorithm and ensemble strategy
Sabnani Computer security: A machine learning approach
Le Machine Learning based Framework for User-Centered Insider Threat Detection
Mihailescu et al. Unveiling Threats: Leveraging User Behavior Analysis for Enhanced Cybersecurity
Jain et al. An Approach to Identify Vulnerable Features of Instant Messenger
Kubigenova et al. Prospects for Information Security in Big Data Technology
Zhang Countering cybersecurity vulnerabilities in the power system
CN117763519A (en) Trusted user architecture construction method, trusted user architecture construction system and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant