CN115033889B - Illegal right-raising detection method and device, storage medium and computer equipment - Google Patents
Illegal right-raising detection method and device, storage medium and computer equipment Download PDFInfo
- Publication number
- CN115033889B CN115033889B CN202210716028.XA CN202210716028A CN115033889B CN 115033889 B CN115033889 B CN 115033889B CN 202210716028 A CN202210716028 A CN 202210716028A CN 115033889 B CN115033889 B CN 115033889B
- Authority
- CN
- China
- Prior art keywords
- event
- data
- illegal
- call
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 158
- 238000000034 method Methods 0.000 claims abstract description 240
- 230000006870 function Effects 0.000 claims description 89
- 238000000605 extraction Methods 0.000 claims description 21
- 238000012549 training Methods 0.000 claims description 17
- 238000001914 filtration Methods 0.000 claims description 6
- 238000004590 computer program Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 9
- 238000012706 support-vector machine Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000007637 random forest analysis Methods 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 238000003066 decision tree Methods 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000003252 repetitive effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/57—Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
- G06F21/577—Assessing vulnerabilities and evaluating computer system security
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/50—Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
- G06F21/55—Detecting local intrusion or implementing counter-measures
- G06F21/56—Computer malware detection or handling, e.g. anti-virus arrangements
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Virology (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Telephonic Communication Services (AREA)
- Debugging And Monitoring (AREA)
Abstract
The disclosure provides an illegal right-raising detection method and device, a storage medium and computer equipment; relates to the technical field of computers. The method comprises the following steps: acquiring a first event set corresponding to a first process, and determining safety baseline information based on the first event set; acquiring a second event set corresponding to the process to be detected, and performing illegal right-raising detection based on the second event set to acquire a second detection result; the second event set comprises a second user space function call event and a second kernel space capability call event; and determining whether the process to be detected is illegal right raising operation or not based on the second detection result and the safety baseline information. The method and the device can solve the problems of high false detection rate and low detection accuracy rate of illegal right-raising detection in the related technology.
Description
Technical Field
The disclosure relates to the technical field of computers, and in particular relates to an illegal right-raising detection method and device, a storage medium and computer equipment.
Background
The right-raising attack is generally that a low-authority user carries out illegal right raising through a hardware vulnerability and/or a processor vulnerability of the device, and the right is raised to the high right so as to carry out security invasion operation on the computer system, thereby seriously affecting the system security.
In the related art, illegal copyright raising judgment is performed through log data of an operating system, so that the problems of high false detection rate and low illegal copyright raising detection accuracy exist, and therefore, how to effectively detect illegal copyright raising operation is a technical problem to be solved in the field.
It should be noted that the information disclosed in the above background section is only for enhancing understanding of the background of the present disclosure and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
Disclosure of Invention
An object of an embodiment of the present disclosure is to provide a method and an apparatus for detecting illegal rights, a storage medium, and a computer device, so as to solve the problem of high false detection rate and low detection accuracy of illegal rights detection in related technologies to a certain extent
According to a first aspect of the present disclosure, there is provided an illegal right-raising detection method, the method including: acquiring a first event set corresponding to a first process, and determining safety baseline information based on the first event set; acquiring a second event set corresponding to a process to be detected, and performing illegal right lifting detection based on the second event set to acquire a second detection result; the second event set comprises a second user space function call event and a second kernel space capability call event; and determining whether the process to be detected is illegal right raising operation or not based on a second detection result and the safety baseline information.
Optionally, the first event set includes a first user space function call event and a first kernel space capability call event; the determining security baseline information based on the first set of events includes: performing feature extraction on the first user space function call event and the first kernel space capability call event to obtain first feature data; detecting the first characteristic data by adopting a weight-lifting detection model to obtain a first detection result; determining the safety baseline information based on the first detection result; the weight-raising detection model is obtained through training of a training sample set, and the training sample set comprises a normal weight-raising event and an illegal weight-raising event.
Optionally, the illegal right raising detection based on the second event set includes: feature extraction is carried out on the second user space function calling event and the second kernel space capacity calling event so as to obtain second feature data; and carrying out illegal right-raising detection on the second characteristic data by adopting the right-raising detection model so as to obtain a second detection result.
Optionally, the first event set further includes first process identification information, the second event set further includes second process identification information, and before the feature extraction is performed on the first event set, the method further includes: grouping the first user space function call event and the first kernel space capability call event according to the time window and the first process identification information, so that processes related to the same first process identification information are divided into a group according to the time window; before feature extraction is performed on the second set of events, the method further includes: and grouping the second user space function call event and the second kernel space capacity call event according to the time window and the second process identification information, so that processes related to the same second process identification information are divided into a group according to the time window.
Optionally, the illegal right raising detection based on the second event set includes: process creation characteristic data, system call time sequence characteristic data, user space function call characteristic data and capability call path characteristic data of each group of data corresponding to the second event set are extracted; and adopting a weight-raising detection model to fuse the process creation feature data, the system call time sequence feature data, the user space function call feature data and the kernel capability call path feature data so as to obtain a second detection result.
Optionally, the determining, based on the second detection result and the security baseline information, whether the process to be detected is an illegal override operation includes: and when the detection result is larger than the safety baseline information, determining that the process to be detected is illegal right-raising operation.
Optionally, before the illegal override detection based on the second event set, the method further comprises: and carrying out data filtering on the second event set to filter out trusted function call data and/or capability call data.
According to a second aspect of the present disclosure, there is provided an illegal right-raising detection apparatus, the apparatus including: the first determining module is used for acquiring a first event set corresponding to a first process and determining safety baseline information based on the first event set; the detection module is used for acquiring a second event set corresponding to the process to be detected, and carrying out illegal right-raising detection based on the second event set so as to acquire a second detection result; the second event set comprises a second user space function call event and a second kernel space capability call event; and the second determining module is used for determining whether the process to be detected is illegal right raising operation or not based on the detection result and the safety baseline information.
According to a third aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor performs a method of any of the above.
According to a fourth aspect of the present disclosure, there is provided a computer device comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any of the above via execution of executable instructions.
Exemplary embodiments of the present disclosure may have some or all of the following advantages:
in the illegal right-raising detection method provided by the example embodiment of the present disclosure, on one hand, by acquiring the first event set corresponding to the first process, determining the security baseline information based on the first event set, and determining whether the process to be detected is an illegal right-raising operation according to the security baseline information, a basic reference security baseline is provided for the illegal right-raising detection result, so that the false detection rate in the illegal right-raising detection process can be reduced, and the workload of system security maintenance personnel can be reduced. On the other hand, the illegal override detection is performed by acquiring the second event set corresponding to the process to be detected, the user space function calling event and the kernel space capacity calling event are combined, the illegal override detection can be performed through the event data of multiple dimensions, and the detection accuracy can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort.
Fig. 1 schematically illustrates an application scenario of an illegal right-raising detection method and apparatus according to an embodiment of the present disclosure.
Fig. 2 schematically illustrates a flow diagram of an illegal override detection method according to an embodiment of the present disclosure.
Fig. 3 schematically illustrates a flow diagram for determining security baseline information according to one embodiment of the disclosure.
Fig. 4 schematically illustrates one of the flow diagrams of illegal override detection according to one embodiment of the present disclosure.
Fig. 5 schematically illustrates a second flow diagram of illegal override detection according to an embodiment of the present disclosure.
Fig. 6 schematically shows a block diagram of a structure of an illegal right-lifting detection device according to an embodiment of the present disclosure.
Fig. 7 schematically illustrates an exemplary computer device block diagram according to one embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the present disclosure. However, those skilled in the art will recognize that the aspects of the present disclosure may be practiced with one or more of the specific details, or with other methods, components, devices, steps, etc. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
To facilitate an understanding of the present disclosure, the following terms are explained.
Capacity (CAP for short), which is the Capability concept of the Linux kernel, breaks the concept of super users/common users in the UNIX/LINUX operating system, and the common users can also do work which only the super users can complete.
The main idea of Capabilities is to partition the privileges of the root user, i.e. to partition the privileges of the root into different Capabilities, each representing a certain privileged operation. For example: capability cap_sys_module represents privileged operations that a user can load (or unload) kernel MODULEs, and cap_setup represents privileged operations that a user can modify the identity of a process user. In Capabilities the system will have access control for privileged operations based on the capabilities that the process has.
Referring to fig. 1, a schematic diagram of a system architecture 100 of an exemplary application environment of an illegal right-raising detection method and apparatus provided in some embodiments of the present disclosure. As shown in fig. 1, the system architecture 100 may include one or more of the computer devices 110 and a server 120. The network is the medium used to provide communications links between computer devices 110 and servers 120. The network may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The computer device 110 may be a variety of network devices with a display screen including, but not limited to, a desktop computer, a portable computer, a smart phone, a tablet computer, and the like.
The server 120 may be a server providing various services, such as obtaining a first set of events corresponding to a first process from the computer device 110; determining security baseline information based on the first set of events; acquiring a second event set corresponding to a process to be detected, and performing illegal right-raising detection based on the second event set; and determining whether the process to be detected is illegal right raising operation or not based on the detection result and the safety baseline information.
The server 120 may be hardware or software. When the server 120 is hardware, it may be implemented as a distributed server cluster formed by a plurality of servers, or may be implemented as a single server. When the server 120 is software, it may be implemented as a plurality of software or software modules, or as a single software or software module. The present invention is not particularly limited herein.
It should be understood that the number of computer devices, networks, and servers in fig. 1 are illustrative only. There may be any number of computer devices, networks, and servers, as desired for implementation. For example, the server 120 may be a server cluster formed by a plurality of servers.
The method for detecting illegal rights provided in the embodiments of the present disclosure may be executed in the server 120, and accordingly, the illegal rights detecting device is generally disposed in the server 120. The illegal right-raising detection method provided by the embodiment of the present disclosure may also be executed in the computer device 110, and accordingly, the illegal right-raising detection apparatus is generally disposed in the computer device 110.
Referring to fig. 2, the illegal right-raising detection method according to one embodiment provided by the present disclosure may be applied to a Linux computer device, and includes the following steps S210 to S230.
Step S210, a first event set corresponding to the first process is obtained, and based on the first event set, safety baseline information is determined.
In this example embodiment, the first process may include a process similar to the process of raising the authority in the common software process, a normal high-authority process (such as a root authority process), or a special user group process (such as a general user group with special authority). For example, the first process may include processes similar to or related to the weighting process in the following software runs: FTP (File Transfer Protocol ), SSH (Secure Shell, secure Shell protocol), KVM (Kernel-based virtualization device), VMWare (Virtual Machines Ware, virtualized computer software), apache, nginx, redis databases, relational database management system software MySQL, postgreSQL, framework Spark based memory computing, and the like.
Illustratively, the eBPF (Extended Berkeley Packet Filter, extended berkeley packet filtering) may be used for network packet grabbing, fast processing and forwarding of kernel-mode network packets, hook point through LSM (Linux Security Module, linux security mode) by grabbing the eBPF event of the first process as the first event set, and the eBPF may perform security monitoring and access control on the Linux kernel. The higher version kernel of Linux may convert any kernel function call into a user space event that may carry any data based on eBPF.
In this example embodiment, the first event set may include function/capability call data (function/capability call path, identifier, time, etc.) of a kernel space corresponding to the first process, call stack data of a user space (e.g., user space call path, identifier, time, etc.), system call data (e.g., system call name, call time), creation of new process data (e.g., number of new processes created within a predetermined time, average, etc., new process creation path, new process creation time difference).
In this example embodiment, a trained random forest model or support vector machine may be employed to determine the security baseline information corresponding to the first set of events. The security baseline information may be dynamically changed or may remain unchanged for a period of time, which is not limited in this example.
Step S220, a second event set corresponding to the process to be detected is obtained, and illegal right-raising detection is carried out based on the second event set, so that a second result is obtained.
In this example embodiment, the second set of events includes a second user space function call event and a second kernel space capability call event. The second user space function calling event may include system calling data corresponding to a process to be detected, related data of a new process creating event, calling event identification information, identification information of adjacent events before and after the calling event, function calling time information, process identification information (such as a process number) corresponding to the calling event, function information (such as a function name) executed in a user space, and the like. The second kernel space Capabilities (Capabilities) call event may include current capability call path information (may include related information of a current call and a call before and after the current call) corresponding to the process to be detected, user information of the current call capability, a command of the current call capability, and current call capability information (such as a Linux CAP name).
In this example embodiment, the illegal override detection may be performed using a random forest model and/or a support vector machine (e.g., a support vector machine).
Step S230, based on the second detection result and the security baseline information, determining whether the process to be detected is an illegal right raising operation.
In this example embodiment, the second detection result and the security baseline information may be compared, and whether the process to be detected is an illegal override operation may be determined based on the comparison result. When the security baseline information is a value, the process of which the second detection result is larger than the security baseline information can be determined to be illegal right raising operation. When the security baseline information is a value set, the process that the second detection result is greater than the average value/maximum value of the security baseline value set may be determined as illegal upgrading operation, or the process that the value in the security baseline value set is weighted and compared with the second detection result to determine whether the operation is illegal and complete, or may be set as other determination conditions, which is not limited in this example.
In the illegal right-raising detection method provided by the embodiment of the disclosure, on one hand, by acquiring the first event set corresponding to the first process, determining the security baseline information based on the first event set, determining whether the process to be detected is illegal right-raising operation according to the security baseline information, and providing a basic reference security baseline for the illegal right-raising detection result, the false detection rate in the illegal right-raising detection process can be reduced, and the workload of system security maintenance personnel is reduced. On the other hand, the illegal override detection is performed by acquiring the second event set corresponding to the process to be detected, the user space function calling event and the kernel space capacity calling event are combined, the illegal override detection can be performed through the event data of multiple dimensions, and the detection accuracy can be improved.
In some embodiments, referring to fig. 3, the first set of events includes a first user space function call event and a first kernel space capability call event; based on the first set of events, determining security baseline information includes: and extracting features of the first user space function calling event and the first kernel space capacity calling event to obtain first feature data.
In this example embodiment, the first user space function call event may include system call data corresponding to the first process, related data of a new process creation event, call event identification information, identification information of adjacent events before and after the call event, function call time information, process identification information corresponding to the call event (e.g., a process number), function information (e.g., a function name) executed in the user space, and the like. The first kernel space capability calling event may include current capability calling path information (may include related information of a current call and a call before and after the current call) corresponding to the first process, user information of the current calling capability, a command of the current calling capability, and current calling capability information (such as a Linux CAP name). The data in the part or all of the first user space function call event and the first kernel space capability call event may be used as the first feature data, or the part or all of the data in the first user space function call event and the first kernel space capability call event may be processed and then used as the first feature data, which is not limited in this example.
Illustratively, the first characteristic data may include in a first process: the method comprises the steps of creating new process number information (such as information of maximum value, mean value, variance and the like of the number of new processes created every second in preset time), system call time sequence collection (such as time-ordered system call names), user space function call data (such as current event number, numbers of adjacent time before and after the current event, function call time stamp, process number, function name executed by user space and the like, function call time difference of the same sub-process), new process creation path (such as new process number, adjacent process number before and after the new process, father process identification information of the new process, command for starting the new process, execution result of the new process, parameter for starting the new process command, sub-process creation time difference between the new process and the father process), capability call path (such as energy call process number, adjacent process number before and after the capability call process, capability call user information, command, parameter, name and the like for calling the capability).
And detecting the first characteristic data by adopting a weight-raising detection model to obtain a first detection result.
In this example embodiment, the right-of-hand detection model may be obtained by training a training sample set that includes normal right-of-hand events and illegal right-of-hand events. The normal override event may be obtained by running common software, a normal high-authority process, or a special override user group process. The illegal override event may be obtained by a captured historical illegal override event or by running malicious override software. The data in the training sample set is labeled data.
The weighting detection model may be a random forest model, and may perform put-back random sampling on samples in the training sample set, randomly select part of the features in the first feature data each time, and completely split the selected data to obtain a decision tree. The final result is determined by the classification result of each decision tree, and the number ratio of each category can be used as a first detection result.
In another example embodiment, the weighting detection model may also be a support vector machine. The feature value corresponding to the first feature data can be mapped to a high-dimensional space through a function, learning is conducted in the high-dimensional space according to the feature value corresponding to each training sample, and a classifier can be generated and can output a corresponding class probability value as a first detection result.
Based on the first detection result, security baseline information is determined.
In this exemplary embodiment, the maximum value of the first detection result may be used as the security baseline information, the first few bits with the maximum value of the first detection result may be selected as the security baseline information, and the maximum value of the first detection result may be multiplied by a preset security coefficient to be used as the security baseline information.
In some embodiments, referring to fig. 4, illegal override detection based on the second set of events includes: and extracting features of the second user space function calling event and the second kernel space capacity calling event to obtain second feature data.
In this example embodiment, the second feature data may be data corresponding to a process to be detected similar to the first feature data, that is, in the process to be detected: the number of new creation processes, new creation process Li Jing, user space function call path, kernel control capability call path, system call, etc.
And carrying out illegal right-raising detection on the second characteristic data by adopting a right-raising detection model so as to obtain a second detection result.
In this example embodiment, for the random forest model, a decision result greater than a preset threshold (e.g., 70%) may be set to be an illegal right raising process, and the current process is determined to be an illegal right raising process. For support vector machines, a probability threshold may be set for the determination.
In some embodiments, the first set of events further includes first process identification information, the second set of events further includes second process identification information, and the method further includes, prior to feature extraction of the first set of events:
And grouping the first user space function call event and the first kernel space capability call event according to the time window and the first process identification information, so that processes related to the same first process identification information are divided into a group according to the time window.
In this example embodiment, the first process identification information may be a parent process number in the first process, and related data of a function call event and a capability call event of a parent process and a child process corresponding to the parent process number may be segmented into a group according to a time window. For example, the function call and capability call event related data of related processes (parent process and child process) of the same parent process number generated within ten minutes are divided into a set of data.
Before feature extraction is performed on the second set of events, the method further comprises:
and grouping the second user space function call event and the second kernel space capability call event according to the time window and the second process identification information, so that processes related to the same second process identification information are divided into a group according to the time window.
In this example embodiment, the second process identification information may be a parent process number in the process to be detected, or the related data of the function call event and the capability call event of the parent process and the child process corresponding to the parent process number may be segmented into a group according to a time window (5-10 minutes).
Based on the grouping result of the above embodiment, illegal privilege detection based on the second event set includes: and extracting process creation characteristic data, system call time sequence characteristic data, user space function call characteristic data and capability call path characteristic data of each group of data corresponding to the second event set.
In this example embodiment, feature extraction may be performed on each set of data, and detection may be performed in units of each set of data, so as to conform to a process running mode and ensure accuracy of a result.
And adopting a weight-raising detection model to fuse the process creation characteristic data, the system call time sequence characteristic data, the user space function call characteristic data and the kernel capability call path characteristic data so as to obtain a second detection result.
In this example embodiment, the feature data in each set of data is fused by using the model parameters in the weighting detection model, so that features of multiple dimensions can be fused, and the detection accuracy is improved.
In some embodiments, prior to the illegitimate right detection based on the second set of events, the method further comprises: the second set of events is data filtered to filter out trusted function call data and/or capability call data.
In this example embodiment, the trusted function call data and/or capability call data may be made whitelisted (e.g., function name, capability name, i.e., linux CAP name, etc., user ID) to reduce the data throughput of the detection process.
In some embodiments, referring to fig. 5, the illegal right-raising detection method of one specific embodiment of the present disclosure may include the following steps.
First, a first event set corresponding to a first process is obtained.
In this example, the first process may include a portion of the common software process similar to the weighting process and a normal high-rights weighting process. The first time set may be an eBPF event during the running of the process.
And secondly, grouping the first user space function calling event and the first kernel space capacity calling event according to the time window and the first process identification information, so that the data of the same first process identification information are divided into a group according to the time window.
In this example, the processes may be grouped according to the process identification information (e.g., process ID) of the parent process, where a parent process and its related child processes are divided into a group according to a time sequence, and are segmented according to time windows, where each time window corresponds to a group of data.
And thirdly, extracting the characteristics of each group of grouped first process data to obtain first characteristic data.
In this example, the first feature data may include process creation feature data, system call timing feature data, user space function call feature data, and capability call path feature data corresponding to the first process. The process creation characteristic data may include a new process creation path, a number, and a time difference of adjacent creation processes, and a creation time difference of a current process and a child process of the same parent process in the process tree. The user space function call feature data may include a function call path (the number of the current function call event and the numbers of the previous and subsequent neighboring function call events) and the time difference of the neighboring function call events, user space execution function information, time information, corresponding process numbers, and the like. The system call timing characteristic data may include a system call time and name (sycall name). The capability calling path feature data may include information such as a current capability calling number and front-rear adjacent capability calling number information, capability calling time information, a process ID of a current calling CAP, a user ID of the current calling CAP, a command of the current calling CAP, a name of the current calling CAP, and the like.
And step four, determining safety baseline information according to the extracted first characteristic data.
In this example, the first feature data may be input into the weighting detection model, and then the maximum value of the output data may be selected as the security baseline.
And fifthly, acquiring a second event set of the process to be detected.
In this example, the second set of events may be obtained by grabbing the eBPF event run by the process under inspection.
And sixthly, data filtering is carried out on the second event set to filter out trusted function call data and/or capability call data.
Seventh, grouping the second event sets according to the time window and the second process identification information, so that the data of the same second process identification information are divided into a group according to the time window.
And eighth step, extracting features of each group of process data to be detected to obtain corresponding process creation feature data (creation path and time difference), system call time sequence feature data, user space function call feature data (call path and time difference) and capability call path feature data, namely second feature data.
And ninth, detecting the extracted second characteristic data by adopting a weight-raising detection model so as to obtain a second detection result.
And tenth, judging whether the second detection result is larger than the safety baseline information, if so, determining that the process to be detected is illegal and unauthorized operation, otherwise, returning to the third step to continue monitoring.
The order of the steps in the above embodiments is merely exemplary, and the order of the steps may be adjusted accordingly as needed. For example, the first step and the fifth step may be performed simultaneously.
According to the method and the device, eBPF data are used, the characteristics of multiple dimensions such as a new process creation characteristic (comprising quantity and paths), a SYSCALL system call time sequence characteristic, a user space function call characteristic (comprising call paths and time differences), a process tree corresponding sub-process creation time difference, a CAP capability call characteristic (comprising paths and time differences) and the like are fused from real-time function call process and CAP capability call monitoring data, and normal and illegal right-raising characteristics can be comprehensively distinguished from the multiple dimensions, so that the detection accuracy of illegal right-raising operations (such as abnormal right-raising behaviors of a Linux system) is improved.
The security base line information is set, so that the problem that a specific user group is wrongly detected as an illegal access process when a special service process is accessed by a special user (such as a common user with mysql service access authority), namely the common user with special service access authority, can be avoided, and the condition that the normal access of the special service process is invoked by the common user can be effectively eliminated, so that the false alarm rate is reduced, and the detection accuracy is further improved. The security baseline information of the method can be dynamically adjusted, flexible management and identification of normal process behaviors can be realized, the identification capability of the Linux host for upgrading is greatly enhanced, and the stable and sustainable optimization of the host security detection capability is ensured.
Referring to fig. 6, in this example embodiment, there is further provided an illegal right-lifting detecting device 600, where the device 600 includes: the first determining module 610, the detecting module 620 and the second determining module 630, where the first determining module 610 is configured to obtain a first event set corresponding to the first process, and determine security baseline information based on the first event set; the detection module 620 is configured to obtain a second event set corresponding to the process to be detected, and perform illegal upgrade detection based on the second event set, so as to obtain a second detection result; the second event set comprises a second user space function call event and a second kernel space capability call event; the second determining module 630 is configured to determine whether the process to be detected is an illegal override operation based on the second detection result and the security baseline information.
In one embodiment of the present disclosure, the first set of events includes a first user space function call event and a first kernel space capability call event; the first determination module 610 includes: the device comprises a first feature extraction submodule, a first detection submodule and a first determination submodule, wherein the feature extraction submodule is used for carrying out feature extraction on a first user space function calling event and a first kernel space capacity calling event so as to obtain first feature data; the first detection sub-module is used for detecting the first characteristic data by adopting a weight-raising detection model to obtain a first detection result; the first determination submodule is used for determining safety baseline information based on a first detection result; the weight-raising detection model is obtained through training of a training sample set, and the training sample set comprises a normal weight-raising event and an illegal weight-raising event.
In one embodiment of the present disclosure, the detection module 620 includes a second feature extraction sub-module and a second detection sub-module, where the second feature extraction sub-module is configured to perform feature extraction on a second user space function call event and a second kernel space capability call event to obtain second feature data; the second detection sub-module is used for carrying out illegal right-lifting detection on the second characteristic data by adopting a right-lifting detection model so as to obtain a second detection result.
In one embodiment of the present disclosure, the first event set further includes first process identification information, the second event set further includes second process identification information, and the apparatus 600 further includes a first grouping module and a second grouping module, where the first grouping module is configured to group the first user space function call event and the first kernel space capability call event according to the time window and the first process identification information before feature extraction of the first event set, so that data of the same first process identification information is divided into a group according to the time window;
the second grouping module is used for grouping the second user space function calling event and the second kernel space capacity calling event according to the time window and the second process identification information before the feature extraction is carried out on the second event set, so that the data of the same second process identification information are divided into a group according to the time window.
In one embodiment of the present disclosure, the detection module 620 may also be used to: process creation feature data, system call time sequence feature data, user space function call feature data and capability call path feature data of each group of data corresponding to the second event set are extracted; and adopting a weight-raising detection model to fuse the process creation characteristic data, the system call time sequence characteristic data, the user space function call characteristic data and the kernel capability call path characteristic data so as to obtain a second detection result.
In one embodiment of the present disclosure, the second determining module is further configured to: and when the second detection result is larger than the safety baseline information, determining that the process to be detected is illegal right-raising operation.
In one embodiment of the present disclosure, the apparatus 600 further comprises a data filtering module that may be configured to perform data filtering on the second set of events to filter out trusted function call data and/or capability call data prior to illegitimate override detection based on the second set of events.
The specific details of each module/unit involved in the illegal rights detection device in the above embodiment have been described in detail in the corresponding illegal rights detection method, so that the details are not repeated here.
As another aspect, the present application also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be present alone without being fitted into the device. The computer-readable medium carries one or more programs which, when executed by a device, cause the device to implement the method in the embodiments described below. For example, the device may implement the various steps shown in fig. 2-5, etc.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
In addition, in an exemplary embodiment of the present disclosure, an apparatus capable of implementing the above method is also provided. Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a computer device according to an embodiment of the present application. As shown in fig. 7, the computer device 700 includes a processor 710, a memory 720, an input-output interface 730, and a communication bus 740. Processor 710 is coupled to memory 720 and input-output interface 730, for example, processor 710 may be coupled to memory 720 and input-output interface 730 through communication bus 740. The processor 710 is configured to support the computer device to perform the corresponding functions of the illegal override detection method of fig. 2-5. The processor 710 may be a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a hardware chip, or any combination thereof. The hardware chip may be an Application-specific integrated circuit (ASIC), a programmable logic device (Programmable Logic Device, PLD), or a combination thereof. The PLD may be a complex programmable logic device (Complex Programmable Logic Device, CPLD), a Field programmable gate array (Field-Programmable Gate Array, FPGA), general array logic (Generic Array Logic, GAL), or any combination thereof. The memory 720 is used for storing program codes and the like. Memory 720 may include volatile memory (VolatileMemory, VM), such as random access memory (Random Access Memory, RAM); the Memory 720 may also include a Non-Volatile Memory (NVM), such as Read-Only Memory (ROM), flash Memory (flash Memory), hard Disk (HDD) or Solid State Drive (SSD); memory 720 may also include a combination of the above types of memory.
The input-output interface 730 is used to input or output data.
Processor 710 may call the above-described program code to perform the following operations:
acquiring a first event set corresponding to a first process, and determining safety baseline information based on the first event set; acquiring a second event set corresponding to the process to be detected, and performing illegal right-raising detection based on the second event set to acquire a second detection result; the second event set comprises a second user space function call event and a second kernel space capability call event; and determining whether the process to be detected is illegal right raising operation or not based on the second detection result and the safety baseline information.
Optionally, the first event set includes a first user space function call event and a first kernel space capability call event, and the processor 710 may further determine the security baseline information based on the first event set, and perform the following operations: feature extraction is carried out on the first user space function calling event and the first kernel space capacity calling event so as to obtain first feature data; detecting the first characteristic data by adopting a weight-lifting detection model to obtain a first detection result; determining safety baseline information based on the first detection result; the weight-raising detection model is obtained through training of a training sample set, and the training sample set comprises a normal weight-raising event and an illegal weight-raising event.
Optionally, the processor 710 may further perform illegal override detection based on the second event set, and perform the following operations: feature extraction is carried out on the second user space function calling event and the second kernel space capacity calling event so as to obtain second feature data; and carrying out illegal right-raising detection on the second characteristic data by adopting a right-raising detection model so as to obtain a second detection result.
Optionally, the first event set further includes first process identification information, and before the feature extraction of the first event set, the processor 710 may further perform the following operations:
and grouping the first user space function call event and the first kernel space capability call event according to the time window and the first process identification information, so that the data of the same first process identification information are divided into a group according to the time window.
Optionally, the second event set further includes second process identification information, and before the feature extraction of the second event set, the processor 710 may further perform the following operations:
and grouping the second user space function call event and the second kernel space capability call event according to the time window and the second process identification information, so that the data of the same second process identification information are divided into a group according to the time window.
Optionally, the processor 710 may further perform the following operations based on the detection result and the security baseline information: and when the second detection result is larger than the safety baseline information, determining that the process to be detected is illegal right-raising operation.
Optionally, the processor 710 may further perform the following operations: the second set of events is data filtered to filter out trusted function call data and/or capability call data.
It should be noted that implementation of each operation may also correspond to the corresponding description of the method embodiment shown with reference to fig. 2-5; the processor 710 may also cooperate with the input-output interface 730 to perform other operations in the method embodiments described above.
From the above description of embodiments, those skilled in the art will readily appreciate that the example embodiments described herein may be implemented in software, or in combination with the necessary hardware. Thus, the technical solution according to the embodiments of the present disclosure may be embodied in the form of a software product, which may be stored in a non-volatile storage medium (which may be a CD-ROM, a usb disk, a mobile hard disk, etc.) or on a network, comprising several instructions to cause a device to perform a method according to the embodiments of the present disclosure.
Furthermore, the above-described figures are only schematic illustrations of processes included in the method according to the exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
It should be noted that although the steps of the methods of the present disclosure are illustrated in the accompanying drawings in a particular order, this does not require or imply that the steps must be performed in that particular order or that all of the illustrated steps be performed in order to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step to perform, and/or one step decomposed into multiple steps to perform, etc., all are considered part of the present disclosure.
It should be understood that the present disclosure disclosed and defined herein extends to all alternative combinations of two or more of the individual features mentioned or evident from the text and/or drawings. All of these different combinations constitute various alternative aspects of the present disclosure. Embodiments of the present disclosure describe the best mode known for carrying out the disclosure and will enable one skilled in the art to utilize the disclosure.
Claims (8)
1. An illegal right-raising detection method, characterized in that the method comprises:
acquiring a first event set corresponding to a first process, and determining safety baseline information based on the first event set;
acquiring a second event set corresponding to a process to be detected, wherein the second event set comprises a second user space function calling event, a second kernel space capacity calling event and second process identification information;
grouping the second user space function call event and the second kernel space capability call event according to the time window and the second process identification information, so that the data of the same second process identification information are divided into a group according to the time window;
extracting process creation characteristic data, system call time sequence characteristic data, user space function call characteristic data and capability call path characteristic data of each group of data corresponding to the second user space function call event and the second kernel space capability call event;
adopting a weight-raising detection model to fuse the process creation feature data, the system call time sequence feature data, the user space function call feature data and the capability call path feature data so as to obtain a second detection result;
And determining whether the process to be detected is illegal right raising operation or not based on the second detection result and the safety baseline information.
2. The illegal override detection method according to claim 1, wherein the first event set includes a first user space function call event and a first kernel space capability call event; the determining security baseline information based on the first set of events includes:
performing feature extraction on the first user space function call event and the first kernel space capability call event to obtain first feature data;
detecting the first characteristic data by adopting a weight-lifting detection model to obtain a first detection result;
determining the safety baseline information based on the first detection result;
the weight-raising detection model is obtained through training of a training sample set, and the training sample set comprises a normal weight-raising event and an illegal weight-raising event.
3. The illegal override detection method according to claim 2, wherein the first event set further includes first process identification information, the method further comprising, prior to feature extraction of the first user space function call event and first kernel space capability call event:
And grouping the first user space function call event and the first kernel space capacity call event according to the time window and the first process identification information, so that the data of the same first process identification information are divided into a group according to the time window.
4. The illegal override detection method according to any one of claims 1 to 3, wherein the determining whether the process to be detected is an illegal override operation based on the second detection result and the security baseline information includes:
and when the second detection result is larger than the safety baseline information, determining that the process to be detected is illegal right-raising operation.
5. The illegal override detection method according to claim 1, characterized in that before illegal override detection based on the second event set, the method further comprises:
and carrying out data filtering on the second event set to filter out trusted function call data and/or capability call data.
6. An illegal rights-raising detecting device, characterized in that the device comprises:
the first determining module is used for acquiring a first event set corresponding to a first process and determining safety baseline information based on the first event set;
The detection module is used for acquiring a second event set corresponding to the process to be detected, wherein the second event set comprises a second user space function calling event, a second kernel space capacity calling event and second process identification information; the detection module is further configured to group the second user space function call event and the second kernel space capability call event according to a time window and the second process identification information, so that data of the same second process identification information is divided into a group according to the time window; extracting process creation characteristic data, system call time sequence characteristic data, user space function call characteristic data and capability call path characteristic data of each group of data corresponding to the second user space function call event and the second kernel space capability call event; adopting a weight-raising detection model to fuse the process creation feature data, the system call time sequence feature data, the user space function call feature data and the capability call path feature data so as to obtain a second detection result;
and the second determining module is used for determining whether the process to be detected is illegal right raising operation or not based on the second detection result and the safety baseline information.
7. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the method of any of claims 1-5.
8. A computer device, comprising: a processor; and
a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any of claims 1-5 via execution of the executable instructions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210716028.XA CN115033889B (en) | 2022-06-22 | 2022-06-22 | Illegal right-raising detection method and device, storage medium and computer equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210716028.XA CN115033889B (en) | 2022-06-22 | 2022-06-22 | Illegal right-raising detection method and device, storage medium and computer equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115033889A CN115033889A (en) | 2022-09-09 |
CN115033889B true CN115033889B (en) | 2023-10-31 |
Family
ID=83127617
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210716028.XA Active CN115033889B (en) | 2022-06-22 | 2022-06-22 | Illegal right-raising detection method and device, storage medium and computer equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115033889B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116049817B (en) * | 2023-01-17 | 2023-09-08 | 安芯网盾(北京)科技有限公司 | Real-time detection and blocking process weighting method and device based on Linux kernel |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104679593A (en) * | 2015-03-13 | 2015-06-03 | 浪潮集团有限公司 | Task scheduling optimization method based on SMP system |
CN105245543A (en) * | 2015-10-28 | 2016-01-13 | 中国人民解放军国防科学技术大学 | Operating system mandatory access control method based on security marker randomization |
CN108038049A (en) * | 2017-12-13 | 2018-05-15 | 西安电子科技大学 | Real-time logs control system and control method, cloud computing system and server |
WO2019033973A1 (en) * | 2017-08-18 | 2019-02-21 | 阿里巴巴集团控股有限公司 | Privilege escalation prevention detection method and device |
CN111191226A (en) * | 2019-07-04 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Method, device, equipment and storage medium for determining program by using privilege-offering vulnerability |
CN111259386A (en) * | 2018-12-03 | 2020-06-09 | 阿里巴巴集团控股有限公司 | Kernel security detection method, device, equipment and storage medium |
CN111291364A (en) * | 2018-12-07 | 2020-06-16 | 阿里巴巴集团控股有限公司 | Kernel security detection method, device, equipment and storage medium |
CN111782416A (en) * | 2020-06-08 | 2020-10-16 | Oppo广东移动通信有限公司 | Data reporting method, device, system, terminal and computer readable storage medium |
CN113821316A (en) * | 2021-06-10 | 2021-12-21 | 腾讯科技(深圳)有限公司 | Abnormal process detection method and device, storage medium and electronic equipment |
CN113868626A (en) * | 2021-09-29 | 2021-12-31 | 杭州默安科技有限公司 | Method and system for detecting permission promotion vulnerability and computer readable storage medium |
CN113987435A (en) * | 2021-09-26 | 2022-01-28 | 奇安信科技集团股份有限公司 | Illegal copyright detection method and device, electronic equipment and storage medium |
CN114143037A (en) * | 2021-11-05 | 2022-03-04 | 山东省计算中心(国家超级计算济南中心) | Malicious encrypted channel detection method based on process behavior analysis |
US11314859B1 (en) * | 2018-06-27 | 2022-04-26 | FireEye Security Holdings, Inc. | Cyber-security system and method for detecting escalation of privileges within an access token |
CN114547175A (en) * | 2022-03-01 | 2022-05-27 | 北京京东振世信息技术有限公司 | Data processing method, device, storage medium and computer system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10984098B2 (en) * | 2018-04-06 | 2021-04-20 | Palo Alto Networks, Inc. | Process privilege escalation protection in a computing environment |
-
2022
- 2022-06-22 CN CN202210716028.XA patent/CN115033889B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104679593A (en) * | 2015-03-13 | 2015-06-03 | 浪潮集团有限公司 | Task scheduling optimization method based on SMP system |
CN105245543A (en) * | 2015-10-28 | 2016-01-13 | 中国人民解放军国防科学技术大学 | Operating system mandatory access control method based on security marker randomization |
WO2019033973A1 (en) * | 2017-08-18 | 2019-02-21 | 阿里巴巴集团控股有限公司 | Privilege escalation prevention detection method and device |
CN108038049A (en) * | 2017-12-13 | 2018-05-15 | 西安电子科技大学 | Real-time logs control system and control method, cloud computing system and server |
US11314859B1 (en) * | 2018-06-27 | 2022-04-26 | FireEye Security Holdings, Inc. | Cyber-security system and method for detecting escalation of privileges within an access token |
CN111259386A (en) * | 2018-12-03 | 2020-06-09 | 阿里巴巴集团控股有限公司 | Kernel security detection method, device, equipment and storage medium |
CN111291364A (en) * | 2018-12-07 | 2020-06-16 | 阿里巴巴集团控股有限公司 | Kernel security detection method, device, equipment and storage medium |
CN111191226A (en) * | 2019-07-04 | 2020-05-22 | 腾讯科技(深圳)有限公司 | Method, device, equipment and storage medium for determining program by using privilege-offering vulnerability |
CN111782416A (en) * | 2020-06-08 | 2020-10-16 | Oppo广东移动通信有限公司 | Data reporting method, device, system, terminal and computer readable storage medium |
CN113821316A (en) * | 2021-06-10 | 2021-12-21 | 腾讯科技(深圳)有限公司 | Abnormal process detection method and device, storage medium and electronic equipment |
CN113987435A (en) * | 2021-09-26 | 2022-01-28 | 奇安信科技集团股份有限公司 | Illegal copyright detection method and device, electronic equipment and storage medium |
CN113868626A (en) * | 2021-09-29 | 2021-12-31 | 杭州默安科技有限公司 | Method and system for detecting permission promotion vulnerability and computer readable storage medium |
CN114143037A (en) * | 2021-11-05 | 2022-03-04 | 山东省计算中心(国家超级计算济南中心) | Malicious encrypted channel detection method based on process behavior analysis |
CN114547175A (en) * | 2022-03-01 | 2022-05-27 | 北京京东振世信息技术有限公司 | Data processing method, device, storage medium and computer system |
Non-Patent Citations (3)
Title |
---|
Additional Kernel Observer to Prevent Privilege Escalation Attacks by Focusing on System Call Privilege Changes;Toshihiro Yamauchi 等;2018 IEEE Conference on Dependable and secure Computing(DSC);全文 * |
Row Hammer漏洞攻击研究;王文伟;刘培顺;;网络与信息安全学报(第01期);第73-79页 * |
一种基于预警信息的漏洞自动化快速防护方法;徐其望;陈震杭;彭国军;张焕国;;信息安全学报(第01期);第78-86页 * |
Also Published As
Publication number | Publication date |
---|---|
CN115033889A (en) | 2022-09-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10872151B1 (en) | System and method for triggering analysis of an object for malware in response to modification of that object | |
US9690606B1 (en) | Selective system call monitoring | |
US9838405B1 (en) | Systems and methods for determining types of malware infections on computing devices | |
CN108280350B (en) | Android-oriented mobile network terminal malicious software multi-feature detection method | |
US10007786B1 (en) | Systems and methods for detecting malware | |
US20170083703A1 (en) | Leveraging behavior-based rules for malware family classification | |
CN109586282B (en) | Power grid unknown threat detection system and method | |
JP7531816B2 (en) | Image-based malicious code detection method and device and artificial intelligence-based endpoint threat detection and response system using the same | |
WO2024007615A1 (en) | Model training method and apparatus, and related device | |
CN115033889B (en) | Illegal right-raising detection method and device, storage medium and computer equipment | |
CN112738094B (en) | Expandable network security vulnerability monitoring method, system, terminal and storage medium | |
US11222115B2 (en) | Data scan system | |
CN109800569A (en) | Program identification method and device | |
CN114598512A (en) | Honeypot-based network security guarantee method and device and terminal equipment | |
CN113569240B (en) | Method, device and equipment for detecting malicious software | |
CN113378161A (en) | Security detection method, device, equipment and storage medium | |
CN112769595A (en) | Abnormality detection method, abnormality detection device, electronic device, and readable storage medium | |
CN112487265A (en) | Data processing method and device, computer storage medium and electronic equipment | |
CN112035831A (en) | Data processing method, device, server and storage medium | |
US10846405B1 (en) | Systems and methods for detecting and protecting against malicious software | |
CN115859298A (en) | Dynamic trusted computing environment architecture and method for power master station system | |
KR102541888B1 (en) | Image-based malicious code analysis method and apparatus and artificial intelligence-based endpoint detection and response system using the same | |
CN115396142A (en) | Information access method and device based on zero trust, computer equipment and medium | |
CN115086081A (en) | Escape prevention method and system for honeypots | |
CN115296849A (en) | Associated alarm method and system, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20220909 Assignee: Tianyiyun Technology Co.,Ltd. Assignor: CHINA TELECOM Corp.,Ltd. Contract record no.: X2024110000020 Denomination of invention: Methods and devices for detecting illegal claims, storage media, and computer equipment Granted publication date: 20231031 License type: Common License Record date: 20240315 |