CN113010896A - Method, apparatus, device, medium and program product for determining an abnormal object - Google Patents

Method, apparatus, device, medium and program product for determining an abnormal object Download PDF

Info

Publication number
CN113010896A
CN113010896A CN202110287780.2A CN202110287780A CN113010896A CN 113010896 A CN113010896 A CN 113010896A CN 202110287780 A CN202110287780 A CN 202110287780A CN 113010896 A CN113010896 A CN 113010896A
Authority
CN
China
Prior art keywords
node
feature
abnormal
nodes
representing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110287780.2A
Other languages
Chinese (zh)
Other versions
CN113010896B (en
Inventor
秦洋洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110287780.2A priority Critical patent/CN113010896B/en
Publication of CN113010896A publication Critical patent/CN113010896A/en
Application granted granted Critical
Publication of CN113010896B publication Critical patent/CN113010896B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/57Certifying or maintaining trusted computer platforms, e.g. secure boots or power-downs, version controls, system software checks, secure updates or assessing vulnerabilities
    • G06F21/577Assessing vulnerabilities and evaluating computer system security
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • G06F16/288Entity relationship models

Abstract

According to example embodiments of the present disclosure, a method, apparatus, device, medium, and program product for determining an abnormal object are provided. Relate to the artificial intelligence field, especially relate to network security technical field and deep learning technical field. The specific implementation scheme is as follows: acquiring behavior data related to an object; based on the behavior data, constructing a relation graph of the object, wherein the relation graph comprises a plurality of nodes representing identifications related to the object and a plurality of edges representing relations among the plurality of nodes, the edges represent interactive relations among the identifications, and the nodes have corresponding node characteristics and are used for representing behavior modes of the object; determining an object feature of the object based on a node feature of at least one node of the plurality of nodes; and determining whether the object is an abnormal object based on the object characteristics. According to the embodiment of the disclosure, the characteristics of the object can be accurately determined, and whether the object is an abnormal object can be accurately determined.

Description

Method, apparatus, device, medium and program product for determining an abnormal object
Technical Field
The present disclosure relates to the field of artificial intelligence, and more particularly, to methods, apparatuses, devices, computer-readable storage media and computer program products for determining an abnormal object.
Background
In recent years, with the rapid development of the internet, industries (such as network black and gray products) that infringe network security are also developing. These include, but are not limited to, trojan horse virus, number-raising and single-brushing, wool-pulling, telecom fraud, knowledge piracy, traffic hijacking, etc. abnormal operations performed by abnormal objects in various applications and platforms. The abnormal operation of these abnormal objects violates the privacy and legitimate interests of the normal objects and platforms, resulting in a poor user experience for the user. Therefore, a technical solution for accurately identifying abnormal objects violating network security is needed.
Disclosure of Invention
According to example embodiments of the present disclosure, a method, an apparatus, a device, a computer-readable storage medium and a computer program product for determining an abnormal object are provided.
In a first aspect of the disclosure, a method of determining an abnormal object is provided. The method comprises the following steps: acquiring behavior data related to an object; based on the behavior data, constructing a relation graph of the object, wherein the relation graph comprises a plurality of nodes representing identifications related to the object and a plurality of edges representing relations among the plurality of nodes, the edges represent interactive relations among the identifications, and the nodes have corresponding node characteristics and are used for representing behavior modes of the object; determining an object feature of the object based on a node feature of at least one node of the plurality of nodes; and determining whether the object is an abnormal object based on the object characteristics.
In a second aspect of the present disclosure, an apparatus for determining an abnormal object is provided. The device includes: a first data acquisition module configured to acquire behavior data related to an object; a relationship graph building module configured to build a relationship graph of the object based on the behavior data, the relationship graph including a plurality of nodes representing identifiers related to the object and a plurality of edges representing relationships between the plurality of nodes, the edges representing interactive relationships between the identifiers, the nodes having corresponding node characteristics for representing behavior patterns of the object; a first object feature determination module configured to determine an object feature of an object based on a node feature of at least one node of the plurality of nodes; and a first abnormal object determination module configured to determine whether the object is an abnormal object based on the object characteristics.
In a third aspect of the disclosure, an electronic device is provided that includes one or more processors; and storage means for storing the one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method according to the first aspect of the disclosure.
In a fourth aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored which, when executed by a processor, implements a method according to the first aspect of the present disclosure.
In a fifth aspect of the present disclosure, there is provided a computer program product comprising computer program instructions to implement a method according to the first aspect of the present disclosure by a processor.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, the same or similar reference numerals denote the same or similar elements. The accompanying drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure, in which:
FIG. 1 illustrates a schematic diagram of an example environment in which embodiments of the present disclosure can be implemented;
FIG. 2 illustrates a flow diagram of an example of determining an anomalous object in accordance with some embodiments of the present disclosure;
FIG. 3 shows a schematic diagram of a table of behavior data of objects, in accordance with an embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of a relationship map according to an embodiment of the present disclosure;
FIG. 5 shows a schematic block diagram of an apparatus to determine an abnormal object according to an embodiment of the present disclosure; and
FIG. 6 illustrates a block diagram of a computing device capable of implementing various embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
In describing embodiments of the present disclosure, the terms "include" and its derivatives should be interpreted as being inclusive, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
In the description of embodiments of the present disclosure, the term "model" may learn from training data the associations between respective inputs and outputs, such that after training is completed, a given input is processed based on a trained set of parameters to generate a corresponding output. The "model" may also sometimes be referred to as a "neural network", "learning model", "learning network", or "network". These terms are used interchangeably herein.
The term "feature" refers to a message, expression, and action represented by a low-dimensional vector. The nature of the feature vectors allows objects corresponding to vectors that are close in distance to have similar meanings. The characteristic that the object can be coded by using the concept of 'characteristic' through a low-dimensional vector and the meaning of the object can be kept is very suitable for deep learning.
As mentioned above, it is necessary to accurately identify an abnormal object. In the conventional scheme, there is a scheme of identifying an abnormal object as follows: (1) abnormal objects are identified using predetermined rules that are manually established. The method needs a large amount of manual operation, and the established rule is inflexible and often cannot adapt to the frequently-changed black and gray product attack mode; (2) the familiar abnormal operation sequence of an abnormal object is modeled to identify other abnormal objects. Since even the sequence of operations of the same object may be different, the recognition method is not accurate; (3) and identifying abnormal objects through simple association relations. The method determines, for example, a plurality of objects using the same WIFI as associated objects, and when one of the objects is an abnormal object, also determines the other objects as abnormal objects. The scheme is not accurate enough in identification, and a large number of false identifications exist. Therefore, the existing scheme has obvious defects in accuracy, flexibility and robustness for identifying abnormal objects.
An example embodiment of the present disclosure proposes a scheme of determining an abnormal object. In this scheme, behavior data of an object is first acquired. And then constructing a relation map of the object according to the behavior data of the object. The nodes in the relationship graph represent the identifications of the objects, and the edges represent the interaction relationships between the identifications of the objects. The node characteristics of each node represent the behavior pattern of the object. And then determining the object characteristics of the object according to the node characteristics of the nodes in the relational graph. And finally, determining whether the object is an abnormal object or not according to the object characteristics. Therefore, the object characteristics can be accurately determined by constructing the relation graph associated with the object behaviors through the object. Further, according to the determined characteristics of the accurate object, whether the object is an abnormal object can be determined efficiently and accurately.
FIG. 1 illustrates a schematic diagram of an example environment 100 in which embodiments of the present disclosure can be implemented. As shown, the example environment 100 may include an object 110, a computing device 120, behavioral data 130, and an abnormal object detection model 140. An object may refer herein to one or more users, devices used by the users, accounts used by the users, IP addresses used by the users to log on, etc. An object may be any entity by which an exception operation or action is performed, and the disclosure is not limited thereto. Although only one object is illustrated, the number is merely exemplary. One skilled in the art will appreciate that multiple objects may also exist simultaneously. The present disclosure is not limited thereto.
Computing device 120 may, in response to receiving an access request for object 110, obtain behavior data 130 for object 110 and then determine whether object 110 is an anomalous object based on behavior data 130. The behavior data 130 may be classified into current behavior data and historical behavior data by time. In some embodiments, the computing device 120 may train an initial model (not shown) from historical behavior data to arrive at the abnormal object detection model 140. The initial training models include, but are not limited to, Support Vector Machine (SVM) models, bayesian models, random forest models, various deep learning/neural network models, such as Convolutional Neural Networks (CNNs), Recurrent Neural Networks (RNNs), and the like. In some embodiments, computing device 120 may utilize current behavior data and trained abnormal object detection model 140 described above to determine whether object 110 is an abnormal object. This will be described in detail below.
Alternatively, in some embodiments, further to obtaining training data from behavioral data 130 of object 110, manually labeled and/or automatically labeled training data may be obtained from big data for training abnormal object detection model 140.
Although computing device 120 is shown as including abnormal object detection model 140, abnormal object detection model 140 may also be an entity external to computing device 120. Computing device 120 may be any device with computing capabilities. By way of non-limiting example, the computing device 120 may be any type of stationary, mobile, or portable computing device, including but not limited to a desktop computer, laptop computer, notebook computer, netbook computer, tablet computer, multimedia computer, mobile phone, or the like; all or a portion of the components of computing device 120 may be distributed in the cloud. Computing device 120 contains at least a processor, memory, and other components typically found in a general purpose computer to implement computing, storage, communication, control, and the like functions.
In order to more clearly understand the aspects provided by the embodiments of the present disclosure, embodiments of the present disclosure will be further described with reference to fig. 2 to 4. FIG. 2 shows a flow diagram of a process 200 of determining an anomalous object in accordance with an embodiment of the present disclosure. Process 200 may be implemented by computing device 120 of fig. 1. For ease of discussion, process 200 will be described in conjunction with FIG. 1.
At 210 of FIG. 2, computing device 120 obtains behavioral data associated with object 110. For example, computing device 120 may obtain behavior data 130 associated with an account number of object 110. In some embodiments, computing device 120 obtains behavior data 130 associated with object 110 in response to an access request for the object.
First, behavior data of an object is described with reference to fig. 3. FIG. 3 shows a schematic diagram of a table 300 of behavior data of objects according to an embodiment of the disclosure. The behavior data 130 of the object may be various operations performed by the object 110 in different scenes at different time points on the account ID of the object. For example, as shown in fig. 3, the account of the object 110 has browsing pictures and videos at 6 am and has complied operations. Note that the points in time, the scenarios involved in the operations, and the types of operations in the diagrams are merely exemplary, and other forms of behavioral data may also exist. For example, behavior data 130 associated with the device, IP address, associated with the object may also be obtained, and the disclosure is not limited thereto. Computing device 120 may construct a relationship graph of object 110 from this behavior data 130 and further determine object characteristics of object 110, as will be explained below.
Returning to the description of FIG. 2, at 220 of FIG. 2, the computing device 120 builds a relationship graph of the object 110 based on the behavior data 130, the relationship graph including a plurality of nodes representing identifications related to the object 110 and a plurality of edges representing relationships between the plurality of nodes, the edges representing interactive relationships between the identifications, the nodes having corresponding node characteristics for representing behavior patterns of the object 110.
In some embodiments, the identification of object 110 includes at least one of: an account number associated with the object, a device associated with the object, or an IP address associated with the object. The identity of the object 110 may be an entity by which the object is represented for accessing the application, which may be any suitable entity and is not limited to an ID, a device, or an identity. Through the interaction between the identifiers, edges can be established between the identifiers to form an association graph related to the object 110. The relationship map will be further described in conjunction with fig. 4.
Fig. 4 shows a schematic diagram 400 of a relationship map according to an embodiment of the present disclosure. For example, as shown in the relationship graph 410, if the object 110 logs in the account ID1 through the device ID1 and the device ID2, and obtains the verification code required by the operation in the application through the mobile phone number 2 and the mobile phone number 1 on the account ID1, the association may be established with the account ID1 as the center and the device ID1, the device ID2, the mobile phone number 1, and the mobile phone number 2 as the first neighbor node. The device ID2 and the phone number 2 may in turn be associated with a second neighboring node IP address 1 and account ID 2. Herein, for clarity, the relationship graph will be exemplarily established with the object account ID as a central node, and the first neighbor node refers to a node having one edge with the central node, and the second neighbor node refers to a node having two edges with the central node. Those skilled in the art will appreciate that the relationship graph may also be established with any identifier as a central node, as desired. The present disclosure is not limited thereto.
In some embodiments, the relationship graph is an anomaly graph as shown in relationship graph 410. An heteromorphic graph refers to the presence of at least two types of identifiers in a relational graph. By establishing the relation between different object identifications, the relation between the operation associated with the same object and the operation associated with different objects can be more thoroughly mined, so that a perfect relation map is established, and a foundation is laid for the identification of subsequent abnormal objects. In the present disclosure, the heteromorphic diagram is merely described as an example, but it is not intended to limit the scope of the present disclosure.
Alternatively, in some embodiments, the relationship graph may be a homogenous graph. The isomorphic graph refers to identifiers in which all nodes in the relationship graph are of the same type, for example, all are object account IDs.
Additionally or alternatively, in some embodiments, the plurality of nodes in the relationship graph includes a node representing an identification of at least one other object. For example, object 110 often interacts with another object in an application, e.g., object 110 often comments or likes the other object, then an object node of the other object may be included in the relationship graph with object 110. Because the similarity exists among the abnormal objects of the black and gray industrial chain, the objects with the interactive relationship are included in the same relationship map, and the relationship among the objects is favorably mined. Thus, the characteristics of the object can be more accurately determined, and the abnormal object can be further determined.
The characteristics of the nodes in the relationship graph 410, which represent the behavior patterns of the object 110, may be determined from the behavior data 130 of the object 110 in FIG. 3. In some embodiments, the node characteristics of the plurality of nodes may be determined based on at least one of: a type of operation associated with the object, a scenario to which the operation associated with the object relates, and a time at which the operation associated with the object occurs. For example, the characteristics of the account ID1 may be determined from 0's, 1's in the table 300 representing different operational scenarios, operational types at different points in time. Therefore, different factors can be integrated, the characteristics of each node can be accurately determined according to the operation of the object, and a foundation is laid for the subsequent determination of the characteristics of the object. Those skilled in the art will appreciate that the characteristics of the nodes may also be determined according to any suitable algorithm, and the disclosure is not limited thereto.
Returning to the description of FIG. 2, at 230 of FIG. 2, computing device 120 determines an object characteristic of object 110 based on a node characteristic of at least one node of the plurality of nodes. For example, computing device 140 may determine object characteristics of object 110 from node characteristics of nodes in relationship graph 410 determined as described above.
In one example, the computing device 120 may determine a target node in the relationship graph, the target node representing an account number associated with the object. Computing device 120 may then determine an object feature based on the target node feature of the target node and the first node feature of the at least one first neighbor node that is adjacent to the target node. For example, the computing device 120 may take the account ID1 of the object 110 in the relationship graph 410 in fig. 4 as the target node. The device ID1 and the mobile number 2 may then be considered the first neighbor node adjacent to the target node (account ID 1).
In some embodiments, the weight of each edge in the relationship graph is the same. In this case, the computing device 120 may randomly choose a first neighbor node that is adjacent to the target node. Alternatively, in some embodiments, the weight of each edge in the relationship graph is different. In this case, the computing device 120 may choose the node to which the edge with the higher weight (i.e., greater than the threshold) is connected as the first neighbor node. Those skilled in the art will appreciate that the appropriate selection rule and number of neighbor nodes may be set according to the requirements of the scenario, and the disclosure is not limited herein.
After determining the first neighbor node, a second neighbor node that is adjacent to the first neighbor node may also be determined. The computing device 120 may determine at least one second neighbor node adjacent to the at least one first neighbor node and its second node characteristics. The first node characteristic is then combined with the second node characteristic to determine a third node characteristic. And finally the target node feature and the third node feature are combined as the object feature of object 110.
Continuing with the relationship graph 410 as an example, the computing device 120 may determine a second node characteristic of a second neighboring node (IP address 1) that is adjacent to the first neighboring node (e.g., mobile phone number 2). The determination of the characteristics is described above and will not be described herein. Computing device 120 may then combine the second node characteristics of the second neighboring node (IP address 1) with the node characteristics of the first neighboring node (e.g., mobile number 2) (e.g., the node in relationship graph 420 representing mobile number 2 has merged the node characteristics of IP address 1). Finally, as shown in relationship graph 420 and relationship graph 430 in fig. 4, computing device 120 combines the node characteristics of cell phone number 2 and device ID1 with the node characteristics of account ID1 to determine the node characteristics as object 110. The object characteristics of the object are determined by fusing the characteristics of two layers (or more than two layers) of neighbor nodes of the target node, namely, the object characteristics can be determined from multiple dimensions and fused by utilizing the self-relation map information of the object behavior data, so that the object characteristics can be accurately determined. This avoids the risk of misidentification caused by simply associating objects with information such as WIFI.
At 240 of fig. 2, computing device 120 determines whether object 110 is an anomalous object based on the object features.
In some embodiments, the computing device 120 obtains an abnormal object detection model 140 associated with the sample object, the abnormal object detection model 140 describing a relationship between sample object features associated with the sample object and the abnormal object. Computing device 120 then determines whether the object is an abnormal object based on the abnormal object detection model and the object features.
The sample object characteristics may be determined from earlier historical data in the behavioral data 130 of the object 110, for example, the behavioral data may be determined as behavioral data divided into different time periods every 10 minutes, and the sample object characteristics may be determined from the behavioral data 130 divided by the most recent 10 minutes. The computing device 120 can then train the abnormal object detection model 140 based on the sample object features and the labeled abnormal object. Finally, the computing device may construct a relationship graph from the most recent 10 minutes of behavior data 130 to determine the object features of the object 110 and input into the trained abnormal object detection model 140 to determine that the object 110 is an abnormal object. And if the object 110 is determined to be an anomalous object, the object is denied access to the application or is blacklisted. The above 10 minutes are merely exemplary and one skilled in the art will appreciate that other times may be set for training and using the model and the disclosure is not limited thereto.
According to the embodiment of the disclosure, the object characteristics can be accurately determined by constructing the relation graph associated with the object behaviors through the object. Furthermore, whether the object is an abnormal object can be determined efficiently and accurately according to the characteristics of the accurate object.
Fig. 6 shows a schematic block diagram of an apparatus 600 for determining an abnormal object according to an embodiment of the present disclosure. As shown in fig. 6, the apparatus 600 includes: a first data acquisition module 610 configured to acquire behavior data related to an object; a relationship graph building module 620 configured to build a relationship graph of the object based on the behavior data, the relationship graph including a plurality of nodes representing identifiers related to the object and a plurality of edges representing relationships between the plurality of nodes, the edges representing interaction relationships between the identifiers, the nodes having corresponding node characteristics for representing behavior patterns of the object; a first object feature determination module 630 configured to determine an object feature of the object based on a node feature of at least one node of the plurality of nodes; and a first abnormal object determination module 640 configured to determine whether the object is an abnormal object based on the object characteristics.
In some embodiments, the plurality of nodes may include a node representing an identification of at least one other object.
In some embodiments, the first object feature determination module 630 may include: a target node determination module configured to determine a target node in a relationship graph, the target node representing an account associated with an object; a second object feature determination module configured to determine an object feature based on a target node feature of the target node and the first node feature of the at least one first neighbor node adjacent to the target node.
In some embodiments, the second object characteristic determination module may comprise: a neighbor node characteristic determination module configured to determine a second node characteristic of at least one second neighbor node adjacent to the at least one first neighbor node; a feature combination module configured to combine the first node feature with the second node feature to determine a third node feature; and a third object feature determination module configured to combine the target node feature and the third node feature as the object feature.
In some embodiments, the first abnormal object determination module 640 may include: a detection model acquisition module configured to acquire an abnormal object detection model associated with the sample object, the abnormal object detection model describing a relationship between a sample object feature related to the sample object and the abnormal object; and a second abnormal object determination module configured to determine the object as an abnormal object based on the abnormal object detection model and the object feature.
In some embodiments, the node characteristics of the plurality of nodes may be determined based on at least one of: the type of operation associated with the object, the scenario to which the operation associated with the object relates, and the time at which the operation associated with the object occurred.
In some embodiments, the identification of the object may include at least one of: an account number associated with the object, a device associated with the object, or an IP address associated with the object.
In some embodiments, the first data acquisition module 610 may include: and the second data acquisition module is configured to respond to the access request of the object and acquire the behavior data related to the object.
FIG. 6 illustrates a schematic block diagram of an example electronic device 600 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the apparatus 600 includes a computing unit 601, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data required for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The computing unit 601 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The computing unit 601 performs the various methods and processes described above, such as the process 200. For example, in some embodiments, process 200 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 600 via the ROM 602 and/or the communication unit 609. When the computer program is loaded into RAM 603 and executed by the computing unit 601, one or more steps of the process 200 described above may be performed. Alternatively, in other embodiments, computing unit 601 may be configured to perform process 200 in any other suitable manner (e.g., by way of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with an object, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to an object; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which objects can provide input to the computer. Other kinds of devices may also be used to provide for interaction with an object; for example, feedback provided to the subject can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the object may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., an object computer having a graphical object interface or a web browser through which objects can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The Server can be a cloud Server, also called a cloud computing Server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service ("Virtual Private Server", or simply "VPS"). The server may also be a server of a distributed system, or a server incorporating a blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.

Claims (19)

1. A method of determining an abnormal object, the method comprising:
acquiring behavior data related to an object;
building a relationship graph of the object based on the behavior data, wherein the relationship graph comprises a plurality of nodes representing identifications related to the object and a plurality of edges representing relationships among the plurality of nodes, the edges represent interactive relationships among the identifications, and the nodes have corresponding node characteristics and are used for representing behavior patterns of the object;
determining an object feature of the object based on a node feature of at least one of the plurality of nodes; and
determining whether the object is an abnormal object based on the object features.
2. The method of claim 1, wherein the plurality of nodes comprises nodes representing an identification of at least one other object.
3. The method of claim 1 or 2, wherein determining an object feature of the object based on node features of the plurality of nodes comprises:
determining a target node in the relationship graph, the target node representing an account associated with the object;
determining the object feature based on a target node feature of the target node and a first node feature of at least one first neighbor node adjacent to the target node.
4. The method of claim 3, wherein determining the object feature based on the target node feature of the target node and a first node feature of at least one first neighbor node adjacent to the target node comprises:
determining second node characteristics of at least one second neighboring node adjacent to the at least one first neighboring node;
combining the first node characteristic with the second node characteristic to determine a third node characteristic; and
combining the target node characteristics and the third node characteristics as the object characteristics.
5. The method of claim 1 or 2, wherein determining whether the object is an abnormal object based on the object feature comprises:
obtaining an abnormal object detection model associated with a sample object, the abnormal object detection model describing a relationship between sample object features related to the sample object and an abnormal object; and
determining whether the object is an abnormal object based on the abnormal object detection model and the object features.
6. The method of claim 1 or 2, wherein the node characteristics of the plurality of nodes are determined based on at least one of: a type of operation associated with the object, a scenario to which the operation associated with the object relates, and a time at which the operation associated with the object occurs.
7. The method according to claim 1 or 2, wherein the identification of the object comprises at least one of:
an account number associated with the object and,
a device associated with the object, or
An IP address associated with the object.
8. The method of claim 1 or 2, wherein obtaining behavioral data related to the subject comprises:
and responding to the access request of the object, and acquiring behavior data related to the object.
9. An apparatus to determine an abnormal object, the apparatus comprising:
a first data acquisition module configured to acquire behavior data related to an object;
a relationship graph building module configured to build a relationship graph of the object based on the behavior data, the relationship graph including a plurality of nodes representing identifications related to the object and a plurality of edges representing relationships between the plurality of nodes, the edges representing interaction relationships between the identifications, the nodes having corresponding node characteristics for representing behavior patterns of the object;
a first object feature determination module configured to determine an object feature of the object based on a node feature of at least one node of the plurality of nodes; and
a first abnormal object determination module configured to determine whether the object is an abnormal object based on the object feature.
10. The apparatus of claim 9, wherein the plurality of nodes comprises nodes representing an identification of at least one other object.
11. The apparatus of claim 9 or 10, wherein the first object feature determination module comprises:
a target node determination module configured to determine a target node in the relationship graph, the target node representing an account associated with the object;
a second object feature determination module configured to determine the object feature based on a target node feature of the target node and a first node feature of at least one first neighbor node adjacent to the target node.
12. The apparatus of claim 11, wherein the second object feature determination module comprises:
a neighbor node characteristic determination module configured to determine a second node characteristic of at least one second neighbor node adjacent to the at least one first neighbor node;
a feature combination module configured to combine the first node feature with the second node feature to determine a third node feature; and
a third object feature determination module configured to combine the target node feature and the third node feature as the object feature.
13. The apparatus of claim 9 or 10, wherein the first abnormal object determination module comprises:
a detection model acquisition module configured to acquire an abnormal object detection model associated with a sample object, the abnormal object detection model describing a relationship between a sample object feature related to the sample object and an abnormal object; and
a second abnormal object determination module configured to determine whether the object is an abnormal object based on the abnormal object detection model and the object feature.
14. The apparatus according to claim 9 or 10, wherein the node characteristics of the plurality of nodes are determined based on at least one of: a type of operation associated with the object, a scenario to which the operation associated with the object relates, and a time at which the operation associated with the object occurs.
15. The apparatus according to claim 9 or 10, wherein the identification of the object comprises at least one of:
an account number associated with the object and,
a device associated with the object, or
An IP address associated with the object.
16. The apparatus of claim 9 or 10, wherein the first data acquisition module comprises:
and the second data acquisition module is configured to respond to the access request of the object and acquire the behavior data related to the object.
17. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-8.
18. A non-transitory computer readable storage medium having stored thereon computer instructions for causing the computer to perform the method of any one of claims 1-8.
19. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1-8.
CN202110287780.2A 2021-03-17 2021-03-17 Method, apparatus, device, medium and program product for determining abnormal object Active CN113010896B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110287780.2A CN113010896B (en) 2021-03-17 2021-03-17 Method, apparatus, device, medium and program product for determining abnormal object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110287780.2A CN113010896B (en) 2021-03-17 2021-03-17 Method, apparatus, device, medium and program product for determining abnormal object

Publications (2)

Publication Number Publication Date
CN113010896A true CN113010896A (en) 2021-06-22
CN113010896B CN113010896B (en) 2023-10-03

Family

ID=76409393

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110287780.2A Active CN113010896B (en) 2021-03-17 2021-03-17 Method, apparatus, device, medium and program product for determining abnormal object

Country Status (1)

Country Link
CN (1) CN113010896B (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113254672A (en) * 2021-06-23 2021-08-13 中国平安人寿保险股份有限公司 Abnormal account identification method, system, equipment and readable storage medium
CN113553370A (en) * 2021-07-27 2021-10-26 百度在线网络技术(北京)有限公司 Abnormality detection method, abnormality detection device, electronic device, and readable storage medium
CN113610521A (en) * 2021-07-27 2021-11-05 胜斗士(上海)科技技术发展有限公司 Method and apparatus for detecting anomalies in behavioral data
CN113904943A (en) * 2021-09-26 2022-01-07 北京百度网讯科技有限公司 Account detection method and device, electronic equipment and storage medium
CN114422267A (en) * 2022-03-03 2022-04-29 北京天融信网络安全技术有限公司 Flow detection method, device, equipment and medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019114344A1 (en) * 2017-12-15 2019-06-20 阿里巴巴集团控股有限公司 Graphical structure model-based method for prevention and control of abnormal accounts, and device and equipment
CN110110093A (en) * 2019-04-08 2019-08-09 深圳众赢维融科技有限公司 A kind of recognition methods, device, electronic equipment and the storage medium of knowledge based map
CN110348215A (en) * 2019-07-16 2019-10-18 深圳众赢维融科技有限公司 Exception object recognition methods, device, electronic equipment and medium
CN112398819A (en) * 2020-11-02 2021-02-23 杭州海康威视数字技术股份有限公司 Method and device for recognizing abnormality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019114344A1 (en) * 2017-12-15 2019-06-20 阿里巴巴集团控股有限公司 Graphical structure model-based method for prevention and control of abnormal accounts, and device and equipment
CN110110093A (en) * 2019-04-08 2019-08-09 深圳众赢维融科技有限公司 A kind of recognition methods, device, electronic equipment and the storage medium of knowledge based map
CN110348215A (en) * 2019-07-16 2019-10-18 深圳众赢维融科技有限公司 Exception object recognition methods, device, electronic equipment and medium
CN112398819A (en) * 2020-11-02 2021-02-23 杭州海康威视数字技术股份有限公司 Method and device for recognizing abnormality

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
S. KLEVTSOV: "Tracking Significant Changes in a Technical Parameter by the Example of Acceleration of a Moving Dynamic Object", 《IEEE XPLORE》 *
贾中浩;古天龙;宾辰忠;常亮;张伟涛;朱桂明;: "旅游知识图谱特征学习的景点推荐", 智能系统学报, no. 03 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113254672A (en) * 2021-06-23 2021-08-13 中国平安人寿保险股份有限公司 Abnormal account identification method, system, equipment and readable storage medium
CN113254672B (en) * 2021-06-23 2023-07-18 中国平安人寿保险股份有限公司 Method, system, equipment and readable storage medium for identifying abnormal account
CN113553370A (en) * 2021-07-27 2021-10-26 百度在线网络技术(北京)有限公司 Abnormality detection method, abnormality detection device, electronic device, and readable storage medium
CN113610521A (en) * 2021-07-27 2021-11-05 胜斗士(上海)科技技术发展有限公司 Method and apparatus for detecting anomalies in behavioral data
CN113553370B (en) * 2021-07-27 2023-07-21 百度在线网络技术(北京)有限公司 Abnormality detection method, abnormality detection device, electronic device, and readable storage medium
CN113904943A (en) * 2021-09-26 2022-01-07 北京百度网讯科技有限公司 Account detection method and device, electronic equipment and storage medium
CN113904943B (en) * 2021-09-26 2024-04-05 北京百度网讯科技有限公司 Account detection method and device, electronic equipment and storage medium
CN114422267A (en) * 2022-03-03 2022-04-29 北京天融信网络安全技术有限公司 Flow detection method, device, equipment and medium
CN114422267B (en) * 2022-03-03 2024-02-06 北京天融信网络安全技术有限公司 Flow detection method, device, equipment and medium

Also Published As

Publication number Publication date
CN113010896B (en) 2023-10-03

Similar Documents

Publication Publication Date Title
CN113010896A (en) Method, apparatus, device, medium and program product for determining an abnormal object
US11822670B2 (en) Security risk assessment and control for code
US9727723B1 (en) Recommendation system based approach in reducing false positives in anomaly detection
US10999311B2 (en) Risk score generation for assets of an enterprise system utilizing user authentication activity
US10165003B2 (en) Identifying an imposter account in a social network
US20210126936A1 (en) Predicting vulnerabilities affecting assets of an enterprise system
US10380590B2 (en) Transaction authentication based on metadata
US20200067980A1 (en) Increasing security of network resources utilizing virtual honeypots
US20150172096A1 (en) System alert correlation via deltas
Zhang et al. A trust model stemmed from the diffusion theory for opinion evaluation
CN111931048B (en) Artificial intelligence-based black product account detection method and related device
US10505963B1 (en) Anomaly score generation based on adaptive clustering of user location
CN115883187A (en) Method, device, equipment and medium for identifying abnormal information in network traffic data
CN110599278B (en) Method, apparatus, and computer storage medium for aggregating device identifiers
CN113312560A (en) Group detection method and device and electronic equipment
US20190132306A1 (en) User authentication based on predictive applications
CN116451210A (en) Rights recovery method, device, equipment and storage medium
CN115333783A (en) API call abnormity detection method, device, equipment and storage medium
US9519864B1 (en) Method and system for identifying dependent components
CN113642919A (en) Risk control method, electronic device, and storage medium
US11012463B2 (en) Predicting condition of a host for cybersecurity applications
CN113591088B (en) Identification recognition method and device and electronic equipment
US11930000B2 (en) Detection of anomalous authentications
CN116112245A (en) Attack detection method, attack detection device, electronic equipment and storage medium
CN117421681A (en) Abnormal user detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant