CN118154292A - Money back-flushing identification method based on BiLSTM and graph convolution neural network combination - Google Patents

Money back-flushing identification method based on BiLSTM and graph convolution neural network combination Download PDF

Info

Publication number
CN118154292A
CN118154292A CN202410340372.2A CN202410340372A CN118154292A CN 118154292 A CN118154292 A CN 118154292A CN 202410340372 A CN202410340372 A CN 202410340372A CN 118154292 A CN118154292 A CN 118154292A
Authority
CN
China
Prior art keywords
node
graph
data
vector
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410340372.2A
Other languages
Chinese (zh)
Inventor
吴万文
黄靛
付婧斐
朱小宝
胡正
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Hangkong University
Original Assignee
Nanchang Hangkong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Hangkong University filed Critical Nanchang Hangkong University
Priority to CN202410340372.2A priority Critical patent/CN118154292A/en
Publication of CN118154292A publication Critical patent/CN118154292A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/02Banking, e.g. interest calculation or account maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/042Knowledge-based neural networks; Logical representations of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • G06N3/0442Recurrent networks, e.g. Hopfield networks characterised by memory or gating, e.g. long short-term memory [LSTM] or gated recurrent units [GRU]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/04Trading; Exchange, e.g. stocks, commodities, derivatives or currency exchange

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)

Abstract

The invention is applicable to the technical field of data processing, and provides a money laundering identification method based on BiLSTM and graph convolution neural network combination, which comprises the following steps: step S1, performing long-cut short-patch on user transaction time sequence data, and feeding the processed data into BiLSTM networks; s2, respectively extracting statistical math of transaction data of each user and forming a vector by taking the statistical math as an element; step S3, splicing the vectors output in the step S1 and the step S2 into new vectors; and S4, taking the user as a node and the payment flow as an edge to form a graph, taking the vector output in the step S3 as the attribute characteristic of the graph convolution neural network node, and training and outputting the classification result of the node. The invention can better capture the dynamic characteristics in the transaction network under the condition of small time expenditure during training, and has the recall rate of up to 0.963; the rich node features can be extracted for more granular pattern recognition.

Description

Money back-flushing identification method based on BiLSTM and graph convolution neural network combination
Technical Field
The invention belongs to the technical field of data processing, and particularly relates to a money laundering identification method based on BiLSTM and graph convolution neural network combination.
Background
The traditional money laundering technology carries out money laundering identification of single transaction behaviors by using a logistic regression or clustering algorithm, however, the current transaction network is increasingly huge, the money laundering behaviors tend to be complicated and collective, and the traditional pattern identification cannot meet the current requirements on money laundering behavior identification.
The advent of graph roll-up networks solves the difficult problem of difficult access to data correlation and data structure features in the money laundering field, but when applied to the money laundering field, there are several problems: (1) The graph rolling network research is mostly applied to the static graph problem, but in the digital currency transaction research, the graph structure is dynamically changed, the dynamics is one of the characteristics which need to be focused, and the traditional static graph rolling network cannot well capture the dynamic characteristics in the transaction network, so that the recall rate is always at a low level. (2) Existing dynamic graph rolling networks such as EvolveGCN have time overhead for training. (3) The increasingly collective money laundering behavior requires richer node features to enable finer granularity pattern recognition. Therefore, we propose a money laundering identification method based on BiLSTM and graph convolution neural network combination.
Disclosure of Invention
The invention aims to provide a money laundering identification method based on BiLSTM and graph convolution neural network combination, which aims to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions:
A money laundering identification method based on BiLSTM and graph convolution neural network combination comprises the following steps:
step S1, performing long-cut short-patch on user transaction time sequence data, wherein the time sequence length is the average value of all data lengths, and feeding the long-cut short-patch data into BiLSTM networks;
s2, respectively extracting statistical mathematical quantity of each user transaction data, and forming a vector by taking the statistical mathematical quantity as an element;
Step S3, splicing the vectors output in the step S1 and the step S2 to form a new vector;
and S4, taking the user as a node and the payment flow as an edge to form a graph, taking the vector output in the step S3 as the attribute characteristic of the graph convolution neural network node, and training and outputting the classification result of the node.
Further, the specific operation of step S1 is as follows:
suppose that the transaction data for user v i is A i represents the transaction flow quantity of the user i, the quantity of the original transaction data of each user is subjected to long-cut short-patch, and the processed data is/>
Wherein the method comprises the steps ofF p is a long-cut short-patch function indicating when the amount of user transaction data is greater than/>When the number of the user transaction data is less than/>, the excess data is cut offAt this time, the missing data is padded with 0 until the amount of transaction data is equal toWhen the amount of user transaction data is equal to/>When the transaction data is processed, the original transaction data is processed;
Data after cutting long and short Feeding BiLSTM a network f R, taking the vector output by the hidden layer of the last layer as output, and generating a vector V 1 with a dimension of b after the vector is subjected to a linear layer and an activation function:
Wherein V 1={[x1,x2,…,xb]}∈Rb,V1 is obtained by long-cut short-patch processing of the original transaction data.
Further, the specific operation in step S2 is as follows:
the original transaction data of each user is extracted with mathematical quantity statistical characteristics, and a vector V 2=[y1,y2,…,yd]∈Rd is formed by the mathematical quantity based on statistics.
Further, the vector output in the step S3 is a feature matrix V i of each node V i, which is defined as follows:
Vi=concat(V1+V2)=[x1,x2,…xb,y1,y2,…,yd].
further, the specific operation in step S4 is as follows:
Define graph g= (V, E), where V represents the set of all nodes V i in the graph, where V represents the set of all users, i.e. V i E V; e represents the set of all edges E ij in the graph, here representing the set of all payment flows, i.e., E ij=(Vi,Vj) ε E, representing the transfer or payment of V i to V j; the adjacency matrix of the graph is denoted as a:
The set of neighbor nodes N (v i)={ui∈V|(vi,ui) ∈E) of node v i, where u i represents the node directly connected to node v i, i.e., the neighbor node of node v i;
The graph convolution neural network aggregates information from adjacent nodes, a convolution layer is overlapped, and a layer 2 node v i embedding vector h i is:
Wherein the method comprises the steps of Adjacency matrix representing undirected graph self-join,/>Is an identity matrix; /(I)Is a diagonal matrix, each/>I.e., the degree of node v i; σ (·) is a ReLU activation function, specifically defined as ReLU (x) =max (0, x); /(I)Is a hierarchical linear transformation matrix optimized during training, where F and F' represent the node feature dimensions of layers 1 and 2, respectively;
For a node v i, the node update formula is expressed as follows:
The convolution neural network outputs a feature vector h i of each node V i with a dimension of 32, and finally, the vector V out expressed by the binary probability of the node is output through a linear layer and the Log_Softmax activation function:
Vout=log_softmax(linear(hi))。
compared with the prior art, the invention has the beneficial effects that:
According to the money back-flushing identification method based on BiLSTM and the graph convolution neural network, only one layer of convolution structure design is adopted, dynamic characteristics in a transaction network can be better captured under the condition of small time cost in training, and the recall rate is as high as 0.963; before the graph convolution neural network is trained, rich node characteristics can be extracted to perform more granular pattern recognition.
Drawings
Fig. 1 is a schematic structural view of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Specific implementations of the invention are described in detail below in connection with specific embodiments.
As shown in fig. 1, in the method for identifying money back on the basis of BiLSTM and graph convolution neural network, the description of the method is performed by taking real bank transaction flow data as an example, and the method comprises the following steps:
step S1, performing long-cut short-patch on user transaction time sequence data, wherein the time sequence length is the average value of all data lengths, and feeding the long-cut short-patch data into BiLSTM networks;
the specific operation of the step S1 is as follows:
suppose that the transaction data for user v i is A i represents the transaction running number of user i, and in order to facilitate the subsequent time series processing, the number of the original transaction data of each user is subjected to long-cut short-patch, and the processed data is/>
Wherein the method comprises the steps ofF p is a long-cut short-patch function indicating when the amount of user transaction data is greater than/>When the number of the user transaction data is less than/>, the excess data is cut offAt this time, the missing data is padded with 0 until the amount of transaction data is equal toWhen the amount of user transaction data is equal to/>When the transaction data is processed, the original transaction data is processed;
Then the data after the long and short complements are cut Feeding BiLSTM a network f R, taking the vector output by the hidden layer of the last layer as output, and generating a vector V 1 with a dimension of b after the vector is subjected to a linear layer and an activation function:
V 1={[x1,x2,...,xb]}∈Rb,V1 is obtained by long-cut short-patch processing of the original transaction data, is irrelevant to the cut data in the original transaction data, and is used for extracting mathematical quantity statistical characteristics from the original transaction data of each user in order to make up for the missing data.
S2, respectively extracting statistical mathematical quantity of each user transaction data, and forming a vector by taking the statistical mathematical quantity as an element;
The specific operation of the step S2 is as follows:
Extracting mathematical quantity statistical characteristics from the original transaction data of each user, wherein the extracted mathematical quantity is as follows based on experience: the transaction time, the transaction amount, the average value, the median, the maximum value, the minimum value, the standard deviation, the proportion of the transaction amount to the total transaction amount, the proportion of the number of times of ten occurrences of the transaction amount to the total transaction amount, the number of transaction sites and the proportion of receipt and payment identifications (account giving, account entering, loan and borrowing) to the total transaction amount of c opponent users with the largest transaction times of the user i; the number of all transaction opponents, the number of transaction card numbers, the standard deviation of transaction time, the average value of transaction amount, the average value of transaction balance, the proportion of the times of ten times of occurrence of transaction amount to the total transaction times, and the like of the user i. The statistical-based mathematical quantities are formed into a vector V 2=[y1,y2,...,yd]∈Rd.
Step S3, splicing the vectors output in the step S1 and the step S2 to form a new vector;
The vector output in the step S3 is a feature matrix V i of each node V i, which is defined as follows:
Vi=concat(V1+V2)=[x1,x2,...xb,y1,y2,…,yd].
S4, taking the user as a node, taking the payment flow as an edge to form a graph, taking the vector output in the step S3 as the attribute characteristic of the graph convolution neural network node, and training and outputting a node classification result;
The specific operation of the step S4 is as follows:
Define graph g= (V, E), where V represents the set of all nodes V i in the graph, where V represents the set of all users, i.e. V i E V; e represents the set of all edges E ij in the graph, here representing the set of all payment flows, i.e., E ij=(Vi,Vj) ε E, representing the transfer or payment of V i to V j; the adjacency matrix of the graph is denoted as a:
The set of neighbor nodes N (v i)={ui∈V|(vi,ui) ∈E) of node v i, where u i represents the node directly connected to node v i, i.e., the neighbor node of node v i; if there is an edge (v i,ui) E from node v i to node u i, node u i is the neighbor node of node v i; n (v i) represents the set of neighbor nodes of node v i, i.e. the set of all nodes directly connected to node v i.
The graph convolution neural network aggregates information from adjacent nodes, captures information of at most k hops from the nodes by superposing convolution layers (k is the number of stacked GCN layers), and takes 2 as k, and the embedded vector h i of the node v i of the layer 2 is:
Wherein the method comprises the steps of Adjacency matrix representing undirected graph self-join that allows node characterization itself to be incorporated when updating node characterization,/>, andIs an identity matrix; /(I)Is a diagonal matrix, each/>I.e., the degree of node v i; σ (·) is a ReLU activation function, specifically defined as ReLU (x) =max (0, x); /(I)Is a hierarchical linear transformation matrix optimized during training, where F and F' represent the node feature dimensions of layers 1 and 2, respectively;
For a node v i, the node update formula is expressed as follows:
The convolution neural network outputs a feature vector h i of each node V i with a dimension of 32, and finally, the vector V out expressed by the binary probability of the node is output through a linear layer and the Log_Softmax activation function:
Vout=log_softmax(linear(hi))。
the working principle of the invention is as follows:
According to the money back-flushing identification method based on BiLSTM and graph convolution neural network combination, only one layer of convolution structure design is adopted to reduce huge time expenditure caused by the application of the convolution network. The characteristics of each user transaction time sequence data are extracted through the bidirectional LSTM neural network, the characteristics are combined with the statistical math quantity extracted from the transaction flow data based on an empirical rule method, the combined vector is used as the node attribute characteristics of the graph convolution neural network, the characteristics are combined and extracted through BiLSTM and the statistical method, the dynamic characteristics of the transaction network can be well captured, the richer node characteristics are obtained, and finer granularity pattern recognition is supported.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that modifications and improvements can be made by those skilled in the art without departing from the spirit of the present invention, and these should also be considered as the scope of the present invention, which does not affect the effect of the implementation of the present invention and the utility of the patent.

Claims (5)

1. The money back washing identification method based on BiLSTM and graph convolution neural network is characterized by comprising the following steps of:
step S1, performing long-cut short-patch on user transaction time sequence data, wherein the time sequence length is the average value of all data lengths, and feeding the long-cut short-patch data into BiLSTM networks;
s2, respectively extracting statistical mathematical quantity of each user transaction data, and forming a vector by taking the statistical mathematical quantity as an element;
Step S3, splicing the vectors output in the step S1 and the step S2 to form a new vector;
and S4, taking the user as a node and the payment flow as an edge to form a graph, taking the vector output in the step S3 as the attribute characteristic of the graph convolution neural network node, and training and outputting the classification result of the node.
2. The method for identifying backwash money based on BiLSTM in combination with a graph roll-up neural network according to claim 1, wherein the specific operations of step S1 are as follows:
suppose that the transaction data for user v i is A i represents the transaction flow quantity of the user i, the quantity of the original transaction data of each user is subjected to long-cut short-patch, and the processed data is/>
Wherein the method comprises the steps ofF p is a long-cut short-patch function indicating when the amount of user transaction data is greater than/>When the number of the user transaction data is less than/>, the excess data is cut offAt this point, the missing data is padded with 0 until the amount of transaction data is equal to/>When the amount of user transaction data is equal to/>When the transaction data is processed, the original transaction data is processed;
Data after cutting long and short Feeding BiLSTM a network f R, taking the vector output by the hidden layer of the last layer as output, and generating a vector V 1 with a dimension of b after the vector is subjected to a linear layer and an activation function:
Wherein V 1={[x1,x2,…,xb]}∈Rb,V1 is obtained by long-cut short-patch processing of the original transaction data.
3. The method for identifying backwash money based on BiLSTM in combination with a graph roll-up neural network according to claim 2, wherein the specific operations in step S2 are as follows:
the original transaction data of each user is extracted with mathematical quantity statistical characteristics, and a vector V 2=[y1,y2,…,yd]∈Rd is formed by the mathematical quantity based on statistics.
4. The method for identifying backwash money based on BiLSTM in combination with a graph convolution neural network according to claim 3, wherein the vector output in the step S3 is a feature matrix V i of each node V i, which is defined as follows:
Vi=concat(V1+V2)=[x1,x2,…xb,y1,y2,…,yd].
5. the method for identifying backwash money based on BiLSTM in combination with a graph roll-up neural network according to claim 4, wherein the specific operations of step S4 are as follows:
Define graph g= (V, E), where V represents the set of all nodes V i in the graph, where V represents the set of all users, i.e. V i E V; e represents the set of all edges E ij in the graph, here representing the set of all payment flows, i.e., E ij=(Vi,Vj) ε E, representing the transfer or payment of V i to V j; the adjacency matrix of the graph is denoted as a:
The set of neighbor nodes N (v i)={ui∈V|(vi,ui) ∈E) of node v i, where u i represents the node directly connected to node v i, i.e., the neighbor node of node v i;
The graph convolution neural network aggregates information from adjacent nodes, a convolution layer is overlapped, and a layer 2 node v i embedding vector h i is:
Wherein the method comprises the steps of Adjacency matrix representing undirected graph self-join,/>Is an identity matrix; /(I)Is a diagonal matrix, each/>I.e., the degree of node v i; σ (·) is a ReLU activation function, specifically defined as ReLU (x) =max (0, x); /(I)Is a hierarchical linear transformation matrix optimized during training, where F and F' represent the node feature dimensions of layers 1 and 2, respectively;
For a node v i, the node update formula is expressed as follows:
The convolution neural network outputs a feature vector h i of each node V i with a dimension of 32, and finally, the vector V out expressed by the binary probability of the node is output through a linear layer and the Log_Softmax activation function:
Vout=log_softmax(linear(hi))。
CN202410340372.2A 2024-03-25 2024-03-25 Money back-flushing identification method based on BiLSTM and graph convolution neural network combination Pending CN118154292A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410340372.2A CN118154292A (en) 2024-03-25 2024-03-25 Money back-flushing identification method based on BiLSTM and graph convolution neural network combination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410340372.2A CN118154292A (en) 2024-03-25 2024-03-25 Money back-flushing identification method based on BiLSTM and graph convolution neural network combination

Publications (1)

Publication Number Publication Date
CN118154292A true CN118154292A (en) 2024-06-07

Family

ID=91298265

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410340372.2A Pending CN118154292A (en) 2024-03-25 2024-03-25 Money back-flushing identification method based on BiLSTM and graph convolution neural network combination

Country Status (1)

Country Link
CN (1) CN118154292A (en)

Similar Documents

Publication Publication Date Title
CN112395466B (en) Fraud node identification method based on graph embedded representation and cyclic neural network
Lu et al. Telecom fraud identification based on ADASYN and random forest
CN110956503B (en) User identification method and device with lending requirements based on user network behaviors
CN113822419B (en) Self-supervision graph representation learning operation method based on structural information
CN111245820A (en) Phishing website detection method based on deep learning
CN111090749A (en) Newspaper and periodical publication classification method and system based on TextCNN
CN114679372A (en) Node similarity-based attention network link prediction method
CN118154292A (en) Money back-flushing identification method based on BiLSTM and graph convolution neural network combination
CN112435034A (en) Marketing arbitrage black product identification method based on multi-network graph aggregation
CN116541792A (en) Method for carrying out group partner identification based on graph neural network node classification
CN116756391A (en) Unbalanced graph node neural network classification method based on graph data enhancement
CN112270548B (en) Credit card fraud detection method based on deep learning
CN116633589A (en) Malicious account detection method, device and storage medium in social network
CN110889467A (en) Company name matching method and device, terminal equipment and storage medium
Gabhane et al. Churn Prediction in Telecommunication Business using CNN and ANN
CN114638984B (en) Malicious website URL detection method based on capsule network
CN113283530B (en) Image classification system based on cascade characteristic blocks
CN112069392B (en) Method and device for preventing and controlling network-related crime, computer equipment and storage medium
CN115131598A (en) Boltzmann image recognition and classification method based on sparsity and dimensionality reduction
CN114154022A (en) Scheme-source cable classification processing method based on hierarchical graph convolution neural network model
CN113537272A (en) Semi-supervised social network abnormal account detection method based on deep learning
Nguyen et al. A Deep Learning Model for Splicing Image Detection
CN113095304B (en) Method for weakening influence of resampling on pedestrian re-identification
CN113763167B (en) Blacklist mining method based on complex network
CN109284379B (en) Adaptive microblog topic tracking method based on two-way quantity model

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination