CN117437019A - Credit card overdue risk prediction method, apparatus, device, medium and program product - Google Patents

Credit card overdue risk prediction method, apparatus, device, medium and program product Download PDF

Info

Publication number
CN117437019A
CN117437019A CN202311000163.5A CN202311000163A CN117437019A CN 117437019 A CN117437019 A CN 117437019A CN 202311000163 A CN202311000163 A CN 202311000163A CN 117437019 A CN117437019 A CN 117437019A
Authority
CN
China
Prior art keywords
credit card
training
overdue
prediction
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311000163.5A
Other languages
Chinese (zh)
Inventor
李泰丞
沙一菲
李冰洁
杨恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202311000163.5A priority Critical patent/CN117437019A/en
Publication of CN117437019A publication Critical patent/CN117437019A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/03Credit; Loans; Processing thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Finance (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Accounting & Taxation (AREA)
  • Technology Law (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Marketing (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

The present disclosure provides a credit card overdue risk prediction method, apparatus, electronic device, medium, and computer program product. The method and the device can be used in the technical field of artificial intelligence. The credit card overdue risk prediction method comprises the following steps: acquiring credit card information; determining a overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited Boltzmann machine, m training samples used for training the limited Boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and identifying the credit card as a credit card having an overdue risk when the overdue prediction value is greater than a set overdue threshold.

Description

Credit card overdue risk prediction method, apparatus, device, medium and program product
Technical Field
The present disclosure relates to the field of artificial intelligence, and more particularly, to a credit card overdue risk prediction method, apparatus, electronic device, medium, and computer program product.
Background
The traditional credit card overdue risk early warning model generally uses a traditional machine learning model trained by single original data to conduct simple prediction and classification, and the actual credit card data volume is large and the distribution is complex, so that the prediction accuracy of the traditional credit card risk early warning method is generally low.
Disclosure of Invention
In view of the above, the present disclosure provides a credit card overdue risk prediction method, apparatus, electronic device, computer readable storage medium and computer program product with high accuracy.
One aspect of the present disclosure provides a credit card overdue risk prediction method, including: acquiring credit card information; determining a overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited Boltzmann machine, m training samples used for training the limited Boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and identifying the credit card as a credit card having an overdue risk when the overdue prediction value is greater than a set overdue threshold.
According to the credit card overdue risk prediction method disclosed by the embodiment of the disclosure, the overdue predicted value of the credit card can be determined according to the credit card information and the pre-trained credit card overdue prediction model, and the credit card can be identified as the credit card with overdue risk when the overdue predicted value is larger than the set overdue threshold. Therefore, the credit card overdue risk prediction method can effectively identify the credit card with overdue risk, and accordingly a customer manager can be reminded to monitor and collect the credit card in a key mode. The credit card overdue prediction model used in the method is obtained by training m training samples, wherein the m training samples are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, and therefore, the sample quality of the m training samples is higher, and the model parameters trained by the m training samples, namely the mapping relation between credit card information and overdue prediction values, are more accurate. Based on this, the credit card with overdue risk predicted by the credit card overdue prediction model of the present disclosure is more accurate.
In some embodiments, the step of pre-training the credit card expiration prediction model includes: determining the n first pre-training samples according to the acquired first original data of the credit card, wherein the data characteristics of the first original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-training samples comprise sample characteristics and sample labels, the sample characteristics are determined according to the data characteristics of the first original data, and the sample labels are determined according to the credit card repayment records; screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model to obtain m training samples; according to the m training samples, initial model parameters set in the limited Boltzmann machine are adjusted to obtain training model parameters; and applying the training model parameters as model parameters of the credit card overdue prediction model.
In some embodiments, the step of screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a cyclic neural network model to obtain the m training samples includes: predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result; predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained self-attention network model to obtain a second prediction credit result; predicting the overdue condition of each first pre-training sample in the n first pre-training samples by using a pre-trained cyclic neural network model to obtain a third prediction credit result; and screening the first pre-training sample as a training sample when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, and screening the m training samples from the n first pre-training samples.
In some embodiments, the step of screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a cyclic neural network model to obtain the m training samples includes: predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result; predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained self-attention network model to obtain a second prediction credit result; predicting the overdue condition of each first pre-training sample in the n first pre-training samples by using a pre-trained cyclic neural network model to obtain a third prediction credit result; when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, screening the first pre-training sample into a second pre-training sample, and screening the k second pre-training samples from the n first pre-training samples, wherein the two or more same prediction credit results are used as the prediction credit results of the second pre-training sample, and k is an integer greater than or equal to 1 and less than or equal to n; and comparing the predicted credit result of each second pre-training sample in the k second pre-training samples with the sample label of the second pre-training sample, screening the second pre-training sample as a training sample when the predicted credit result is consistent with the sample label, and screening the m training samples in the k second pre-training samples.
In some embodiments, prior to the step of applying the training model parameters as model parameters for the credit card expiration prediction model, the step of pre-training the credit card expiration prediction model further comprises: determining x first pre-verification samples according to second original data of the acquired credit card, wherein the data characteristics of the second original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-verification samples comprise verification sample characteristics and verification sample labels, the verification sample characteristics are determined according to the data characteristics of the second original data, and the verification sample labels are determined according to the credit card repayment records; screening the x first pre-verification samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model to obtain y verification samples; verifying the training model parameters according to the y verification samples; if the verification is passed, the training model parameters are used as the model parameters of the credit card overdue prediction model; and if the verification is not passed, repeating the step of determining the n first pre-training samples according to the acquired first original data of the credit card until the verification is passed.
In some embodiments, the step of determining the n first pre-training samples from the acquired first raw data of the credit card comprises: performing data cleaning on the obtained n pieces of first original data; and performing feature engineering on the cleaned first original data to obtain n first pre-training samples.
In some embodiments, the step of performing data cleansing on the acquired n pieces of first raw data includes: carrying out missing value processing, abnormal value detection and filtering, and deleting repeated data and error checking on the obtained n pieces of first original data; and/or performing feature engineering on the cleaned first original data to obtain n first pre-training samples, wherein the step of obtaining the n first pre-training samples comprises the following steps of: and carrying out feature normalization processing on the cleaned first original data, and carrying out dimension reduction on the first original data subjected to feature normalization to obtain n first pre-training samples.
Another aspect of the present disclosure provides a credit card overdue risk prediction apparatus, including: the acquisition module is used for executing acquisition of credit card information; the determining module is used for determining a overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited boltzmann machine, m training samples used for training the limited boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and an identification module for performing identification of the credit card as a credit card having an overdue risk when the overdue prediction value is greater than a set overdue threshold.
Another aspect of the present disclosure provides an electronic device comprising one or more processors and one or more memories, wherein the memories are configured to store executable instructions that, when executed by the processors, implement the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions that, when executed, are configured to implement a method as described above.
Another aspect of the present disclosure provides a computer program product comprising a computer program comprising computer executable instructions which, when executed, are for implementing a method as described above.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent from the following description of embodiments thereof with reference to the accompanying drawings in which:
FIG. 1 schematically illustrates an exemplary system architecture to which methods, apparatuses may be applied according to embodiments of the present disclosure;
FIG. 2 schematically illustrates a flow chart of a credit card overdue risk prediction method according to an embodiment of the disclosure;
FIG. 3 schematically illustrates a flowchart of steps for pre-training a credit card expiration prediction model, according to an embodiment of the present disclosure;
FIG. 4 schematically illustrates a flowchart of steps for determining n first pre-training samples from first raw data of an acquired credit card, according to an embodiment of the disclosure;
FIG. 5 schematically illustrates a flowchart of steps for screening n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a recurrent neural network model to obtain m training samples, in accordance with an embodiment of the present disclosure;
FIG. 6 schematically illustrates a flowchart of steps for screening n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a recurrent neural network model to obtain m training samples, in accordance with an embodiment of the present disclosure;
FIG. 7 schematically illustrates a flowchart of steps for pre-training a credit card expiration prediction model, in accordance with an embodiment of the present disclosure;
FIG. 8 schematically illustrates a block diagram of a credit card overdue risk prediction apparatus according to an embodiment of the disclosure;
fig. 9 schematically illustrates a block diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is only exemplary and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the present disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. In addition, in the following description, descriptions of well-known structures and techniques are omitted so as not to unnecessarily obscure the concepts of the present disclosure.
In the technical solution of the present disclosure, the related user information (including, but not limited to, user personal information, user image information, user equipment information, such as location information, etc.) and data (including, but not limited to, data for analysis, stored data, displayed data, etc.) are information and data authorized by the user or sufficiently authorized by each party, and the related data is collected, stored, used, processed, transmitted, provided, disclosed, applied, etc. and processed, all in compliance with the related laws and regulations and standards of the related country and region, necessary security measures are taken, no prejudice to the public order, and corresponding operation entries are provided for the user to select authorization or rejection.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and/or the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
Where a formulation similar to at least one of "A, B or C, etc." is used, in general such a formulation should be interpreted in accordance with the ordinary understanding of one skilled in the art (e.g. "a system with at least one of A, B or C" would include but not be limited to systems with a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features.
The traditional credit card overdue risk early warning model generally uses a traditional machine learning model trained by single original data to conduct simple prediction and classification, and the actual credit card data volume is large and the distribution is complex, so that the prediction accuracy of the traditional credit card risk early warning method is generally low.
Embodiments of the present disclosure provide a credit card overdue risk prediction method, apparatus, electronic device, computer readable storage medium, and computer program product. The credit card overdue risk prediction method comprises the following steps: acquiring credit card information; determining a overdue prediction value of the credit card according to credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited Boltzmann machine, m training samples used for training the limited Boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and identifying the credit card as a credit card having an overdue risk when the overdue prediction value is greater than the set overdue threshold.
It should be noted that the credit card overdue risk prediction method, apparatus, electronic device, computer readable storage medium and computer program product of the present disclosure may be used in the field of artificial intelligence technology, and may also be used in any field other than the field of artificial intelligence technology, such as financial field, and the field of the present disclosure is not limited herein.
FIG. 1 schematically illustrates an exemplary system architecture 100 in which credit card overdue risk prediction methods, apparatuses, electronic devices, computer-readable storage media, and computer program products may be applied in accordance with embodiments of the present disclosure. It should be noted that fig. 1 is only an example of a system architecture to which embodiments of the present disclosure may be applied to assist those skilled in the art in understanding the technical content of the present disclosure, but does not mean that embodiments of the present disclosure may not be used in other devices, systems, environments, or scenarios.
As shown in fig. 1, a system architecture 100 according to this embodiment may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
The user may interact with the server 105 via the network 104 using the terminal devices 101, 102, 103 to receive or send messages or the like. Various communication client applications, such as shopping class applications, web browser applications, search class applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only) may be installed on the terminal devices 101, 102, 103.
The terminal devices 101, 102, 103 may be a variety of electronic devices having a display screen and supporting web browsing, including but not limited to smartphones, tablets, laptop and desktop computers, and the like.
The server 105 may be a server providing various services, such as a background management server (by way of example only) providing support for websites browsed by users using the terminal devices 101, 102, 103. The background management server may analyze and process the received data such as the user request, and feed back the processing result (e.g., the web page, information, or data obtained or generated according to the user request) to the terminal device.
It should be noted that the credit card overdue risk prediction method provided by the embodiments of the present disclosure may be generally executed by the server 105. Accordingly, the credit card overdue risk prediction apparatus provided in the embodiments of the present disclosure may be generally disposed in the server 105. The credit card overdue risk prediction method provided by the embodiments of the present disclosure may also be performed by a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105. Accordingly, the credit card overdue risk prediction apparatus provided by the embodiments of the present disclosure may also be provided in a server or a server cluster that is different from the server 105 and is capable of communicating with the terminal devices 101, 102, 103 and/or the server 105.
It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
The credit card overdue risk prediction method according to an embodiment of the present disclosure will be described in detail with reference to fig. 2 to 7 based on the scenario described in fig. 1.
Fig. 2 schematically illustrates a flowchart of a credit card overdue risk prediction method according to an embodiment of the disclosure.
As shown in fig. 2, the credit card overdue risk prediction method of this embodiment includes operations S210 to S230.
In operation S210, credit card information is acquired. The credit card information comprises cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data.
In some examples, cardholder base information includes: user name, identification number, gender and age, etc.
In some examples, the credit card base information includes: credit card amount and open age, etc.
In some examples, the credit card consumption record includes: the amount per time the credit card is consumed, the frequency of credit card consumption, the total amount of credit card consumption, and the like.
In some examples, the credit card payment record includes: the historical arrears of each period should pay the date, the actual arrears of each period of history, the arrears of each period of history pay the amount, the arrears of the period should pay the date and the arrears of the period of money.
In some examples, the credit card application history data includes: the method comprises the steps of credit card credit line upgrading frequency, the number of times of active application of the credit line upgrading by a cardholder, the number of times of recommended transaction of the credit line upgrading by a bank and the auditing result of the bank on the credit line upgrading.
In operation S220, the expiration prediction value of the credit card is determined according to the credit card information and a pre-trained credit card expiration prediction model, wherein the credit card expiration prediction model is obtained according to training a limited boltzmann machine, m training samples used for training the limited boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n.
It can be appreciated that the credit card information can be processed according to the feature dimension requirement and the data type requirement of the credit card overdue prediction model on the input data to obtain the credit card information conforming to the feature dimension and the data type. The credit card overdue prediction model comprises model parameters, wherein the model parameters can be the mapping relation between credit card information and overdue predicted values, and the overdue predicted values of the credit card can be obtained according to the credit card information by utilizing the mapping relation between the credit card information and the overdue predicted values.
In operation S230, when the expiration prediction value is greater than the set expiration threshold, the credit card is identified as a credit card having an expiration risk.
According to the credit card overdue risk prediction method disclosed by the embodiment of the disclosure, the overdue predicted value of the credit card can be determined according to the credit card information and the pre-trained credit card overdue prediction model, and the credit card can be identified as the credit card with overdue risk when the overdue predicted value is larger than the set overdue threshold. Therefore, the credit card overdue risk prediction method can effectively identify the credit card with overdue risk, and accordingly a customer manager can be reminded to monitor and collect the credit card in a key mode. The credit card overdue prediction model used in the method is obtained by training m training samples, wherein the m training samples are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, and therefore, the sample quality of the m training samples is higher, and the model parameters trained by the m training samples, namely the mapping relation between credit card information and overdue prediction values, are more accurate. Based on this, the credit card with overdue risk predicted by the credit card overdue prediction model of the present disclosure is more accurate.
FIG. 3 schematically illustrates a flowchart of steps for pre-training a credit card expiration prediction model, according to an embodiment of the disclosure.
As shown in FIG. 3, the steps of pre-training the credit card expiration prediction model of this embodiment include operations S310 to S330 and S370.
In operation S310, n first pre-training samples are determined according to the acquired first original data of the credit card, wherein the data features of the first original data include cardholder base information, credit card consumption record, credit card payment record and credit card application history data, the first pre-training samples include sample features and sample tags, the sample features are determined according to the data features of the first original data, and the sample tags are determined according to the credit card payment record.
In some examples, cardholder base information includes: user name, identification number, gender and age, etc.
In some examples, the credit card base information includes: credit card amount and open age, etc.
In some examples, the credit card consumption record includes: the amount per time the credit card is consumed, the frequency of credit card consumption, the total amount of credit card consumption, and the like.
In some examples, the credit card payment record includes: the date of payment due to the historical payment per period, the actual date of payment due to the historical payment per period, the amount of payment due to the historical payment per period, and the like.
In some examples, the credit card application history data includes: the method comprises the steps of credit card credit line upgrading frequency, the number of times of active application of the credit line upgrading by a cardholder, the number of times of recommended transaction of the credit line upgrading by a bank and the auditing result of the bank on the credit line upgrading.
As a possible implementation manner, as shown in fig. 4, the step of determining n first pre-training samples according to the acquired first raw data of the credit card in operation S310 includes operation S311 and operation S312.
In operation S311, data cleansing is performed on the acquired n first raw data.
In some specific examples, the step of performing data cleansing on the acquired n pieces of first raw data in operation S311 includes: and carrying out missing value processing, abnormal value detection and filtering, repeated data deletion and error checking on the obtained n pieces of first original data.
It is understood that data cleansing may be understood as performing operations such as processing, screening, converting, and normalizing on the first raw data directly obtained from the database to improve performance of the subsequent model. The data cleansing mainly includes the following 4 operations.
Missing value processing: and filling the missing data with the average value.
Detection and filtration of outliers: and judging whether abnormal points exist or not by a statistical distribution characteristic method.
Deleting duplicate data: duplicate data is deleted during the data cleansing process.
Error checking: and extracting specific data details, checking the specific data details with a service, calculating caliber, and randomly verifying whether the data are correct.
Therefore, the data cleaning of the acquired n first original data can be conveniently realized by carrying out missing value processing, abnormal value detection and filtering, repeated data deletion and error checking on the acquired n first original data.
In operation S312, the cleaned first raw data is subjected to feature engineering to obtain n first pre-training samples.
In some specific examples, the step of performing feature engineering on the cleaned first raw data in operation S312 to obtain n first pre-training samples includes: and carrying out feature normalization processing on the cleaned first original data, and carrying out dimension reduction on the first original data subjected to feature normalization to obtain n first pre-training samples.
It will be appreciated that the feature normalization process may eliminate the effect of data dimensions on model robustness. Because the data features of the first original data obtained from the database are comprehensive and the feature dimension is high, the feature normalized data is subjected to dimension reduction by using a Principal component analysis (Principal ComponentAnalysis, PCA) algorithm, and the high-dimensional data is reduced to the low-dimensional data while the important information of the first original data is saved as much as possible, so that the training cost of the limited boltzmann machine can be reduced.
In some examples, the sample features of the first pre-training sample may be understood as reduced-dimension data features of the first raw data. The sample tag of the first pre-training sample may include an overdue and an unexpired, and the acquired credit card payment record includes a historical payment due date per period and a historical payment due actual payment date per period, from which it may be determined whether the corresponding credit card is overdue, whereby the overdue or unexpired may be regarded as the sample tag of the corresponding credit card.
In some examples, the credit card is marked as overdue based on the historical due date and the historical actual due date, and is marked as a number 1, and is marked as not overdue based on the historical due date and the historical actual due date, whereby either a 1 or 0 may be used as a sample tag for the corresponding credit card.
Therefore, the first original data after cleaning is subjected to feature normalization processing, the first original data after feature normalization is subjected to dimension reduction, n first pre-training samples are obtained, feature engineering of the first original data after cleaning can be conveniently achieved, and n first pre-training samples are obtained.
Determining n first pre-training samples from the acquired first raw data of the credit card may be facilitated through operation S311 and operation S312.
In operation S320, the n first pre-training samples are filtered according to the convolutional neural network model, the self-attention network model, and the cyclic neural network model, to obtain m training samples.
As a possible implementation manner, as shown in fig. 5, the step of screening n first training samples according to the convolutional neural network model, the self-attention network model, and the cyclic neural network model to obtain m training samples in operation S320 includes operations S321 to S324.
In operation S321, the overdue condition of each of the n first pretrained samples is predicted by using the pretrained convolutional neural network model, so as to obtain a first prediction credit result.
For example, the convolutional neural network model (CNN) used by this industry is the VGG-16 model. The network structure is 16 layers in total, 13 are convolution layers, and 3 are full connection layers. The convolution layer is 3*3, the same convolutions with step size of 1, the pooling layers are all 2 x 2, and the maximum pooling with step size of 2. The training CNN of the present disclosure uses a pytorch deep learning framework, and the convolutional neural network training set used for training the CNN includes a plurality of convolutional neural network samples, which may be obtained in the same manner as the first pre-training sample. The convolutional neural network samples are input into a VGG-16 model, and the VGG-16 model is trained. After the VGG-16 model fitting is completed, whether the credit card on the convolutional neural network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the convolutional neural network test set is overdue or not can reach more than 85% of classification accuracy. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the convolutional neural network training set, the convolutional neural network verification set and the convolutional neural network test set. The convolutional neural network verification set and the convolutional neural network test set are obtained in the same mode as the convolutional neural network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by using the pre-training convolutional neural network model, and a first prediction credit result is obtained.
In operation S322, the overdue condition of each of the n first pre-training samples is predicted by using the pre-trained self-attention network model to obtain a second prediction credit result.
For example, the self-attention network model (GNN) used by the present industry is a graph self-attention network (Graph Attention Networks, GAT), which builds a model based on pytorch. The self-attention network training set used for training GNNs includes a plurality of self-attention network samples, which may be obtained in the same manner as the first pre-training sample. The self-attention network sample is input into the GAT model, which is trained. After the GAT model fitting is completed, whether the credit card on the self-attention network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the self-attention network test set is overdue or not can reach more than 85% of classification accuracy. And (5) saving the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the self-attention network training set, the self-attention network verification set and the self-attention network test set. The acquisition mode of the self-attention network verification set and the self-attention network test set is the same as that of the self-attention network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by utilizing the pre-trained self-attention network model, and a second prediction credit result can be obtained.
In operation S323, the overdue condition of each of the n first pre-training samples is predicted by using the pre-trained recurrent neural network model, so as to obtain a third prediction credit result.
For example, the recurrent neural network model used by the present company is the LSTM model. The LSTM model constructed in the present disclosure adds a total of 4 LSTM layers and two dropout layers, and uses a weight decay strategy when training the model. The recurrent neural network training set used for training the LSTM includes a plurality of recurrent neural network samples, and the recurrent neural network samples may be obtained in the same manner as the first pre-training sample. And inputting the circulating neural network sample into the LSTM model, and training the LSTM model. After the LSTM model fitting is completed, whether the credit card on the circulating neural network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the circulating neural network test set is overdue or not can reach more than 85% of classification accuracy. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the cyclic neural network training set, the cyclic neural network verification set and the cyclic neural network test set. The acquisition modes of the cyclic neural network verification set and the cyclic neural network test set are the same as the acquisition modes of the cyclic neural network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by using the pre-training cyclic neural network model, and a third prediction credit result can be obtained.
In operation S324, when any two of the first prediction credit result, the second prediction credit result, and the third prediction credit result of the same first pre-training sample are the same, the first pre-training sample is screened as a training sample, and m training samples are screened out of n first pre-training samples.
Through operation S321 to operation S324, the screening of n first pre-training samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model can be conveniently realized, m training samples are obtained, the quality of m training samples obtained by the screening is higher, and further the credit card overdue prediction model obtained by training the m training samples can reduce the misclassification condition generated due to the fact that a single model cannot fit the real data distribution condition.
As another possible implementation manner, as shown in fig. 6, the step of screening n first training samples according to the convolutional neural network model, the self-attention network model, and the cyclic neural network model in operation S320 to obtain m training samples includes operations S325 to S329.
In operation S325, the overdue condition of each of the n first pretrained samples is predicted by using the pretrained convolutional neural network model, so as to obtain a first prediction credit result.
For example, the convolutional neural network model (CNN) used by this industry is the VGG-16 model. The network structure is 16 layers in total, 13 are convolution layers, and 3 are full connection layers. The convolution layer is 3*3, the same convolutions with step size of 1, the pooling layers are all 2 x 2, and the maximum pooling with step size of 2. The training CNN of the present disclosure uses a pytorch deep learning framework, and the convolutional neural network training set used for training the CNN includes a plurality of convolutional neural network samples, which may be obtained in the same manner as the first pre-training sample. The convolutional neural network samples are input into a VGG-16 model, and the VGG-16 model is trained. After the VGG-16 model fitting is completed, whether the credit card on the convolutional neural network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the convolutional neural network test set is overdue or not can reach more than 85% of classification accuracy. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the convolutional neural network training set, the convolutional neural network verification set and the convolutional neural network test set. The convolutional neural network verification set and the convolutional neural network test set are obtained in the same mode as the convolutional neural network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by using the pre-training convolutional neural network model, and a first prediction credit result is obtained.
In operation S326, the overdue condition of each of the n first pre-training samples is predicted by using the pre-trained self-attention network model to obtain a second prediction credit result.
For example, the self-attention network model (GNN) used by the present industry is a graph self-attention network (Graph Attention Networks, GAT), which builds a model based on pytorch. The self-attention network training set used for training GNNs includes a plurality of self-attention network samples, which may be obtained in the same manner as the first pre-training sample. The self-attention network sample is input into the GAT model, which is trained. After the GAT model fitting is completed, whether the credit card on the self-attention network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the self-attention network test set is overdue or not can reach more than 85% of classification accuracy. And (5) saving the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the self-attention network training set, the self-attention network verification set and the self-attention network test set. The acquisition mode of the self-attention network verification set and the self-attention network test set is the same as that of the self-attention network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by utilizing the pre-trained self-attention network model, and a second prediction credit result can be obtained.
In operation S327, the overdue condition of each of the n first pre-training samples is predicted by using the pre-trained recurrent neural network model, so as to obtain a third prediction credit result.
For example, the recurrent neural network model used by the present company is the LSTM model. The LSTM model constructed in the present disclosure adds a total of 4 LSTM layers and two dropout layers, and uses a weight decay strategy when training the model. The recurrent neural network training set used for training the LSTM includes a plurality of recurrent neural network samples, and the recurrent neural network samples may be obtained in the same manner as the first pre-training sample. And inputting the circulating neural network sample into the LSTM model, and training the LSTM model. After the LSTM model fitting is completed, whether the credit card on the circulating neural network verification set is overdue or not can reach more than 95% of classification accuracy, and whether the credit card on the circulating neural network test set is overdue or not can reach more than 85% of classification accuracy. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the cyclic neural network training set, the cyclic neural network verification set and the cyclic neural network test set. The acquisition modes of the cyclic neural network verification set and the cyclic neural network test set are the same as the acquisition modes of the cyclic neural network training set.
Therefore, the overdue condition of each first pre-training sample in the n first pre-training samples can be predicted by using the pre-training cyclic neural network model, and a third prediction credit result can be obtained.
In operation S328, when any two of the first prediction credit result, the second prediction credit result, and the third prediction credit result of the same first pre-training sample are the same, the first pre-training sample is screened as a second pre-training sample, and k second pre-training samples are screened out of n first pre-training samples, where two or more identical prediction credit results are used as the prediction credit results of the second pre-training samples, and k is an integer greater than or equal to 1 and less than or equal to n.
In operation S329, the prediction credit result of each of the k second pre-training samples is compared with the sample label of the second pre-training sample, and when the prediction credit result is consistent with the sample label, the second pre-training sample is screened as a training sample, and m training samples are screened out of the k second pre-training samples.
Through operation S325 to operation S329, the screening of n first pre-training samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model can be conveniently realized, m training samples are obtained, the m training samples obtained by the screening are all high-quality samples, further, the credit card overdue prediction model obtained by training the m training samples can reduce the misclassification condition generated by the fact that a single model cannot fit the real data distribution condition, and the credit card overdue prediction model disclosed by the disclosure integrates the advantages of the convolutional neural network model, the self-attention network model, the cyclic neural network model and the limited boltzmann machine, so that the robustness of the credit card overdue prediction model is higher.
In operation S330, initial model parameters set in the restricted boltzmann machine are adjusted according to the m training samples, resulting in training model parameters.
In operation S370, the training model parameters are applied as model parameters of the credit card overdue prediction model. The pre-training of the credit card expiration prediction model may be facilitated through operations S310 through S330 and S370.
In some examples, the weight values and cost function thresholds used in the credit card overdue prediction model are derived according to a bayesian method. Compared with the traditional method of manually setting the related weight value and the cost function threshold, the Bayesian method used by the method for determining the weight value and the threshold can remarkably improve the robustness of the model.
FIG. 7 schematically illustrates a flowchart of steps for pre-training a credit card expiration prediction model, according to an embodiment of the disclosure.
As shown in FIG. 7, the steps of pre-training the credit card expiration prediction model of this embodiment include operations S310 to S380.
In operation S310, n first pre-training samples are determined according to the acquired first original data of the credit card, wherein the data features of the first original data include cardholder base information, credit card consumption record, credit card payment record and credit card application history data, the first pre-training samples include sample features and sample tags, the sample features are determined according to the data features of the first original data, and the sample tags are determined according to the credit card payment record.
In operation S320, the n first pre-training samples are filtered according to the convolutional neural network model, the self-attention network model, and the cyclic neural network model, to obtain m training samples.
In operation S330, initial model parameters set in the restricted boltzmann machine are adjusted according to the m training samples, resulting in training model parameters.
In operation S340, x first pre-verification samples are determined according to the acquired second original data of the credit card, wherein the data characteristics of the second original data include cardholder base information, credit card consumption record, credit card payment record and credit card application history data, the first pre-verification samples include verification sample characteristics and verification sample tags, the verification sample characteristics are determined according to the data characteristics of the second original data, and the verification sample tags are determined according to the credit card payment record. It will be appreciated that the method for determining the x first pre-verification samples is the same as the method for determining the n first pre-training samples, and will not be described in detail here.
In operation S350, the x first pre-verification samples are screened according to the convolutional neural network model, the self-attention network model, and the cyclic neural network model, to obtain y verification samples. It can be understood that the method of screening x first pre-verification samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model to obtain y verification samples is the same as the method of screening n first pre-training samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model to obtain m training samples, and is not repeated here.
In operation S360, training model parameters are verified based on y verification samples.
In operation S370, if the verification is passed, the training model parameters are applied as the model parameters of the credit card overdue prediction model.
In operation S380, if the verification is not passed, the step of determining n first pre-training samples according to the acquired first raw data of the credit card in operation S310 is repeated until the verification is passed.
In some examples, the predicted value can be obtained by verifying the credit card overdue prediction model with y verification samples, the actual value can be obtained according to the verification sample labels of the y verification samples, the loss value between the predicted value and the actual value is calculated by using the loss function, if the loss value meets the set model threshold, the verification can be considered to be passed, and the training model parameter is applied as the model parameter of the credit card overdue prediction model; if the loss value does not meet the set model threshold, the verification is considered to be failed, and operations S310 to S360 are repeated until the verification is passed.
In some examples, the application of the validated training model parameters may enable the model parameters of the credit card overdue prediction model to be more accurate, thereby enabling the data processing capacity of the credit card overdue prediction model to be better and the obtained calculation result of the classification value to be more accurate. The operation S310 to the operation S380 can facilitate the pre-construction of a credit card overdue prediction model capable of performing overdue risk prediction on a credit card.
The credit card overdue risk prediction method according to an embodiment of the present disclosure is described in detail as follows. It is to be understood that the following description is exemplary only and is not intended to limit the disclosure in any way.
The present disclosure is based on the existing historical usage related data of the credit card of the bank, such as the consumption frequency of the credit card, the total consumption amount of the credit card, the repayment frequency of the credit card, etc., the part of historical data can acquire the overdue label information of the cardholder and the data amount of the credit card data of the bank is huge, the part of data is learned and classified by using deep learning, and the distribution situation of the data can not be completely fitted by a single model is considered, so three deep learning methods, namely a convolutional neural network (Convolutional Neural Networks, CNN), a graph neural network (Graph Neural Networks, GNN) and a Long Short Term Memory network (LSTM), are used for carrying out two classification on the historical data of the credit card cardholder, and classification results with higher accuracy can be obtained by integrating classification results. Based on the method, the actual label and the classification result are all the unoverdue cardholder data, and the partial data are input into a limited Boltzmann machine (Restricted Boltzmann Machine, RBM) for training. After updating the cardholder data at the end of the repayment period, respectively inputting the data into the integrated deep learning model and the RBM to respectively obtain the overdue result, and judging whether the cardholder is overdue according to the weighted weight information obtained by the historical training data.
The training phase of the present disclosure is divided into 10 steps altogether, and the detailed steps are described below.
Step1 data acquisition
The credit card holder dimension is taken from the database. Firstly, the number of credit card holders required for taking the number is determined, and secondly, the time range of taking the number is determined. Because the credit card repayment period is a month, the minimum statistical range of the data is a month, and the corresponding data of the cardholder are obtained, wherein the specific data comprise: basic information of the cardholder such as user gender, age, etc.; credit card information of the cardholder, such as credit card amount, opening year, etc.; credit card consumption records, such as credit card consumption frequency, credit card consumption total, etc.; credit card repayment records, such as credit card repayment frequency, credit card repayment amount, whether repayment is on time or not, and the like; application history data such as the update frequency of credit line of a credit card of a cardholder, the number of active application and recommended transaction by a bank, related auditing results and the like.
Step2 data cleaning
The data cleaning refers to processing, screening, converting, normalizing and other operations on the original data directly taken from the database to improve the performance of the subsequent model. The data cleaning mainly comprises the following steps: 1. missing value processing, namely, filling the average value of missing data; 2. detecting and filtering abnormal values, and judging whether abnormal points exist or not through a statistical distribution special diagnosis method; 3. repeated data processing, wherein repeated data can be deleted in the data cleaning process; 4. error checking, namely extracting specific data details from the extracted data, checking a calculated caliber with a service, and randomly verifying whether the data are correct.
Step3 engineering of characteristics
The data engineering of the data after cleaning by the present disclosure is as follows: 1. feature normalization processing is carried out to eliminate the influence of data dimension on model robustness; 2. because the extracted data is comprehensive and the feature dimension is high, the dimension of the normalized data is reduced by using a principal component analysis (Principal Component Analysis, PCA) algorithm, the important information of the original data is saved as much as possible while the high-dimension data is reduced to the low dimension, and the training cost of the deep learning model is reduced.
Step4 data labelling
The contracted payment date and the actual payment date of each month of the cardholder are obtained when the cardholder data are obtained, a field of whether overdue is added according to the data, the monthly data of the cardholder are marked as number 1 if overdue, and the monthly data of the cardholder are marked as number 0 if overdue.
Step5 sample generation
The present disclosure constructs samples from the labeled data, selecting 60% of the data as training data, 20% of the data as validation set, and 20% of the data as test set. In practice, the data is highly unbalanced because there are fewer overdue cardholders. The present disclosure upsamples unbalanced data to balance data distribution by adding a minority class of data.
In addition, since the model feature construction modes of different base learners are different, the sample is constructed according to different base learners.
Step6 training of basic learner
Step6.1cnn-based learner training
The CNN network-based learner used in the present disclosure is a VGG-16 model. The network structure is 16 layers in total, 13 are convolution layers, and 3 are full connection layers. The convolution layer is 3*3, the same convolutions with step size of 1, the pooling layers are all 2 x 2, and the maximum pooling with step size of 2. Training the CNN of the present disclosure uses a pytorch deep learning framework to input the samples constructed in step 5 into the VGG-16 model, which trains the model. After model fitting is completed, whether the credit cards on the verification set are overdue or not can reach more than 95% of classification accuracy, and whether the credit cards on the test set are overdue or not can reach more than 85% of classification accuracy. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effect on the training set, the verification set and the test set.
Training of Step6.2GNN-based learner
GNN-based learners used in the present disclosure are graph self-attention networks (Graph Attention Networks, GAT), model building based on pytorch according to official requirements of GAT. And (3) inputting the GAT sample constructed in the step (5), namely the input node matrix, the adjacency relation and the edge weight, into a GAT model, and training the model. After the model is fitted, the accuracy of the model on the verification set in the credit card overdue second classification is ensured to reach more than 95%. The accuracy of the overdue two classification of the credit card on the test set reaches more than 85 percent. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effects on the training set, the verification set and the test set.
Step6.3LSTM based learner training
The basis learner of the recurrent neural network used in the present disclosure is the lstm model. The LSTM model constructed in the present disclosure adds a total of 4 LSTM layers and two dropout layers, and uses a weight decay strategy when training the model. The samples constructed in step 5 of the present disclosure are input into the constructed LSTM model, and the model is trained. After the model is determined to be fitted, the accuracy of the model on the verification set in the overdue two-class credit card is ensured to reach more than 95% after the model is fitted. The accuracy of the overdue two classification of the credit card on the test set reaches more than 85 percent. And (3) storing the model once every 50 iterations in the training process, and finally screening out the model files with the best effects on the training set, the verification set and the test set.
Step7. Integrated learning
And (3) integrating the models of the three base learners obtained in the step (6), wherein a voting method is used in the integration method. And (3) respectively inputting the training data of the three models obtained in the step (5) into the base learners VGG-16, GAT and LSTM to obtain three classification results of whether the credit card of the cardholder is overdue, integrating the classified results, wherein the integrated standard is that if the classification results of two or more models are consistent, the sample is added into a candidate sample set. The candidate sample set is used for subsequent training.
Step8 enhanced dataset acquisition
And (3) comparing the classification result of the candidate sample obtained in the step (7) with the real label of whether the credit card of the cardholder is overdue, and taking the sample as an enhanced training set for training of RBM when the integrated learning classification result is consistent with the real label and the label is not overdue.
step9.RBM training
In practice, the overdue condition of the credit card holder is less likely to occur, so that the positive and negative samples are very unbalanced after the actual data is constructed into the samples. Although sample equalization and other technical methods can be adopted during model training, the model recall rate can be lower during prediction. Based on this situation, unsupervised learning and non-linear characterizations models are suitable for such scenarios. The present disclosure is trained using the enhanced data set acquired by the RBM in step 7, which can acquire a data characterization of the data. The enhanced data set used in RBM training is all data of good data quality of which the credit card of a cardholder is not overdue, and the parameters of model fitting are all characteristics of a normal sample. When a sample of the overdue risk of the credit card is input into the model, the output of the model is greatly different from the normal output, and when the threshold value is exceeded, the overdue risk of the cardholder of the credit card in the repayment period can be judged.
Step10 parameter acquisition
Based on the integrated reinforcement model in step 7, the RBM-trained model and the reinforcement dataset of step 9, a bayesian estimation method is used to find the most appropriate weight parameters of the reinforcement learner and RBM and the threshold of the RBM model.
The detailed steps of the model application phase of the present disclosure are described below.
Step1 sample generation
And acquiring actual data of the cardholder in the current repayment period of the cardholder, processing the data through a characteristic process, and constructing a sample applicable to VGG-16, GAT, LSTM and RBM according to the operation of the training stage step 5.
Step2 analysis of results
And (3) respectively inputting the sample of the cardholder obtained in the step (1) into VGG-16, GAT, LSTM and RBM models, integrating the model results of VGG-16, GAT and LSTM to obtain the result of the strong learner, and combining the result of the strong learner and the result of RBM according to the parameters found by the Bayesian estimation method in the step (10) in the training stage to obtain the credit card overdue risk grade score of the cardholder with high recall rate.
The present disclosure can significantly improve the credit card overdue risk recall: compared with the traditional credit card overdue risk prediction model, the weighting-based integrated deep learning and RBM risk early warning method provided by the invention can promote early warning recall rate in actual risk early warning; the present disclosure calculates weight values and RBM cost function thresholds according to a bayesian method: compared with the traditional method of manually setting the related weight value and the RBM cost function threshold, the Bayesian method used by the method of the present disclosure determines the weight value and the threshold, and can significantly improve the robustness of the model; the method and the device can remarkably reduce the misclassification condition generated by the fact that a single model cannot fit the real data distribution condition: because the present disclosure gathers the length of CNN, GNN, LSTM, and RBM, the model robustness is higher; the method is simple to operate, and after cardholder data is updated in the final period of the repayment period, parameter setting is not needed, and the result with high accuracy of overdue can be obtained by inputting the result into a complete model.
Based on the credit card overdue risk prediction method, the disclosure also provides a credit card overdue risk prediction device. The credit card overdue risk prediction apparatus will be described in detail with reference to fig. 8.
Fig. 8 schematically illustrates a block diagram of a credit card overdue risk prediction apparatus 10 according to an embodiment of the present disclosure.
The credit card overdue risk prediction apparatus 10 includes an acquisition module 1, a determination module 2, and an identification module 3.
Acquisition module 1, acquisition module 1 is configured to perform operation S210: credit card information is acquired.
A determining module 2, the determining module 2 is configured to perform operation S220: and determining the overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited Boltzmann machine, m training samples used for training the limited Boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n.
The identifying module 3, the identifying module 3 is configured to perform operation S230: and when the overdue predicted value is larger than the set overdue threshold value, identifying the credit card as the credit card with overdue risk.
According to some embodiments of the present disclosure, the credit card overdue risk prediction apparatus further includes a pre-building module, where the pre-building module is configured to pre-train the credit card overdue prediction model, and the pre-building module includes a first acquisition unit, a first determination unit, a training unit, and an application unit.
The first acquisition unit is used for determining n first pre-training samples according to the acquired first original data of the credit card, wherein the data characteristics of the first original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-training samples comprise sample characteristics and sample labels, the sample characteristics are determined according to the data characteristics of the first original data, and the sample labels are determined according to the credit card repayment records.
The first determining unit is used for screening the n first pre-training samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model to obtain m training samples.
The training unit is used for adjusting initial model parameters set in the limited Boltzmann machine according to m training samples to obtain training model parameters.
And the application unit is used for applying the training model parameters as the model parameters of the credit card overdue prediction model.
According to some embodiments of the present disclosure, the first determining unit comprises a first determining element, a second determining element, a third determining element and a first screening element.
The first determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result.
And the second determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing the pre-trained self-attention network model to obtain a second prediction credit result.
And the third determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained cyclic neural network model to obtain a third prediction credit result.
And the first screening element is used for screening the first pre-training sample as training samples when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, and screening m training samples from n first pre-training samples.
According to some embodiments of the present disclosure, the first determining unit comprises a first determining element, a second determining element, a third determining element, a second screening element and a third screening element.
The first determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result.
And the second determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing the pre-trained self-attention network model to obtain a second prediction credit result.
And the third determining element is used for predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained cyclic neural network model to obtain a third prediction credit result.
And the second screening element is used for screening the first pre-training sample into a second pre-training sample and screening k second pre-training samples from n first pre-training samples when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, wherein the two or more same prediction credit results are used as the prediction credit result of the second pre-training sample, and k is an integer greater than or equal to 1 and less than or equal to n.
And the third screening element is used for comparing the prediction credit result of each second pre-training sample in the k second pre-training samples with the sample label of the second pre-training sample, screening the second pre-training sample as a training sample when the prediction credit result is consistent with the sample label, and screening m training samples in the k second pre-training samples.
According to some embodiments of the present disclosure, the credit card overdue risk prediction apparatus further includes a pre-building module for pre-training a credit card overdue prediction model, the pre-building module including a first acquisition unit, a first determination unit, a training unit, a second acquisition unit, a second determination unit, a verification unit, an application unit, and a repetition unit.
The first acquisition unit is used for determining n first pre-training samples according to the acquired first original data of the credit card, wherein the data characteristics of the first original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-training samples comprise sample characteristics and sample labels, the sample characteristics are determined according to the data characteristics of the first original data, and the sample labels are determined according to the credit card repayment records.
The first determining unit is used for screening the n first pre-training samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model to obtain m training samples.
The training unit is used for adjusting initial model parameters set in the limited Boltzmann machine according to m training samples to obtain training model parameters.
The second acquisition unit is used for determining x first pre-verification samples according to the acquired second original data of the credit card, wherein the data characteristics of the second original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-verification samples comprise verification sample characteristics and verification sample labels, the verification sample characteristics are determined according to the data characteristics of the second original data, and the verification sample labels are determined according to the credit card repayment records.
And the second determining unit is used for screening the x first pre-verification samples according to the convolutional neural network model, the self-attention network model and the cyclic neural network model to obtain y verification samples.
The verification unit is used for verifying training model parameters according to y verification samples;
and the application unit is used for applying the training model parameters as the model parameters of the credit card overdue prediction model if the verification is passed.
And the repeating unit is used for repeating the step of determining n first pre-training samples according to the acquired first original data of the credit card until the verification is passed if the verification is not passed.
According to some embodiments of the present disclosure, the first acquisition unit comprises a cleaning element and a feature engineering element.
And the cleaning element is used for cleaning the acquired n pieces of first original data.
And the characteristic engineering element is used for carrying out characteristic engineering on the cleaned first original data to obtain n first pre-training samples.
According to the credit card overdue risk prediction apparatus 10 of the embodiment of the present disclosure, the overdue predicted value of the credit card can be determined according to the credit card information and the pre-trained credit card overdue prediction model, and the credit card can be identified as the credit card with overdue risk when the overdue predicted value is greater than the set overdue threshold. Therefore, the credit card overdue risk prediction method can effectively identify the credit card with overdue risk, and accordingly a customer manager can be reminded to monitor and collect the credit card in a key mode. The credit card overdue prediction model used in the method is obtained by training m training samples, wherein the m training samples are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, and therefore, the sample quality of the m training samples is higher, and the model parameters trained by the m training samples, namely the mapping relation between credit card information and overdue prediction values, are more accurate. Based on this, the credit card with overdue risk predicted by the credit card overdue prediction model of the present disclosure is more accurate.
In addition, according to an embodiment of the present disclosure, any of the plurality of modules of the acquisition module 1, the determination module 2, and the identification module 3 may be incorporated in one module to be implemented, or any of the plurality of modules may be split into a plurality of modules. Alternatively, at least some of the functionality of one or more of the modules may be combined with at least some of the functionality of other modules and implemented in one module.
According to embodiments of the present disclosure, at least one of the acquisition module 1, the determination module 2 and the identification module 3 may be at least partially implemented as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in hardware or firmware in any other reasonable way of integrating or packaging the circuits, or in any one of or a suitable combination of any of the three.
Alternatively, at least one of the acquisition module 1, the determination module 2 and the identification module 3 may be at least partially implemented as a computer program module, which, when executed, may perform the respective functions.
Fig. 9 schematically shows a block diagram of an electronic device adapted to implement the above-described method according to an embodiment of the present disclosure.
As shown in fig. 9, an electronic device 900 according to an embodiment of the present disclosure includes a processor 901 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage portion 908 into a Random Access Memory (RAM) 903. The processor 901 may include, for example, a general purpose microprocessor (e.g., a CPU), an instruction set processor and/or an associated chipset and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. Processor 901 may also include on-board memory for caching purposes. Processor 901 may include a single processing unit or multiple processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
In the RAM 903, various programs and data necessary for the operation of the electronic device 900 are stored. The processor 901, the ROM 902, and the RAM 903 are connected to each other by a bus 904. The processor 901 performs various operations of the method flow according to the embodiments of the present disclosure by executing programs in the ROM 902 and/or the RAM 903. Note that the program may be stored in one or more memories other than the ROM 902 and the RAM 903. The processor 901 may also perform various operations of the method flow according to embodiments of the present disclosure by executing programs stored in the one or more memories.
According to an embodiment of the disclosure, the electronic device 900 may also include an input/output (I/O) interface 905, the input/output (I/O) interface 905 also being connected to the bus 904. The electronic device 900 may also include one or more of the following components connected to the I/O interface 905: an input section 906 including a keyboard, a mouse, and the like; an output portion 907 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage portion 908 including a hard disk or the like; and a communication section 909 including a network interface card such as a LAN card, a modem, or the like. The communication section 909 performs communication processing via a network such as the internet. The drive 910 is also connected to an input/output (I/O) interface 905 as needed. A removable medium 911 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is installed as needed on the drive 910 so that a computer program read out therefrom is installed into the storage section 908 as needed.
The present disclosure also provides a computer-readable storage medium that may be embodied in the apparatus/device/system described in the above embodiments; or may exist alone without being assembled into the apparatus/device/system. The computer-readable storage medium carries one or more programs which, when executed, implement methods in accordance with embodiments of the present disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example, but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. For example, according to embodiments of the present disclosure, the computer-readable storage medium may include ROM 902 and/or RAM 903 and/or one or more memories other than ROM 902 and RAM 903 described above.
Embodiments of the present disclosure also include a computer program product comprising a computer program containing program code for performing the methods shown in the flowcharts. The program code, when executed in a computer system, causes the computer system to perform the methods of embodiments of the present disclosure.
The above-described functions defined in the system/apparatus of the embodiments of the present disclosure are performed when the computer program is executed by the processor 901. The systems, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
In one embodiment, the computer program may be based on a tangible storage medium such as an optical storage device, a magnetic storage device, or the like. In another embodiment, the computer program may also be transmitted, distributed, and downloaded and installed in the form of a signal on a network medium, via communication portion 909, and/or installed from removable medium 911. The computer program may include program code that may be transmitted using any appropriate network medium, including but not limited to: wireless, wired, etc., or any suitable combination of the foregoing.
In such an embodiment, the computer program may be downloaded and installed from the network via the communication portion 909 and/or installed from the removable medium 911. The above-described functions defined in the system of the embodiments of the present disclosure are performed when the computer program is executed by the processor 901. The systems, devices, apparatus, modules, units, etc. described above may be implemented by computer program modules according to embodiments of the disclosure.
According to embodiments of the present disclosure, program code for performing computer programs provided by embodiments of the present disclosure may be written in any combination of one or more programming languages, and in particular, such computer programs may be implemented in high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. Programming languages include, but are not limited to, such as Java, c++, python, "C" or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that the features recited in the various embodiments of the disclosure and/or in the claims may be combined in various combinations and/or combinations, even if such combinations or combinations are not explicitly recited in the disclosure. In particular, the features recited in the various embodiments of the present disclosure and/or the claims may be variously combined and/or combined without departing from the spirit and teachings of the present disclosure. All such combinations and/or combinations fall within the scope of the present disclosure.
The embodiments of the present disclosure are described above. However, these examples are for illustrative purposes only and are not intended to limit the scope of the present disclosure. Although the embodiments are described above separately, this does not mean that the measures in the embodiments cannot be used advantageously in combination. The scope of the disclosure is defined by the appended claims and equivalents thereof. Various alternatives and modifications can be made by those skilled in the art without departing from the scope of the disclosure, and such alternatives and modifications are intended to fall within the scope of the disclosure.

Claims (11)

1. A method for predicting the risk of overdue credit card comprising:
acquiring credit card information;
determining a overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited Boltzmann machine, m training samples used for training the limited Boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and
And identifying the credit card as a credit card with overdue risk when the overdue predicted value is greater than a set overdue threshold.
2. The method of claim 1, wherein the step of pre-training a credit card expiration prediction model comprises:
determining the n first pre-training samples according to the acquired first original data of the credit card, wherein the data characteristics of the first original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-training samples comprise sample characteristics and sample labels, the sample characteristics are determined according to the data characteristics of the first original data, and the sample labels are determined according to the credit card repayment records;
screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model to obtain m training samples;
according to the m training samples, initial model parameters set in the limited Boltzmann machine are adjusted to obtain training model parameters; and
and using the training model parameters as the model parameters of the credit card overdue prediction model.
3. The method of claim 2, wherein the step of screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a cyclic neural network model to obtain the m training samples comprises:
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result;
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained self-attention network model to obtain a second prediction credit result;
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by using a pre-trained cyclic neural network model to obtain a third prediction credit result; and
and screening the first pre-training sample as a training sample when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, and screening the m training samples from the n first pre-training samples.
4. The method of claim 2, wherein the step of screening the n first pre-training samples according to a convolutional neural network model, a self-attention network model, and a cyclic neural network model to obtain the m training samples comprises:
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained convolutional neural network model to obtain a first prediction credit result;
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by utilizing a pre-trained self-attention network model to obtain a second prediction credit result;
predicting the overdue condition of each first pre-training sample in the n first pre-training samples by using a pre-trained cyclic neural network model to obtain a third prediction credit result;
when any two of the first prediction credit result, the second prediction credit result and the third prediction credit result of the same first pre-training sample are the same, screening the first pre-training sample into a second pre-training sample, and screening the k second pre-training samples from the n first pre-training samples, wherein the two or more same prediction credit results are used as the prediction credit results of the second pre-training sample, and k is an integer greater than or equal to 1 and less than or equal to n; and
And comparing the prediction credit result of each second pre-training sample in the k second pre-training samples with the sample label of the second pre-training sample, screening the second pre-training sample as a training sample when the prediction credit result is consistent with the sample label, and screening the m training samples in the k second pre-training samples.
5. The method of claim 2, wherein the step of pre-training the credit card overdue prediction model is preceded by the step of applying the training model parameters as model parameters of the credit card overdue prediction model, further comprising:
determining x first pre-verification samples according to second original data of the acquired credit card, wherein the data characteristics of the second original data comprise cardholder basic information, credit card consumption records, credit card repayment records and credit card application history data, the first pre-verification samples comprise verification sample characteristics and verification sample labels, the verification sample characteristics are determined according to the data characteristics of the second original data, and the verification sample labels are determined according to the credit card repayment records;
Screening the x first pre-verification samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model to obtain y verification samples;
verifying the training model parameters according to the y verification samples;
if the verification is passed, the training model parameters are used as the model parameters of the credit card overdue prediction model; and
and if the verification is not passed, repeating the step of determining the n first pre-training samples according to the acquired first original data of the credit card until the verification is passed.
6. The method of claim 2, wherein the step of determining the n first pre-training samples from the acquired first raw data of the credit card comprises:
performing data cleaning on the obtained n pieces of first original data; and
and performing feature engineering on the cleaned first original data to obtain n first pre-training samples.
7. The method of claim 6, wherein the step of data cleansing the acquired n first raw data comprises: carrying out missing value processing, abnormal value detection and filtering, and deleting repeated data and error checking on the obtained n pieces of first original data; and/or
Performing feature engineering on the cleaned first original data to obtain n first pre-training samples, wherein the step of obtaining the n first pre-training samples comprises the following steps of: and carrying out feature normalization processing on the cleaned first original data, and carrying out dimension reduction on the first original data subjected to feature normalization to obtain n first pre-training samples.
8. A credit card overdue risk prediction apparatus, comprising:
the acquisition module is used for executing acquisition of credit card information;
the determining module is used for determining a overdue prediction value of the credit card according to the credit card information and a pre-trained credit card overdue prediction model, wherein the credit card overdue prediction model is obtained according to a training limited boltzmann machine, m training samples used for training the limited boltzmann machine are obtained by screening n first pre-training samples according to a convolutional neural network model, a self-attention network model and a cyclic neural network model, n is an integer greater than or equal to 1, and m is an integer greater than or equal to 1 and less than or equal to n; and
and the identification module is used for identifying the credit card as the credit card with overdue risk when the overdue predicted value is larger than a set overdue threshold value.
9. An electronic device, comprising:
one or more processors;
one or more memories for storing executable instructions which, when executed by the processor, implement the method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the storage medium has stored thereon executable instructions which, when executed by a processor, implement the method according to any of claims 1-7.
11. A computer program product comprising a computer program comprising one or more executable instructions which when executed by a processor implement the method according to any one of claims 1 to 7.
CN202311000163.5A 2023-08-09 2023-08-09 Credit card overdue risk prediction method, apparatus, device, medium and program product Pending CN117437019A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311000163.5A CN117437019A (en) 2023-08-09 2023-08-09 Credit card overdue risk prediction method, apparatus, device, medium and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311000163.5A CN117437019A (en) 2023-08-09 2023-08-09 Credit card overdue risk prediction method, apparatus, device, medium and program product

Publications (1)

Publication Number Publication Date
CN117437019A true CN117437019A (en) 2024-01-23

Family

ID=89552299

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311000163.5A Pending CN117437019A (en) 2023-08-09 2023-08-09 Credit card overdue risk prediction method, apparatus, device, medium and program product

Country Status (1)

Country Link
CN (1) CN117437019A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117726181A (en) * 2024-02-06 2024-03-19 山东科技大学 Collaborative fusion and hierarchical prediction method for typical disaster risk heterogeneous information of coal mine

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117726181A (en) * 2024-02-06 2024-03-19 山东科技大学 Collaborative fusion and hierarchical prediction method for typical disaster risk heterogeneous information of coal mine
CN117726181B (en) * 2024-02-06 2024-04-30 山东科技大学 Collaborative fusion and hierarchical prediction method for typical disaster risk heterogeneous information of coal mine

Similar Documents

Publication Publication Date Title
US10755196B2 (en) Determining retraining of predictive models
CN111401777B (en) Enterprise risk assessment method, enterprise risk assessment device, terminal equipment and storage medium
CN110705719A (en) Method and apparatus for performing automatic machine learning
CN109118316B (en) Method and device for identifying authenticity of online shop
CN110766481A (en) Client data processing method and device, electronic equipment and computer readable medium
CN110717597A (en) Method and device for acquiring time sequence characteristics by using machine learning model
CN117437019A (en) Credit card overdue risk prediction method, apparatus, device, medium and program product
CN111178687A (en) Financial risk classification method and device and electronic equipment
CN114638695A (en) Credit evaluation method, device, equipment and medium
CN111191677B (en) User characteristic data generation method and device and electronic equipment
CN112950359B (en) User identification method and device
CN113610625A (en) Overdue risk warning method and device and electronic equipment
CN113128773A (en) Training method of address prediction model, address prediction method and device
CN116664306A (en) Intelligent recommendation method and device for wind control rules, electronic equipment and medium
CN116503092A (en) User reservation intention recognition method and device, electronic equipment and storage medium
CN116720946A (en) Credit risk prediction method, device and storage medium based on recurrent neural network
US11935075B2 (en) Card inactivity modeling
CN116091249A (en) Transaction risk assessment method, device, electronic equipment and medium
CN116245630A (en) Anti-fraud detection method and device, electronic equipment and medium
CN115099986A (en) Vehicle insurance renewal processing method and device and related equipment
CN114493853A (en) Credit rating evaluation method, credit rating evaluation device, electronic device and storage medium
CN112348584A (en) Vehicle estimation method, device and equipment
CN112734352A (en) Document auditing method and device based on data dimensionality
CN113537363B (en) Abnormal object detection method and device, electronic equipment and storage medium
US20240113936A1 (en) Method and system for artificial intelligence-based acceleration of zero-touch processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination