CN111506798A - User screening method, device, equipment and storage medium - Google Patents

User screening method, device, equipment and storage medium Download PDF

Info

Publication number
CN111506798A
CN111506798A CN202010144416.6A CN202010144416A CN111506798A CN 111506798 A CN111506798 A CN 111506798A CN 202010144416 A CN202010144416 A CN 202010144416A CN 111506798 A CN111506798 A CN 111506798A
Authority
CN
China
Prior art keywords
user
data
identified
feature
offline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010144416.6A
Other languages
Chinese (zh)
Inventor
余雯
黄承伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN202010144416.6A priority Critical patent/CN111506798A/en
Priority to PCT/CN2020/093424 priority patent/WO2021174699A1/en
Publication of CN111506798A publication Critical patent/CN111506798A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The application relates to the field of data analysis and discloses a user screening method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image; performing image processing on the visit record image to obtain offline visiting data of the user to be identified; taking the offline bailout data and the online data as basic data of the user to be identified, and performing feature derivation on the basic data to obtain feature data of the user to be identified; carrying out characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model; and when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, determining that the user to be identified is a target user. The accuracy rate of screening the target user from the new user is improved.

Description

User screening method, device, equipment and storage medium
Technical Field
The present application relates to the field of feature recognition, and in particular, to a method, an apparatus, a device, and a storage medium for user screening.
Background
Currently, in order to identify a target user from a user group, a machine model is generally used to perform a statistical analysis on historical purchase information of the user so as to screen the target user from the user group.
However, for a new user, since the historical purchase information is less and the available information is less, when the machine model identifies the new user as a target user, the target user cannot be accurately screened out from the new user, and the screening accuracy of the target user is low.
Therefore, how to improve the screening accuracy of screening the target user from the new users becomes a problem to be solved urgently.
Disclosure of Invention
The application provides a user screening method, a user screening device, user screening equipment and a storage medium, so that the screening accuracy of screening a target user from new users is improved.
In a first aspect, the present application provides a user screening method, including:
acquiring offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image;
performing image processing on the visit record image to obtain offline visiting data of the user to be identified;
taking the offline bailout data and the online data as basic data of the user to be identified, and performing feature derivation on the basic data to obtain feature data of the user to be identified;
carrying out characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model, wherein the pre-trained user recognition model is used for recognizing the characteristic factor of a target user;
and when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, determining that the user to be identified is a target user.
In a second aspect, the present application further provides a user screening apparatus, the apparatus including:
the system comprises a data acquisition module, a data acquisition module and a data acquisition module, wherein the data acquisition module is used for acquiring offline image data and online data of a user to be identified, and the offline image data comprises a visit record image;
the image processing module is used for carrying out image processing on the visit record image so as to obtain offline bailout visit data of the user to be identified;
the characteristic derivation module is used for taking the offline visiting data and the online data as basic data of the user to be identified and carrying out characteristic derivation on the basic data to obtain characteristic data of the user to be identified;
the characteristic matching module is used for matching characteristic factors of the characteristic data of the user to be recognized according to a pre-trained user recognition model, and the pre-trained user recognition model is used for recognizing the characteristic factors of a target user;
and the user determining module is used for determining the user to be identified as the target user when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition.
In a third aspect, the present application further provides a computer device comprising a memory and a processor; the memory is used for storing a computer program; the processor is configured to execute the computer program and implement the user screening method as described above when executing the computer program.
In a fourth aspect, the present application further provides a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the user screening method as described above.
The application discloses a user screening method, a device, equipment and a storage medium, wherein offline image data and online data of a user to be identified are obtained, then offline image data is subjected to image processing to obtain offline bailout data of the user to be identified, the offline bailout data and the online data are used as basic data of the user to be identified, the basic data are subjected to feature derivation to obtain feature data of the user to be identified, feature factor matching is carried out on the feature data of the user to be identified according to a pre-trained user identification model, and when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, the user to be identified is determined to be a target user. And the offline image data is subjected to image processing, so that the accuracy and the acquisition speed of the acquired offline visiting data are improved. And carrying out feature derivation on basic data of the user to be identified to obtain feature data, enriching the feature data of the new user through the feature derivation, and carrying out feature factor matching through a pre-trained user identification model to determine whether the user to be identified is a target user or not, so that the screening accuracy of the target user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a user recognition model training method provided in an embodiment of the present application;
fig. 2 is a schematic flow chart of a user screening method provided in an embodiment of the present application;
FIG. 3 is a flow diagram illustrating sub-steps of a user filtering method provided in FIG. 2;
fig. 4 is a schematic flowchart of a user screening method provided in an embodiment of the present application;
FIG. 5 is a schematic block diagram of a user recognition model training apparatus according to an embodiment of the present application;
fig. 6 is a schematic block diagram of a user screening apparatus according to an embodiment of the present application;
fig. 7 is a schematic block diagram of a user screening apparatus according to an embodiment of the present application;
fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The embodiment of the application provides a user screening method and device, computer equipment and a storage medium. The user screening method can be used for screening the new users, so that the target users are screened from the new users, and the accuracy of screening the target users from the new users is improved. Wherein the target user may be a new user having purchasing potential for a particular product or products. In the present embodiment, for convenience of description and understanding, the specific product having the purchasing potential is taken as an example for detailed description.
Some embodiments of the present application will be described in detail below with reference to the accompanying drawings. The embodiments described below and the features of the embodiments can be combined with each other without conflict.
Referring to fig. 1, fig. 1 is a schematic flow chart of a user recognition model training method according to an embodiment of the present application. According to the user identification model training method, sample data are subjected to characteristic derivation to obtain sample characteristic data, and then training is performed by using a random forest algorithm, so that the screening accuracy of a model obtained through training on a target user is improved.
As shown in fig. 1, the user recognition model training method specifically includes: step S101 to step S104.
S101, obtaining sample data, and sequentially performing feature extraction and feature derivation on the sample data to obtain sample feature data.
Specifically, the sample data comprises non-insurance sales data, online interactive data, offline interactive data, basic information, asset information, liability information, third-party user portrait and the like of the sample user in a historical period of time, and also comprises the current purchase condition of the sample user for a specific product, wherein the specific product is a product which needs to be popularized.
The third party user representation may be a user representation generated by the sample user on another platform, and the third party user representation may include information such as user preferences and purchasing preferences.
The sample feature data comprise sample features and sample feature values, and the sample feature data are used for constructing the user identification model.
And performing feature extraction on the sample data to obtain each sample feature and a feature value corresponding to the sample feature, and taking the obtained sample feature and the feature value corresponding to the sample feature as sample feature data.
For example, the user basic information may include sample characteristics such as gender characteristics, age characteristics, occupation characteristics, and the like, and the user asset information may include annual income characteristics, sports characteristics, real estate characteristics, and the like. When the basic information of the user is sex male and age 35 years, and a specific product is purchased, the sample characteristic is sex, the sample characteristic value is male, the sample characteristic is age, the sample characteristic value is 35, the sample characteristic value is a specific product, and the sample characteristic value is purchased.
The characteristic derivation of the sample data refers to the characteristic learning of the sample data to obtain new sample user data, and the new sample user data and the sample data are jointly used as sample characteristic data. The feature derivation may be performed by a basic transformation of a single feature, or may be performed by adding a time latitude to a feature, performing time slicing, or performing operations such as addition or multiplication on a plurality of features. For example, the two features may be added to obtain a new feature, such as adding the asset information of the user and the liability information of the user to obtain the actual asset information of the user, and using the obtained actual asset information as the derived new feature.
And S102, carrying out data cleaning on the sample characteristic data to obtain the cleaned sample characteristic data.
Specifically, the data cleaning of the sample characteristic data refers to the data cleaning of characteristic values in the sample characteristic data, and specifically includes data null value detection, data abnormal value detection, data exploration and the like.
When the sample feature data is subjected to data null value inspection, the sample feature data comprises a plurality of different types of data, so that a general global constant filling mode cannot be adopted, and the different types of feature data need to be considered in combination.
For example, for some interactive behavior features, zero-value filling can be performed in the case of unavailable acquisition or missing; for user revenue, assets can be populated with averages; the null values may be left unknown for gender, occupation, etc.
When the sample characteristic data is subjected to data exploration, the sample characteristics and corresponding sample characteristic values are rejected if the loss rate is too high or the variance is small due to certain values and has no practical significance by calculating the loss rate, the maximum value, the minimum value, the average value, the variance, the percentile value and the like of the characteristic values of all the sample characteristics in the sample characteristic data. When there is an outlier, such as a maximum outlier, a 95% or other fraction of the quantile may be substituted.
In some embodiments, after obtaining the cleaned sample feature data, the method may further include: and performing variable processing on the cleaned sample characteristic data to obtain processed sample characteristic data.
Specifically, the variable processing of the cleaned sample feature data includes removing repetitive sample features and calculating an IV value of the sample features in the cleaned sample data.
The removing of the repetitive sample features refers to determining whether sample features with high correlation exist in the sample features, for example, two features, namely a working city and a living city, exist in the sample features, and the two features have high correlation, so that any one of the living city and the working city can be removed.
The IV value is used to measure the prediction ability of the sample features, the IV value of the sample features in the sample data after cleaning is calculated, the feature values of the sample features are usually binned and the evidence weight WOE value is calculated, and then the IV value is calculated based on the WOE value.
The WOE value is calculated as:
Figure BDA0002400230610000061
the IV value is calculated by the formula:
Figure BDA0002400230610000062
wherein, WOEiWOE value, g, for the ith attribute of a sample featureiRepresenting the number of purchased product users corresponding to the ith attribute of a certain sample characteristic, g being the total number of purchased product users in the sample, biAnd b is the total number of the users who do not purchase the products in the sample.
The higher the WOE value, the lower the probability that the user in the group is represented as not purchasing a product. The greater the IV value is, the greater the difference between the distribution of the sample characteristics between the sample user who purchased the specific product and the sample user who did not purchase the specific product is, i.e. the better the distinguishing capability of the sample characteristics is.
If the calculated IV value of a sample feature is too high (e.g., the IV value is greater than 0.5), then it is determined whether the sample feature contains a posterior factor that may affect the model after the product is purchased. For example, a sample user's transaction record that occurs after purchasing a particular product, although the characteristics of the transaction record are strongly correlated with whether the product was purchased, needs to cull the sample characteristics because it occurs after the purchase of the particular product.
S103, classifying the cleaned sample characteristic data according to a product purchase rule to respectively obtain a positive sample set and a negative sample set.
Specifically, the product purchase rule refers to a rule of whether a sample user purchases a specific product. In a specific implementation, the product purchase rules may be preset by an engineer.
Judging whether a sample user purchases a specific product or not according to sample features and sample feature values related to specific product purchase in the cleaned sample feature data, classifying the sample user as purchased if the sample user purchases the specific product, recording the sample feature data as a positive sample, and taking the sample feature data which accords with the classification in the cleaned sample feature data as a positive sample set; and if the sample user does not purchase the specific product, classifying the sample user as not purchased, recording the sample characteristic data as a negative sample, and taking the sample characteristic data which accords with the classification in the cleaned sample characteristic data as a negative sample set. In a specific implementation, the purchased category may be designated as 1, and the unpurchased category may be designated as 0.
In some embodiments, in order to avoid the situation that the proportion of the number of the samples in the positive sample set and the negative sample set is not uniform, which results in an overfitting when training the user recognition model, after step S103, the method may further include:
judging whether the difference of the sample numbers of the negative sample set and the positive sample set is greater than a preset threshold value or not; if the number difference between the negative sample set and the positive sample set is larger than a preset threshold value, analyzing the samples in the positive sample set to synthesize a new sample and adding the new sample to the positive sample set to construct a new positive sample set.
Specifically, the preset threshold may be preset by an engineer. In one implementation, the preset threshold may be 40% of the sum of the numbers of samples in the positive sample set and the negative sample set. For example, when the sum of the numbers of samples in the positive sample set and the negative sample set is 100, when the number of samples in the positive sample set is 20 and the number of samples in the negative sample set is 80, the difference between the numbers of samples in the positive sample set and the negative sample set is 60%, which indicates that the number of samples in the positive sample set is small, and therefore the samples in the positive sample set are analyzed to synthesize a new sample, and a new positive sample set is constructed.
In a specific implementation process, samples in the positive sample set are analyzed, wherein the positive sample set can be up-sampled by adopting a SMOTE algorithm to synthesize a new sample, and the new sample is added to the positive sample set to construct and obtain a new positive sample set.
And S104, training a user identification model by utilizing a random forest algorithm based on the positive sample set and the negative sample set to obtain the pre-trained user identification model.
And integrating the positive sample set and the negative sample set to obtain a sample set, and dividing the sample set into a training set and a testing set, wherein the number ratio of positive samples to negative samples in the training set to the testing set is the same. And training the user identification model by using a training set and a random forest algorithm, and verifying the trained user identification model by using a test set.
In the specific implementation process, the specific training process for training the user recognition model by using the training set and the random forest algorithm is as follows:
1. samples S' with the same size as S are randomly and repeatedly extracted from the training set S, and the samples which are not extracted form the data outside the bag. Sampling for n times to generate n training subsets;
2. respectively training n CART decision tree models G for n training subsetsm(x),m∈{1,2,...,n};
3. For the t-th decision tree model Gt(x) Assuming that the dimension of the feature of the training sample is W, randomly calculating a part of the feature W on the node (W < W), where WiAccording to the information gain index, selecting the best characteristic for classificationAnd (4) cracking. For a certain class xiThe information is defined as: i (X ═ X)i)=-log2P(xi) Wherein I (x) represents information of a random variable, P (x)i) Denotes xiThe probability of occurrence.
4. And (4) splitting each tree in the mode in the step (3) until all training examples of the node belong to the same class, and pruning is not needed in the splitting process of the decision tree.
5. And forming a random forest F by the generated decision trees to finish the training of the user recognition model.
After the user identification model is established, the importance of each sample feature in the sample set can be output, and the calculation method is as follows:
1. for each decision tree in the random forest F, calculating an out-of-bag error by using corresponding out-of-bag data, and recording the out-of-bag error as errOOB 1;
2. randomly adding noise interference to all sample characteristics X in the data outside the bag, and calculating the error of the data outside the bag again and recording the error as errOOB 2;
3. assuming there are N trees in the random forest, the importance to sample feature X is:
Figure BDA0002400230610000081
where P is the importance of the sample feature X.
After the importance of each sample feature is calculated, the sample features are sorted according to the size of the importance value, and the sample feature with the top importance ranking is used as the feature of the recognition target user.
In the user identification model training method provided by the embodiment, the sample characteristic data is obtained by performing characteristic extraction and characteristic derivation on the sample data, and then the sample characteristic data is subjected to data cleaning to obtain the cleaned sample characteristic data. The characteristic derivation is carried out on the sample data, the obtained sample characteristic data is increased, and the interference of dirty data on a user identification model is reduced through data cleaning. And training the user recognition model by using the cleaned sample characteristic data through a random forest algorithm. The accuracy of the user identification model for the characteristic factor identification is improved.
Referring to fig. 2, fig. 2 is a schematic flowchart of a user screening method according to an embodiment of the present application. The user screening method can perform image processing on the offline image data of the user to be identified, and matches the feature factors in the feature data of the user to be identified by using the pre-trained user identification model according to the processed user data of the user to be identified, so that the target user is screened out.
As shown in fig. 2, the user screening method specifically includes: step S201 to step S205.
S201, obtaining offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image.
Specifically, the offline image data includes a call log image and a face recognition image. The visit record image comprises a scanned image or a shot image of an offline baizhu visit record according to the communication condition with the user to be identified when the service personnel performs offline communication and visit on the user to be identified.
The online data comprises non-insurance sales data, online interactive data, basic information, asset information, liability information, third-party user representation and the like of the user to be identified. The third-party user portrait can be a user portrait generated by the user to be identified on other platforms, and the third-party user portrait can include information such as user preferences and purchasing preferences.
S202, image processing is carried out on the visit record image to obtain offline visiting data of the user to be identified.
Specifically, the offline bailout data may include a user name, an identification number, a user's understanding of the product, a name and price of the product purchased by the user, a user demand, and the like. And performing image processing on the offline bailout records made by the service personnel, thereby extracting offline bailout data of the user to be identified.
In some embodiments, referring to fig. 3, in order to improve the accuracy of the image processing, step S202 may include:
s2021, preprocessing the visit record image.
Specifically, the preprocessing may include binarization, noise removal, tilt correction, and the like, wherein, when the visit record image is a color image, the visit record image may be binarized to obtain a black-and-white binarized map. And preprocessing the visit record image to improve the identification accuracy of the visit record image.
S2022, performing layout analysis and character recognition on the preprocessed visit record image to obtain a recognition result.
Specifically, when performing layout analysis and character recognition on the preprocessed call record image, the layout analysis refers to segmenting and dividing the call record image into sections and lines according to the text content included in the call record image, and after segmenting and dividing the call record image into lines, performing character cutting and character recognition on the text content included in each section and each line, thereby recognizing the text content included in the call record image.
S2023, determining offline visiting data according to the identification result.
Specifically, determining offline visiting data according to the identification result may refer to performing post-processing on the identification result, and using the post-processed data as offline visiting data. The post-processing of the recognition result means that the recognition result is corrected according to the relation of a specific language context, so that the accuracy of image processing is improved, and the recognition result after the post-processing is stored as offline return visit data.
S203, taking the offline visiting data and the online data as basic data of the user to be identified, and performing feature derivation on the basic data to obtain feature data of the user to be identified.
Specifically, the offline visiting data and the online data obtained through processing are jointly used as basic data of the user to be identified, and feature derivation is carried out on the basic data to obtain feature data of the user to be identified.
The feature data of the user to be identified comprises features and feature values. For example, the user basic information may include characteristics such as gender characteristics, age characteristics, occupation characteristics, and the like, and the user asset information may include annual income characteristics, sports characteristics, real estate characteristics, and the like. When the basic information of the user to be identified is sex male and age 35, the characteristic is sex, the characteristic value is male, the characteristic is age, and the characteristic value is 35.
The characteristic derivation of the basic data of the user to be identified refers to the characteristic learning of the basic data of the user to be identified to obtain new user data to be identified, and the new user data to be identified and the basic data of the user to be identified are jointly used as the characteristic data of the user to be identified.
In a specific implementation, the characteristic derivation can be performed by the following three methods:
1. the underlying transformation of a single feature, such as by squaring, root-opening, log transformation, etc., of a single feature.
2. The characteristics are derived by adding a time dimension, for example, time slicing can be performed according to basic data of the user to be identified, so as to obtain online interactive data of the user to be identified, offline interactive data of the user to be identified, non-insurance sales data of the user to be identified and the like within 1 month, 3 months, 6 months and 12 months respectively.
3. The multi-feature operation, such as adding and multiplying two features or calculating a ratio between features to obtain a new feature, may, for example, summarize the user asset information and the user liability information to obtain new user data to be identified, or perform operations such as adding and multiplying the user asset information and the user liability information to obtain new user data to be identified.
And S204, performing characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model.
Specifically, the pre-trained user recognition model is used for recognizing the characteristic factors of the target user. And because the result output by the trained user identification model is the importance of the features, the user identification model is utilized to carry out feature matching on the feature data of the user to be identified, and whether the feature data comprises the important features output by the user identification model or not is judged.
S205, when the matching degree of the feature data of the user to be recognized and the feature factors in the user recognition model meets a preset condition, determining that the user to be recognized is a target user.
Specifically, when the matching degree between the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, it is indicated that the user to be identified is a user with purchase potential, and the identified user is determined to be a target user.
The preset condition may be that the matching degree between the feature data of the user to be recognized and the feature factor in the user recognition model reaches a preset threshold, or that the matching degree between the feature data of the user to be recognized and the feature factor in the user recognition model is within a numerical range.
When the preset condition is that the matching degree of the feature data of the user to be recognized and the feature factors in the user recognition model reaches a preset threshold, the preset threshold may be a percentage or a specific numerical value preset by an engineer. When the preset threshold is a percentage, the percentage refers to the percentage of the features of the user to be recognized and the number of important features output by the user recognition model.
In other embodiments, other models may be trained to calculate and adjust the specific values of the threshold, and the models may also be trained to adjust the weighting coefficients of the important features.
The user screening method disclosed in the above embodiment performs image processing on the acquired offline image data of the user to be identified to obtain offline bailey visit data, and uses the offline bailey visit data and the online data together as the basic data of the user to be identified, thereby increasing the data amount of the basic data of the user to be identified. And the basic data is subjected to feature derivation to obtain feature data, so that the data volume of the feature data is increased, and the data of a new user is enriched. And matching the characteristic factors of the characteristic data according to the user identification model, determining the user to be identified as a target user when the matching degree meets a preset condition, and improving the screening accuracy of the target user by using the trained user identification model.
Referring to fig. 4, fig. 4 is a schematic flowchart of a user screening method according to an embodiment of the present application. The user screening method can perform image processing on the offline image data of the user to be identified, associate the identified offline data with the online data by using the face data to obtain the user data, and screen the target user for the user to be identified by using the pre-trained user identification model.
As shown in fig. 4, the user screening method specifically includes: step S301 to step S306.
S301, obtaining offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image and a face identification image.
Specifically, the offline image data includes a call log image and a face recognition image. The visit record image comprises a scanned image or a shot image of an offline baizhu visit record according to the communication condition with the user to be identified when the service personnel performs offline communication and visit on the user to be identified. The face recognition image comprises a face image collected when the user to be recognized carries out offline communication visit.
S302, image processing is carried out on the visit record image so as to obtain offline visiting data of the user to be identified.
Specifically, the offline bailout data may include a user name, an identification number, a user's understanding of the product, a name and price of the product purchased by the user, a user demand, and the like. And performing image processing on the offline bailout records made by the service personnel, thereby extracting offline bailout data of the user to be identified.
In some embodiments, the face recognition image may further include a face image acquired when the user to be recognized performs face recognition signing while participating in the offline activity.
Specifically, when a user to be identified participates in offline activities, a human face acquisition image is arranged at the user sign-in place of manual sign-in, human face information of the signed-in user is acquired by the human face, and the acquired human face information, the sign-in time and the sign-in place are stored in a database together. At this time, when the visit record is subjected to image processing to obtain offline visiting data of the user to be identified, the face image acquired when the user to be identified participates in the offline activity can be matched with the offline visiting data of the user to be identified, so that the sign-in time and the sign-in place corresponding to the data of the user to be identified participating in the offline activity are also used as the offline visiting data of the user.
And S303, correlating the offline bailout data with online data according to the face recognition image to obtain basic data of the user to be recognized, and performing feature derivation on the basic data to obtain feature data of the user to be recognized.
Specifically, since the online data of the user to be identified includes the basic information of the user to be identified, that is, information such as name, gender, identification number, face image, and the like, after the offline visiting data is obtained through image processing, the obtained offline visiting data and the online data of the user need to be stored together to be used as the basic data of the user.
Therefore, when data association is carried out, matching is carried out according to the face recognition image corresponding to the offline visiting data and the face image in the online data; and when the face recognition image is successfully matched, establishing the association between the offline visiting data and the online data, and integrating the offline visiting data and the online data to obtain the basic data of the user to be recognized.
The method of face recognition image matching is adopted to replace the manual data input of service personnel, so that on one hand, the data input efficiency is improved; and because the name screening is mostly adopted when the offline visiting data is input, the offline visiting data of the user is input, the confusion of the online data and the offline visiting data of the user with the same name is easily caused, and the user information confusion caused by the offline visiting data input is also avoided by using the face recognition image matching.
The characteristic derivation of the basic data of the user to be identified refers to the characteristic learning of the basic data of the user to be identified to obtain new user data to be identified, and the new user data to be identified and the basic data of the user to be identified are jointly used as the characteristic data of the user to be identified.
S304, carrying out data cleaning on the feature data of the user to be identified to obtain the cleaned feature data of the user to be identified.
Specifically, the data cleaning of the feature data of the user to be identified comprises data null value detection, data abnormal value detection and the like.
When the feature data is subjected to data null value inspection, the feature data comprises a plurality of different types of data, so that a general global constant filling mode cannot be adopted, and the feature data of different types need to be considered in combination.
For example, for some interactive behavior features, zero-value filling can be performed in the case of unavailable acquisition or missing; for user revenue, assets can be populated with averages; the null values may be left unknown for gender, occupation, etc.
S305, carrying out characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model.
Specifically, because the result output by the trained user recognition model is the importance of the feature factor, the feature factor matching is performed on the cleaned feature data of the user to be recognized by using the user recognition model, and whether the important feature factor output by the user recognition model is included in the cleaned feature data is judged.
S306, when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, determining that the user to be identified is a target user.
Specifically, when the matching degree between the feature data of the cleaned user to be identified and the feature factors in the user identification model meets a preset condition, it is determined that the user to be identified is a user with purchase potential, and the identified user is determined to be a target user.
The embodiment discloses a user screening method, which associates offline visiting data and online data by using a face recognition image, so that the offline visiting data and the online data are jointly used as basic data, and the accuracy and the efficiency of offline visiting data entry are improved. And the basic data is subjected to feature derivation to obtain feature data, so that the data volume of the feature data is increased, and the data of a new user is enriched. And the data cleaning is carried out on the characteristic data, so that the interference of dirty data on a user identification model is reduced, and the accuracy of characteristic factor matching is improved. And matching the characteristic factors of the characteristic data according to the user identification model, determining the user to be identified as a target user when the matching degree meets a preset condition, and improving the identification accuracy of the target user by using the trained user identification model.
Referring to fig. 5, fig. 5 is a schematic block diagram of a user recognition model training apparatus according to an embodiment of the present application, which may be configured in a server for executing the user recognition model training method.
As shown in fig. 5, the user recognition model training apparatus 400 includes: a data acquisition module 401, a data cleansing module 402, a data classification module 403, and a model training module 404.
The data obtaining module 401 is configured to obtain sample data, and perform feature extraction and feature derivation on the sample data in sequence to obtain sample feature data.
A data cleaning module 402, configured to perform data cleaning on the sample feature data to obtain cleaned sample feature data.
A data classification module 403, configured to classify the cleaned sample feature data according to a product purchase rule, so as to obtain a positive sample set and a negative sample set respectively.
And a model training module 404, configured to train a user identification model based on the positive sample set and the negative sample set by using a random forest algorithm, so as to obtain the pre-trained user identification model.
Referring to fig. 6, fig. 6 is a schematic block diagram of a user screening apparatus according to an embodiment of the present application, where the user screening apparatus is configured to perform the user screening method. The user screening device may be configured in a server or a terminal.
The server may be an independent server or a server cluster. The terminal can be an electronic device such as a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant and a wearable device.
As shown in fig. 6, the user filtering apparatus 500 includes: a data acquisition module 501, an image processing module 502, a feature derivation module 503, a feature matching module 504, and a user determination module 505.
The data obtaining module 501 is configured to obtain offline image data and online data of a user to be identified, where the offline image data includes a visit record image.
An image processing module 502, configured to perform image processing on the visit record image to obtain offline visiting data of the user to be identified.
Among other things, the image processing module 502 includes a pre-processing sub-module 5021, an identification result sub-module 5022, and a data determination sub-module 5023.
Specifically, the preprocessing sub-module 5021 is configured to preprocess the call record image.
And the recognition result sub-module 5022 is used for performing layout analysis and character recognition on the preprocessed visit record images to obtain recognition results.
And the data determination submodule 5023 is used for determining offline visiting data according to the identification result.
The feature derivation module 503 is configured to use the offline visiting data and the online data as basic data of the user to be identified, and perform feature derivation on the basic data to obtain feature data of the user to be identified.
A feature matching module 504, configured to perform feature factor matching on the feature data of the user to be identified according to a pre-trained user identification model, where the pre-trained user identification model is used to identify a feature factor of a target user.
And a user determining module 505, configured to determine that the user to be identified is a target user when a matching degree between the feature data of the user to be identified and the feature factor in the user identification model meets a preset condition.
Referring to fig. 7, fig. 7 is a schematic block diagram of a user screening apparatus according to an embodiment of the present application, where the user screening apparatus is configured to perform the user screening method. The user screening device may be configured in a server or a terminal.
As shown in fig. 7, the user filtering apparatus 600 includes: a data acquisition module 601, an image processing module 602, a feature derivation module 603, a data cleansing module 604, a feature matching module 605, and a user determination module 606.
The data acquisition module 601 is configured to acquire offline image data and online data of a user to be identified, where the offline image data includes a visit record image and a face identification image.
An image processing module 602, configured to perform image processing on the visit record image to obtain offline visiting data of the user to be identified.
The feature derivation module 603 is configured to associate the offline bayer visit data with online data according to the face recognition image to obtain basic data of the user to be recognized, and perform feature derivation on the basic data to obtain feature data of the user to be recognized.
A data cleaning module 604, configured to perform data cleaning on the feature data of the user to be identified, so as to obtain the cleaned feature data of the user to be identified.
A feature matching module 605, configured to perform feature factor matching on the feature data of the user to be identified according to a pre-trained user identification model, where the pre-trained user identification model is used to identify a feature factor of a target user.
A user determining module 606, configured to determine that the user to be identified is a target user when a matching degree between the feature data of the user to be identified and the feature factor in the user identification model meets a preset condition.
It should be noted that, as will be clear to those skilled in the art, for convenience and brevity of description, the specific working processes of the user recognition model training apparatus and each module described above and the specific working processes of the user screening apparatus and each module described above may refer to the corresponding processes in the foregoing user recognition model training method and user screening method embodiments, and are not described herein again.
The user screening apparatus described above may be implemented in the form of a computer program which may be run on a computer device as shown in fig. 8.
Referring to fig. 8, fig. 8 is a schematic block diagram of a computer device according to an embodiment of the present disclosure. The computer device may be a server or a terminal.
Referring to fig. 8, the computer device includes a processor, a memory, and a network interface connected through a system bus, wherein the memory may include a nonvolatile storage medium and an internal memory.
The non-volatile storage medium may store an operating system and a computer program. The computer program includes program instructions that, when executed, cause a processor to perform any of the user screening methods.
The processor is used for providing calculation and control capability and supporting the operation of the whole computer equipment.
The internal memory provides an environment for the execution of a computer program on a non-volatile storage medium, which when executed by the processor, causes the processor to perform any of the user screening methods.
The network interface is used for network communication, such as sending assigned tasks and the like. Those skilled in the art will appreciate that the architecture shown in fig. 8 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
It should be understood that the Processor may be a Central Processing Unit (CPU), and the Processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein, in one embodiment, the processor is configured to execute a computer program stored in the memory to implement the steps of:
acquiring offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image; performing image processing on the visit record image to obtain offline visiting data of the user to be identified; taking the offline bailout data and the online data as basic data of the user to be identified, and performing feature derivation on the basic data to obtain feature data of the user to be identified; carrying out characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model, wherein the pre-trained user recognition model is used for recognizing the characteristic factor of a target user; and when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, determining that the user to be identified is a target user.
In one embodiment, when the image processing on the visit record image is implemented to obtain offline visiting data of the user to be identified, the processor is configured to implement:
preprocessing the visit record image, wherein the preprocessing comprises binaryzation, noise removal and inclination correction; performing layout analysis and character recognition on the preprocessed visit record image to obtain a recognition result; and determining offline visiting data according to the identification result.
In one embodiment, the offline image data further comprises a face recognition image; the processor is configured to, when the offline visiting data and the online data are used as basic data of the user to be identified, implement: and associating the offline visiting data with online data according to the face recognition image to obtain basic data of the user to be recognized.
In one embodiment, before implementing the feature factor matching of the feature data of the user to be recognized according to the pre-trained user recognition model, the processor is further configured to implement:
and carrying out data cleaning on the characteristic data of the user to be identified to obtain the cleaned characteristic data of the user to be identified.
In one embodiment, the processor is further configured to implement:
acquiring sample data, and sequentially performing feature extraction and feature derivation on the sample data to obtain sample feature data; carrying out data cleaning on the sample characteristic data to obtain cleaned sample characteristic data; classifying the cleaned sample characteristic data according to product purchasing rules to respectively obtain a positive sample set and a negative sample set; and training a user identification model by utilizing a random forest algorithm based on the positive sample set and the negative sample set to obtain the pre-trained user identification model.
In one embodiment, the processor is further configured to implement: performing variable processing on the cleaned sample characteristic data to obtain processed sample characteristic data; the processor is configured to, when classifying the cleaned sample feature data according to a product purchase rule to obtain a positive sample set and a negative sample set, respectively: and classifying the processed sample data according to a product purchasing rule to respectively obtain a positive sample set and a negative sample set.
In one embodiment, the processor, prior to implementing the training of the user recognition model with the random forest algorithm based on the positive and negative sample sets, is further configured to implement:
judging whether the difference of the sample numbers of the negative sample set and the positive sample set is greater than a preset threshold value or not; if the number difference between the negative sample set and the positive sample set is larger than a preset threshold value, analyzing the samples in the positive sample set to synthesize a new sample and adding the new sample to the positive sample set to construct a new positive sample set.
The embodiment of the application further provides a computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, the computer program comprises program instructions, and the processor executes the program instructions to realize any user screening method provided by the embodiment of the application.
The computer-readable storage medium may be an internal storage unit of the computer device described in the foregoing embodiment, for example, a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the computer device.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A user screening method is characterized by comprising the following steps:
acquiring offline image data and online data of a user to be identified, wherein the offline image data comprises a visit record image;
performing image processing on the visit record image to obtain offline visiting data of the user to be identified;
taking the offline bailout data and the online data as basic data of the user to be identified, and performing feature derivation on the basic data to obtain feature data of the user to be identified;
carrying out characteristic factor matching on the characteristic data of the user to be recognized according to a pre-trained user recognition model, wherein the pre-trained user recognition model is used for recognizing the characteristic factor of a target user;
and when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition, determining that the user to be identified is a target user.
2. The user screening method according to claim 1, wherein the image processing of the visit record image to obtain offline debarking data of the user to be identified comprises:
preprocessing the visit record image, wherein the preprocessing comprises binaryzation, noise removal and inclination correction;
performing layout analysis and character recognition on the preprocessed visit record image to obtain a recognition result;
and determining offline visiting data according to the identification result.
3. The user screening method of claim 1, wherein the offline image data further includes a face recognition image;
the taking the offline visiting data and the online data as the basic data of the user to be identified comprises the following steps:
and associating the offline visiting data with online data according to the face recognition image to obtain basic data of the user to be recognized.
4. The user screening method according to claim 1, before the performing feature factor matching on the feature data of the user to be recognized according to the pre-trained user recognition model, further comprising:
and carrying out data cleaning on the characteristic data of the user to be identified to obtain the cleaned characteristic data of the user to be identified.
5. The user identification method according to claim 1, further comprising:
acquiring sample data, and sequentially performing feature extraction and feature derivation on the sample data to obtain sample feature data;
carrying out data cleaning on the sample characteristic data to obtain cleaned sample characteristic data;
classifying the cleaned sample characteristic data according to product purchasing rules to respectively obtain a positive sample set and a negative sample set;
and training a user identification model by utilizing a random forest algorithm based on the positive sample set and the negative sample set to obtain the pre-trained user identification model.
6. The user screening method according to claim 5, further comprising:
performing variable processing on the cleaned sample characteristic data to obtain processed sample characteristic data;
classifying the cleaned sample characteristic data according to product purchase rules to respectively obtain a positive sample set and a negative sample set, including:
and classifying the processed sample data according to a product purchasing rule to respectively obtain a positive sample set and a negative sample set.
7. The user screening method of claim 5, further comprising, before the training a user recognition model using a random forest algorithm based on the positive and negative sample sets:
judging whether the difference of the sample numbers of the negative sample set and the positive sample set is greater than a preset threshold value or not;
if the number difference between the negative sample set and the positive sample set is larger than a preset threshold value, analyzing the samples in the positive sample set to synthesize a new sample and adding the new sample to the positive sample set to construct a new positive sample set.
8. A user screening apparatus, comprising:
the system comprises a data acquisition module, a data acquisition module and a data acquisition module, wherein the data acquisition module is used for acquiring offline image data and online data of a user to be identified, and the offline image data comprises a visit record image;
the image processing module is used for carrying out image processing on the visit record image so as to obtain offline bailout visit data of the user to be identified;
the characteristic derivation module is used for taking the offline visiting data and the online data as basic data of the user to be identified and carrying out characteristic derivation on the basic data to obtain characteristic data of the user to be identified;
the characteristic matching module is used for matching characteristic factors of the characteristic data of the user to be recognized according to a pre-trained user recognition model, and the pre-trained user recognition model is used for recognizing the characteristic factors of a target user;
and the user determining module is used for determining the user to be identified as the target user when the matching degree of the feature data of the user to be identified and the feature factors in the user identification model meets a preset condition.
9. A computer device, wherein the computer device comprises a memory and a processor;
the memory is used for storing a computer program;
the processor for executing the computer program and implementing the user screening method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement the user screening method of any one of claims 1 to 7.
CN202010144416.6A 2020-03-04 2020-03-04 User screening method, device, equipment and storage medium Pending CN111506798A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010144416.6A CN111506798A (en) 2020-03-04 2020-03-04 User screening method, device, equipment and storage medium
PCT/CN2020/093424 WO2021174699A1 (en) 2020-03-04 2020-05-29 User screening method, apparatus and device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010144416.6A CN111506798A (en) 2020-03-04 2020-03-04 User screening method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN111506798A true CN111506798A (en) 2020-08-07

Family

ID=71863921

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010144416.6A Pending CN111506798A (en) 2020-03-04 2020-03-04 User screening method, device, equipment and storage medium

Country Status (2)

Country Link
CN (1) CN111506798A (en)
WO (1) WO2021174699A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021164232A1 (en) * 2020-02-17 2021-08-26 平安科技(深圳)有限公司 User identification method and apparatus, and device and storage medium
CN113743752A (en) * 2021-08-23 2021-12-03 南京星云数字技术有限公司 Data processing method and device
CN117250521A (en) * 2023-11-17 2023-12-19 江西驴宝宝通卡科技有限公司 Charging pile battery capacity monitoring system and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105426857A (en) * 2015-11-25 2016-03-23 小米科技有限责任公司 Training method and device of face recognition model
CN106022317A (en) * 2016-06-27 2016-10-12 北京小米移动软件有限公司 Face identification method and apparatus
CN106789844A (en) * 2015-11-23 2017-05-31 阿里巴巴集团控股有限公司 A kind of malicious user recognition methods and device
CN109784351A (en) * 2017-11-10 2019-05-21 财付通支付科技有限公司 Data classification method, disaggregated model training method and device
CN110415065A (en) * 2018-04-28 2019-11-05 K11集团有限公司 User data collection system and information-pushing method

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107657048B (en) * 2017-09-21 2020-12-04 麒麟合盛网络技术股份有限公司 User identification method and device
US20190324621A1 (en) * 2018-04-23 2019-10-24 Qualcomm Incorporated System and Methods for Utilizing Multi-Finger Touch Capability to Efficiently Perform Content Editing on a Computing Device
CN109218390B (en) * 2018-07-12 2021-09-10 北京比特智学科技有限公司 User screening method and device
CN110049094B (en) * 2019-02-28 2022-03-04 创新先进技术有限公司 Information pushing method and offline display terminal
CN110175298B (en) * 2019-04-12 2023-11-14 腾讯科技(深圳)有限公司 User matching method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106789844A (en) * 2015-11-23 2017-05-31 阿里巴巴集团控股有限公司 A kind of malicious user recognition methods and device
CN105426857A (en) * 2015-11-25 2016-03-23 小米科技有限责任公司 Training method and device of face recognition model
CN106022317A (en) * 2016-06-27 2016-10-12 北京小米移动软件有限公司 Face identification method and apparatus
CN109784351A (en) * 2017-11-10 2019-05-21 财付通支付科技有限公司 Data classification method, disaggregated model training method and device
CN110415065A (en) * 2018-04-28 2019-11-05 K11集团有限公司 User data collection system and information-pushing method

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021164232A1 (en) * 2020-02-17 2021-08-26 平安科技(深圳)有限公司 User identification method and apparatus, and device and storage medium
CN113743752A (en) * 2021-08-23 2021-12-03 南京星云数字技术有限公司 Data processing method and device
CN117250521A (en) * 2023-11-17 2023-12-19 江西驴宝宝通卡科技有限公司 Charging pile battery capacity monitoring system and method
CN117250521B (en) * 2023-11-17 2024-02-20 江西驴充充物联网科技有限公司 Charging pile battery capacity monitoring system and method

Also Published As

Publication number Publication date
WO2021174699A1 (en) 2021-09-10

Similar Documents

Publication Publication Date Title
WO2019214248A1 (en) Risk assessment method and apparatus, terminal device, and storage medium
CN107025596B (en) Risk assessment method and system
CN107066616B (en) Account processing method and device and electronic equipment
WO2021164232A1 (en) User identification method and apparatus, and device and storage medium
CN110555372A (en) Data entry method, device, equipment and storage medium
CN109583966B (en) High-value customer identification method, system, equipment and storage medium
CN111506798A (en) User screening method, device, equipment and storage medium
CN112102073A (en) Credit risk control method and system, electronic device and readable storage medium
CN112559900B (en) Product recommendation method and device, computer equipment and storage medium
CN112017040B (en) Credit scoring model training method, scoring system, equipment and medium
CN112766824A (en) Data processing method and device, electronic equipment and storage medium
CN111127110A (en) Merchant score calculation method, device, equipment and storage medium
CN115936841A (en) Method and device for constructing credit risk assessment model
CN114743048A (en) Method and device for detecting abnormal straw picture
CN115713424A (en) Risk assessment method, risk assessment device, equipment and storage medium
CN114170000A (en) Credit card user risk category identification method, device, computer equipment and medium
CN115114851A (en) Scoring card modeling method and device based on five-fold cross validation
CN114626940A (en) Data analysis method and device and electronic equipment
CN114066564A (en) Service recommendation time determination method and device, computer equipment and storage medium
CN109308565B (en) Crowd performance grade identification method and device, storage medium and computer equipment
CN113435741A (en) Training plan generation method, device, equipment and storage medium
CN114548620A (en) Logistics punctual insurance service recommendation method and device, computer equipment and storage medium
CN110610200B (en) Vehicle and merchant classification method and device, computer equipment and storage medium
CN117037167A (en) Sensitive information detection method, device, equipment and medium based on artificial intelligence
CN114022284A (en) Abnormal transaction detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40031456

Country of ref document: HK