CN111159397A - Text classification method and device and server - Google Patents

Text classification method and device and server Download PDF

Info

Publication number
CN111159397A
CN111159397A CN201911227485.7A CN201911227485A CN111159397A CN 111159397 A CN111159397 A CN 111159397A CN 201911227485 A CN201911227485 A CN 201911227485A CN 111159397 A CN111159397 A CN 111159397A
Authority
CN
China
Prior art keywords
text
classification model
text classification
training
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911227485.7A
Other languages
Chinese (zh)
Other versions
CN111159397B (en
Inventor
马良庄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN201911227485.7A priority Critical patent/CN111159397B/en
Publication of CN111159397A publication Critical patent/CN111159397A/en
Application granted granted Critical
Publication of CN111159397B publication Critical patent/CN111159397B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification

Abstract

The embodiment of the specification provides a text classification method, a text classification device and a server, wherein a prediction category of a first training text is output through a second text classification model with relatively low feature extraction capability, and received text information is classified through a first text classification model trained by the first training text, a real category and the prediction category of the first training text. The feature extraction capability of the second text classification model is smaller than that of the first text classification model, so that the first text classification model can be restrained through the second text classification model, overfitting of the first text classification model can be effectively controlled, and accuracy of text classification is improved.

Description

Text classification method and device and server
Technical Field
The specification relates to the technical field of artificial intelligence, in particular to a text classification method, a text classification device and a server.
Background
In everyday applications, some textual information often needs to be classified. For example, in a smart robot customer service application scenario, a user may send text information to the smart robot customer service, where the text information may be text information related to account operations, such as: how to register an account or how to bind a mobile phone number to the account, etc.; or text information related to the order, such as: how to cancel an order or how long the order refund process is to be cancelled, etc.; other types of textual information are also possible. In order to improve the response efficiency of the intelligent robot customer service, the text information needs to be classified. Therefore, it is necessary to improve the accuracy of text information classification.
Disclosure of Invention
Based on the method, the device and the server, the text classification method and device are provided by the embodiment of the specification.
According to a first aspect of embodiments herein, there is provided a text classification method, the method comprising:
receiving text information;
classifying the text information through a pre-trained first text classification model to determine the category of the text information;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
According to a second aspect of embodiments herein, there is provided a text classification apparatus, the apparatus comprising:
the receiving module is used for receiving text information;
the classification module is used for classifying the text information through a pre-trained first text classification model so as to determine the category of the text information;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
According to a third aspect of embodiments herein, there is provided a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any of the embodiments.
According to a fourth aspect of the embodiments of the present specification, there is provided a server comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method according to any of the embodiments when executing the program.
By applying the scheme of the embodiment of the specification, the prediction category of the first training text is output through the second text classification model with relatively low feature extraction capability, and the received text information is classified through the first training text and the first text classification model trained by the real category and the prediction category of the first training text. The feature extraction capability of the second text classification model is smaller than that of the first text classification model, so that the first text classification model can be restrained through the second text classification model, overfitting of the first text classification model can be effectively controlled, and accuracy of text classification is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the specification.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present specification and together with the description, serve to explain the principles of the specification.
Fig. 1 is a flowchart of a text classification method according to an embodiment of the present specification.
Fig. 2A is a schematic diagram of a second model training process according to an embodiment of the present disclosure.
Fig. 2B is a schematic diagram of a first model training process according to an embodiment of the present disclosure.
FIG. 3 is a schematic diagram of a first model training process according to another embodiment of the present disclosure.
Fig. 4 is a block diagram of a text classification apparatus according to an embodiment of the present specification.
FIG. 5 is a schematic diagram of a computer device for implementing methods of embodiments of the present description, according to an embodiment of the present description.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the specification, as detailed in the appended claims.
The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of the present specification. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
As shown in fig. 1, an embodiment of the present specification provides a text classification method, which may include:
step S102: receiving text information;
step S104: classifying the text information through a pre-trained first text classification model to determine the category of the text information;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
The steps in the embodiments of the present description may be performed by an intelligent robot customer service located on the server side. For step S102, the text message may be sent by the user to the intelligent robot customer service through the client. The user can input text information on the client, and the client can send the text information to the intelligent robot customer service. The client may be an application installed on an electronic device such as a smart phone, a tablet computer, or a desktop computer. For example, it may be an application program such as Taobao, Internet banking or Payment. The text information entered by the user on the client may be text information related to account operations, such as: how to register an account or how to bind a mobile phone number to the account, etc.; or text information related to the order, such as: how to cancel an order or how long the order refund process is to be cancelled, etc.; other types of textual information are also possible.
In some embodiments, the user may also send information in other formats to the client, other than text. After receiving the information in other formats, the client can extract text information from the information and then send the text information to the intelligent robot customer service. For example, when the other format is a picture format, text information may be recognized from the picture by an OCR (Optical Character Recognition) technique. Further, for the received or extracted text information, stop words can be filtered from the text information, and then the filtered text information is sent to the intelligent robot customer service.
For step S104, the intelligent robot customer service may classify the text information to determine a category to which the text information belongs. For example, "how to register an account" and "how to bind a mobile phone number to an account" are divided into "account operation" categories; for another example, "how to cancel an order" and "how long it is to cancel the order refund process" are classified into order categories. The category in this step may be a category in a category database, where the category database includes a plurality of categories, and each category may be preset according to an actual requirement. If necessary, the categories in the category database may also be updated periodically, and the updating manner includes creating new categories, deleting unusual categories, adjusting existing categories (for example, adjusting names of existing categories), and the like. Each category corresponds to a group of response messages for the intelligent robot to reply to the text messages. After determining the category to which the text information belongs, the intelligent robot customer service may search for response information that is most matched with the text information from response information corresponding to the category, and return the searched response information to the client. By classifying, the accuracy and efficiency of response can be improved.
The second text classification model is a machine learning model that is less feature extracting than the first text classification model. A model with a smaller scale than the first text classification model may be used as the second text classification model, where the scale may refer to the number of model parameters or the number of layers of the model, or to other characteristics of the model. "Small" is relative to the features of the first text classification model, e.g., the second text classification model has fewer model parameters than the first text classification model, or has fewer model layers than the first text classification model, indicating that the second text classification model is a smaller scale machine learning model than the first text classification model. In practical applications, the scale of the second text classification model may be at least one order of magnitude smaller than the scale of the first text classification model. For example, the number of model parameters of the second text classification model is at least one order of magnitude less than the first text classification model, or the number of model layers of the second text classification model is at least one order of magnitude less than the first text classification model.
The first text classification model and the second text classification model may be machine learning models of various categories, such as a neural network model, a decision tree model, a bayesian classifier, and the like, and the types of the first text classification model and the second text classification model may be the same or different, which is not limited in this disclosure. In order to prevent the inaccurate classification result caused by the overfitting of the first text classification model, the second text classification model can adopt an underfitting machine learning model.
Each piece of first training text used to train the first text classification model includes two class labels in total, namely, the prediction class output by the second text classification model, and the true class of the first training text. The prediction class output by the second text classification model may or may not be the same as the true class of the first training text. For example, when the first training text is "how to bind a mobile phone number to an account", the corresponding prediction category and real category may be "account security category" and "account operation category", respectively.
And outputting the prediction category of the first training text through a second text classification model with relatively low feature extraction capability, and classifying the received text information through a first text classification model trained by the first training text, the real category and the prediction category of the first training text. The feature extraction capability of the second text classification model is smaller than that of the first text classification model, so that the first text classification model can be restrained through the second text classification model, overfitting of the first text classification model can be effectively controlled, and accuracy of text classification is improved.
In some embodiments, as shown in fig. 2A, the second text classification model is trained by: and taking a second training text as the input of the second text classification model, and taking the real category of the second training text as the output of the second text classification model so as to train the second text classification model.
In the training process, the ID number of the real category of the second training text may be acquired, where the ID number is used to uniquely identify each real category of the second training text, and the serial number of the real category of the second training text in the category database may be used as the ID number. The second training text may also be converted into a vector, for example, the word2vec technology may be used to convert the second training text into a vector, and of course, other methods may also be used to perform the conversion, which is not limited in this specification. Then, the vector is used as the input of the second text classification model, the ID number is used as the output of the second text classification model, and the second text classification model is trained.
In some embodiments, as shown in fig. 2B, the first text classification model is trained by: inputting the first training text into the second text classification model to obtain the first prediction category; and taking the first training text as the input of a first text classification model, and taking the real class and the first prediction class of the first training text together as the output of the first text classification model so as to train the first text classification model.
In this embodiment, the first text classification model is trained in a similar manner as the second text classification model, except that the output of the first text classification model includes two class labels. In the training process, the two categories may also be converted into ID numbers, respectively, and the first training text is converted into a vector, which is not described herein again.
In some embodiments, the method further comprises: and if the preset training termination condition is not met, the first text classification model is used as the second text classification model, and the first text classification model is retrained.
In the embodiment of the present description, a process of training a model with a strong feature extraction capability by using a model with a weak feature extraction capability is repeated for multiple times in a model training process, then, the trained model is used as the model with the weak feature extraction capability to train a model with a strong feature extraction capability, and the above steps are repeated until a training termination condition is satisfied. The training termination condition may be that the training frequency reaches a preset frequency threshold, or may be other constraint conditions.
And the second prediction category of the first training text output by the first text classification model meets a preset first loss function, and the first prediction category meets a preset second loss function. The second loss function is a loss function of the second text classification model predicting the category of the first training text. The output class output by the second text classification model is used as a reference class for predicting the class of the first training text by the first text classification model, so the second loss function can also be called a soft loss function (soft target loss) of the first text classification model; in contrast, the first loss function is a hard loss function (hard target loss) of the first text classification model. Similarly, the second loss function is a hard loss function of the second text classification model, and the first loss function is a soft loss function of the second text classification model.
As shown in fig. 3, the first text classification model is obtained through multiple training. The method comprises the steps of firstly training a small model (namely, a second text classification model) with a smaller scale according to a second training text by taking a preset loss function as a constraint condition, then taking the small model as a teacher model (teacher), taking the loss function of the small model and the loss function of a large model (namely, a first text classification model) with a larger scale as the constraint condition together, and training the large model according to the first training text, wherein the trained large model is called a student model (student). Then, the student model is used as a teacher model, and a larger student model is trained again according to the above mode. Repeating the process for N times to obtain a first text classification model for final deployment online.
It should be noted that the training data used to train the smaller-scale machine learning model is referred to as a second training text, and the training data used to train the larger-scale machine learning model is referred to as a first training text. The first training text and the second training text may be the same or different.
In some embodiments, the step of training the first text classification model comprises: weighting the real category and the first prediction category; taking the first training text as the input of a first text classification model, taking the weighted real class and the weighted first prediction class as the output of the first text classification model together, and training the first text classification model after a plurality of iterations; in an iterative process, gradually reducing the weight of the first prediction category until the weight of the first prediction category is reduced to a first weight value, and gradually increasing the weight of the real category until the weight of the real category is increased to a second weight value; the first weight value is less than the second weight value, and an initial weight of the first prediction category is greater than an initial weight of the real category.
In this embodiment, multiple iterations may be performed in the process of training the first text classification model, and different weights are assigned to the prediction class and the true class of the first training text in each iteration, so that as the number of iterations increases, the weight of the prediction class of the first training text gradually decreases (e.g., decreases from 1 to 0), and the weight of the true class of the first training text gradually increases (e.g., increases from 0 to 1). The weights of the two categories may be gradually increased or decreased according to preset step lengths, and the step length of decreasing the weight of the prediction category may be the same as or different from the step length of increasing the weight of the real category. By gradually reducing the weight of the predicted category of the first training text and gradually increasing the weight of the true category of the first training text, the final training result can be mainly determined by the true category of the first training text. Due to the fact that the prediction class error output by the second text classification model is possibly different from the real class of the first training text, the method can avoid the inaccuracy of the training result caused by the prediction class error output by the second text classification model.
As shown in fig. 4, a block diagram of a text classification apparatus according to an embodiment of the present specification may include:
a receiving module 402, configured to receive text information;
a classification module 404, configured to classify the text information through a pre-trained first text classification model to determine a category to which the text information belongs;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
The specific details of the implementation process of the functions and actions of each module in the text classification device are found in the implementation process of the corresponding step in the text classification method, and are not described herein again.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The above-described embodiments of the apparatus are merely illustrative, wherein the modules described as separate parts may or may not be physically separate, and the parts displayed as modules may or may not be physical modules, may be located in one place, or may be distributed on a plurality of network modules. Some or all of the modules can be selected according to actual needs to achieve the purpose of the solution in the specification. One of ordinary skill in the art can understand and implement it without inventive effort.
The embodiments of the apparatus of the present specification can be applied to a computer device, such as a server or a terminal device. The device embodiments may be implemented by software, or by hardware, or by a combination of hardware and software. The software implementation is taken as an example, and as a logical device, the device is formed by reading corresponding computer program instructions in the nonvolatile memory into the memory for operation through the processor in which the file processing is located. From a hardware aspect, as shown in fig. 5, it is a hardware structure diagram of a computer device in which the apparatus of this specification is located, except for the processor 502, the memory 504, the network interface 506, and the nonvolatile memory 508 shown in fig. 5, a server or an electronic device in which the apparatus is located in the embodiment may also include other hardware according to an actual function of the computer device, which is not described again.
Accordingly, the embodiments of the present specification also provide a computer storage medium, in which a program is stored, and the program, when executed by a processor, implements the method in any of the above embodiments.
Accordingly, embodiments of the present specification further provide a server, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the method in any of the above embodiments is implemented.
Embodiments of the present description may take the form of a computer program product embodied on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (10)

1. A method of text classification, the method comprising:
receiving text information;
classifying the text information through a pre-trained first text classification model to determine the category of the text information;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
2. The method of claim 1, the first text classification model trained by:
inputting the first training text into the second text classification model to obtain the first prediction category;
and taking the first training text as the input of a first text classification model, and taking the real class and the first prediction class of the first training text together as the output of the first text classification model so as to train the first text classification model.
3. The method of claim 2, further comprising:
and if the preset training termination condition is not met, the first text classification model is used as the second text classification model, and the first text classification model is retrained.
4. The method of claim 1, the second text classification model trained by:
and taking a second training text as the input of the second text classification model, and taking the real category of the second training text as the output of the second text classification model so as to train the second text classification model.
5. The method of claim 1, wherein a second prediction class of the first training text output by the first text classification model satisfies a preset first loss function, and the first prediction class satisfies a preset second loss function.
6. The method of claim 1, wherein the step of training the first text classification model using the first training text as an input to the first text classification model and the true class and the first predicted class of the first training text together as an output of the first text classification model comprises:
weighting the real category and the first prediction category;
taking the first training text as the input of a first text classification model, taking the weighted real class and the weighted first prediction class as the output of the first text classification model together, and training the first text classification model after a plurality of iterations;
in an iterative process, gradually reducing the weight of the first prediction category until the weight of the first prediction category is reduced to a first weight value, and gradually increasing the weight of the real category until the weight of the real category is increased to a second weight value;
the first weight value is less than the second weight value, and an initial weight of the first prediction category is greater than an initial weight of the real category.
7. The method of any of claims 1-6, the second text classification model being an under-fit machine learning model.
8. An apparatus for text classification, the apparatus comprising:
the receiving module is used for receiving text information;
the classification module is used for classifying the text information through a pre-trained first text classification model so as to determine the category of the text information;
the first text classification model is obtained by training based on a first training text, a real class of the first training text and a first prediction class of the first training text output by a pre-trained second text classification model, and the feature extraction capability of the second text classification model is lower than that of the first text classification model.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
10. A server comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 7 when executing the program.
CN201911227485.7A 2019-12-04 2019-12-04 Text classification method and device and server Active CN111159397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911227485.7A CN111159397B (en) 2019-12-04 2019-12-04 Text classification method and device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911227485.7A CN111159397B (en) 2019-12-04 2019-12-04 Text classification method and device and server

Publications (2)

Publication Number Publication Date
CN111159397A true CN111159397A (en) 2020-05-15
CN111159397B CN111159397B (en) 2023-04-18

Family

ID=70556436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911227485.7A Active CN111159397B (en) 2019-12-04 2019-12-04 Text classification method and device and server

Country Status (1)

Country Link
CN (1) CN111159397B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556404A (en) * 2021-08-03 2021-10-26 广东九博科技股份有限公司 Communication method and system between single disks in equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106548190A (en) * 2015-09-18 2017-03-29 三星电子株式会社 Model training method and equipment and data identification method
CN108090508A (en) * 2017-12-12 2018-05-29 腾讯科技(深圳)有限公司 A kind of classification based training method, apparatus and storage medium
CN110223281A (en) * 2019-06-06 2019-09-10 东北大学 A kind of Lung neoplasm image classification method when in data set containing uncertain data
CN110427466A (en) * 2019-06-12 2019-11-08 阿里巴巴集团控股有限公司 Training method and device for the matched neural network model of question and answer

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106548190A (en) * 2015-09-18 2017-03-29 三星电子株式会社 Model training method and equipment and data identification method
CN108090508A (en) * 2017-12-12 2018-05-29 腾讯科技(深圳)有限公司 A kind of classification based training method, apparatus and storage medium
CN110223281A (en) * 2019-06-06 2019-09-10 东北大学 A kind of Lung neoplasm image classification method when in data set containing uncertain data
CN110427466A (en) * 2019-06-12 2019-11-08 阿里巴巴集团控股有限公司 Training method and device for the matched neural network model of question and answer

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113556404A (en) * 2021-08-03 2021-10-26 广东九博科技股份有限公司 Communication method and system between single disks in equipment

Also Published As

Publication number Publication date
CN111159397B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
US20120136812A1 (en) Method and system for machine-learning based optimization and customization of document similarities calculation
US11636341B2 (en) Processing sequential interaction data
CN111967387A (en) Form recognition method, device, equipment and computer readable storage medium
KR20190075962A (en) Data processing method and data processing apparatus
CN112487149A (en) Text auditing method, model, equipment and storage medium
CN110674188A (en) Feature extraction method, device and equipment
CN111325248A (en) Method and system for reducing pre-loan business risk
CN113656699B (en) User feature vector determining method, related equipment and medium
CN109726288A (en) File classification method and device based on artificial intelligence process
CN110689359A (en) Method and device for dynamically updating model
CN111159397B (en) Text classification method and device and server
CN113051486A (en) Friend-making scene-based recommendation model training method and device, electronic equipment and computer-readable storage medium
CN111523604A (en) User classification method and related device
CN116308738B (en) Model training method, business wind control method and device
CN116977064A (en) Wind control model construction method, system and device based on loss function
CN111078877B (en) Data processing method, training method of text classification model, and text classification method and device
CN113297482B (en) User portrayal describing method and system of search engine data based on multiple models
CN115936003A (en) Software function point duplicate checking method, device, equipment and medium based on neural network
CN113177603B (en) Training method of classification model, video classification method and related equipment
CN114996453A (en) Method and device for recommending commodity codes of import and export commodities and electronic equipment
CN112132690B (en) Method and device for pushing foreign exchange product information, computer equipment and storage medium
CN111046934B (en) SWIFT message soft clause recognition method and device
CN111143552B (en) Text information category prediction method and device and server
CN113837635A (en) Risk detection processing method, device and equipment
JP5824429B2 (en) Spam account score calculation apparatus, spam account score calculation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant