CN113239176A - Semantic matching model training method, device, equipment and storage medium - Google Patents

Semantic matching model training method, device, equipment and storage medium Download PDF

Info

Publication number
CN113239176A
CN113239176A CN202110688539.0A CN202110688539A CN113239176A CN 113239176 A CN113239176 A CN 113239176A CN 202110688539 A CN202110688539 A CN 202110688539A CN 113239176 A CN113239176 A CN 113239176A
Authority
CN
China
Prior art keywords
similarity
standard
model
student
teacher
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110688539.0A
Other languages
Chinese (zh)
Other versions
CN113239176B (en
Inventor
陆林炳
刘志慧
金培根
林加新
李炫�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Life Insurance Company of China Ltd
Original Assignee
Ping An Life Insurance Company of China Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Life Insurance Company of China Ltd filed Critical Ping An Life Insurance Company of China Ltd
Priority to CN202110688539.0A priority Critical patent/CN113239176B/en
Publication of CN113239176A publication Critical patent/CN113239176A/en
Application granted granted Critical
Publication of CN113239176B publication Critical patent/CN113239176B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/194Calculation of difference between files
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Biomedical Technology (AREA)
  • Human Computer Interaction (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to the field of prediction models, and discloses a semantic matching model training method, a device, equipment and a storage medium, wherein the method comprises the following steps: obtaining a problem pair sample set; calculating similarity of standard problems and similar problems in the sample by the teacher model to obtain a first similarity; inputting the sharing parameters into a student output layer of the student model to initialize the student model; calculating the similarity of the standard problem and the similar problem through a student model to obtain a second similarity; determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model. The invention improves the learning ability of the model and also improves the accuracy and quality of problem matching.

Description

Semantic matching model training method, device, equipment and storage medium
Technical Field
The invention relates to the field of prediction models, in particular to a semantic matching model training method, a semantic matching model training device, semantic matching model training equipment and a semantic matching model training storage medium.
Background
With the continuous development of intelligent technology, more and more service scenes use an AI question-and-answer system to replace manual question-and-answer.
At present, semantic models are widely applied to the AI question-answering system, however, with the rapid development of network information, more and more corpus data need to be processed, and the requirements for the accuracy and speed of corpus data processing are increasing. The large-scale semantic model has excellent performance in natural language understanding and generation tasks, however, deploying the semantic model with massive parameters to a device with limited memory still faces huge challenges. In the existing model compression field, a student model only learns the final result of a teacher model, so that a great deal of information of the teacher model on problem pair prediction is lost, and the technical problems of low accuracy and weak learning ability of the student model are caused.
Disclosure of Invention
Therefore, it is necessary to provide a semantic matching model training method, apparatus, computer device and storage medium for solving the technical problems that the student model only learns the final result of the teacher model, and a large amount of information predicted by the teacher model for problem pairs is lost, resulting in low accuracy and weak learning ability of the student model.
A semantic matching model training method comprises the following steps:
obtaining a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
calculating the similarity of the standard problems and the similar problems in the problem pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
inputting the sharing parameters into a student output layer of a student model to initialize the student model;
inputting the standard question, the similar question, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training;
calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector;
when the total loss value does not meet a preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating an updated total loss value of the student model after iterative updating;
and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
A semantic matching model training apparatus, comprising:
the problem pair acquisition module is used for acquiring a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
the first similarity module is used for calculating the similarity of the standard questions and the similar questions in the question pair sample through a teacher model to obtain first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
the initialization module is used for inputting the sharing parameters into a student output layer of a student model so as to initialize the student model;
the student model training module is used for inputting the standard problem, the similar problem, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training;
the second similarity module is used for calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
a total loss value module, configured to determine a total loss value according to the first similarity, the second similarity, the teacher hidden vector, and the student hidden layer vector;
the semantic matching model module is used for iteratively updating the student model by using a back propagation method when the total loss value does not meet a preset convergence condition, and calculating an updated total loss value of the student model after iterative updating; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
A computer device comprising a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, the processor implementing the semantic matching model training method when executing the computer readable instructions.
One or more readable storage media storing computer-readable instructions that, when executed by one or more processors, cause the one or more processors to perform a semantic matching model training method as described above.
According to the semantic matching model training method, the semantic matching model training device, the computer equipment and the storage medium, the problem pair sample set is obtained; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; a question pair sample comprises a standard question and a similar question, and similarity calculation is carried out on the standard question and the similar question in the question pair sample through a teacher model to obtain first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters; inputting the sharing parameters into a student output layer of a student model to initialize the student model; inputting the standard question, the similar question, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training; calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors; determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector; when the total loss value does not meet a preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating an updated total loss value of the student model after iterative updating; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic similarity model. The problem pair sample set based on the expert field knowledge base is obtained, the teacher model trains the problem pair sample to obtain a first similarity and a teacher hidden vector, the teacher model is initialized by using the shared parameters of the teacher model output layer, the problem pair sample is trained by the student model to obtain the first similarity and the student hidden vector, the total loss value is determined according to the first similarity, the second similarity, the teacher hidden vector and the student hidden vector, and the domain knowledge, the hidden information and the similarity information are continuously integrated into the model training by a back propagation method until the training is completed to obtain the semantic similarity model. When the model is used for learning, not only can the semantics in the original problem pair be identified, but also the hidden information and the final result of the teacher model can be identified. The learning ability of the model is improved, and the accuracy and quality of problem matching are also improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic diagram of an application environment of a semantic matching model training method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a semantic matching model training method according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a semantic matching model training apparatus according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a computer device according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The semantic matching model training method provided by this embodiment can be applied to the application environment shown in fig. 1, in which a client communicates with a server. The client includes, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices. The server can be implemented by an independent server or a server cluster composed of a plurality of servers.
In an embodiment, as shown in fig. 2, a semantic matching model training method is provided, which is described by taking the example that the method is applied to the server side in fig. 1, and includes the following steps:
s10, obtaining a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one such problem includes a standard problem and a similar problem for the sample.
Understandably, the expert domain knowledge base stores historically collected expert labeled problem pairs samples associated with a business scenario. Questions include standard questions and similar questions for the sample. Standard problems of different service scenes are obtained from an expert domain knowledge base. And processing the standard problem through the recall model to obtain a plurality of similar problems corresponding to the standard problem. The standard question and a similar question corresponding to the standard question are determined as a question pair sample. By combining, multiple problem pair samples can be obtained. For example: the standard problem is "XXX handset purchase", similar problems are: "how XXX cell phones should buy", "how XXX cell phones buy", or "how to buy XXX cell phones" etc. "XXX handset purchases" and "what to buy" XXX handsets may be determined as a sample of question pairs.
S20, calculating the similarity of the standard questions and the similar questions in the question pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer includes sharing parameters.
Understandably, the teacher model is a neural network model based on the sequence-BERT (Sentence-language representation model). And training the sample set by the problems to obtain the teacher model. The teacher model can output a feature vector which has a fixed length and is kept with semantic information, the feature vector is output data of a middle layer of the teacher model, and similarity between a standard problem and a similar problem is calculated through a softmax function. When similarity calculation is carried out, semantic recognition is carried out on the standard problem and the similar problem through a teacher model, a standard feature vector corresponding to the standard problem and a similar feature vector corresponding to the similar problem are obtained respectively, then, a softmax function is used for calculating the similarity between the standard feature vector and the similar feature vector, and the similarity is determined to be the first similarity.
Specifically, in one embodiment, the standard problem and the similarity problem are input into a teacher model, the teacher model performs semantic recognition on the standard problem and the similarity problem through a semantic decoder, and the standard problem and the similarity problem are converted into fixed-dimension vectors in a sequence-BERT manner, where the fixed-dimension vectors are a standard feature vector (which may be represented by a _ embedded) and a similar feature vector (which may be represented by a _ embedded), respectively. Wherein, the fixed dimension can be selected according to the requirement. For example, the fixed dimension of the standard feature vector is d 768, and the fixed dimension of the similar feature vector is also d 768. Calculating a difference value between the standard feature vector and the similar feature vector to obtain a difference value vector with the dimension d being 768, connecting the standard feature vector, the similar feature vector and the difference value vector to obtain a connection vector with the size of 768 × 3, converting the connection vector into a teacher hidden vector with the dimension d being 768 through first matrix multiplication, and obtaining a hidden layer (hidden) with the dimension d being 768. And converting the teacher hidden layer vector with d being 768 into an output vector with d being 2 by a second matrix multiplication (F being H multiplied by W + B, wherein H is the teacher hidden vector with d being 768, and W and B are shared parameters of the output layer respectively) on the output layer. The Softmax function can normalize the output vector with the dimension d being 2, and finally output the similarity of the standard problem and the similar problem.
The shared parameters of the output layer are random variables obtained by initializing the teacher model, and are kept unchanged in the training process.
And S30, inputting the sharing parameters into a student output layer of a student model, and initializing the student model.
Understandably, the student model is a neural network model based on a Deep Neural Network (DNN), which is obtained by training a sample set on a problem. The teacher model maps each word in each input sentence to a vector of fixed dimension by way of Bag of words (BOW for short). For example, if the fixed dimension d is 768, the vectors of each word in the sentence are added and averaged to obtain a sentence vector with the dimension d of 768.
Specifically, in one embodiment, the shared parameters of the teacher model are input into the student output layer of the student model as the initialization parameters of the student model to initialize the student model.
And S40, inputting the standard question, the similar question, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training.
Understandably, the output layer of the student model contains the shared parameters of the output layer of the teacher model and remains unchanged during the training process. The standard question, the similar question, the first similarity and the teacher hidden vector are input into a student model containing teacher model sharing parameters for training, so that the student model can learn not only information contained in the output result (the first similarity) of the teacher model, but also information contained in the hidden vector of the teacher model.
S50, calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors.
Understandably, when similarity calculation is carried out, semantic recognition is carried out on the standard problem and the similar problem through a student model, a standard feature vector corresponding to the standard problem and a similar feature vector corresponding to the similar problem are respectively obtained, then, a softmax function is used for calculating the similarity between the standard feature vector and the similar feature vector, and the similarity is determined to be a second similarity.
Specifically, in an embodiment, the standard problem and the similarity problem are input into a student model, the student model performs semantic recognition on the standard problem and the similarity problem through a semantic decoder, and each word in sentences of the standard problem and the similarity problem is mapped to a fixed-dimension vector in a BOW manner and converted into the fixed-dimension vector, wherein the fixed-dimension vectors are a standard feature vector (which can be represented by a _ embed) and a similar feature vector (which can be represented by a _ embed), respectively. Wherein, the fixed dimension can be selected according to the requirement. For example, the fixed dimension of the standard feature vector is d 768, and the fixed dimension of the similar feature vector is also d 768. Calculating a difference value between the standard feature vector and the similar feature vector to obtain a difference value vector with the dimension d being 768, connecting the feature vector, the similar feature vector and the difference value vector to obtain a connection vector with the size of 768 × 3, converting the connection vector into a student hidden vector with the dimension d being 768 through first matrix multiplication, and obtaining a hidden layer (hidden) with the dimension d being 768. And converting the student hidden layer vector d of 768 into an output vector d of 2 by a second matrix multiplication (F ═ H × W + B, wherein H is the student hidden vector d of 768, and W and B are shared parameters of the output layer respectively) at the output layer. The Softmax function can normalize the output vector with the dimension d being 2, and finally output the similarity of the standard problem and the similar problem.
And S60, determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector.
It is understood that the first similarity is an output result of the teacher model, the second similarity is an output result of the student model, and a loss between the output result of the teacher model and the output result of the student model can be calculated by MSE (mean square loss function) and determined as the first loss. And calculating the loss between the teacher hidden vector of the teacher model and the student hidden layer vector of the student model, and determining the loss as a second loss. And further determining the total Loss value according to the Loss function Loss. The Loss function Loss is expressed as:
Loss=p1*Loss1+p2*Loss2
therein, Loss1Is first Loss, Loss2For the second loss, p1And p2All the above-mentioned herbs are super ginseng.
S70, when the total loss value does not meet the preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating the updated total loss value of the student model after iterative updating; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
Understandably, the back propagation method mainly comprises two links (excitation propagation and weight updating) which are iterated repeatedly and circularly until the response of the network to the input reaches a preset convergence condition. The convergence condition can be set according to actual requirements. The semantic matching model matches similar questions corresponding to the standard questions input by the user by identifying the standard questions input by the user, selects the question with the highest similarity as a matching result, and provides answers to the corresponding questions.
In one embodiment, when the total loss value does not satisfy the preset convergence condition, the student model is iteratively updated by using a propagation method to obtain an updated student model, and the obtained updated total loss value can be calculated according to the updated student model. And then, judging whether the updated total loss value meets a preset convergence condition.
In the embodiment S10-S70, a problem pair sample set based on an expert domain knowledge base is obtained, a teacher model trains the problem pair sample to obtain a first similarity and a teacher hidden vector, a teacher model is initialized using a shared parameter of a teacher model output layer, a problem pair sample is trained by a student model to obtain the first similarity and the student hidden vector, a total loss value is determined according to the first similarity, the second similarity, the teacher hidden vector and the student hidden vector, and domain knowledge, hidden information and similarity information are continuously integrated into model training by a back propagation method until training is completed to obtain a semantic similarity model. When the model is used for learning, not only can the semantics in the original problem pair be identified, but also the hidden information and the final result of the teacher model can be identified. The learning ability of the model is improved, and the accuracy and quality of problem matching are also improved.
Optionally, before step S10, that is, before the obtaining the problem pair sample set, the method includes:
and S101, acquiring the standard problem from the expert domain knowledge base.
Understandably, the expert domain knowledge base stores historically collected expert labeled problem pair samples related to business scenarios. The standard questions are questions marked by experts and related to business scenes. The standard question may be obtained by a distribution of labels of a plurality of experts.
And S102, processing the standard problem through a recall model to obtain a plurality of similar problems corresponding to the standard problem.
Understandably, the recall model is a shallow semantic model based on a twin neural network. Similarity scores corresponding to the standard questions and all the questions in the expert field knowledge base can be obtained through the recall model, and the similarity scores corresponding to the standard questions and all the questions in the expert field knowledge base are arranged in a descending order. And acquiring the first N (N is a positive integer) similarity scores in the descending order, and determining the problems corresponding to the N similarity pair scores as the similar problems corresponding to the standard problems. For example: the standard problem is "XXX handset purchase", similar problems are: "how XXX cell phones should buy", "how XXX cell phones buy", or "how to buy XXX cell phones" etc.
S103, determining the standard problem and a similar problem corresponding to the standard problem as a problem pair sample.
Understandably, one standard problem corresponds to a plurality of similar problems, the standard problem and one similar problem corresponding to the standard problem can be determined as one problem pair sample, and a plurality of problem pair samples can be obtained.
S104, judging whether the semantics of the standard problem in the problem pair sample is consistent with the semantics of the similar problem, obtaining a judgment result, and generating a label of the problem pair sample according to the judgment result.
Understandably, when the semantics of the standard problem in the problem pair sample are consistent with those of the similar problem, a label corresponding to the judgment result is generated. For example, if the semantic meaning matches as a result of the determination, the label is "1". And when the semantics of the standard problem in the sample and the semantics of the similar problem are inconsistent, the label corresponding to the judgment result. For example, if the semantic inconsistency is determined, the label is "0". The label can be used as input data to be input into a teacher model and a student model for training.
Optionally, before step S20, that is, before the similarity calculation is performed on the standard question and the similar question in the question pair sample through the teacher model to obtain the first similarity, the method includes:
s201, dividing the standard problem and the similar problem into words to obtain a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem.
It is understood that the division of words is to divide the words of a sentence into single words, thereby recording the divided words as unit words. The unit word combination is a combination form in which a sentence is divided into a plurality of unit words. For example, the standard issues are: the 'XXX mobile phone purchase' can be divided into standard unit character combination 'X X X mobile phone purchase'.
S202, combining the standard unit word combination and the similar unit word combination into input data and inputting the input data into the teacher model.
In one embodiment, for example, the standard issues are: "XXX handset purchase", similar problem: "how to buy" on XXX phones, the teacher model inputs are:
(X handset purchase);
(how to purchase the X handset).
Optionally, step S40, namely, the inputting the standard question, the similar question, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training, further includes:
s401, splitting words of the standard problem to obtain a standard unit word combination and a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem.
Understandably, dividing words into words divides words in a sentence into single words and divides words of the sentence into single words. And recording the divided characters as unit characters, and recording the divided words as unit words. The unit words are combined into a combination form of a plurality of unit words. For example, the standard issues are: the cell word combination of "XXX cell phone purchase" is: "X X X Mobile purchase".
S402, inputting the standard unit word combination, the similar unit word combination and the similar unit word combination as input data into the student model.
In one embodiment, for example, the standard issues are: "XXX handset purchase", similar problem: "how to buy" for XXX handset, then the student model inputs are:
(X X X Mobile purchase );
(how X X X cell phones purchased, and XXX cell phones purchased).
Optionally, step S40, inputting the standard question, the similar question, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training, including:
and S403, inputting the standard problem and the similar problem into a twin neural network model, and training the standard problem and the similar problem through the twin neural network model.
Understandably, the twin neural network model is used for calculating cosine similarity between sentences. And inputting the standard problem and the similar problem into a twin neural network model for training, and obtaining similarity information between sentences of the standard problem and the similar problem.
S404, obtaining shallow layer similarity between the output sentences of the twin neural network model.
Understandably, the shallow similarity is the similarity information between the standard problem output by the twin neural network model and the sentences of the similar problem. The similarity information is the cosine similarity between sentences obtained by encoding the sentences into vectors with fixed length and then calculating, wherein the cosine similarity can be taken as [ -1,1 ].
S405, inputting the shallow similarity, the standard problem, the similar problem, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training.
Understandably, the shallow similarity, the standard problem, the similar problem, the first similarity and the teacher hidden vector are used as input data of the student model, so that the student model can learn the semantic information among sentences while learning the hidden layer information.
In one embodiment, for example, the standard issues are: "XXX handset purchase", similar problem: "how to buy" for XXX handset, then the student model inputs are:
(X X X Mobile purchase );
(how X X X cell phone purchased );
(shallow similarity).
Optionally, step S40, inputting the standard question, the similar question, the first similarity, and the teacher hidden vector into a student model containing the sharing parameters for training, further includes:
s406, counting the occurrence frequency of the words in the problem pair sample set, and generating an IDF table.
Understandably, the IDF table contains frequency information of word level and word level in the text, and the frequency information is the IDF value. The IDF is an Inverse text Frequency index (Inverse Document Frequency). TF-IDF (term frequency-inverse document frequency) is a commonly used weighting technique for information retrieval and data mining. TF is the Term Frequency (Term Frequency).
S407, calculating a first IDF value of the standard problem and a second IDF value of the similar problem according to the IDF table.
Understandably, the IDF value of the word level and the word level corresponding to the sentence can be queried in the IDF table. For example: the standard is entitled: "XXX handset purchase", similar problems are: "what to buy for XXX handset" is matched by querying the IDF value: the term "handset" has an IDF value of 20, the term "how" has an IDF value of 5, the term "hand" has an IDF value of 23, and the term "how" has an IDF value of 6. The larger the IDF value, the higher the frequency of occurrence of the word or word.
S408, processing the first IDF value and the second IDF value through a jaccard algorithm to generate IDF similarity between the standard problem and the similar problem; the IDF similarity includes a word-level IDF similarity and a word-level IDF similarity.
Understandably, the jaccard algorithm (jaccard similarity algorithm) is used to compare the similarity between IDF values. The sentences are converted into corresponding word-level IDF values, the word-level IDF values of the two sentences are processed by using a jaccard algorithm, and the word-level IDF similarity between the sentences can be obtained. And (4) converting the sentences into corresponding word-level IDF values, and processing the word-level IDF values of the two sentences by using a jaccard algorithm to obtain the word-level IDF similarity between the sentences.
S409, inputting the IDF similarity, the shallow layer similarity, the standard problem, the similar problem, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training.
Understandably, the shallow similarity, the standard problem, the similar problem, the first similarity and the teacher hidden vector are used as input data of the student model, so that the student model can learn the semantic information between the hidden layer information and sentences and also learn the idf information of the word level and the word level between the sentences, and the student model can pay attention to the information of the word surface.
In one embodiment, for example, the standard issues are: "XXX handset purchase", similar problem: "how to buy" for XXX handset, then the student model inputs are:
(XXX handset purchase );
(how XXX handset was purchased );
(shallow level, word level IDF).
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
In an embodiment, a semantic matching model training device is provided, and the semantic matching model training device corresponds to the semantic matching model training method in the above embodiments one to one. As shown in fig. 3, the semantic matching model training apparatus includes a question pair obtaining module 10, a first similarity module 20, an initialization module 30, a student model training module 40, a second similarity module 50, a total loss value module 60, and a semantic matching model module 70. The functional modules are explained in detail as follows:
a problem pair obtaining module 10, configured to obtain a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
the first similarity module 20 is configured to perform similarity calculation on the standard questions and the similar questions in the question pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
an initialization module 30, configured to input the sharing parameters into a student output layer of a student model, so as to initialize the student model;
a student model training module 40, configured to input the standard question, the similar question, the first similarity, and the teacher hidden vector into a student model containing the shared parameter for training;
a second similarity module 50, configured to perform similarity calculation on the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
a total loss value module 60, configured to determine a total loss value according to the first similarity, the second similarity, the teacher hidden vector, and the student hidden layer vector;
a semantic matching model module 70, configured to iteratively update the student model by using a back propagation method when the total loss value does not satisfy a preset convergence condition, and calculate an updated total loss value of the student model after iterative update; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
Optionally, the problem pair before acquiring the module 10 includes:
the standard problem module is used for acquiring the standard problem from the expert domain knowledge base;
the similar problem module is used for processing the standard problem through a recall model to obtain a plurality of similar problems corresponding to the standard problem;
the problem pair determining module is used for determining the standard problem and a similar problem corresponding to the standard problem as a problem pair sample;
and the judging module is used for judging whether the semantics of the standard problem in the problem pair sample is consistent with the semantics of the similar problem to obtain a judging result, and generating a label of the problem pair sample according to the judging result.
Optionally, before the first similarity module 20, the method includes:
the word segmentation unit is used for segmenting words of the standard problem and the similar problem to obtain a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem;
and the first input data unit is used for combining the standard unit word combination and the similar unit word combination into input data to be input into the teacher model.
Optionally, the student model training module 40 includes:
the word splitting unit is used for splitting words of the standard problem to obtain a standard unit word combination and a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem;
and the second input data unit is used for inputting the standard unit word combination, the similar unit word combination and the similar unit word combination as input data into the student model.
Optionally, the student model training module 40 further includes:
the twin neural network model unit is used for inputting the standard problem and the similar problem into a twin neural network model, and the standard problem and the similar problem are trained through the twin neural network model;
the shallow layer similarity unit is used for obtaining the shallow layer similarity between the output sentences of the twin neural network model;
and the first training unit is used for inputting the shallow similarity, the standard problem, the similar problem, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training.
Optionally, the student model training module 40 further includes:
counting the occurrence frequency of words in the problem pair sample set to generate an IDF table;
an IDF value calculating unit for calculating a first IDF value of the standard problem and a second IDF value of the similar problem according to the IDF table;
an IDF similarity unit, configured to process the first IDF value and the second IDF value through a jaccard algorithm, and generate an IDF similarity between the standard problem and the similar problem; the IDF similarity comprises word level IDF similarity and word level IDF similarity;
and the second training unit is used for inputting the IDF similarity, the shallow layer similarity, the standard problem, the similar problem, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training.
For the specific definition of the semantic matching model training device, reference may be made to the above definition of the semantic matching model training method, which is not described herein again. The modules in the semantic matching model training device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, which may be a server, the internal structure of which may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a readable storage medium and an internal memory. The readable storage medium stores an operating system, computer readable instructions, and a database. The internal memory provides an environment for the operating system and execution of computer-readable instructions in the readable storage medium. The database of the computer device is used for storing data related to the semantic matching model training method. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer readable instructions, when executed by a processor, implement a semantic matching model training method. The readable storage media provided by the present embodiment include nonvolatile readable storage media and volatile readable storage media.
In one embodiment, a computer device is provided, comprising a memory, a processor, and computer readable instructions stored on the memory and executable on the processor, the processor when executing the computer readable instructions implementing the steps of:
obtaining a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
calculating the similarity of the standard problems and the similar problems in the problem pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
inputting the sharing parameters into a student output layer of a student model to initialize the student model;
inputting the standard question, the similar question, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training;
calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector;
when the total loss value does not meet a preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating an updated total loss value of the student model after iterative updating; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
In one embodiment, one or more computer-readable storage media storing computer-readable instructions are provided, the readable storage media provided by the embodiments including non-volatile readable storage media and volatile readable storage media. The readable storage medium has stored thereon computer readable instructions which, when executed by one or more processors, perform the steps of:
obtaining a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
calculating the similarity of the standard problems and the similar problems in the problem pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
inputting the sharing parameters into a student output layer of a student model to initialize the student model;
inputting the standard question, the similar question, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training;
calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector;
when the total loss value does not meet a preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating an updated total loss value of the student model after iterative updating; and when the updated total loss value meets the preset convergence condition, determining the student model corresponding to the updated total loss value meeting the preset convergence condition as a semantic matching model.
It will be understood by those of ordinary skill in the art that all or part of the processes of the methods of the above embodiments may be implemented by hardware related to computer readable instructions, which may be stored in a non-volatile readable storage medium or a volatile readable storage medium, and when executed, the computer readable instructions may include processes of the above embodiments of the methods. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A semantic matching model training method is characterized by comprising the following steps:
obtaining a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
calculating the similarity of the standard problems and the similar problems in the problem pair sample through a teacher model to obtain a first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
inputting the sharing parameters into a student output layer of a student model to initialize the student model;
inputting the standard question, the similar question, the first similarity and the teacher hiding vector into a student model containing the sharing parameters for training;
calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
determining a total loss value according to the first similarity, the second similarity, the teacher hidden vector and the student hidden layer vector;
when the total loss value does not meet a preset convergence condition, iteratively updating the student model by using a back propagation method, and calculating an updated total loss value of the student model after iterative updating; and when the updated total loss value meets a preset convergence condition, determining a student model corresponding to the updated total loss value as a semantic matching model.
2. The method for training semantic matching models according to claim 1, wherein the obtaining a question pair sample set is preceded by:
obtaining the standard question from the expert domain knowledge base;
processing the standard problem through a recall model to obtain a plurality of similar problems corresponding to the standard problem;
determining the standard question and a similar question corresponding to the standard question as a question pair sample;
and judging whether the semantics of the standard problem in the problem pair sample and the semantics of the similar problem are consistent or not, obtaining a judgment result, and generating a label of the problem pair sample according to the judgment result.
3. The training method of semantic matching models according to claim 1, wherein before the similarity calculation of the standard questions and the similar questions in the question pair samples through the teacher model to obtain the first similarity, the method comprises:
dividing the standard problem and the similar problem into words to obtain a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem;
and combining the standard unit word combination and the similar unit word combination as input data to be input into the teacher model.
4. The semantic matching model training method of claim 1, wherein the inputting the standard question, the similarity question, the first similarity and the teacher hidden vector into a student model containing the shared parameters for training comprises:
splitting words of the standard problem to obtain a standard unit word combination and a standard unit word combination corresponding to the standard problem and a similar unit word combination corresponding to the similar problem;
and inputting the standard unit word combination, the similar unit word combination and the similar unit word combination as input data into the student model.
5. The semantic matching model training method of claim 1, wherein the standard question, the similarity question, the first similarity, and the teacher hidden vector are input into a student model containing the shared parameters for training, further comprising:
inputting the standard problem and the similar problem into a twin neural network model, and training the standard problem and the similar problem through the twin neural network model;
obtaining shallow layer similarity between output sentences of the twin neural network model;
inputting the shallow similarity, the standard question, the similar question, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training.
6. The semantic matching model training method of claim 5, wherein the standard question, the similarity question, the first similarity and the teacher hidden vector are input into a student model containing the shared parameters for training, further comprising:
counting the occurrence frequency of words in the problem pair sample set to generate an IDF table;
calculating a first IDF value of the standard problem and a second IDF value of the similar problem according to the IDF table;
processing the first IDF value and the second IDF value through a jaccard algorithm to generate IDF similarity between the standard problem and the similar problem; the IDF similarity comprises word level IDF similarity and word level IDF similarity;
inputting the IDF similarity, the shallow similarity, the standard question, the similar question, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training.
7. A semantic matching model training device, comprising:
the problem pair acquisition module is used for acquiring a problem pair sample set; the problem pair sample set comprises a plurality of problem pair samples obtained from an expert domain knowledge base; one said question comprises a standard question and a similar question for a sample;
the first similarity module is used for calculating the similarity of the standard questions and the similar questions in the question pair sample through a teacher model to obtain first similarity; the teacher model comprises a teacher hiding layer and a teacher output layer; the teacher hiding layer is used for processing the question pair sample into a teacher hiding vector; the teacher output layer contains sharing parameters;
the initialization module is used for inputting the sharing parameters into a student output layer of a student model so as to initialize the student model;
the student model training module is used for inputting the standard problem, the similar problem, the first similarity and the teacher hidden vector into a student model containing the sharing parameters for training;
the second similarity module is used for calculating the similarity of the standard problem and the similar problem through the student model to obtain a second similarity; the semantic recognition student model comprises a student hiding layer; the student hide layer includes student hide vectors;
a total loss value module, configured to determine a total loss value according to the first similarity, the second similarity, the teacher hidden vector, and the student hidden layer vector;
the semantic matching model module is used for iteratively updating the student model by using a back propagation method when the total loss value does not meet a preset convergence condition, and calculating an updated total loss value of the student model after iterative updating; and the student model corresponding to the updated total loss value meeting the preset convergence condition is determined as the semantic matching model when the updated total loss value meets the preset convergence condition.
8. The semantic matching model training device according to claim 7, wherein the question pair acquisition module is preceded by:
the standard problem module is used for acquiring the standard problem from the expert domain knowledge base;
the similar problem module is used for processing the standard problem through a recall model to obtain a plurality of similar problems corresponding to the standard problem;
the problem pair determining module is used for determining the standard problem and a similar problem corresponding to the standard problem as a problem pair sample;
and the judging module is used for judging whether the semantics of the standard problem in the problem pair sample is consistent with the semantics of the similar problem to obtain a judging result, and generating a label of the problem pair sample according to the judging result.
9. A computer device comprising a memory, a processor, and computer readable instructions stored in the memory and executable on the processor, wherein the processor when executing the computer readable instructions implements the semantic matching model training method according to any one of claims 1 to 6.
10. One or more readable storage media storing computer-readable instructions that, when executed by one or more processors, cause the one or more processors to perform the semantic matching model training method of any one of claims 1-6.
CN202110688539.0A 2021-06-21 2021-06-21 Semantic matching model training method, device, equipment and storage medium Active CN113239176B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110688539.0A CN113239176B (en) 2021-06-21 2021-06-21 Semantic matching model training method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110688539.0A CN113239176B (en) 2021-06-21 2021-06-21 Semantic matching model training method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113239176A true CN113239176A (en) 2021-08-10
CN113239176B CN113239176B (en) 2022-08-23

Family

ID=77140740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110688539.0A Active CN113239176B (en) 2021-06-21 2021-06-21 Semantic matching model training method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113239176B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113850807A (en) * 2021-11-30 2021-12-28 大族激光科技产业集团股份有限公司 Image sub-pixel matching positioning method, system, device and medium
CN114548261A (en) * 2022-02-18 2022-05-27 北京百度网讯科技有限公司 Data processing method, data processing device, electronic equipment and storage medium
CN116361658A (en) * 2023-04-07 2023-06-30 北京百度网讯科技有限公司 Model training method, task processing method, device, electronic equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200184155A1 (en) * 2017-05-10 2020-06-11 Oracle International Corporation Generating desired discourse structure from an arbitrary text
CN112465138A (en) * 2020-11-20 2021-03-09 平安科技(深圳)有限公司 Model distillation method, device, storage medium and equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200184155A1 (en) * 2017-05-10 2020-06-11 Oracle International Corporation Generating desired discourse structure from an arbitrary text
CN112465138A (en) * 2020-11-20 2021-03-09 平安科技(深圳)有限公司 Model distillation method, device, storage medium and equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘葛泓等: "基于Text-CNN联合分类与匹配的合同法律智能问答系统研究", 《软件工程》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113850807A (en) * 2021-11-30 2021-12-28 大族激光科技产业集团股份有限公司 Image sub-pixel matching positioning method, system, device and medium
CN114548261A (en) * 2022-02-18 2022-05-27 北京百度网讯科技有限公司 Data processing method, data processing device, electronic equipment and storage medium
CN116361658A (en) * 2023-04-07 2023-06-30 北京百度网讯科技有限公司 Model training method, task processing method, device, electronic equipment and medium

Also Published As

Publication number Publication date
CN113239176B (en) 2022-08-23

Similar Documents

Publication Publication Date Title
CN113239176B (en) Semantic matching model training method, device, equipment and storage medium
CN109871532B (en) Text theme extraction method and device and storage medium
CN111859960B (en) Semantic matching method, device, computer equipment and medium based on knowledge distillation
CN109783655A (en) A kind of cross-module state search method, device, computer equipment and storage medium
WO2019154411A1 (en) Word vector retrofitting method and device
CN113157863A (en) Question and answer data processing method and device, computer equipment and storage medium
CN111259647A (en) Question and answer text matching method, device, medium and electronic equipment based on artificial intelligence
CN108304376B (en) Text vector determination method and device, storage medium and electronic device
CN113886550A (en) Question-answer matching method, device, equipment and storage medium based on attention mechanism
CN113836303A (en) Text type identification method and device, computer equipment and medium
CN114064852A (en) Method and device for extracting relation of natural language, electronic equipment and storage medium
CN111402864A (en) Voice processing method and electronic equipment
CN112749557A (en) Text processing model construction method and text processing method
CN113204679B (en) Code query model generation method and computer equipment
CN112650869B (en) Image retrieval reordering method and device, electronic equipment and storage medium
CN114398883A (en) Presentation generation method and device, computer readable storage medium and server
CN114707510A (en) Resource recommendation information pushing method and device, computer equipment and storage medium
CN114328894A (en) Document processing method, document processing device, electronic equipment and medium
CN114238798A (en) Search ranking method, system, device and storage medium based on neural network
CN113434652A (en) Intelligent question-answering method, intelligent question-answering device, intelligent question-answering equipment and storage medium
CN111639260A (en) Content recommendation method, device and storage medium thereof
CN113360744A (en) Recommendation method and device of media content, computer equipment and storage medium
CN115017291B (en) Hotspot problem analysis method and device, computer equipment and storage medium
CN112417086B (en) Data processing method, device, server and storage medium
CN117235236B (en) Dialogue method, dialogue device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant