CN113780454A - Model training and calling method and device, computer equipment and storage medium - Google Patents
Model training and calling method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN113780454A CN113780454A CN202111094266.3A CN202111094266A CN113780454A CN 113780454 A CN113780454 A CN 113780454A CN 202111094266 A CN202111094266 A CN 202111094266A CN 113780454 A CN113780454 A CN 113780454A
- Authority
- CN
- China
- Prior art keywords
- label
- network model
- type
- pointer network
- symptom
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000012549 training Methods 0.000 title claims abstract description 47
- 208000024891 symptom Diseases 0.000 claims abstract description 230
- 238000002372 labelling Methods 0.000 claims abstract description 19
- 238000005457 optimization Methods 0.000 claims abstract description 11
- 230000006870 function Effects 0.000 claims description 49
- 238000004364 calculation method Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 10
- 238000000605 extraction Methods 0.000 claims description 5
- 230000000875 corresponding effect Effects 0.000 description 85
- 210000002784 stomach Anatomy 0.000 description 17
- 238000010586 diagram Methods 0.000 description 11
- 230000003340 mental effect Effects 0.000 description 8
- 206010000087 Abdominal pain upper Diseases 0.000 description 2
- 238000013136 deep learning model Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 208000020016 psychiatric disease Diseases 0.000 description 2
- 238000003491 array Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 238000011478 gradient descent method Methods 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2415—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Biophysics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Probability & Statistics with Applications (AREA)
- Computational Linguistics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Machine Translation (AREA)
Abstract
The application relates to the field of medical big data, and provides a pointer network model training and calling method, a device, computer equipment and a storage medium, wherein the method comprises the following steps: labeling at least one type of label on a sample statement to generate sample data; inputting sample data into a pointer network model, and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label; calculating a loss function value of the pointer network model; and performing iterative optimization on the pointer network model, returning to the step of inputting the sample data in the sample data set into the pointer network model and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label until the loss function is determined to be converged based on the loss function value, obtaining the trained pointer network model, and realizing generation of comprehensive and accurate symptom standard expression sentences. The application also relates to a blockchain technique, and the sample data set can be stored in a blockchain link point.
Description
Technical Field
The application relates to the technical field of medical big data, in particular to a model training and calling method, a model training and calling device, computer equipment and a storage medium.
Background
With the rapid development of deep learning technology, a symptom information extraction method based on a deep learning model is developed, and symptom information is extracted from description sentences of symptoms through the model, so that symptom standard expression sentences are generated, and the method can play a key role in downstream tasks such as disease prediction, auxiliary diagnosis and treatment, medication recommendation and the like. However, because the description of the patient on the symptoms is usually spoken and varied, and the description of the patient on the symptoms generally includes a plurality of information such as description of symptom type, description of symptom degree, description of symptom frequency, etc., at present, only the symptom information corresponding to a single label can be extracted based on the model at each time, that is, only one of various information such as symptom type/symptom degree/symptom frequency information can be obtained at each time, so that a comprehensive and accurate standard expression sentence of the symptoms cannot be obtained.
Therefore, how to generate comprehensive and accurate symptom standard expression sentences is a problem to be solved urgently.
Disclosure of Invention
The application provides a model training and calling method, a model training and calling device, computer equipment and a storage medium, and realizes generation of comprehensive and accurate symptom standard expression sentences.
In a first aspect, the present application provides a pointer network model training method, where the method includes:
labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set;
inputting the sample data in the sample data set into a pointer network model, and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label;
determining a highest probability of each position of each type of label from the plurality of predicted probabilities, and calculating a loss function value of the pointer network model according to the highest probability of each position of each type of label;
and performing iterative optimization on the pointer network model, returning to the step of inputting the sample data in the sample data set into the pointer network model and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label until the pointer network model is determined to be converged based on the calculated loss function value, and obtaining the trained pointer network model.
In a second aspect, the present application further provides a pointer network model calling method, where the pointer network model is a pointer network model obtained by training with the pointer network model training method, and the pointer network model calling method includes:
acquiring a symptom description sentence to be processed;
inputting the symptom description sentence into the pointer network model, and predicting the probability that each character of the symptom description sentence corresponds to a plurality of positions of each type of label;
extracting a character segment which starts from the first character to ends from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label;
and determining symptom information of each type of label corresponding to the symptom description statement according to the character segment corresponding to each type of label, so as to generate a symptom standard expression statement based on the symptom information.
In a third aspect, the present application further provides a pointer network model training apparatus, where the pointer network model training apparatus includes:
the label labeling module is used for labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set;
the training module is used for inputting the sample data in the sample data set into a pointer network model and predicting the probability that each character of the sample data corresponds to a plurality of positions of each type of label;
a calculation module for determining a highest probability for each position of each type of label from the predicted plurality of probabilities, and calculating a loss function value of the pointer network model according to the highest probability for each position of each type of label;
and the first processing module is used for carrying out iterative optimization on the pointer network model until the pointer network model is determined to be converged based on the calculated loss function value, so that the trained pointer network model is obtained.
In a fourth aspect, the present application further provides a pointer network model invoking device, where the pointer network model invoking device includes:
the acquisition module is used for acquiring a symptom description statement to be processed;
the prediction module is used for inputting the symptom description statement into the pointer network model and predicting the probability that each character of the symptom description statement corresponds to a plurality of positions of each type of label;
the extraction module is used for extracting a character segment which starts from the first character to end from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label;
and the second processing module is used for determining the symptom information of each type of label corresponding to the symptom description statement according to the character segment corresponding to each type of label so as to generate a symptom standard expression statement based on the symptom information.
In a fifth aspect, the present application further provides a computer device comprising a memory and a processor;
the memory for storing a computer program;
the processor is configured to execute the computer program and implement the pointer network model training method as described above when executing the computer program, or implement the pointer network model calling method as described above.
In a sixth aspect, the present application further provides a computer-readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the pointer network model training method as described above, or to implement the pointer network model calling method as described above.
The application discloses a model training and calling method, a device, a computer device and a storage medium, wherein a sample data set is generated by labeling at least one type of label for each sample statement in the sample statement set, sample data in the sample data set is input into a pointer network model for training, the probability of each character of the sample data corresponding to a plurality of positions of each type of label is predicted, the highest probability of each position of each type of label is determined from the predicted probabilities, a loss function value of the pointer network model is calculated according to the highest probability of each position of each type of label, the pointer network model is subjected to iterative optimization, the step of inputting the sample data into the pointer network model is returned, the probability of each character of the sample data corresponding to a plurality of positions of each type of label is predicted, and the step of determining the convergence of the pointer network model based on the calculated loss function value is carried out until the convergence of the pointer network model is determined, and obtaining a trained pointer network model. The symptom information corresponding to the labels can be obtained through the trained pointer network model, so that comprehensive and accurate symptom standard expression sentences are generated.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart diagram illustrating steps of a pointer network model training method according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating exemplary sub-steps of labeling each sample sentence in a sample sentence set with at least one type of label;
fig. 3 is a schematic diagram of sample data for tag annotation generation according to an embodiment of the present application;
FIG. 4 is a schematic diagram of another sample data for tag annotation generation according to an embodiment of the present application;
FIG. 5 is a flow chart illustrating exemplary substeps for calculating a loss function value for the pointer network model based on the highest probability for each location for each type of tag provided by an embodiment of the present application;
FIG. 6 is a schematic flow chart diagram illustrating steps of a pointer network model calling method according to an embodiment of the present application;
FIG. 7 is a schematic block diagram of a pointer network model training apparatus provided in an embodiment of the present application;
FIG. 8 is a schematic block diagram of a pointer network model invoking device provided in an embodiment of the present application;
fig. 9 is a schematic block diagram of a structure of a computer device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The flow diagrams depicted in the figures are merely illustrative and do not necessarily include all of the elements and operations/steps, nor do they necessarily have to be performed in the order depicted. For example, some operations/steps may be decomposed, combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
It is to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
The embodiment of the application provides a pointer network model training and calling method, a pointer network model training and calling device, computer equipment and a storage medium, and is used for improving the accuracy of machine reading understanding.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating a pointer network model training method according to an embodiment of the present application. The method can be applied to computer equipment, and the application scene of the method is not limited in the application. In the following, the pointer network model training method is described in detail by taking an example in which the pointer network model training method is applied to a computer device.
As shown in fig. 1, the pointer network model training method specifically includes steps S101 to S106.
S101, labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set.
Wherein, the sample sentence can be a symptom description sentence of the corresponding symptom of the patient. For example, symptom descriptive sentences such as "special pain in stomach today" and "bad spirit frequently" are used as initial sample sentences to form a corresponding sample sentence set.
For each sample statement in the sample statement set, at least one type of label is usually corresponded, wherein the type of label includes but is not limited to symptom type, symptom degree, symptom frequency and the like. For example, for the sample sentence "stomach is particularly painful today", the label type included correspondingly includes the symptom type and the symptom level. As another example, for a "mental disorder" sample statement, the correspondingly included tag types include symptom type and symptom frequency.
Illustratively, when a sample statement corresponds to multiple classes of tags, there is an association between the multiple classes of tags. For example, there is a correlation between three types of labels, i.e., symptom type, symptom degree, and symptom frequency.
And labeling each sample statement based on at least one type of label corresponding to each sample statement in the sample statement set to generate corresponding sample data. And, a sample data set is composed of the generated respective sample data.
In some embodiments, as shown in fig. 2, step S101 may include sub-step S1011 and sub-step S1012.
S1011, determining at least one type of label corresponding to each sample statement.
For example, still taking the sample sentence "stomach is particularly painful today" as an example, the type of the label corresponding to the sample sentence is determined to include a symptom type and a symptom degree.
For another example, taking a sample sentence of "often bad mental" as an example, the tag type corresponding to the sample sentence is determined to include a symptom type and a symptom frequency.
And S1012, marking the starting position and the ending position of each type of label of each sample statement.
For each sample statement, each type of tag corresponds to a corresponding start position, end position, and other positions besides the start position and the end position, i.e., other positions between the start position and the end position. For example, for the "today's stomach is particularly painful" sample sentence, where the "special" corresponding position is the starting position of the symptom level label, where the "special" corresponding position is the ending position of the symptom level label, and where the "other" corresponding position is the other position of the symptom level label.
And marking the starting position and the ending position of each type of label of each sample statement. Illustratively, a preset first label character is adopted to label the starting position and the ending position of each type of label of each sample statement, and a second label character is adopted to label other positions of the label. For example, a first tag character "1" and a second tag character "0" are preset, the start position and the end position of each type of tag are labeled with the first tag character "1", and the other positions of each type of tag are labeled with the second tag character "0". It should be noted that the first tag character and the second tag character may be other types of tag characters besides "1" and "0", and are not limited herein.
Illustratively, taking the sample sentence of "stomach pain of today" as an example, for each type of labels, labeling of the starting position and the ending position of the labels is performed. For example, as shown in FIG. 3, where start represents the starting position of the tag and end represents the ending position of the tag, for the symptom type tag, the "stomach" position corresponds to the starting position of the symptom type tag, and the "stomach" position is labeled with the first tag character "1"; the "painful" position corresponds to the end position of the symptom type label, and the "painful" position is marked with a first label character "1", and the other positions are marked with a second label character "0". For the symptom degree label, the special position corresponds to the initial position of the symptom degree label, and the special position is marked with a first label character 1; the "position" corresponds to the end position of the symptom degree label, and the "position" is labeled with a first label character "1", and the other positions are labeled with a second label character "0". For symptom frequency labels, each location is labeled with a second label character "0".
For another example, taking a sample sentence of "often mental disorder" as an example, as shown in fig. 4, where start represents a start position of the tag, end represents an end position of the tag, and for the symptom type tag, the "fine" position corresponds to the start position of the symptom type tag, and the "fine" position is labeled with a first tag character "1"; the "good" position corresponds to the end position of the symptom type label, the "good" position is labeled with a first label character "1", and the other positions are labeled with a second label character "0". For the symptom frequency label, the "warp" position corresponds to the starting position of the symptom frequency label, and the "warp" position is labeled with a first label character "1"; the "normal" position corresponds to the end position of the symptom frequency label, the "normal" position is labeled with a first label character "1", and the other positions are labeled with a second label character "0". For the symptom level label, each location is labeled with a second label character "0".
S102, inputting the sample data in the sample data set into a pointer network model, and predicting the probability that each character of the sample data corresponds to a plurality of positions of each type of label.
In this embodiment, a pointer network model is pre-constructed, and is used to extract symptom information from sample data including one type of tag or multiple types of tags. For example, the pointer network model may be a BERT model that adds a pointer structure, and it should be noted that the pointer network model may also be other deep learning models than the BERT model that adds a pointer structure.
Illustratively, in the pointer network model, the number of pointer layers in the pointer network model is consistent with the number of types of tags, and each type of tag corresponds to one layer of pointer of the pointer network model. For example, since the general label types include three types, namely, symptom type, symptom degree, and symptom frequency, the pointer network model may be configured as a BERT model with a three-layer pointer structure, so as to extract the symptom information corresponding to the three types, namely, the symptom type, the symptom degree, and the symptom frequency, respectively.
And inputting the generated sample data into the constructed pointer network model, training the pointer network model, and predicting the probability of each character of each sample data corresponding to a plurality of positions of each type of label through the pointer network model. Illustratively, the probability that each character of the sample data corresponds to the start position, the end position and other positions of the symptom type label, the probability that each character corresponds to the start position, the end position and other positions of the symptom degree label, and the probability that each character corresponds to the start position, the end position and other positions of the symptom frequency label are predicted through the pointer network model.
For example, taking a sample sentence "today's stomach is particularly painful" as an example, after a pointer network model is input, for a "present" character in the sample sentence, the pointer network model predicts a probability that "present" corresponds to a start position of a symptom type label, a probability that "present" corresponds to an end position of the symptom type label, a probability that "present" corresponds to another position of the symptom type label, a probability that "present" corresponds to a start position of a symptom degree label, a probability that "present" corresponds to an end position of the symptom degree label, a probability that "present" corresponds to an other position of the symptom degree label, a probability that "present" corresponds to a start position of the symptom frequency label, a probability that "present" corresponds to an end position of the symptom frequency label, and a probability that "present" corresponds to another position of the symptom frequency label.
S103, determining the highest probability of each position of each type of label from the plurality of predicted probabilities, and calculating the loss function value of the pointer network model according to the highest probability of each position of each type of label.
And determining the highest probability of the starting position, the ending position and other positions of each type of label according to the predicted probability of each character corresponding to the starting position, the ending position, other positions and other positions of each type of label. Illustratively, the highest probability of the starting position of the symptom type tag, the highest probability of the ending position of the symptom type tag, the highest probabilities of the other positions of the symptom type tag, the highest probability of the starting position of the symptom degree tag, the highest probability of the ending position of the symptom degree tag, the highest probabilities of the other positions of the symptom degree tag, the highest probability of the starting position of the symptom frequency tag, the highest probability of the ending position of the symptom frequency tag, and the highest probabilities of the other positions of the symptom frequency tag are determined.
And calculating the loss function value of the pointer network model according to the highest probability of the starting position of the symptom type label, the highest probability of the ending position of the symptom type label, the highest probability of other positions of the symptom type label, the highest probability of the starting position of the symptom degree label, the highest probability of the ending position of the symptom degree label, the highest probability of other positions of the symptom degree label, the highest probability of the starting position of the symptom frequency label, the highest probability of the ending position of the symptom frequency label and the highest probability of other positions of the symptom frequency label.
Illustratively, the weight of each position of each type of label is preset, and the highest probability of each position of each type of label is subjected to weighted summation calculation based on the weight of each position of each type of label to obtain the loss function value of the pointer network model.
In some embodiments, as shown in fig. 5, step S103 may include sub-step S1031 and sub-step S1032.
And S1031, obtaining a penalty coefficient of each position of each type of label corresponding to the sample data.
Illustratively, a penalty coefficient table corresponding to each position of each type of label of the sample data is preset, and the penalty coefficient of each position of each type of label corresponding to the sample data is obtained by inquiring the penalty coefficient table.
For example, a table of penalty coefficients set in advance is shown in table 1, where ZZ denotes a symptom type tag, FREQ denotes a symptom frequency tag, Level denotes a symptom degree tag, start denotes a start position of the tag, end denotes an end position of the tag, other denotes other positions of the tag, and a denotes a penalty coefficient. Illustratively, the sum of the penalty coefficients corresponding to the respective positions of each type of tag is 1.
TABLE 1
By referring to table 1, it is obtained that the penalty coefficient of the start position start of the symptom type tag ZZ is 0.4, the penalty coefficient of the end position end of the symptom type tag ZZ is 0.4, the penalty coefficient of the other position other of the symptom type tag ZZ is 0.2, the penalty coefficient of the start position start of the symptom Level tag Level is 0.4, the penalty coefficient of the end position end of the symptom Level tag Level is 0.4, the penalty coefficient of the other position other of the symptom Level tag Level is 0.2, the penalty coefficient of the start position start of the symptom frequency tag FREQ is 0.4, the penalty coefficient of the end position end of the symptom frequency tag FREQ is 0.4, and the penalty coefficient of the other position other of the symptom frequency tag FREQ is 0.2.
S1032, calculating the loss function value of the pointer network model according to the penalty coefficient of each position of each type of label and the highest probability of each position of each type of label.
Illustratively, the following calculation is followed:
and calculating the loss function value of the pointer network model.
Wherein Loss represents the Loss function value, n represents the type number of the label, m represents the base number of the Log function, alphai,startRepresentsPenalty factor, alpha, for the starting position of class i tagsi,endPenalty factor, α, representing the end position of class i tagsi,otherPenalty factor, P, for other positions of class i tagsi,startRepresenting the highest probability of the starting position, P, of the i-th class labeli,endRepresenting the highest probability of the end position of the i-th class label, Pi,otherRepresenting the highest probability of other positions of the class i tag. It should be noted that the base m of the Log function can be set to a common base, such as 2, 10, etc.
For example, if the tag type includes 3 types, i.e., n is 3, and the base number of the Log function is 2, i.e., m is 2, the following calculation formula is used:
and calculating a Loss function value Loss of the pointer network model.
S104, determining whether the pointer network model converges or not based on the loss function value; if not, executing step S105; if yes, go to step S106;
s105, performing iterative optimization on the pointer network model, and returning to execute the step S102;
and S106, finishing the training to obtain the trained pointer network model.
Illustratively, a pointer network model is iteratively optimized by adopting a gradient descent method, parameters of the pointer network model are updated, the operation in the steps is repeatedly executed based on the optimized pointer network model, a loss function value is obtained through calculation, the calculated loss function value is gradually reduced through the iterative optimization of the pointer network model until the loss function is minimized, the pointer network model is converged, the training of the pointer network model is finished, and the trained pointer network model is obtained.
Based on the trained pointer network model, symptom information corresponding to a plurality of labels (correlation exists among the labels) can be extracted each time, for example, symptom information corresponding to a plurality of correlated labels such as symptom type, symptom degree and symptom frequency can be obtained each time, so that comprehensive and accurate symptom standard expression sentences can be obtained through the symptom information.
In the above embodiment, a sample data set is generated by labeling at least one type of tag for each sample statement in the sample statement set, the sample data in the sample data set is input into a pointer network model for training, and the probability that each character of the sample data corresponds to multiple positions of each type of tag is predicted, wherein the multiple positions of each type of tag include the start position, the end position, and other positions except the start position and the end position of the tag, the highest probability of each position of each type of tag is determined from the predicted multiple probabilities, and according to the highest probability of each position of each type of tag, a loss function value of the pointer network model is calculated, iterative optimization is performed on the pointer network model, and the step of inputting the sample data into the pointer network model and predicting the probability that each character of the sample data corresponds to multiple positions of each type of tag is performed, and (4) determining the loss function convergence of the pointer network model based on the calculated loss function value, namely, the pointer network model converges, and obtaining the trained pointer network model. The symptom information corresponding to the labels can be obtained through the trained pointer network model, so that comprehensive and accurate symptom standard expression sentences are generated.
Referring to fig. 6, fig. 6 is a flowchart illustrating a pointer network model invoking method according to an embodiment of the present application. The method can be applied to computer equipment, and the application scene of the method is not limited in the application. The pointer network model calling method is described in detail below by taking an example in which the pointer network model calling method is applied to a computer device.
The pointer network model called is the pointer network model obtained by training through the pointer network model training method in the embodiment.
As shown in fig. 6, the pointer network model invoking method specifically includes steps S201 to S202.
S201, obtaining a symptom description statement to be processed.
The symptom description sentence can be a sentence which is spoken by the patient to express the symptom. For example, the sentences such as "the stomach is particularly painful today" and "the mind is often not good". Illustratively, the corresponding symptom description statement may be obtained by collecting voice data of the patient and performing voice recognition on the voice data. It is to be understood that the manner of obtaining the symptom description sentence is not limited to the above-listed speech recognition, and is not particularly limited herein.
S202, inputting the symptom description sentence into the pointer network model, and predicting the probability that each character of the symptom description sentence corresponds to a plurality of positions of each type of label.
And inputting the symptom description sentence to be processed into the trained pointer network model, and predicting the probability of each character of the symptom description sentence corresponding to a plurality of positions of each type of label through the pointer network model.
For example, taking the example that the label corresponding to the symptom description sentence includes a symptom type, a symptom degree, and a symptom frequency, the pointer network model predicts the probability that each character of the symptom description sentence corresponds to the start position, the end position, and other positions of the symptom type, the probability that each character corresponds to the start position, the end position, and other positions of the symptom degree, and the probability that each character corresponds to the start position, the end position, and other positions of the symptom frequency.
S203, extracting a character segment which starts from the first character to ends from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label.
For example, taking the symptom description sentence "stomach-specific pain today" as an example, the highest probability of the starting position of the symptom type predicted by the pin network model is corresponding to the "stomach" character, and the highest probability of the ending position of the symptom type is corresponding to the "pain" character, and then the character segment "stomach-specific pain" from the "stomach" character to the "pain" character end in the "stomach-specific pain today" is extracted. The highest probability of the starting position where the symptom degree is predicted to be obtained is the corresponding "special" character, and the highest probability of the ending position of the symptom degree is the corresponding "special" character, then the character segment "special" from the "special" character to the "character ending of the" special "character in the" stomach special pain today "is extracted.
For another example, taking the symptom description sentence as "mental not good often", the highest probability of the starting position of the symptom type predicted by the pin network model is corresponding to a "fine" character, and the highest probability of the ending position of the symptom type is corresponding to a "good" character, and then a character segment "mental not good often" starting from the "fine" character to the "good" character in the "mental not good often" is extracted. The highest probability of predicting the starting position of the symptom frequency is corresponding to the 'longitude' character, the highest probability of the ending position of the symptom frequency is corresponding to the 'usual' character, and then the character segment 'usual' from the 'longitude' character to the 'usual' character in the 'mental usual bad' is extracted.
S204, according to the character segment corresponding to each type of label, determining symptom information of each type of label corresponding to the symptom description statement, and generating a symptom standard expression statement based on the symptom information.
For example, taking the symptom description sentence as "today's stomach special pain", the character segment "stomach special pain" corresponding to the symptom type label and the character segment "special" corresponding to the symptom degree label are extracted, and according to the character segment "stomach special pain" corresponding to the symptom type label and the character segment "special" corresponding to the symptom degree label, the symptom information corresponding to the symptom type label is determined as follows: "Weitong"; the symptom information corresponding to the symptom degree label is as follows: "special"; the symptom information corresponding to the symptom frequency label is as follows: "None". I.e. to obtain { "symptom type: stomach ache "," frequency of symptoms: none "," degree of symptoms: particularly "}.
For another example, taking the symptom description sentence as "often mental poor" as an example, extracting the character segment "often mental poor" corresponding to the symptom type label and the character segment "often" corresponding to the symptom frequency label, and determining that the symptom information corresponding to the symptom type label is: "poor spirit"; the symptom information corresponding to the symptom frequency label is as follows: "often"; the symptom information corresponding to the symptom degree label is as follows: "None". I.e. to obtain { "symptom type: mental disability "," frequency of symptoms: frequently "," degree of symptoms: none'.
And generating a symptom standard expression statement corresponding to the symptom description statement, namely a professional expression, based on the obtained symptom information corresponding to each type of label.
The generated symptom standard expression sentences are combined with related attribute information, and the method can be used for downstream tasks such as disease prediction, constitution prediction, medication recommendation and the like.
In the embodiment, a symptom description sentence to be processed is obtained, the symptom description sentence is input into a pointer network model, the probability of multiple positions of each character of the symptom description sentence corresponding to each type of label is predicted, a character segment starting from the first character to ending from the second character is extracted from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label, the symptom information of each type of label corresponding to the symptom description sentence is determined according to the character segment corresponding to each type of label, and a comprehensive and accurate symptom standard expression sentence is generated based on the obtained symptom information of each type of label.
Referring to fig. 7, fig. 7 is a schematic block diagram of a pointer network model training apparatus according to an embodiment of the present application, which can be configured in a computer device for executing the pointer network model training method.
As shown in fig. 7, the pointer network model training apparatus 1000 includes: a label labeling module 1001, a training module 1002, a calculating module 1003 and a first processing module 1004.
A label labeling module 1001, configured to perform at least one type of label labeling on each sample statement in the sample statement set, and generate a sample data set;
a training module 1002, configured to input sample data in the sample data set into a pointer network model, and predict probabilities that each character of the sample data corresponds to multiple positions of each type of label;
a calculating module 1003, configured to determine a highest probability of each position of each type of label from the predicted multiple probabilities, and calculate a loss function value of the pointer network model according to the highest probability of each position of each type of label;
a first processing module 1004, configured to perform iterative optimization on the pointer network model until the pointer network model is determined to be converged based on the calculated loss function value, so as to obtain the trained pointer network model.
In one embodiment, the calculation module 1003 is further configured to:
obtaining a penalty coefficient of each position of each type of label corresponding to the sample data; calculating the loss function value of the pointer network model according to the penalty coefficient of each position of each type of label and the highest probability of each position of each type of label.
In one embodiment, the calculation module 1003 is further configured to:
according to the calculation formula:
calculating the loss function value; wherein Loss represents the Loss function value, n represents the type number of the label, m represents the base number of the Log function, alphai,startPenalty factor, α, representing the starting position of class i tagsi,endPenalty factor, α, representing the end position of class i tagsi,otherPenalty factor, P, for other positions of class i tagsi,startRepresenting the highest probability of the starting position, P, of the i-th class labeli,endRepresenting the highest probability of the end position of the i-th class label, Pi,otherRepresenting the highest probability of other positions of the class i tag.
In one embodiment, the number of pointer layers in the pointer network model is consistent with the number of types of tags, and each type of tag corresponds to a layer of pointer of the pointer network model.
In one embodiment, the label labeling module 1001 is further configured to:
determining at least one type of label corresponding to each sample statement; and marking the starting position and the ending position of each type of label of each sample statement.
It should be noted that, as will be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working processes of the apparatus and the modules described above may refer to the corresponding processes in the foregoing embodiments of the pointer network model training method, and are not described herein again.
Referring to fig. 8, fig. 8 is a schematic block diagram of a pointer network model invoking device according to an embodiment of the present application, which can be configured in a computer device to execute the pointer network model invoking method described above.
As shown in fig. 8, the pointer network model invoking device 2000 includes: an acquisition module 2001, a prediction module 2002, an extraction module 2003, and a second processing module 2004.
An obtaining module 2001, configured to obtain a symptom description statement to be processed;
a prediction module 2002, configured to input the symptom description statement into the pointer network model, and predict a probability that each character of the symptom description statement corresponds to multiple positions of each type of tag;
an extraction module 2003, configured to extract, from the symptom description sentence, a character segment beginning with the first character and ending with the second character according to a first character corresponding to a highest probability of a start position of each type of tag and a second character corresponding to a highest probability of an end position of each type of tag;
the second processing module 2004 is configured to determine, according to the character segment corresponding to each type of tag, symptom information of each type of tag corresponding to the symptom description statement, and generate a symptom standard expression statement based on the symptom information.
It should be noted that, as will be clearly understood by those skilled in the art, for convenience and brevity of description, the specific working processes of the apparatus and the modules described above may refer to the corresponding processes in the foregoing pointer network model invoking method embodiment, and are not described herein again.
The methods, apparatus, and devices of the present application may be deployed in numerous general-purpose or special-purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
For example, the method and apparatus described above may be implemented in the form of a computer program that can be run on a computer device as shown in fig. 9.
Referring to fig. 9, fig. 9 is a schematic block diagram of a computer device according to an embodiment of the present disclosure.
Referring to fig. 9, the computer device includes a processor and a memory connected by a system bus, wherein the memory may include a nonvolatile storage medium and an internal memory.
The processor is used for providing calculation and control capability and supporting the operation of the whole computer equipment.
The internal memory provides an environment for the execution of a computer program on a non-volatile storage medium, which when executed by the processor causes the processor to perform any one of a pointer network model training method or a pointer network model calling method.
It should be understood that the Processor may be a Central Processing Unit (CPU), and the Processor may be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, etc. Wherein a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Wherein, in one embodiment, the processor is configured to execute a computer program stored in the memory to implement the steps of:
labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set; inputting the sample data in the sample data set into a pointer network model, and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label; determining a highest probability of each position of each type of label from the plurality of predicted probabilities, and calculating a loss function value of the pointer network model according to the highest probability of each position of each type of label; and performing iterative optimization on the pointer network model, returning to the step of inputting the sample data in the sample data set into the pointer network model and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label until the pointer network model is determined to be converged based on the calculated loss function value, and obtaining the trained pointer network model.
In one embodiment, the processor, in implementing the calculating the loss function value for the pointer network model according to the highest probability for each location of each type of tag, is configured to implement:
obtaining a penalty coefficient of each position of each type of label corresponding to the sample data; calculating the loss function value of the pointer network model according to the penalty coefficient of each position of each type of label and the highest probability of each position of each type of label.
In one embodiment, the processor, in implementing the calculating the loss function value for the pointer network model according to the penalty factor for each location for each type of tag and the highest probability for each location for each type of tag, is configured to implement:
according to the calculation formula:
calculating the loss function value; wherein Loss represents the Loss function value, n represents the type number of the label, m represents the base number of the Log function, alphai,startPenalty factor, α, representing the starting position of class i tagsi,endPenalty factor, α, representing the end position of class i tagsi,otherPenalty factor, P, for other positions of class i tagsi,startRepresenting the highest probability of the starting position, P, of the i-th class labeli,endRepresenting the highest probability of the end position of the i-th class label, Pi,otherRepresenting the highest probability of other positions of the class i tag.
In one embodiment, the number of pointer layers in the pointer network model is consistent with the number of types of tags, and each type of tag corresponds to a layer of pointer of the pointer network model.
In one embodiment, the processor, when implementing the at least one type of tagging performed on each sample statement in the set of sample statements, is configured to implement:
determining at least one type of label corresponding to each sample statement;
and marking the starting position and the ending position of each type of label of each sample statement.
In one embodiment, the processor is configured to execute a computer program stored in the memory to perform the steps of:
acquiring a symptom description sentence to be processed; inputting the symptom description sentence into the pointer network model, and predicting the probability that each character of the symptom description sentence corresponds to a plurality of positions of each type of label; extracting a character segment which starts from the first character to ends from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label; and determining symptom information of each type of label corresponding to the symptom description statement according to the character segment corresponding to each type of label, so as to generate a symptom standard expression statement based on the symptom information.
The computer-readable storage medium may be an internal storage unit of the computer device described in the foregoing embodiment, for example, a hard disk or a memory of the computer device. The computer readable storage medium may also be an external storage device of the computer device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital Card (SD Card), a Flash memory Card (Flash Card), and the like provided on the computer device.
Further, the computer-readable storage medium may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function, and the like; the storage data area may store data created according to the use of the blockchain node, and the like.
The block chain referred by the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. A block chain (Blockchain), which is essentially a decentralized database, is a series of data blocks associated by using a cryptographic method, and each data block contains information of a batch of network transactions, so as to verify the validity (anti-counterfeiting) of the information and generate a next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like.
While the invention has been described with reference to specific embodiments, the scope of the invention is not limited thereto, and those skilled in the art can easily conceive various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (10)
1. A pointer network model training method is characterized by comprising the following steps:
labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set;
inputting the sample data in the sample data set into a pointer network model, and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label;
determining a highest probability of each position of each type of label from the plurality of predicted probabilities, and calculating a loss function value of the pointer network model according to the highest probability of each position of each type of label;
and performing iterative optimization on the pointer network model, returning to the step of inputting the sample data in the sample data set into the pointer network model and predicting the probability of each character of the sample data corresponding to a plurality of positions of each type of label until the pointer network model is determined to be converged based on the calculated loss function value, and obtaining the trained pointer network model.
2. The pointer network model training method of claim 1, wherein the calculating the loss function value of the pointer network model according to the highest probability of each position of each type of label comprises:
obtaining a penalty coefficient of each position of each type of label corresponding to the sample data;
calculating the loss function value of the pointer network model according to the penalty coefficient of each position of each type of label and the highest probability of each position of each type of label.
3. The pointer network model training method according to claim 2, wherein the calculating the loss function value of the pointer network model according to the penalty coefficient for each position of each type of label and the highest probability for each position of each type of label comprises:
according to the calculation formula:
wherein Loss represents the Loss function value, n represents the type number of the label, m represents the base number of the Log function, alphai,startPenalty factor, α, representing the starting position of class i tagsi,endPenalty factor, α, representing the end position of class i tagsi,otherPenalty factor, P, for other positions of class i tagsi,startRepresenting the highest probability of the starting position, P, of the i-th class labeli,endRepresenting the highest probability of the end position of the i-th class label, Pi,otherRepresenting the highest probability of other positions of the class i tag.
4. The method of claim 1, wherein the number of pointer layers in the pointer network model is consistent with the number of types of labels, and each type of label corresponds to a layer of pointer in the pointer network model.
5. The pointer network model training method of any one of claims 1 to 4, wherein the labeling at least one type of label for each sample sentence in the sample sentence set comprises:
determining at least one type of label corresponding to each sample statement;
and marking the starting position and the ending position of each type of label of each sample statement.
6. A pointer network model calling method, wherein the pointer network model is a pointer network model trained by the pointer network model training method according to any one of claims 1 to 5, and the pointer network model calling method includes:
acquiring a symptom description sentence to be processed;
inputting the symptom description sentence into the pointer network model, and predicting the probability that each character of the symptom description sentence corresponds to a plurality of positions of each type of label;
extracting a character segment which starts from the first character to ends from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label;
and determining symptom information of each type of label corresponding to the symptom description statement according to the character segment corresponding to each type of label, so as to generate a symptom standard expression statement based on the symptom information.
7. A pointer network model training apparatus, comprising:
the label labeling module is used for labeling at least one type of label for each sample statement in the sample statement set to generate a sample data set;
the training module is used for inputting the sample data in the sample data set into a pointer network model and predicting the probability that each character of the sample data corresponds to a plurality of positions of each type of label;
a calculation module for determining a highest probability for each position of each type of label from the predicted plurality of probabilities, and calculating a loss function value of the pointer network model according to the highest probability for each position of each type of label;
and the first processing module is used for carrying out iterative optimization on the pointer network model until the pointer network model is determined to be converged based on the calculated loss function value, so that the trained pointer network model is obtained.
8. A pointer network model invoking apparatus, comprising:
the acquisition module is used for acquiring a symptom description statement to be processed;
the prediction module is used for inputting the symptom description statement into the pointer network model and predicting the probability that each character of the symptom description statement corresponds to a plurality of positions of each type of label;
the extraction module is used for extracting a character segment which starts from the first character to end from the second character from the symptom description sentence according to the first character corresponding to the highest probability of the starting position of each type of label and the second character corresponding to the highest probability of the ending position of each type of label;
and the second processing module is used for determining the symptom information of each type of label corresponding to the symptom description statement according to the character segment corresponding to each type of label so as to generate a symptom standard expression statement based on the symptom information.
9. A computer device, wherein the computer device comprises a memory and a processor;
the memory for storing a computer program;
the processor, configured to execute the computer program and to implement the pointer network model training method according to any one of claims 1 to 5, or to implement the pointer network model calling method according to claim 6 when the computer program is executed.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, causes the processor to implement the pointer network model training method of any one of claims 1 to 5 or the pointer network model invoking method of claim 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111094266.3A CN113780454B (en) | 2021-09-17 | 2021-09-17 | Model training and calling method and device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111094266.3A CN113780454B (en) | 2021-09-17 | 2021-09-17 | Model training and calling method and device, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113780454A true CN113780454A (en) | 2021-12-10 |
CN113780454B CN113780454B (en) | 2023-10-24 |
Family
ID=78851883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111094266.3A Active CN113780454B (en) | 2021-09-17 | 2021-09-17 | Model training and calling method and device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113780454B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115132303A (en) * | 2022-04-28 | 2022-09-30 | 腾讯科技(深圳)有限公司 | Physiological label prediction method, model training method, device, equipment and medium |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416384A (en) * | 2018-03-05 | 2018-08-17 | 苏州大学 | A kind of image tag mask method, system, equipment and readable storage medium storing program for executing |
CN110309305A (en) * | 2019-06-14 | 2019-10-08 | 中国电子科技集团公司第二十八研究所 | Machine based on multitask joint training reads understanding method and computer storage medium |
CN111062413A (en) * | 2019-11-08 | 2020-04-24 | 深兰科技(上海)有限公司 | Road target detection method and device, electronic equipment and storage medium |
CN112069639A (en) * | 2020-09-09 | 2020-12-11 | 国网山东省电力公司威海供电公司 | Method and system for planning grid of power system |
CN112509600A (en) * | 2020-12-11 | 2021-03-16 | 平安科技(深圳)有限公司 | Model training method and device, voice conversion method and device and storage medium |
CN112818946A (en) * | 2021-03-08 | 2021-05-18 | 苏州科达科技股份有限公司 | Training of age identification model, age identification method and device and electronic equipment |
CN113127631A (en) * | 2021-04-23 | 2021-07-16 | 重庆邮电大学 | Text summarization method based on multi-head self-attention mechanism and pointer network |
CN113326187A (en) * | 2021-05-25 | 2021-08-31 | 扬州大学 | Data-driven intelligent detection method and system for memory leakage |
-
2021
- 2021-09-17 CN CN202111094266.3A patent/CN113780454B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108416384A (en) * | 2018-03-05 | 2018-08-17 | 苏州大学 | A kind of image tag mask method, system, equipment and readable storage medium storing program for executing |
CN110309305A (en) * | 2019-06-14 | 2019-10-08 | 中国电子科技集团公司第二十八研究所 | Machine based on multitask joint training reads understanding method and computer storage medium |
CN111062413A (en) * | 2019-11-08 | 2020-04-24 | 深兰科技(上海)有限公司 | Road target detection method and device, electronic equipment and storage medium |
CN112069639A (en) * | 2020-09-09 | 2020-12-11 | 国网山东省电力公司威海供电公司 | Method and system for planning grid of power system |
CN112509600A (en) * | 2020-12-11 | 2021-03-16 | 平安科技(深圳)有限公司 | Model training method and device, voice conversion method and device and storage medium |
CN112818946A (en) * | 2021-03-08 | 2021-05-18 | 苏州科达科技股份有限公司 | Training of age identification model, age identification method and device and electronic equipment |
CN113127631A (en) * | 2021-04-23 | 2021-07-16 | 重庆邮电大学 | Text summarization method based on multi-head self-attention mechanism and pointer network |
CN113326187A (en) * | 2021-05-25 | 2021-08-31 | 扬州大学 | Data-driven intelligent detection method and system for memory leakage |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115132303A (en) * | 2022-04-28 | 2022-09-30 | 腾讯科技(深圳)有限公司 | Physiological label prediction method, model training method, device, equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN113780454B (en) | 2023-10-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021135469A1 (en) | Machine learning-based information extraction method, apparatus, computer device, and medium | |
CN110298019A (en) | Name entity recognition method, device, equipment and computer readable storage medium | |
CN113051371B (en) | Chinese machine reading understanding method and device, electronic equipment and storage medium | |
CN112256822A (en) | Text search method and device, computer equipment and storage medium | |
US9754083B2 (en) | Automatic creation of clinical study reports | |
CN112215008A (en) | Entity recognition method and device based on semantic understanding, computer equipment and medium | |
CN111710383A (en) | Medical record quality control method and device, computer equipment and storage medium | |
CN113449187A (en) | Product recommendation method, device and equipment based on double portraits and storage medium | |
CN113157959B (en) | Cross-modal retrieval method, device and system based on multi-modal topic supplementation | |
CN115714002B (en) | Training method for depression risk detection model, depression symptom early warning method and related equipment | |
CN112885478A (en) | Medical document retrieval method, medical document retrieval device, electronic device, and storage medium | |
CN113159013A (en) | Paragraph identification method and device based on machine learning, computer equipment and medium | |
CN113889074A (en) | Voice generation method, device, equipment and medium | |
CN118277573B (en) | Pre-hospital emergency text classification labeling method based on ChatGLM model, electronic equipment, storage medium and computer program product | |
CN109284497B (en) | Method and apparatus for identifying medical entities in medical text in natural language | |
CN111967253A (en) | Entity disambiguation method and device, computer equipment and storage medium | |
CN113780454B (en) | Model training and calling method and device, computer equipment and storage medium | |
CN113297852B (en) | Medical entity word recognition method and device | |
CN115422368A (en) | Event coreference resolution method and device, computer equipment and storage medium | |
CN113935328A (en) | Text abstract generation method and device, electronic equipment and storage medium | |
CN114691716A (en) | SQL statement conversion method, device, equipment and computer readable storage medium | |
CN113450764A (en) | Text voice recognition method, device, equipment and storage medium | |
CN113868424A (en) | Text theme determining method and device, computer equipment and storage medium | |
CN113011153A (en) | Text correlation detection method, device, equipment and storage medium | |
CN113934842A (en) | Text clustering method and device and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |