CN116933087A - Training method and device for intention detection model - Google Patents

Training method and device for intention detection model Download PDF

Info

Publication number
CN116933087A
CN116933087A CN202310949294.1A CN202310949294A CN116933087A CN 116933087 A CN116933087 A CN 116933087A CN 202310949294 A CN202310949294 A CN 202310949294A CN 116933087 A CN116933087 A CN 116933087A
Authority
CN
China
Prior art keywords
medical
interaction
intention
data
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310949294.1A
Other languages
Chinese (zh)
Inventor
江飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202310949294.1A priority Critical patent/CN116933087A/en
Publication of CN116933087A publication Critical patent/CN116933087A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/332Query formulation
    • G06F16/3329Natural language query formulation or dialogue systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/35Clustering; Classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2431Multiple classes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • Computational Linguistics (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

The embodiment of the specification provides an intention detection model training method and device, wherein the intention detection model training method comprises the following steps: in the model training process, intention detection is carried out on user medical data and medical payment data, an intention set is constructed according to detection results, the user medical data and the medical payment data are used as training samples, the intention set is used as sample labels, an intermediate model is obtained by training a model to be trained, intention classification is carried out on the intention set to obtain a plurality of intention categories, medical interaction problems of the intention categories are generated based on a dialogue generation model, and the intermediate model is trained to obtain an intention detection model according to medical interaction data obtained through interaction of the medical interaction problems.

Description

Training method and device for intention detection model
Technical Field
The present document relates to the field of data processing technologies, and in particular, to a training method and device for an intent detection model.
Background
With the continuous development and popularization of the internet, the application range of online services provided based on the internet is wider and wider, so that most users are gradually covered, various online service scenes, such as online inquiry services based on the internet, appear in the situation, but the identity state of the users cannot be actually perceived in the online inquiry process, the situation of diseases of the users can only be carried out in a dialogue communication and video display mode, the communication efficiency is low, and the situation that communication cannot cut into points exists in many cases, so that the user experience is affected.
Disclosure of Invention
One or more embodiments of the present specification provide an intent detection model training method, including: and carrying out intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to the detection result. And training the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model. And carrying out intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model. And acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
One or more embodiments of the present specification provide a medical access intention detection method including: user medical data and medical payment data of an accessing user are acquired. And inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, so as to obtain initial access intention. And generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem. And inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
One or more embodiments of the present specification provide an intention detection model training apparatus comprising: and the intention detection module is configured to detect intention of the user medical data and the medical payment data and construct an intention set according to the detection result. And the model training module is configured to train the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model. The problem generation module is configured to classify the intention set into a plurality of intention categories and generate medical interaction problems of the intention categories based on a dialogue generation model. The data acquisition training module is configured to acquire medical interaction data obtained by interaction based on the medical interaction problem and train the intermediate model based on the medical interaction data to obtain an intention detection model.
One or more embodiments of the present specification provide a medical access intention detection apparatus including: and the data acquisition module is configured to acquire the user medical data and the medical payment data of the access user. And the intention detection module is configured to input the user medical data and the medical payment data into an intention detection model for intention detection, so as to obtain initial access intention. And the medical access interaction module is configured to generate a medical interaction problem of the initial access intention based on a dialogue generation model and conduct medical access interaction with the access user based on the medical interaction problem. The secondary intention detection module is configured to input medical interaction data obtained through medical access interaction into the intention detection model to perform secondary intention detection, and obtain medical access intention.
One or more embodiments of the present specification provide an intent detection model training apparatus comprising: a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to: and carrying out intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to the detection result. And training the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model. And carrying out intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model. And acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
One or more embodiments of the present specification provide a medical access intention detection apparatus including: a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to: user medical data and medical payment data of an accessing user are acquired. And inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, so as to obtain initial access intention. And generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem. And inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
One or more embodiments of the present specification provide a storage medium storing computer-executable instructions that, when executed by a processor, implement the following: and carrying out intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to the detection result. And training the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model. And carrying out intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model. And acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
One or more embodiments of the present specification provide another storage medium storing computer-executable instructions that, when executed by a processor, implement the following: user medical data and medical payment data of an accessing user are acquired. And inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, so as to obtain initial access intention. And generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem. And inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
Drawings
For a clearer description of one or more embodiments of the present description or of the solutions of the prior art, the drawings that are needed in the description of the embodiments or of the prior art will be briefly described below, it being obvious that the drawings in the description that follow are only some of the embodiments described in the present description, from which other drawings can be obtained, without inventive faculty, for a person skilled in the art;
FIG. 1 is a schematic diagram of an environment in which an intent detection model training method is implemented, provided in one or more embodiments of the present disclosure;
FIG. 2 is a process flow diagram of an intent detection model training method provided in one or more embodiments of the present disclosure;
FIG. 3 is a process flow diagram of an intent detection model training method for use in a healthcare scenario provided in one or more embodiments of the present disclosure;
FIG. 4 is a process flow diagram of a medical access intent detection method provided in one or more embodiments of the present disclosure;
FIG. 5 is a schematic diagram of one embodiment of an intent detection model training apparatus provided in one or more embodiments of the present disclosure;
FIG. 6 is a schematic diagram of one embodiment of a medical access intent detection device provided by one or more embodiments of the present disclosure;
FIG. 7 is a schematic diagram of an intent detection model training apparatus provided in one or more embodiments of the present disclosure;
fig. 8 is a schematic structural diagram of a medical access intention detection device according to one or more embodiments of the present disclosure.
Detailed Description
In order to enable a person skilled in the art to better understand the technical solutions in one or more embodiments of the present specification, the technical solutions in one or more embodiments of the present specification will be clearly and completely described below with reference to the drawings in one or more embodiments of the present specification, and it is obvious that the described embodiments are only some embodiments of the present specification, not all embodiments. All other embodiments, which can be made by one or more embodiments of the present disclosure without inventive effort, are intended to be within the scope of the present disclosure.
In an implementation environment to which the intention detection model training method provided in one or more embodiments of the present disclosure is applicable, at least the model to be trained 101 and the dialogue generation model 102 are included, where the implementation environment may further include a database 103 or a storage unit for storing user medical data and medical payment data; in addition, the implementation environment may include an interaction system 104 that interacts with training personnel/data annotators;
The model 101 to be trained and the dialogue generating model 102 run on the servers, and the two servers can run on the same server or different servers, and can be one or more servers, a server cluster formed by a plurality of servers, or cloud servers of a cloud computing platform;
in the implementation environment, in the training process of the intention detection model, stored user medical data and medical payment data are read from a database 103 to serve as training samples, an intention set is constructed on the basis of detection results obtained by intention detection of the user medical data and the medical payment data to serve as sample labels, the to-be-trained model 101 is subjected to supervised training to obtain an intermediate model, after the intermediate model is obtained by training, the intention set is subjected to intention classification to obtain a plurality of intention categories, medical interaction problems of each intention category are generated through a dialogue generation model 102, on the basis of the generated medical interaction problems, the medical interaction data are obtained through interaction of the medical interaction problems by an interaction system 104, and then the intention detection model is obtained through further training of the intermediate model according to the medical interaction data.
One or more embodiments of an intent detection model training method provided in the present specification are as follows:
referring to fig. 2, the method for training the intent detection model provided in the present embodiment specifically includes steps S202 to S208.
Step S202, intention detection is carried out on the medical data of the user and the medical payment data, and an intention set is constructed according to the detection result.
The user medical data in this embodiment refers to data related to medical practice of the user, and may specifically be one or more of medical practice data such as medical records of the user, medical consultation data, case data, examination data, physical examination data, and medicine purchasing data. The medical payment data refers to payment data for performing medical related payment through a payment application, and specifically may be one or more of payment data such as registered payment data, medical insurance payment data, check payment data, physical examination payment data, medicine payment data and the like.
In specific implementation, through intention detection on the user medical data and the medical payment data of each user, the access intention of medical access behaviors corresponding to the user medical data and the medical payment data of the user is detected, namely: and performing intention detection from the medical data of the user and the medical payment data to obtain the medical access intention of the user.
In an optional implementation manner provided in this embodiment, performing intent detection on user medical data and medical payment data, and constructing an intent set according to a detection result includes:
extracting medical sub-data of each user contained in the medical data of the user and payment sub-data of each user contained in the medical payment data;
respectively inputting the medical sub-data and payment sub-data of each user into the dialogue generation model to detect intention, and constructing the intention set according to the medical access intention of each user obtained by detection;
it should be noted that the user medical data is composed of user medical data of a plurality of users, and the medical payment data is also composed of medical payment data of a plurality of users; here, the user medical data of each user included in the user medical data is referred to as medical sub-data, and the medical payment data of each user included in the medical payment data is referred to as payment sub-data.
The intention set is a set composed of medical access intention of a plurality of users. The medical access intention of the user refers to the access intention of the medical action of the user determined based on the medical sub-data and the payment sub-data of the user, for example, the medical sub-data of the user A comprises medical consultation data and a diagnosis record, and the payment sub-data comprises registered payment data, the medical access intention of the user is detected to be the diagnosis intention; for example, if the medical sub-data of the user B includes medicine purchasing data and the payment sub-data includes medicine payment data, the medical access intention of the user is detected as the medicine purchasing intention.
And step S204, training the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model.
The obtaining is based on the intention detection of the user medical data and the medical payment data to construct an intention set, and model training is carried out on the model to be trained according to the user medical data, the medical payment data and the intention set, specifically, the user medical data and the medical payment data are taken as training samples, the intention set is taken as a sample label to carry out supervised training on the model to be trained, and the model obtained through training is called as an intermediate model.
Specifically, in the model training process, aiming at medical sub-data and payment sub-data of any one user in medical data and medical payment data of the user, the medical sub-data and payment sub-data of the any one user are used as a training sample, the medical sub-data and payment sub-data of the any one user are input into a model to be trained for intention detection, the predicted intention of the any one user is output, the medical access intention of the any one user is determined in an intention set to be used as a sample label of a current training sample, training loss is determined according to the predicted intention of the any one user and the medical access intention of the any one user, and parameter adjustment is performed on the model to be trained based on the training loss;
And the training mode is adopted to respectively input all training samples contained in the medical data of the user and the medical payment data into the model to be trained respectively for intention detection, the prediction intention of each training sample in all the training samples is output, the training loss is determined according to the prediction intention of each training sample and the medical access intention of each training sample in the intention set, the parameter adjustment is carried out on the model to be trained in sequence, and the intermediate model is obtained after the training is completed.
Step S206, performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model.
As described above, the intent set includes medical access intents of a plurality of users, and the medical access intents of different users may be different, for example, there may be medical access intents of some users as a doctor's intent, and there may also be medical access intents of some users as a medicine purchase intent; here, by classifying the intention set, it is determined which intention categories the intention set contains the medical access intents of a plurality of users.
In a specific implementation process, medical access intentions of a plurality of users contained in the intent set can be clustered through a clustering algorithm, one or more subsets of the medical access intentions are obtained, and each subset corresponds to one intention category. For example, after clustering the medical access intents of a plurality of users included in the intent set, two subsets are obtained, wherein the medical access intents included in one subset are the doctor's intents, namely: the intention category corresponding to the subset is the visit intention category, and the medical visit intention contained in the other subset is the medicine purchase intention, namely: the intention category corresponding to the subset is a purchasing intention category.
In the specific implementation, after the intention collection is subjected to intention classification to obtain a plurality of intention categories, medical interaction problems of the intention categories are generated based on the dialogue generation model, so that interaction can be performed on the basis of the generated medical interaction problems, and medical access intention of a user can be deeply analyzed in an interaction mode.
In an optional implementation manner provided in this embodiment, generating medical interaction questions of each intention category based on the dialog generation model includes:
for any training sample in the training samples, acquiring medical sub-data and payment sub-data contained in the any training sample;
and inputting the intention category, the medical sub-data and the payment sub-data corresponding to any training sample into the dialogue generation model to generate a dialogue, so as to obtain the medical interaction problem of the intention category corresponding to any training sample.
For example, a training sample includes medical sub-data (medical sub-data includes medical consultation data and visit record) and payment sub-data (payment sub-data includes registered payment data) of the user a, and if the intention category corresponding to the training sample is a visit intention category, the medical sub-data and payment sub-data of the user a and the corresponding visit intention category are input into a pre-constructed dialogue generation model to generate a medical interaction dialogue, and a medical interaction problem is output: "please input a query about what kind of disease is desired? ";
Or, a certain training sample includes medical sub-data (medical sub-data includes medicine purchasing data) and payment sub-data (payment sub-data includes medicine payment data) of the user B, and if the intention category corresponding to the training sample is a medicine purchasing intention category, inputting the medical sub-data and payment sub-data of the user B and the corresponding medicine purchasing intention category into a dialogue generating model to generate a medical interaction dialogue, and outputting a medical interaction question: "please input what kind of drug is desired to be purchased? ".
In another alternative implementation manner provided in the present embodiment, generating medical interaction problems of each intention category based on a dialog generation model includes:
inputting user medical data and medical payment data contained in any training sample in the training samples into the dialogue generating model to extract key data, and obtaining key data;
and inquiring corresponding medical interactive questions in a question bank based on the key data.
For example, if a certain training sample includes medical sub-data (medical sub-data includes medical consultation data and a visit record) and payment sub-data (payment sub-data includes registered payment data), the medical sub-data and payment sub-data of the user a are input into a dialogue generating model configured with a key data extracting function to extract key data, and the extracted key data "xx disease" is output, and then a pre-constructed problem library is searched for a corresponding medical interaction problem of the disease category "xx disease", which is: please ask if there are xxxx symptoms.
Step S208, acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
After the medical interaction problems of the intention categories are generated based on the dialogue generation model, the medical interaction data are acquired in an interaction mode based on the medical interaction problems, so that the more accurate medical access intention of the user is deeply analyzed in an interaction mode, and the middle model is trained by utilizing the medical interaction data which can be used for more accurately analyzing the medical access intention of the user based on the analysis of the more accurate medical access intention of the user, so that the intention detection model obtained through training can have the detection capability of analyzing the more accurate medical access intention.
The medical interaction data in this embodiment includes a medical interaction problem and reply data based on the medical interaction problem obtained after interaction.
It should be noted that, because the user medical data and the medical payment data are data from the accessing user who performs the medical access, and in the model training process, the intention detection model is not obtained yet, and the on-line deployment premise of the intention detection model is lacking, in this case, the interaction with the accessing user cannot be directly performed, and the interaction with the accessing user in the medical access process is simulated by the manner of the simulated interaction of the medical access, so the interaction is performed based on the medical interaction problem, including the simulated interaction or the test interaction of the medical access based on the medical interaction problem.
In the specific execution process, the simulation interaction or test interaction of the medical access based on the medical interaction problem can be realized by means of a medical interaction system, particularly the simulation interaction or test interaction with an interaction user is realized by the medical interaction system, and the interaction user can be a training person for model training or a data labeling person for providing data labeling for model training.
Specifically, in an optional implementation manner provided in this embodiment, the interaction based on the medical interaction problem includes:
the medical interaction problem is taken as interface input to call an interaction interface of a medical interaction system, so that the medical interaction problem is sent to a corresponding interaction user through the medical interaction system;
and acquiring interaction reply data returned by the interaction interface call, and taking the medical interaction problem and the interaction reply data as the medical interaction data.
In addition, in the specific execution process, the simulation interaction or the test interaction of the medical access based on the medical interaction problem can be performed through an interaction model, specifically, the simulation interaction or the test interaction is performed on the medical interaction problem and the interaction model, the specific interaction mode is that the medical interaction problem is input into the interaction model, and the interaction model carries out problem reply processing on the input medical interaction problem and then outputs corresponding reply.
Specifically, in an optional implementation manner provided in this embodiment, the interaction based on the medical interaction problem includes:
inputting the medical interaction problem into an interaction model for dialogue interaction to obtain an output dialogue reply;
the key data and the dialogue reply are used as the medical interaction data; the interaction model includes the dialog generation model.
In practical application, there may be a case that multiple rounds of interaction are required to analyze a more accurate medical access intention of a user, and for this purpose, in the process of performing interaction based on a medical interaction problem, multiple rounds of simulation interaction or test interaction may be performed to determine medical interaction data, so that the more accurate analysis is performed on the medical access intention of the user through the multiple rounds of simulation interaction or test interaction.
Taking two-round interaction as an example, describing the process of performing multi-round simulation interaction or test interaction, and referring to the implementation process of the two-round interaction, wherein the other processes of performing multi-round simulation interaction or test interaction are more than two rounds; specifically, in an optional implementation manner provided in this embodiment, the interaction based on the medical interaction problem includes:
Calling an interaction interface of a medical interaction system to perform medical dialogue interaction based on the medical interaction problem to obtain a first dialogue reply;
inputting the medical interaction problem and the first dialogue reply into the dialogue generation model to generate a dialogue, and obtaining a second interaction problem;
and calling the interaction interface to perform medical dialogue interaction based on the second interaction problem to obtain a second dialogue reply, and taking the medical interaction problem, the first dialogue reply, the second interaction problem and the second dialogue reply as the medical interaction data.
In a specific implementation, in order to improve training efficiency and pertinence of training of the intermediate model, in an optional implementation manner provided in this embodiment, training the intermediate model based on the medical interaction data to obtain an intention detection model includes:
classifying the medical interaction data according to the intention category to which the medical interaction data belongs, and obtaining respective classified interaction data of the intention categories;
respectively carrying out parameter adjustment on the intention detection model based on the classified interaction data of each intention category to obtain candidate models of each intention category;
And fusing model parameters of the candidate models of the intention categories based on the training weights of the intention categories to obtain the intention detection model.
In addition, in the implementation, on the basis of training the intermediate model to obtain the intention detection model, the model parameters of the intention detection model are adjusted according to the interaction data of the sub-intention category under the intention category aiming at the medical interaction scene of multi-round interaction with the visiting user, so that the intention detection model aiming at the sub-intention category is more accurate.
Specifically, in an alternative implementation manner provided in this embodiment, after training the intermediate model based on the medical interaction data to obtain the intent detection model, the following operations are performed:
determining a sub-intention category under the intention category according to the medical interaction data;
generating secondary interaction problems of the sub-intention category based on the dialogue generation model, and acquiring secondary interaction data obtained by performing secondary interaction based on the secondary interaction problems;
and carrying out parameter adjustment on the intention detection model based on the secondary interaction data to obtain an intention detection model of the sub-intention category.
The following further describes the training method of the intent detection model provided in this embodiment with reference to fig. 3 by taking an application of the training method of the intent detection model provided in this embodiment to a medical service scene as an example, and referring to fig. 3, the training method of the intent detection model applied to a medical service scene specifically includes the following steps.
Step S302, medical sub-data and payment sub-data of each history access user of the medical service are acquired.
And step S304, intention detection is carried out on the medical sub-data and payment sub-data of each history access user, and the medical access intention of each history access user is obtained.
Step S306, training the model to be trained by taking medical sub-data and payment sub-data of each history access user as training samples and medical access intention of each history access user as sample labels to obtain an intermediate model.
Step S308, clustering medical access intentions of the history access users, and determining intention types of each medical access intention according to clustering results.
In step S310, the medical sub-data, payment sub-data and intention category of each history access user are input into the dialogue generation model to generate a dialogue, so as to obtain a medical interaction problem.
Step S312, the medical interaction problem is used as interface input to call the interaction interface of the medical interaction system, so that the medical interaction problem is sent to the corresponding interaction user through the medical interaction system.
Step S314, the interaction reply data of the medical interaction problem returned by the interaction interface call is obtained.
Step S316, training the intermediate model based on the medical interaction problem and the interaction reply data to obtain an intention detection model of the medical service.
The step S310 may be replaced by the following implementation manner: inputting the user medical data and the medical payment data of each history access user into a dialogue generating model to extract key data, and obtaining the key data; and inquiring corresponding medical interaction questions in the question library based on the key data.
The above steps S312 to S314 may be replaced by the following implementation manners: and inputting the medical interaction problem into a dialogue generating model to perform dialogue interaction, and obtaining output interaction reply data.
In addition, the following implementation manner may be further included after step S316: determining a sub-intention category under the intention category according to the medical interaction problem and the interaction reply data; generating secondary interaction problems of sub-intention categories based on a dialogue generation model, and acquiring secondary interaction data obtained by performing secondary interaction based on the secondary interaction problems; and carrying out parameter adjustment on the intention detection model based on the secondary interaction data to obtain an intention detection model of the sub-intention category.
In summary, in the training process of the model, the one or more intention detection model training methods provided in this embodiment use the medical data and the medical payment data of the user as training samples, perform intention detection on the medical data and the medical payment data of the user, construct an intention set according to the detection result as a sample label, perform supervised training on the model to be trained to obtain an intermediate model, classify the intention set to obtain a plurality of intention classes, and interact based on the generated medical interaction problem of each intention class based on the dialogue generation model, so as to obtain medical interaction data, analyze the more accurate medical access intention of the user in an interactive manner, and further train the intermediate model by using the obtained medical interaction data, so that the intention detection model obtained by training can more accurately analyze the medical access intention of the user.
One or more embodiments of a medical access intention detection method provided in the present specification are as follows:
Referring to fig. 4, the method for training the intent detection model provided in the present embodiment specifically includes steps S402 to S408.
Step S402, user medical data and medical payment data of the access user are acquired.
The user medical data in this embodiment refers to data related to medical practice of the user, and may specifically be one or more of medical practice data such as medical records of the user, medical consultation data, case data, examination data, physical examination data, and medicine purchasing data.
The medical payment data refers to payment data for performing medical related payment through a payment application, and specifically may be one or more of payment data such as registered payment data, medical insurance payment data, check payment data, physical examination payment data, medicine payment data and the like.
And step S404, inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining an initial access intention.
Optionally, the intention detection model trains the intermediate model based on medical interaction data obtained by interaction of medical interaction problems of intention types of training samples generated by the dialogue generation model;
The intermediate model is obtained by training a model to be trained by taking user medical data and medical payment data contained in a training data set as training samples and taking an intention set as a sample label;
the intention set is constructed and obtained according to the detection result of intention detection on the user medical data and the medical payment data contained in the training data set.
It should be noted that, the specific training process of the intention detection model refers to the specific training process provided by the above embodiment of the intention detection model training method, and this embodiment is not described herein again.
Step S406, generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem.
In this embodiment, on the basis of inputting user medical data and medical payment data of an access user into an intention detection model to perform intention detection to obtain an initial access intention, a medical interaction problem is generated for the initial access intention of the access user based on a dialogue generation model, so that interaction with the access user can be performed on the basis of the generated medical interaction problem, and the medical access intention of the access user is deeply analyzed in an interaction mode.
In an optional implementation manner provided in this embodiment, the generating the medical interaction problem of the initial access intention based on the dialog generation model includes: and inputting the user medical data, the medical payment data and the initial access intention of the access user into the dialogue generation model to generate a dialogue, so as to obtain the medical interaction problem of the initial access intention.
In another optional implementation manner provided in this embodiment, generating the medical interaction problem of the initial access intention based on the dialog generation model includes:
inputting the user medical data and the medical payment data of the access user into the dialogue generating model to extract key data, and obtaining key data;
and inquiring corresponding medical interactive questions in a question bank based on the key data.
In the implementation, in the process of generating the medical interaction problem of the initial access intention based on the dialogue generation model, the medical interaction problem of the initial access intention can be obtained by carrying out dialogue generation by inputting the user medical data, the medical payment data and the initial access intention of the access user into the dialogue generation model, based on the medical interaction problem, in the process of carrying out medical access interaction with the access user based on the medical interaction problem, the medical interaction problem can be input into an interaction interface of a medical interaction system to carry out problem transmission, and interaction reply data returned by the medical interaction problem and the interaction interface are used as medical interaction data.
Specifically, in an optional implementation manner provided in this embodiment, the generating, based on a dialogue generating model, a medical interaction problem of the initial access intention, and performing medical access interaction with the access user based on the medical interaction problem includes:
inputting the user medical data, the medical payment data and the initial access intention of the access user into the dialogue generation model to generate a dialogue, and obtaining a medical interaction problem of the initial access intention;
the medical interaction problem is taken as interface input to call an interaction interface of a medical interaction system, so that the medical interaction problem is sent to the access user through the medical interaction system;
and acquiring interaction reply data returned by the interaction interface call, and taking the medical interaction problem and the interaction reply data as the medical interaction data.
In practical application, there may be a case that multiple rounds of interaction are required to analyze a more accurate medical access intention of an accessing user, for which, in a process of performing medical access interaction with the accessing user based on a medical interaction problem, medical interaction data may be determined in a manner of performing multiple rounds of medical access interaction, so that the medical access intention of the user is analyzed more accurately through the multiple rounds of medical access interaction.
Taking two-round medical access interaction as an example, describing a process of performing multi-round medical access interaction, and referring to a realization process of the two-round medical access interaction in other multi-round medical access interaction processes with more than two rounds; specifically, in an optional implementation manner provided in this embodiment, performing medical access interaction with the access user based on the medical interaction problem includes:
calling an interaction interface of a medical interaction system based on the medical interaction problem to perform medical dialogue interaction with the access user to obtain a first dialogue reply;
inputting the medical interaction problem and the first dialogue reply into the dialogue generation model to generate a dialogue, and obtaining a second interaction problem;
and calling the interaction interface to perform medical dialogue interaction based on the second interaction problem to obtain a second dialogue reply, and taking the medical interaction problem, the first dialogue reply, the second interaction problem and the second dialogue reply as the medical interaction data.
Step S408, inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, and obtaining medical access intention.
On the basis of obtaining medical interaction data through medical access interaction with the access user, the obtained medical interaction data is input into an intention detection model to perform secondary intention detection, so that more accurate intention of the access user for medical access is detected.
In addition, in the process of performing secondary intention detection on the access user by using the intention detection model, at least one of user medical data, medical payment data and initial access intention of the access user can be input into the intention detection model together with medical interaction data obtained by performing medical access interaction to perform secondary intention detection to obtain medical access intention, namely: inputting the medical interaction data obtained by the medical access interaction into the intention detection model for secondary intention detection can be replaced by: the medical interaction data is input into the intent detection model with the user medical data, medical payment data and/or initial access intent of the accessing user for secondary intent detection.
An embodiment of an intention detection model training device provided in the present specification is as follows:
in the above-described embodiments, an intention detection model training method is provided, and an intention detection model training apparatus is provided correspondingly, which is described below with reference to the accompanying drawings.
Referring to fig. 5, a schematic diagram of an embodiment of an intent detection model training device according to the present embodiment is shown.
Since the apparatus embodiments correspond to the method embodiments, the description is relatively simple, and the relevant portions should be referred to the corresponding descriptions of the method embodiments provided above. The device embodiments described below are merely illustrative.
The present embodiment provides an intention detection model training device, including:
an intention detection module 502 configured to perform intention detection on user medical data and medical payment data, and construct an intention set according to the detection result;
the model training module 504 is configured to train a model to be trained to obtain an intermediate model by taking the user medical data and the medical payment data as training samples and the intention set as sample labels;
a question generation module 506 configured to classify the intents of the set of intents to obtain a plurality of intention categories, and generate medical interaction questions for each intention category based on a dialog generation model;
the data acquisition training module 508 is configured to acquire medical interaction data obtained by interaction based on the medical interaction problem, and train the intermediate model based on the medical interaction data to obtain an intention detection model.
An embodiment of a medical access intention detection device provided in the present specification is as follows:
in the above-described embodiments, a medical access intention detecting method is provided, and a medical access intention detecting apparatus is provided correspondingly, which is described below with reference to the accompanying drawings.
Referring to fig. 6, a schematic diagram of an embodiment of a medical access intention detecting apparatus provided in the present embodiment is shown.
Since the apparatus embodiments correspond to the method embodiments, the description is relatively simple, and the relevant portions should be referred to the corresponding descriptions of the method embodiments provided above. The device embodiments described below are merely illustrative.
The present embodiment provides a medical access intention detection device including:
a data acquisition module 602 configured to acquire user medical data and medical payment data of an accessing user;
an intention detection module 604 configured to input the user medical data and the medical payment data into an intention detection model for intention detection, obtaining an initial access intention;
a medical access interaction module 606 configured to generate a medical interaction question of the initial access intention based on a dialogue generation model and to perform a medical access interaction with the access user based on the medical interaction question;
The secondary intention detection module 608 is configured to input medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
An embodiment of an intent detection model training device provided in the present specification is as follows:
in response to the above description of an intention detection model training method, one or more embodiments of the present disclosure further provide an intention detection model training device, based on the same technical concept, for performing the above provided one intention detection model training method, and fig. 7 is a schematic structural diagram of one or more embodiments of the present disclosure.
The embodiment provides an intention detection model training device, including:
as shown in FIG. 7, the intent detection model training apparatus may vary considerably in configuration or performance, and may include one or more processors 701 and memory 702, where memory 702 may store one or more stored applications or data. Wherein the memory 702 may be transient storage or persistent storage. The application program stored in memory 702 may include one or more modules (not shown in the figures), each of which may include a series of computer-executable instructions in an intent detection model training device. Still further, the processor 701 may be arranged to communicate with the memory 702, executing a series of computer executable instructions in the memory 702 on the model training device is intended to be detected. The intent detection model training device may also include one or more power sources 703, one or more wired or wireless network interfaces 704, one or more input/output interfaces 705, one or more keyboards 706, and the like.
In a particular embodiment, the intent detection model training device includes a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer executable instructions for the intent detection model training device, and configured to be executed by the one or more processors, the one or more programs including computer executable instructions for:
performing intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to detection results;
training a model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model;
and acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
An embodiment of a medical access intention detecting apparatus provided in the present specification is as follows:
in correspondence to a medical access intention detection method described above, one or more embodiments of the present disclosure further provide a medical access intention detection device for performing a medical access intention detection method provided above, and fig. 8 is a schematic structural diagram of a medical access intention detection device provided by one or more embodiments of the present disclosure, based on the same technical idea.
The medical access intention detection device provided in this embodiment includes:
as shown in fig. 8, the medical access intention detection device may have a relatively large difference due to different configurations or performances, and may include one or more processors 801 and a memory 802, where one or more storage applications or data may be stored in the memory 802. Wherein the memory 802 may be transient storage or persistent storage. The application program stored in the memory 802 may include one or more modules (not shown in the figures), each of which may include a series of computer-executable instructions in the medical access intention detection device. Still further, the processor 801 may be configured to communicate with the memory 802 to execute a series of computer executable instructions in the memory 802 on the medical access intention detection device. The medical access intent detection device may also include one or more power sources 803, one or more wired or wireless network interfaces 804, one or more input/output interfaces 805, one or more keyboards 806, and the like.
In one particular embodiment, a medical access intent detection device includes a memory, and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer-executable instructions for the medical access intent detection device, and execution of the one or more programs by one or more processors comprises computer-executable instructions for:
acquiring user medical data and medical payment data of an access user;
inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining initial access intention;
generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem;
and inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
An embodiment of a storage medium provided in the present specification is as follows:
Corresponding to the above description of an intent detection model training method, one or more embodiments of the present disclosure further provide a storage medium based on the same technical concept.
The storage medium provided in this embodiment is configured to store computer executable instructions that, when executed by a processor, implement the following flow:
performing intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to detection results;
training a model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model;
and acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
It should be noted that, in the present specification, an embodiment of a storage medium and an embodiment of an intent detection model training method in the present specification are based on the same inventive concept, so that a specific implementation of the embodiment may refer to an implementation of the foregoing corresponding method, and a repetition is omitted.
Another storage medium embodiment provided in this specification is as follows:
corresponding to the above-described medical access intention detection method, one or more embodiments of the present specification further provide a storage medium based on the same technical idea.
The storage medium provided in this embodiment is configured to store computer executable instructions that, when executed by a processor, implement the following flow:
acquiring user medical data and medical payment data of an access user;
inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining initial access intention;
generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem;
and inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
It should be noted that, in the present specification, an embodiment of a storage medium and an embodiment of a medical access intention detection method in the present specification are based on the same inventive concept, so that a specific implementation of the embodiment may refer to an implementation of the foregoing corresponding method, and a repetition is omitted.
In this specification, each embodiment is described in a progressive manner, and the same or similar parts of each embodiment are referred to each other, and each embodiment focuses on the differences from other embodiments, for example, an apparatus embodiment, and a storage medium embodiment, which are all similar to a method embodiment, so that description is relatively simple, and relevant content in reading apparatus embodiments, and storage medium embodiments is referred to the part description of the method embodiment.
The foregoing describes specific embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In the 30 s of the 20 th century, improvements to one technology could clearly be distinguished as improvements in hardware (e.g., improvements to circuit structures such as diodes, transistors, switches, etc.) or software (improvements to the process flow). However, with the development of technology, many improvements of the current method flows can be regarded as direct improvements of hardware circuit structures. Designers almost always obtain corresponding hardware circuit structures by programming improved method flows into hardware circuits. Therefore, an improvement of a method flow cannot be said to be realized by a hardware entity module. For example, a programmable logic device (Programmable Logic Device, PLD) (e.g., field programmable gate array (Field Programmable Gate Array, FPGA)) is an integrated circuit whose logic function is determined by the programming of the device by a user. A designer programs to "integrate" a digital system onto a PLD without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Moreover, nowadays, instead of manually manufacturing integrated circuit chips, such programming is mostly implemented by using "logic compiler" software, which is similar to the software compiler used in program development and writing, and the original code before the compiling is also written in a specific programming language, which is called hardware description language (Hardware Description Language, HDL), but not just one of the hdds, but a plurality of kinds, such as ABEL (Advanced Boolean Expression Language), AHDL (Altera Hardware Description Language), confluence, CUPL (Cornell University Programming Language), HDCal, JHDL (Java Hardware Description Language), lava, lola, myHDL, PALASM, RHDL (Ruby Hardware Description Language), etc., VHDL (Very-High-Speed Integrated Circuit Hardware Description Language) and Verilog are currently most commonly used. It will also be apparent to those skilled in the art that a hardware circuit implementing the logic method flow can be readily obtained by merely slightly programming the method flow into an integrated circuit using several of the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer readable medium storing computer readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, application specific integrated circuits (Application Specific Integrated Circuit, ASIC), programmable logic controllers, and embedded microcontrollers, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, atmel AT91SAM, microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic of the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller in a pure computer readable program code, it is well possible to implement the same functionality by logically programming the method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers, etc. Such a controller may thus be regarded as a kind of hardware component, and means for performing various functions included therein may also be regarded as structures within the hardware component. Or even means for achieving the various functions may be regarded as either software modules implementing the methods or structures within hardware components.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each unit may be implemented in the same piece or pieces of software and/or hardware when implementing the embodiments of the present specification.
One skilled in the relevant art will recognize that one or more embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, one or more embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description can take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present description is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the specification. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising at least one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
One or more embodiments of the present specification may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. One or more embodiments of the specification may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The foregoing description is by way of example only and is not intended to limit the present disclosure. Various modifications and changes may occur to those skilled in the art. Any modifications, equivalent substitutions, improvements, etc. that fall within the spirit and principles of the present document are intended to be included within the scope of the claims of the present document.

Claims (19)

1. An intent detection model training method comprising:
performing intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to detection results;
training a model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model;
and acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
2. The intent detection model training method as recited in claim 1, wherein the generating medical interaction questions for each intent category based on the dialog generation model comprises:
For any training sample in the training samples, acquiring medical sub-data and payment sub-data contained in the any training sample;
and inputting the intention category, the medical sub-data and the payment sub-data corresponding to any training sample into the dialogue generation model to generate a dialogue, so as to obtain the medical interaction problem of the intention category corresponding to any training sample.
3. The intent detection model training method according to claim 2, said interacting based on said medical interaction question, comprising:
the medical interaction problem is taken as interface input to call an interaction interface of a medical interaction system, so that the medical interaction problem is sent to a corresponding interaction user through the medical interaction system;
and acquiring interaction reply data returned by the interaction interface call, and taking the medical interaction problem and the interaction reply data as the medical interaction data.
4. The intent detection model training method as recited in claim 1, wherein the generating medical interaction questions for each intent category based on the dialog generation model comprises:
inputting user medical data and medical payment data contained in any training sample in the training samples into the dialogue generating model to extract key data, and obtaining key data;
And inquiring corresponding medical interactive questions in a question bank based on the key data.
5. The intent detection model training method of claim 4, said interacting based on said medical interaction question, comprising:
inputting the medical interaction problem into an interaction model for dialogue interaction to obtain an output dialogue reply;
the key data and the dialogue reply are used as the medical interaction data; the interaction model includes the dialog generation model.
6. The intent detection model training method according to claim 1, said interacting based on said medical interaction question, comprising:
calling an interaction interface of a medical interaction system to perform medical dialogue interaction based on the medical interaction problem to obtain a first dialogue reply;
inputting the medical interaction problem and the first dialogue reply into the dialogue generation model to generate a dialogue, and obtaining a second interaction problem;
and calling the interaction interface to perform medical dialogue interaction based on the second interaction problem to obtain a second dialogue reply, and taking the medical interaction problem, the first dialogue reply, the second interaction problem and the second dialogue reply as the medical interaction data.
7. The intention detection model training method according to claim 1, the training the intermediate model based on the medical interaction data to obtain an intention detection model, comprising:
classifying the medical interaction data according to the intention category to which the medical interaction data belongs, and obtaining respective classified interaction data of the intention categories;
respectively carrying out parameter adjustment on the intention detection model based on the classified interaction data of each intention category to obtain candidate models of each intention category;
and fusing model parameters of the candidate models of the intention categories based on the training weights of the intention categories to obtain the intention detection model.
8. The intention detection model training method according to claim 1, wherein after the step of acquiring medical interaction data obtained by interaction based on the medical interaction problem and training the intermediate model based on the medical interaction data to obtain an intention detection model is performed, comprising:
determining a sub-intention category under the intention category according to the medical interaction data;
generating secondary interaction problems of the sub-intention category based on the dialogue generation model, and acquiring secondary interaction data obtained by performing secondary interaction based on the secondary interaction problems;
And carrying out parameter adjustment on the intention detection model based on the secondary interaction data to obtain an intention detection model of the sub-intention category.
9. The intention detection model training method according to claim 1, wherein the intention detection is performed on the user medical data and the medical payment data, and the intention set is constructed according to the detection result, comprising:
extracting medical sub-data of each user contained in the medical data of the user and payment sub-data of each user contained in the medical payment data;
and respectively inputting the medical sub-data and payment sub-data of each user into the dialogue generation model to detect the intention, and constructing the intention set according to the medical access intention of each user obtained by detection.
10. A medical access intention detection method, comprising:
acquiring user medical data and medical payment data of an access user;
inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining initial access intention;
generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem;
And inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
11. The medical access intention detection method according to claim 10, wherein the intention detection model is obtained by training an intermediate model based on medical interaction data obtained by interaction of medical interaction problems of intention categories to which training samples generated by the dialogue generation model belong;
the intermediate model is obtained by training a model to be trained by taking user medical data and medical payment data contained in a training data set as training samples and taking an intention set as a sample label;
the intention set is constructed and obtained according to the detection result of intention detection on the user medical data and the medical payment data contained in the training data set.
12. The medical access intention detection method of claim 10, the generating the medical interaction question of the initial access intention based on a dialog generation model, comprising:
inputting the user medical data, the medical payment data and the initial access intention into the dialogue generation model to generate a dialogue, and obtaining a medical interaction problem of the initial access intention;
Or alternatively, the process may be performed,
inputting the user medical data and the medical payment data into the dialogue generating model to extract key data, and obtaining key data;
and inquiring corresponding medical interactive questions in a question bank based on the key data.
13. The medical access intention detection method according to claim 10, the medical access interaction with the access user based on the medical interaction question, comprising:
the medical interaction problem is taken as interface input to call an interaction interface of a medical interaction system, so that the medical interaction problem is sent to the access user through the medical interaction system;
and acquiring interaction reply data returned by the interaction interface call, and taking the medical interaction problem and the interaction reply data as the medical interaction data.
14. An intent detection model training apparatus comprising:
the intention detection module is configured to detect intention of the user medical data and the medical payment data and construct an intention set according to detection results;
the model training module is configured to train the model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
A question generation module configured to classify the intentions of the intent sets to obtain a plurality of intent categories, and generate medical interaction questions for each intent category based on a dialog generation model;
the data acquisition training module is configured to acquire medical interaction data obtained by interaction based on the medical interaction problem and train the intermediate model based on the medical interaction data to obtain an intention detection model.
15. A medical access intention detection device, comprising:
a data acquisition module configured to acquire user medical data and medical payment data of an access user;
the intention detection module is configured to input the user medical data and the medical payment data into an intention detection model for intention detection, and initial access intention is obtained;
a medical access interaction module configured to generate a medical interaction question of the initial access intention based on a dialogue generation model and to perform medical access interaction with the access user based on the medical interaction question;
the secondary intention detection module is configured to input medical interaction data obtained through medical access interaction into the intention detection model to perform secondary intention detection, and obtain medical access intention.
16. An intent detection model training apparatus comprising:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to:
performing intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to detection results;
training a model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model;
and acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
17. A medical access intention detection device comprising:
a processor; and a memory configured to store computer-executable instructions that, when executed, cause the processor to:
acquiring user medical data and medical payment data of an access user;
Inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining initial access intention;
generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem;
and inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
18. A storage medium storing computer-executable instructions that when executed by a processor implement the following:
performing intention detection on the medical data of the user and the medical payment data, and constructing an intention set according to detection results;
training a model to be trained by taking the user medical data and the medical payment data as training samples and the intention set as sample labels to obtain an intermediate model;
performing intention classification on the intention set to obtain a plurality of intention categories, and generating medical interaction problems of the intention categories based on a dialogue generation model;
and acquiring medical interaction data obtained by interaction based on the medical interaction problem, and training the intermediate model based on the medical interaction data to obtain an intention detection model.
19. A storage medium storing computer-executable instructions that when executed by a processor implement the following:
acquiring user medical data and medical payment data of an access user;
inputting the user medical data and the medical payment data into an intention detection model to perform intention detection, and obtaining initial access intention;
generating a medical interaction problem of the initial access intention based on a dialogue generation model, and performing medical access interaction with the access user based on the medical interaction problem;
and inputting the medical interaction data obtained by the medical access interaction into the intention detection model to perform secondary intention detection, so as to obtain medical access intention.
CN202310949294.1A 2023-07-28 2023-07-28 Training method and device for intention detection model Pending CN116933087A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310949294.1A CN116933087A (en) 2023-07-28 2023-07-28 Training method and device for intention detection model

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310949294.1A CN116933087A (en) 2023-07-28 2023-07-28 Training method and device for intention detection model

Publications (1)

Publication Number Publication Date
CN116933087A true CN116933087A (en) 2023-10-24

Family

ID=88385999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310949294.1A Pending CN116933087A (en) 2023-07-28 2023-07-28 Training method and device for intention detection model

Country Status (1)

Country Link
CN (1) CN116933087A (en)

Similar Documents

Publication Publication Date Title
CN110309283B (en) Answer determination method and device for intelligent question answering
CN110569428B (en) Recommendation model construction method, device and equipment
CN110457578B (en) Customer service demand identification method and device
CN110674188A (en) Feature extraction method, device and equipment
CN110008394B (en) Public opinion information identification method, device and equipment
CN110688974A (en) Identity recognition method and device
CN111930810B (en) Data rule mining method and device
CN116881429B (en) Multi-tenant-based dialogue model interaction method, device and storage medium
CN116933087A (en) Training method and device for intention detection model
CN112307371B (en) Applet sub-service identification method, device, equipment and storage medium
CN115358777A (en) Advertisement putting processing method and device of virtual world
CN117076650B (en) Intelligent dialogue method, device, medium and equipment based on large language model
CN115953559B (en) Virtual object processing method and device
CN111461352B (en) Model training method, service node identification device and electronic equipment
CN117035695B (en) Information early warning method and device, readable storage medium and electronic equipment
CN113312484B (en) Object tag processing method and device
CN114822554B (en) Interactive processing method and device based on voice
CN117931672A (en) Query processing method and device applied to code change
CN117094744A (en) Method and device for determining class of commercial tenant based on internet of things (IoT) device
CN117707948A (en) Training method and device for test case generation model
CN117196641A (en) Service processing method and device based on dialogue interaction
CN115033485A (en) Big data automatic testing method and device, electronic equipment and storage medium
CN117827173A (en) Title generation model training method and device
CN116957527A (en) Mail management method, mail management device, electronic equipment and computer readable storage medium
CN117421036A (en) Component management method, device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination