CN109815317A - A kind of sequence learning method, system, computer readable storage medium and equipment - Google Patents
A kind of sequence learning method, system, computer readable storage medium and equipment Download PDFInfo
- Publication number
- CN109815317A CN109815317A CN201811522537.9A CN201811522537A CN109815317A CN 109815317 A CN109815317 A CN 109815317A CN 201811522537 A CN201811522537 A CN 201811522537A CN 109815317 A CN109815317 A CN 109815317A
- Authority
- CN
- China
- Prior art keywords
- answer
- question
- answer data
- learning method
- classifier
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Landscapes
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Electrically Operated Instructional Devices (AREA)
Abstract
The present invention provides a kind of sequence learning method, which includes: to obtain question and answer data, and the question and answer data are marked according to crowdsourcing mechanism;Text character extraction is carried out to the question and answer data after label, to obtain training sample;Classifier is trained using the training sample as the input of classifier;Classified using trained classifier to question and answer data to be sorted.During subjective assessment being introduced model training by crowdsourcing mechanism using the present invention, for the method proposed in the past, this frame is more concerned about the subjective information of answer.By building answer to such mode, the difficulty manually marked is further simplified, is converted into the quality compared between two answers from for each answer mark Relevance scores.
Description
Technical field
The present invention relates to a kind of learning methods, and in particular to a kind of sequence learning method, system and equipment.
Background technique
The objective characteristics of answer are primarily upon in existing answer quality evaluating method, the subjective assessment of answer is unable to get very
Good quantization.The relative ranks between answer are not accounted for for single answer progress degree of correlation scoring.For traditional sequence
Model, the artificial training data that marks is at high cost, if model parameter is excessive, the tune of empirical method can be made to join extremely difficult.
Summary of the invention
In view of the foregoing deficiencies of prior art, the purpose of the present invention is to provide a kind of sequence learning method and it is
System, to solve the problems, such as that manually mark training data is at high cost in the prior art.
In order to achieve the above objects and other related objects, the present invention provides a kind of sequence learning method, the learning method packet
It includes:
Question and answer data are obtained, and the question and answer data are marked according to crowdsourcing mechanism;
Text character extraction is carried out to the question and answer data after label, to obtain training sample;
Classifier is trained using the training sample as the input of classifier;
Classified using trained classifier to question and answer data to be sorted;
The corresponding label of classification belonging to sorted question and answer data is converted into corresponding score using scoring functions.
Optionally, described that the question and answer data are marked according to crowdsourcing mechanism, it specifically includes:
N answer A=of collection A1 ..., and An }, n > 2;
Answer is constructed to set P, P ∈ (Ai, Aj) | i, j=1,2 ..., n }, wherein if (Ai, Aj) ∈ P, then
Answer pair is marked, if the quality of the answer Ai of some answer centering is higher than the quality of answer Aj, the answer pair
It is labeled as 1, is otherwise labeled as 0.
Optionally, the question and answer data after described pair of label carry out Text character extraction, to obtain training sample, specifically
Include:
Determine the corresponding feature of the answer data;
The corresponding feature of the answer data is denoted as feature vector, X, the result that the question and answer data will be marked
It is denoted as y;
Each answer data is expressed as < X, y >, then the answer data composing training sample.
Optionally, the file characteristic extracted is normalized.
Optionally, IDF letter of the corresponding feature of the answer data including at least word frequency information, keyword in answer
One of breath, answer length.
Optionally, the classifier includes one of KNN, RF, NN, GBDT.
Optionally, the scoring functions are as follows:Wherein,Wherein n indicates answer
Number,Indicate the summation of answer label,Indicate the label obtained after answer t and answer i match stop.
In order to achieve the above objects and other related objects, the present invention also provides a kind of sequence learning system, the learning systems
Include:
Mark module, for obtaining question and answer data and the question and answer data being marked according to crowdsourcing mechanism;
Characteristic extracting module, for carrying out Text character extraction to the question and answer data after label, to obtain training sample
This;
Training module, for being trained the training sample as the input of classifier to classifier;
Categorization module, for being classified using trained classifier to by question and answer data to be sorted.
Evaluation module, for being converted to the corresponding label of classification belonging to sorted question and answer data using scoring functions
Corresponding score.
In order to achieve the above objects and other related objects, the present invention also provides a kind of computer readable storage medium, storages
Computer program executes the learning method when computer program is run by processor.
In order to achieve the above objects and other related objects, the present invention also provides a kind of equipment includes:
Memory, for storing computer program;
Processor, for executing the computer program of the memory storage, so that the equipment executes study above-mentioned
Method.
As described above, a kind of sequence learning method of the invention and system, have the advantages that
Answer set quality evaluation under the same problem is converted into a kind of classification that machine can learn first by the present invention
Then these short texts are converted into the form that machine is understood that using sentence vector by problem, then complete by common classifier
At model training, answer finally is completed using designed scoring functions, absolute mass score is tagged to relative mass quality
Conversion.
During subjective assessment being introduced model training by crowdsourcing mechanism using the present invention, compared to what is proposed in the past
For method, this frame is more concerned about the subjective information of answer.By building answer to such mode, further simplify artificial
The difficulty of mark is converted into the quality compared between two answers from for each answer mark Relevance scores.In addition, this hair
It is bright to meet the needs of Ask-Answer Community answer quality evaluation or sequence, and it is easily extended to similar work requirements
In, such as the quality evaluation of image, the quality evaluation etc. of comment.
Detailed description of the invention
Fig. 1 is a kind of flow chart of learning method that sorts of the present invention;
Fig. 2 is a kind of block diagram of learning method that sorts of the present invention;
Fig. 3 is a kind of block diagram of learning system that sorts of the present invention;
Fig. 4 is the block diagram of mark module in a kind of sequence learning system of the present invention.
Specific embodiment
Illustrate embodiments of the present invention below by way of specific specific example, those skilled in the art can be by this specification
Other advantages and efficacy of the present invention can be easily understood for disclosed content.
The present invention can also be embodied or applied by other different embodiments, the items in this specification
Without departing from the spirit of the present invention details can also carry out various modifications or alterations based on different viewpoints and application.It needs
Bright, in the absence of conflict, the feature in following embodiment and embodiment can be combined with each other.
It should be noted that illustrating the basic structure that only the invention is illustrated in a schematic way provided in following embodiment
Think, only shown in schema then with related component in the present invention rather than component count, shape and size when according to actual implementation
Draw, when actual implementation kenel, quantity and the ratio of each component can arbitrarily change for one kind, and its assembly layout kenel
It is likely more complexity.
The problem of question answering system based on retrieval can be inputted according to user passes through the key obtained after parsing input problem
Word, search are added to the storing data of index, return to corresponding search result list as answer list La.For answer list
Answer sequence in La is a critical issue, and good sort algorithm can (answer quality refers to according to the height of answer quality
Answer the most accurate or user answer being most interested in) successively it is ranked up.
For answer quality identification other than some objective characteristics of answer, the subjective judgement of user is most important
's.The present invention utilizes crowdsourcing mechanism, and the subjective judgement of user is introduced into answer Ranking Algorithm, and Ranking Algorithm is turned
Turn to two classification problems.
Assuming that the answer returned has A1, A2 ..., the total n answer of An.By different answer Ai and the Aj combination of any two
Together, an orderly answer is formed to < Ai, Aj >, and < Ai, Aj >=1 is enabled to indicate that the quality of answer Ai is higher than answer Aj
Quality, < Ai, Aj >=0 indicate answer Ai the quality of low quality in answer Aj.
The core of crowdsourcing mechanism be by it is a large amount of artificial it is online in the way of to make to a problem be or non-judgement this hair
It is bright by answer to < Ai, the value of Aj > allows a large amount of human user to carry out 0/1 judgement as crowdsourcing problem.
Based on this, the present invention provides a kind of sequence learning method, as shown in Figure 1, 2, method includes the following steps:
Step S1. obtains question and answer data, and the question and answer data are marked according to crowdsourcing mechanism;
Step S2. carries out Text character extraction to the question and answer data after label, to obtain training sample;
Step S3. is trained classifier for the training sample as the input of classifier;
Step S4. classifies to question and answer data to be sorted using trained classifier.
Step S5. is converted to the corresponding label of classification belonging to sorted question and answer data accordingly using scoring functions
Score.
Answer set quality evaluation under the same problem is converted into a kind of classification that machine can learn first by the present invention
Then these short texts are converted into the form that machine is understood that using sentence vector, then complete model by classifier by problem
Training finally completes the conversion that relative mass quality is tagged to absolute mass score using designed scoring functions.
In step sl, problem data is obtained, and the question and answer data are marked according to crowdsourcing mechanism, it is specific to wrap
It includes:
N answer A=of step S11. collection A1 ..., and An }, n > 2;
In this present embodiment, open question and answer data building training data, such as Cornell on the one hand can be used
Movie Dialogs,Ubuntu Dialogue Corpus,WikiQA,OpenSubtitles,Twitter and
InsuranceQA;On the other hand, can be by crawling Sina weibo, Baidu is known, knows that equal Ask-Answer Communities obtain question and answer number
According to.
After obtaining a certain number of question and answer data, need each to answer correspond in answer set under the same problem
Case combination of two is built into multiple answers pair.
Step S12. constructs answer to set P, P ∈ (Ai, Aj) | i, j=1,2 ..., n }, wherein if (Ai, Aj) ∈
P, then
Step S13. marks answer pair, if the quality of the answer Ai of some answer centering is higher than the quality of answer Aj,
Otherwise the answer is labeled as 0 to being labeled as 1.
By mark, many answers of tape label may finally be obtained to as training data.
In step s 2, Text character extraction is carried out to the question and answer data after label, to obtain training sample;Specifically
Ground,
In question answering system or Ask-Answer Community, the length of answer tends not to especially grow, therefore, need to be concerned with how
Such short text is expressed as the form that computer is understood that.A kind of mode being simple and efficient is exactly by these text conversions
At corresponding vector.The answer of textual form is only converted into feature vector could train classification models.And in the present embodiment
In, extract the feature of text using sentence vector, and by these feature normalizations to [0,1].
The feature of text specifically includes that IDF information, the answer length of word frequency information of the keyword in answer, keyword.
After feature quantity has been determined, answer can be converted to feature vector, X, the result manually marked is y, in this way by each answer
It is expressed as < X, the form of y > ultimately forms specific training sample.
In step s3, classifier is trained using the training sample as the input of classifier.
Answer quality evaluation problem is converted to the classification problem that machine can learn by this frame.Selection one suitable classification
Device all has an impact to subsequent prediction and marking effect.Therefore, it is possible to use the practice effect in academia or industry
All good classifier completes the verifying to algorithm.Common classifier has k- nearest neighbor algorithm (K-
NearestNeighbor, abbreviation KNN), random forests algorithm learns (Random Forest, abbreviation RF), neural network
(Neural Network, abbreviation NN) and gradient boosted tree (Gradient Boosting Decision Tree, referred to as
GBDT).Good classifier can be showed in practical applications by these and complete model training of the invention.
In step s 4, classified using trained classifier to question and answer data to be sorted.
In the present embodiment, it obtains being label as 0 or 1 after classification prediction, for directly carrying out between different answers
Quality evaluation be not intuitive enough.Therefore, a kind of sequence learning method of the present invention further includes step S5, specifically,
Step S5 are as follows:
The corresponding label of classification belonging to sorted question and answer data is converted into corresponding score using scoring functions.Tool
Body, using scoring functions, 0 or 1 label is converted to the score between [0,100].
More specifically, scoring functions are as follows:Wherein,Wherein n indicates answer
Number,Indicate the summation of answer label,Indicate the label obtained after answer t and answer i match stop.
After the corresponding label of classification belonging to sorted question and answer data is converted to corresponding score, so that it may root
According to it is each answer safe quality score be done directly the quality evaluation to different answers work.
As shown in figure 3, the present invention also provides a kind of sequence learning system, which includes that mark module 1, feature mention
Modulus block 2, training module 3, categorization module 4 and evaluation module 5;
Mark module 1, for obtaining question and answer data and the question and answer data being marked according to crowdsourcing mechanism.
In this present embodiment, open question and answer data building training data, such as Cornell on the one hand can be used
Movie Dialogs,Ubuntu Dialogue Corpus,WikiQA,OpenSubtitles,Twitter and
InsuranceQA;On the other hand, can be by crawling Sina weibo, Baidu is known, knows that equal Ask-Answer Communities obtain question and answer number
According to.
Specifically, as shown in figure 4, the mark module includes collector unit 11, structural unit 12 and mark unit 13;
The collector unit 11, for collecting n answer A={ A1 ..., An }, n > 2;
The structural unit 12, for constructing answer to set P, P ∈ (Ai, Aj) | i, j=1,2 ..., n }, wherein
If (Ai, Aj) ∈ P, then
The mark unit 13, for marking answer pair, if the quality of the answer Ai of some answer centering is higher than answer
The quality of Aj, then otherwise the answer is labeled as 0 to being labeled as 1.
Characteristic extracting module 2, for carrying out Text character extraction to the question and answer data after label, to obtain training sample
This.The feature of text specifically includes that IDF information, the answer length of word frequency information of the keyword in answer, keyword.
Training module 3, for being trained the training sample as the input of classifier to classifier.Common point
Class device has k- nearest neighbor algorithm (K-NearestNeighbor, abbreviation KNN), random forests algorithm study (Random Forest,
Abbreviation RF), neural network (Neural Network, abbreviation NN) and gradient boosted tree (Gradient Boosting
Decision Tree, abbreviation GBDT).
Categorization module 4, for being classified using trained classifier to question and answer data to be sorted.
Evaluation module 5, for being converted the corresponding label of classification belonging to sorted question and answer data using scoring functions
For corresponding score.Specifically, scoring functions are as follows:Wherein,Wherein n indicates answer
Number,Indicate the summation of answer label,Indicate the label obtained after answer t and answer i match stop.
The present invention also provides a kind of equipment, comprising:
Memory, for storing computer program;
Processor, for executing the computer program of the memory storage, so that the equipment executes study above-mentioned
Method.
The processor can be central processing unit (Central Processing Unit, CPU), can also be it
His general processor, digital signal processor (Digital Signal Processor, DSP), specific integrated circuit
(Application Specifi.Integrated Circuit, ASIC), ready-made programmable gate array (Field-
Programmable Gate Array, FPGA) either other programmable logic device, discrete gate or transistor logic,
Discrete hardware components etc..General processor can be microprocessor or the processor is also possible to any conventional processor
Deng.
The memory can be internal storage unit or External memory equipment, such as plug-in type hard disk, intelligent memory card
(Smart Media Card, SMC), secure digital (Secure Digital, SD) card, flash card (FlashCard) etc..Into one
Step ground, the memory can also both include internal storage unit, also include External memory equipment.The memory is for storing
The computer program and other programs and data.The memory can be also used for temporarily storing oneself and be exported or will
The data to be exported.
It is apparent to those skilled in the art that for convenience of description and succinctly, only with above-mentioned each function
Can unit, module division progress for example, in practical application, can according to need and by above-mentioned function distribution by different
Functional unit, module are completed, i.e., the internal structure of described device is divided into different functional unit or module, more than completing
The all or part of function of description.Each functional unit in embodiment, module can integrate in one processing unit, can also
To be that each unit physically exists alone, can also be integrated in one unit with two or more units, it is above-mentioned integrated
Unit both can take the form of hardware realization, can also realize in the form of software functional units.In addition, each function list
Member, the specific name of module are also only for convenience of distinguishing each other, the protection scope being not intended to limit this application.Above system
The specific work process of middle unit, module, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In the above-described embodiments, it all emphasizes particularly on different fields to the description of each embodiment, is not described in detail or remembers in some embodiment
The part of load may refer to the associated description of other embodiments.
Those of ordinary skill in the art may be aware that list described in conjunction with the examples disclosed in the embodiments of the present disclosure
Member and algorithm steps can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually
It is implemented in hardware or software, the specific application and design constraint depending on technical solution.Professional technician
Each specific application can be used different methods to achieve the described function, but this realization is it is not considered that exceed
The scope of the present invention.
In embodiment provided by the present invention, it should be understood that disclosed device/terminal device and method, it can be with
It realizes by another way.For example, device described above/terminal device embodiment is only schematical, for example, institute
The division of module or unit is stated, only a kind of logical function partition, there may be another division manner in actual implementation, such as
Multiple units or components can be combined or can be integrated into another system, or some features can be ignored or not executed.Separately
A bit, shown or discussed mutual lotus root is closed or directly lotus root is closed or communication connection can be through some interfaces, device
Or the indirect lotus root of unit is closed or communication connection, can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, the functional units in various embodiments of the present invention may be integrated into one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.Above-mentioned integrated list
Member both can take the form of hardware realization, can also realize in the form of software functional units.
If the integrated module/unit be realized in the form of SFU software functional unit and as independent product sale or
In use, can store in a computer readable storage medium.Based on this understanding, the present invention realizes above-mentioned implementation
All or part of the process in example method, can also instruct relevant hardware to complete, the meter by computer program
Calculation machine program can be stored in a computer readable storage medium, the computer program when being executed by processor, it can be achieved that on
The step of stating each embodiment of the method.Wherein, the computer program includes computer program code, the computer program generation
Code can be source code form, object identification code form, executable file or certain intermediate forms etc..The computer-readable medium
It may include: any entity or device, recording medium, USB flash disk, mobile hard disk, magnetic that can carry the computer program code
Dish, CD, computer storage, read-only memory (ROM, Read-Only Memory), random access memory ((RAM,
Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium etc..
The above-described embodiments merely illustrate the principles and effects of the present invention, and is not intended to limit the present invention.It is any ripe
The personage for knowing this technology all without departing from the spirit and scope of the present invention, carries out modifications and changes to above-described embodiment.Cause
This, institute is complete without departing from the spirit and technical ideas disclosed in the present invention by those of ordinary skill in the art such as
At all equivalent modifications or change, should be covered by the claims of the present invention.
Claims (10)
1. a kind of sequence learning method, which is characterized in that the learning method includes:
Question and answer data are obtained, and the question and answer data are marked according to crowdsourcing mechanism;
Text character extraction is carried out to the question and answer data after label, to obtain training sample;
Classifier is trained using the training sample as the input of classifier;
Classified using trained classifier to question and answer data to be sorted;
The corresponding label of classification belonging to sorted question and answer data is converted into corresponding score using scoring functions.
2. a kind of sequence learning method according to claim 1, which is characterized in that described to be asked according to crowdsourcing mechanism described
Answer is specifically included according to being marked:
N answer A=of collection A1 ..., and An }, n > 2;
Answer is constructed to set P, P ∈ (Ai, Aj) | i, j=1,2 ..., n }, wherein if (Ai, Aj) ∈ P, then
Answer pair is marked, if the quality of the answer Ai of some answer centering is higher than the quality of answer Aj, the answer is to mark
It is 1, is otherwise labeled as 0.
3. a kind of sequence learning method according to claim 1, which is characterized in that the question and answer number after described pair of label
It is specifically included according to Text character extraction is carried out with obtaining training sample:
Determine the corresponding feature of the answer data;
The corresponding feature of the answer data is denoted as feature vector, X, the result that the question and answer data are marked is denoted as
y;
Each answer data is expressed as < X, y >, then the answer data composing training sample.
4. a kind of sequence learning method according to claim 3, which is characterized in that the file characteristic extracted into
Row normalized.
5. a kind of sequence learning method according to claim 3, which is characterized in that the corresponding feature of the answer data is extremely
One of IDF information, answer length less including word frequency information, keyword in answer.
6. a kind of sequence learning method according to claim 1, which is characterized in that the classifier include KNN, RF, NN,
One of GBDT.
7. a kind of sequence learning method according to claim 2, which is characterized in that the scoring functions are as follows:Wherein,Wherein n indicates answer number,Indicate the summation of answer label,It indicates
The label obtained after answer t and answer i match stop.
8. a kind of sequence learning system, which is characterized in that the learning system includes:
Mark module, for obtaining question and answer data and the question and answer data being marked according to crowdsourcing mechanism;
Characteristic extracting module, for carrying out Text character extraction to the question and answer data after label, to obtain training sample;
Training module, for being trained the training sample as the input of classifier to classifier;
Categorization module, for being classified using trained classifier to by question and answer data to be sorted.
The corresponding label of classification belonging to sorted question and answer data is converted to corresponding point using scoring functions by evaluation module
Number.
9. a kind of computer readable storage medium stores computer program, which is characterized in that the computer program is by processor
The learning method as described in claim 1~7 any one is executed when operation.
10. a kind of equipment characterized by comprising
Memory, for storing computer program;
Processor, for executing the computer program of the memory storage, so that the equipment executes such as claim 1~7
Learning method described in any one.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811522537.9A CN109815317B (en) | 2018-12-13 | 2018-12-13 | Ordering learning method, ordering learning system, computer readable storage medium and computer readable storage device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811522537.9A CN109815317B (en) | 2018-12-13 | 2018-12-13 | Ordering learning method, ordering learning system, computer readable storage medium and computer readable storage device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109815317A true CN109815317A (en) | 2019-05-28 |
CN109815317B CN109815317B (en) | 2023-08-22 |
Family
ID=66601606
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811522537.9A Active CN109815317B (en) | 2018-12-13 | 2018-12-13 | Ordering learning method, ordering learning system, computer readable storage medium and computer readable storage device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109815317B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111144546A (en) * | 2019-10-31 | 2020-05-12 | 平安科技(深圳)有限公司 | Scoring method and device, electronic equipment and storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140279996A1 (en) * | 2013-03-15 | 2014-09-18 | Microsoft Corporation | Providing crowdsourced answers to information needs presented by search engine and social networking application users |
CN105893523A (en) * | 2016-03-31 | 2016-08-24 | 华东师范大学 | Method for calculating problem similarity with answer relevance ranking evaluation measurement |
CN106446287A (en) * | 2016-11-08 | 2017-02-22 | 北京邮电大学 | Answer aggregation method and system facing crowdsourcing scene question-answering system |
-
2018
- 2018-12-13 CN CN201811522537.9A patent/CN109815317B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140279996A1 (en) * | 2013-03-15 | 2014-09-18 | Microsoft Corporation | Providing crowdsourced answers to information needs presented by search engine and social networking application users |
CN105893523A (en) * | 2016-03-31 | 2016-08-24 | 华东师范大学 | Method for calculating problem similarity with answer relevance ranking evaluation measurement |
CN106446287A (en) * | 2016-11-08 | 2017-02-22 | 北京邮电大学 | Answer aggregation method and system facing crowdsourcing scene question-answering system |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111144546A (en) * | 2019-10-31 | 2020-05-12 | 平安科技(深圳)有限公司 | Scoring method and device, electronic equipment and storage medium |
CN111144546B (en) * | 2019-10-31 | 2024-01-02 | 平安创科科技(北京)有限公司 | Scoring method, scoring device, electronic equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN109815317B (en) | 2023-08-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109635117B (en) | Method and device for recognizing user intention based on knowledge graph | |
CN106570708B (en) | Management method and system of intelligent customer service knowledge base | |
Bruni et al. | Distributional semantics from text and images | |
CN109871446A (en) | Rejection method for identifying, electronic device and storage medium in intention assessment | |
CN111444344B (en) | Entity classification method, entity classification device, computer equipment and storage medium | |
CN112650840A (en) | Intelligent medical question-answering processing method and system based on knowledge graph reasoning | |
CN108664599B (en) | Intelligent question-answering method and device, intelligent question-answering server and storage medium | |
CN107273861A (en) | A kind of subjective question marking methods of marking, device and terminal device | |
CN110347701B (en) | Target type identification method for entity retrieval query | |
CN106651696A (en) | Approximate question push method and system | |
CN104484380A (en) | Personalized search method and personalized search device | |
CN110019794A (en) | Classification method, device, storage medium and the electronic device of textual resources | |
CN109840532A (en) | A kind of law court's class case recommended method based on k-means | |
CN110096572B (en) | Sample generation method, device and computer readable medium | |
CN110472057B (en) | Topic label generation method and device | |
CN113312474A (en) | Similar case intelligent retrieval system of legal documents based on deep learning | |
CN111309916B (en) | Digest extracting method and apparatus, storage medium, and electronic apparatus | |
CN112883741A (en) | Specific target emotion classification method based on dual-channel graph neural network | |
CN109299357B (en) | Laos language text subject classification method | |
CN110222192A (en) | Corpus method for building up and device | |
CN108520038B (en) | Biomedical literature retrieval method based on sequencing learning algorithm | |
CN112686025A (en) | Chinese choice question interference item generation method based on free text | |
CN107247709A (en) | The optimization method and system of a kind of encyclopaedia entry label | |
CN111459973B (en) | Case type retrieval method and system based on case situation triple information | |
CN109815317A (en) | A kind of sequence learning method, system, computer readable storage medium and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |