CN116150319A - Auxiliary answering method, device, electronic equipment and storage medium - Google Patents

Auxiliary answering method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116150319A
CN116150319A CN202211742535.7A CN202211742535A CN116150319A CN 116150319 A CN116150319 A CN 116150319A CN 202211742535 A CN202211742535 A CN 202211742535A CN 116150319 A CN116150319 A CN 116150319A
Authority
CN
China
Prior art keywords
answer
answering
prompting
user
questions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211742535.7A
Other languages
Chinese (zh)
Inventor
凌超
沙晶
王士进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
iFlytek Co Ltd
Original Assignee
iFlytek Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by iFlytek Co Ltd filed Critical iFlytek Co Ltd
Priority to CN202211742535.7A priority Critical patent/CN116150319A/en
Publication of CN116150319A publication Critical patent/CN116150319A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/3331Query processing
    • G06F16/334Query execution
    • G06F16/3344Query execution using natural language analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/33Querying
    • G06F16/335Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Abstract

The invention provides an auxiliary answering method, an auxiliary answering device, electronic equipment and a storage medium, wherein the method comprises the following steps: determining a test question to be solved; acquiring a user answer step sequence of a test question; and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step. According to the method, the device, the electronic equipment and the storage medium, under the condition that the prompt request is received, the prompt steps are determined from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test questions, and the answer prompt is carried out based on the prompt steps, so that the condition that a complete standard answer is directly given to a user is avoided, and the answer capability and the thinking capability of the user can be exercised on the basis that the user is assisted to accurately complete the answer of the test questions.

Description

Auxiliary answering method, device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method and apparatus for assisting in answering questions, an electronic device, and a storage medium.
Background
Along with the development of internet technology, online answering becomes an answering trend in intelligent teaching. When a user needs to search for solving the problem on-line, the current solution is mainly to directly give all the solving steps of the problem or directly give the final answer.
But by directly giving all the answering steps of the questions, the user can easily take care of the answering steps or answers directly. Therefore, the method of directly giving all questions solving steps is not ideal for assisting the user in answering and improving the answering ability of the user.
Disclosure of Invention
The invention provides an auxiliary answering method, an auxiliary answering device, electronic equipment and a storage medium, which are used for solving the defect that the effect of assisting a user in answering and improving the answering capability of the user is not ideal by directly giving all the answering steps of the questions in the prior art.
The invention provides an auxiliary answering method, which comprises the following steps:
determining a test question to be solved;
acquiring a user answer step sequence of the test questions;
and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step.
According to the auxiliary answering method provided by the invention, the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test question, and the prompting step comprises the following steps:
determining a step which is answered from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question;
and determining the step arranged after the answered step in the standard answering step sequence as a prompting step.
According to the auxiliary answering method provided by the invention, the user answering step sequence for acquiring the test questions further comprises the following steps:
extracting a current answer conclusion from the user answer step sequence;
and determining a answering state based on the conclusion similarity between the current answering conclusion and the standard answering conclusion of the test question, and returning to a user answering step sequence for acquiring the test question under the condition that the answering state is not ended.
According to the auxiliary answering method provided by the invention, the answering state is determined, and then the method further comprises the following steps:
and under the condition that the answer state is finished, determining an answer result based on the conclusion similarity.
The invention also provides a test question recommending method, which comprises the following steps:
a prompting step in the process of obtaining answering questions;
based on the prompting times and/or the step types of the prompting steps, recommending test questions, wherein the step types are modeling steps or knowledge point steps;
the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
According to the test question recommending method provided by the invention, the test question recommending is carried out based on the prompting times and/or the step types of the prompting steps, and the method comprises the following steps:
and recommending the test questions based on the prompting times and/or the step types of the prompting steps and the answering results of the test questions.
According to the test question recommending method provided by the invention, the determining step of the step type comprises the following steps:
and determining the step type of the prompting step based on the semantic relativity between the questions of the test questions and the prompting step.
According to the test question recommending method provided by the invention, the step type of the prompting step is determined based on the semantic relativity between the questions of the test questions and the prompting step, and the method comprises the following steps:
Based on a step classification model, determining a step type of the prompting step by applying semantic relativity between the questions of the test questions and the prompting step;
the step classification model is obtained by training a full-quantity model based on sample situation test questions carrying step type labels, and the full-quantity model is obtained by training a language model based on the sample full-quantity test questions.
The invention also provides an auxiliary answering device, which comprises:
the determining unit is used for determining the test questions to be solved;
the step sequence unit of obtaining answer is used for obtaining the step sequence of user answer of the test question;
and the answer prompting unit is used for determining prompting steps from the standard answer step sequence based on step similarity between the user answer step sequence and the standard answer step sequence of the test question under the condition of receiving a prompting request, and prompting the answer based on the prompting steps.
The invention also provides a test question recommending device, which comprises:
the prompting step unit is used for acquiring prompting steps in the answering process;
the test question recommending unit is used for recommending test questions based on the prompting times and/or the step types of the prompting steps, wherein the step types are modeling steps or knowledge point steps;
The prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
The invention also provides an electronic device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes any one of the answering auxiliary method or the test question recommending method when executing the program.
The present invention also provides a non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements any of the answer assisting method or the test question recommending method described above.
The invention also provides a computer program product, which comprises a computer program, wherein the computer program realizes the answering auxiliary method or the test question recommending method when being executed by a processor.
According to the auxiliary answering method, the device, the electronic equipment and the storage medium, under the condition that the prompting request is received, the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions, and the answering prompt is carried out based on the prompting step, so that the direct giving of complete standard answers to the user is avoided, and the answering capability and thinking capability of the user can be exercised on the basis that the auxiliary user correctly completes the answering of the test questions.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of the auxiliary answering method provided by the invention;
FIG. 2 is a flow chart of a determining prompting step provided by the present invention;
FIG. 3 is a schematic flow chart of the test question recommending method provided by the invention;
FIG. 4 is a schematic flow chart of the auxiliary answer and test question recommending method provided by the invention;
fig. 5 is a schematic structural diagram of the auxiliary answering device provided by the present invention;
FIG. 6 is a schematic diagram of a test question recommending apparatus according to the present invention;
fig. 7 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
By directly giving the question answering step, the user can easily take care of the question answering step or answer. Therefore, the method of directly giving all questions solving steps is not ideal for assisting the user in answering and improving the answering ability of the user.
Aiming at the problems, the invention provides an auxiliary answering method to assist the user in answering according to the actual answering situation of the user, and the method assists the user in completing the answering and improves the answering capability of the user by giving a prompting step of the next step. Fig. 1 is a schematic flow chart of an auxiliary answering method provided by the present invention, as shown in fig. 1, the method includes:
step 110, determining a test question to be solved;
here, the questions to be solved refer to questions that the user needs to solve online, and in the online solving process, the user may need to assist the answering device to obtain a correct solving step. The questions to be solved can be the questions of various subjects, can also be various types of questions, and can be especially situation questions. The test questions to be solved can be obtained by directly obtaining the test questions through answering equipment such as a smart phone and a computer or obtaining the test questions through image acquisition equipment such as a camera on a learning machine. It can be understood that the obtained questions to be solved contain the question information of the questions, and the corresponding accurate standard question solving step can be obtained according to the question information to be solved.
Step 120, obtaining a user answer step sequence of the test questions;
when assisting the user in answering, the user answering step sequence of the test question needs to be acquired in real time, so that the user can know the real-time answering condition. Compared with the auxiliary answering method in the related art, the method and the device of the invention can not directly show the problem solving steps to the user, but assist the user to complete the problem solving according to the actual problem solving steps of the user on the test problem. Here, the answer step sequence refers to all answer steps written by the user when the user solves the test question, where all answer steps may be incomplete or complete. It can be understood that the answering steps of the general test questions have a certain logic sequence, so that the answering steps for the test questions can be combined into a answering step sequence. Here, the sequence of the answering steps in the sequence of answering steps, i.e. the order of the user writing before and after the answering.
The user answer step sequence of the test question can be obtained directly through answer equipment or obtained through image acquisition equipment. For example, a user answer image can be obtained by shooting a paper surface through a camera on the learning machine, and text recognition is performed on the user answer image, so that a user answer step sequence is obtained. Here, each line of text answered by the user may be directly used as a single answering step, or several lines of text may be judged as one answering step in combination with the interval of each line of text. It will be appreciated that a typical user will be more likely to have text lines that are in the same step when answering than text lines that are in a shorter interval.
And step 130, under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and prompting answer based on the prompt step.
Specifically, the prompt request is a request which is sent by a user in the answering process and is expected to receive an answering prompt given by answering equipment. After the answering equipment receives the prompt request, the standard answering step sequence of the corresponding test questions can be obtained through the determined test questions to be answered. And then according to the step similarity between the answer step sequence of the current user and the standard step sequence, determining the corresponding relation between the answer step of the current test question and the standard answer step, and giving a prompting step from the standard answer step sequence as an answer prompt, wherein the answer step sequence of the user can be updated.
Here, the mode of receiving the prompt request may be that the answering device sets a request prompt button, and the user clicks the button to receive the prompt request; or, the voice acquisition equipment acquires the voice of the prompt request required by the user, and the voice is regarded as the received prompt request.
After receiving the prompt request, the standard answer step sequence of the corresponding test question can be obtained through the determined test question to be answered, for example, the standard answer step sequence of the corresponding test question can be obtained through the test question information in a test question library preset by answer equipment.
In order to further improve the answering capability of the user on the basis of assisting the user in accurately answering the questions, after the standard answering step sequence of the test questions is obtained, the current answering step can be positioned in the standard answering step sequence according to the step similarity between the current answering step sequence and the standard answering step sequence of the user, and then the next standard answering step corresponding to the current answering step is provided as a prompting step. It can be understood that the step similarity between the user answering step sequence and the standard answering step sequence of the test question can be specifically embodied as the similarity between each answering step in the user answering step sequence and each answering step in the standard answering step sequence, and through the similarity between the steps, it can be determined which steps have been solved in the user answering process, thereby determining at which step the user may have a stuck case when the user sends a prompt request, and thus giving a targeted prompt step.
In addition, the answer prompting herein refers to the prompting information for assisting the user in answering according to the prompting step in the standard answering step sequence, and the answer prompting may be performed by directly using the next step in the standard answering step sequence corresponding to the current user answering step sequence as the prompting step, or on the basis of the prompting step, acquiring the preset prompting information corresponding to the prompting step, and using the prompting information for answer prompting. For example, the key information in the stem associated with the prompting step is highlighted and the key step that is currently lacking by the user is given. It can be understood that whether the next step in the standard answer step sequence is directly used as a prompting step to answer questions or prompting information of the prompting step is used for prompting questions, all standard answers are prevented from being directly displayed to the user, so that the user can continuously update answer steps of test questions according to the answer questions, and the answer capability of the user is improved on the basis of completing answer test questions.
In the process of answering the prompt, the real-time acquisition of the answer step sequence for the user is still continued, and if the answer step sequence for the user is detected to be updated, namely, the user can continuously complete writing of the answer step of the test question according to the answer prompt under the condition that the answer prompt is obtained, at the moment, the answer prompt is ended, and the step 120 can be returned to continue to obtain the updated answer step of the user; or, in the process of giving the answer prompt, after giving the current answer prompt, a new answer prompt related to the current answer prompt can be continuously given, and if the answer step sequence of the user is detected to be updated, that is, the user can continuously complete writing of the answer step of the test question according to the answer prompt under the condition that the answer prompt is obtained, at the moment, the answer prompt is ended, and the step 120 can be returned to continuously obtain the updated answer step of the user. Here, the new answer prompt is finer in prompting the key steps or knowledge points than the current answer prompt, and is easier to understand; or, waiting for a preset time, and when no prompt request is received in the time, that is, the default user can advance answering according to the answer prompt, the answer step sequence of the user is updated, and returning to step 120 to continuously acquire the updated answer step of the user until the answer is finished. It can be understood that in the process of answering the questions, the answering questions can be always displayed through the answering equipment, so that a user can be prevented from forgetting the answering questions in the answering process, a new answering questions related to the current answering questions can be given in the answering questions process, and more careful and orderly answering questions can be provided.
According to the method provided by the embodiment of the invention, under the condition that the prompt request is received, the prompt step is determined from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and the answer prompt is carried out based on the prompt step, so that the direct giving of complete standard answers to the user is avoided, and the answer capability and thinking capability of the user can be exercised on the basis of assisting the user to accurately complete the answer of the test question.
Based on the above embodiment, fig. 2 is a schematic flow chart of the determining prompting step provided in the present invention, as shown in fig. 2, step 120 includes:
step 210, determining a answered step from the standard answer step sequence based on step similarity between the user answer step sequence and the standard answer step sequence of the test question;
specifically, the step similarity is used to reflect the similarity of each of the user answering steps in the sequence of user answering steps to each of the standard answering steps in the sequence of standard answering steps. Here, the step similarity between the user answer step sequence and the standard answer step sequence of the test question may be implemented by a character string similarity algorithm, or a text similarity algorithm based on deep learning, or by a supervised text similarity model. If the calculated similarity exceeds a certain threshold value set in advance, the user answering step corresponds to the standard answering step in the standard answering step sequence, and the standard answering step is the answered step. And (3) carrying out similarity calculation on each answering step in the user answering step sequence and each answering step in the standard answering step sequence of the test question, so as to determine the answered step from the standard answering step sequence.
Step 220, determining the step arranged after the answered step in the standard answer step sequence as a prompt step.
It will be appreciated that the steps that have been answered in the sequence of standard answering steps for the test question, which have been written by the user, are not required to be prompted, the steps arranged after the answering steps are that no corresponding answering step exists in the answering step sequence of the user, and then answer prompts are needed to be carried out as prompting steps. Thus, from the standard answer step sequence, the steps arranged after the answer step are selected as the prompting steps to perform answer prompting. The prompting step may be the next step immediately after the step of having been answered, may be a key step selected from all steps arranged after the step of having been answered, or may be a step capable of reflecting a solution idea, and as the prompting step, embodiments of the present invention are not limited in particular.
Based on any of the above embodiments, step 120 further comprises:
extracting a current answer conclusion from the user answer step sequence;
and determining a answering state based on the conclusion similarity between the current answering conclusion and the standard answering conclusion of the test question, and returning to a user answering step sequence for acquiring the test question under the condition that the answering state is not ended.
Specifically, the answer step sequence of the user is obtained in real time in the answer process, and can be an unfinished answer step sequence, a blank answer step sequence which does not start to answer, or an answer step sequence which finishes answering and ends answering. It will be appreciated that the steps required to be performed after the user's answer step sequence is obtained are also different in different answer states. When judging that the answer state is not finished or blank, obtaining a user answer step sequence, and then carrying out answer prompt of prompt steps for the user; when judging the finished answering state, after obtaining the user answering step sequence, carrying out capability analysis or judging whether the answering result is correct or not according to the answering result of the user.
Thus, after the user response step sequence is obtained, the current response conclusion can be extracted from the user response step sequence. Here, extracting the current answer conclusion from the sequence of user answer steps may be implemented by a text extraction algorithm such as TextRank, for example, the conclusion extraction may be performed on the last user answer step in the sequence of user answer steps. And then, determining the answering state according to the similarity of the conclusion between the current answering conclusion and the standard answering conclusion of the test question.
The conclusion similarity can be realized by a character string similarity algorithm, a text similarity algorithm based on deep learning, or a supervision text similarity model. It can be understood that the final conclusion of the user answering step sequence is not necessarily a correct answer, and in the case that the answer is not yet completed, the conclusion of the user answering step sequence cannot correspond to the standard answer conclusion, so that the answer state can be judged by presetting a certain threshold. It can be understood that when the similarity is lower than the preset value, the possibility that the conclusion obtained by the user answering step sequence accords with the conclusion in the standard answering step sequence is smaller, and further the possibility that the answer state of the user is not ended is larger, so that the answer state can be determined to be not ended; when the similarity is higher than a preset value, the possibility that the conclusion obtained by the user answering step sequence accords with the standard answering step sequence is high, and further the possibility that the answer state of the user is finished is high, so that the answer state can be determined to be finished. Or, the state that the answer is completed can be obtained directly by clicking or pressing a button for ending the answer by the user, or the state that the answer is completed can be obtained by the voice of the user.
After the answer state of the user is obtained, if the answer state is that the answer is not finished, returning to the user answer step sequence for obtaining the test questions so as to continue to carry out answer prompt of the prompt step for the user and assist the user to complete answer.
The method provided by the embodiment of the invention is communicated after the user response step sequence is obtained
Extracting current answer conclusion from the user answer step sequence, determining answer state based on conclusion similarity between the current answer conclusion and standard answer conclusion of 5 test questions, and making answer in answer state
Under the condition that the answer is not finished, returning to a user answer step sequence for acquiring the test questions, wherein the steps provide a more humanized auxiliary answer method, and the answer state can be judged under the condition that manual selection is not needed, so that the problem that the user mistakenly touches a prompt request after the answer is finished is avoided
And extra calculation loss is achieved, and subsequent services can be provided for the user in time after answering, and 0 is achieved, so that user experience is optimized.
Based on any of the foregoing embodiments, the determining the answer state further includes:
and under the condition that the answer state is finished, determining an answer result based on the conclusion similarity.
Specifically, when the answer state is completed, answer result judgment can be performed. 5 here, when judging the answer result, the conclusion of the answer state judgment can be prolonged
The similarity, it can be understood that, under the condition that the conclusion similarity is greater than a certain preset value, the answer result is determined to be correct, and under the condition that the conclusion similarity is less than or equal to a certain preset value, the answer result is determined to be wrong. It should be noted that, the preset value for judging the answer result is greater than the preset value for judging the answer state.
0 in addition, considering that the sequence of user answering steps is not necessarily completely in accordance with the standard of test questions
The answer step sequence may include answer steps that do not meet the standard answer step sequence, where the answer step may be understood as an incorrect step in the answer process of the test question. Thus, when determining the answer result, reference can also be made to each of the user answer step sequences
Step similarity between each step in the sequence of individual steps and standard answer steps, e.g. one answer step in the sequence of 5 user answer steps and one step in the sequence of standard answer steps
When the similarity between the steps is higher than a certain preset value, determining that the answering step is correct; when the similarity is lower than a certain preset value, determining that the answering step is wrong; therefore, the answer result of the test question can be comprehensively evaluated by determining whether each answer step in the user answer step sequence is correct or not and combining with whether the answer conclusion is correct or not.
The method provided by the embodiment of the invention can ensure the reliability of obtaining the answer result. And then, based on the answer result, the operation of recommending the test questions can be performed according to the answer capability of the user, so that the reliability of the subsequent operation is ensured.
At present, the recommendation method for the test questions, especially the recommendation method for the situation real test questions is realized by the following three methods, wherein the first method is to try to find other similar questions according to the description text of the current questions and display the similar questions to a scene test question list for a user; second, trying to find other topics of the same or similar knowledge points according to knowledge point information of the current topic; thirdly, according to the retrieval and clicking behaviors of the user, two or more problems frequently and sequentially checked by the user are found, and through mining frequent items of the user behaviors, the association relationship between the topics is established, and a test question list is displayed to the user.
However, by the first or third method, since the existing scenario test questions have few resources, the effect of recommending similar types of scenario test questions is not ideal. The second method only recommends questions of the same knowledge point, but loses the key point of the investigation of the situation test questions, and has little effect on improving modeling capacity.
In view of the above problems, the present invention further provides a method for recommending test questions, and fig. 3 is a schematic flow chart of the method for recommending test questions, as shown in fig. 3, where the method includes:
step 310, obtaining a prompting step in the answering process;
step 320, recommending test questions based on the prompting times and/or the step types of the prompting steps, wherein the step types are modeling steps or knowledge point steps;
the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
Specifically, the prompting step in step 310 is determined from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question when the prompting request is received in the answer process. It can be understood that the prompting step can reflect the information of the card case of the user in the answering process, namely the prompting step reflects the weak link of the user in the answering process.
The hint step itself has the attribute of the step type. For contextual questions, the step type may be a modeling step or a knowledge point step. Here, the modeling step refers to a step associated with a question of a scenario test question, and such a step can convert a scenario in the question of the scenario test question into a mathematical symbol language, for example, "each layer of bricks belongs to an arithmetic series", thereby realizing the question modeling of the scenario test question. It will be appreciated that in order to implement solution of a test question, knowledge points are applied in addition to question modeling of a scenario test question, i.e. other steps than modeling steps in a sequence of standard solution steps, whereby steps other than modeling steps may be noted as knowledge points steps, which may be, for example, a specific arithmetic series formula.
In step 320, the recommendation of the test questions may be performed according to the number of prompts and/or the type of the prompts. It can be understood that the number of the prompting times can reflect the difficulty of the user in answering, and the smaller the prompting times, the easier the user in answering, and the higher the grasping degree of the capability of the test question; on the contrary, the more the number of the prompts, the more difficult the user is to answer, and the lower the ability to grasp the test question. In addition, modeling capability and knowledge point grasping capability of a user can be analyzed according to the step type of the prompting step, and it is understood that most of the prompting steps are modeling steps, and weaker modeling capability of the user can be reflected; the prompting step is mostly a knowledge point step, and can reflect that the knowledge point of the user has weaker grasping capability. Based on the above, the user capability analysis can be performed according to at least one of the prompting times and the step types of the prompting steps, so as to recommend the test questions more in line with the user capability condition.
According to the method provided by the invention, through the prompting steps in the answering process, the recommendation of the test questions is performed based on the prompting times and/or the step types of the prompting steps, the modeling capability and the knowledge point answering capability of the user can be analyzed based on the behavior of the user in the answering process, the recommendation of the test questions is performed in a targeted manner, and the answering capability of the user is further effectively improved.
Based on any of the above embodiments, step 320 includes:
and recommending the test questions based on the prompting times and/or the step types of the prompting steps and the answering results of the test questions.
Specifically, the modeling capability and knowledge point solution capability of the user may be reflected, typically according to the number of hints and/or the type of steps that are hinted. When the user answers the questions according to the prompting step, the user can obtain correct answer results according to the prompting step, and the modeling capability and knowledge point answering capability of the user can be further reflected. It can be appreciated that when the user is able to obtain a correct answer result based on the prompting step, the modeling capability and/or knowledge point solving capability of the user is more solid than obtaining an incorrect result. Therefore, the test question recommendation can be performed by comprehensively judging the prompting times and/or the step types of the prompting steps and the answer result, and the test question recommendation can be performed according to the final weighted calculation result by presetting the priorities of the three judging conditions or setting different weights.
In this case, the recommendation of the test questions may be that when the prompting steps are the modeling step and the knowledge point step, the prompting times are more, the user is judged to be relatively weak in modeling and knowledge point grasping ability, so that the user can recommend simple test questions related to the knowledge points in the situation test questions, and perform basic training; when the prompting steps are more prompting times of modeling steps and less prompting times of knowledge point steps, judging that the grasping ability of the knowledge points of the user is solid, and recommending simple test questions related to situations when the modeling ability is relatively weak, so that the user exercises the modeling ability; when the prompting steps are fewer in prompting times of modeling steps and more in prompting times of knowledge point steps, judging that the modeling capability of the user is strong, and the knowledge point mastering capability is relatively weak, recommending harder test questions related to the knowledge points to the user, so that the user exercises the problem solving capability; when the prompting steps are the modeling steps and the knowledge point steps, the prompting times are less or not, the knowledge point grasping capability and the modeling capability are judged to be more solid, the difficult scene test questions are recommended, and the knowledge breadth of the user is enlarged.
Based on any of the above embodiments, the step of determining the step type includes:
And determining the step type of the prompting step based on the semantic relativity between the questions of the test questions and the prompting step.
It can be understood that the step of converting the mathematical symbol language is extracted from the questions of the test question, i.e. the modeling step, and the modeling step necessarily covers the related description in the questions of the test question in consideration of the bridge between the questions and the mathematical symbol language built by the modeling step, so that it can be determined that the modeling step is necessarily semantically related to the questions of the test question, i.e. the greater the semantic relativity between the questions of the test question and the prompting step, the greater the likelihood that the step type of the prompting step is the modeling step, and the lesser the semantic relativity between the questions of the test question and the prompting step is, the greater the likelihood that the step type of the prompting step is the knowledge point step. The semantic relatedness can be realized through a character string similarity algorithm, a text similarity algorithm based on deep learning, or a supervised text similarity model.
Based on any of the above embodiments, the determining a step type of the prompting step based on a semantic relativity between the questions of the test questions and the prompting step includes:
Based on a step classification model, determining a step type of the prompting step by applying semantic relativity between the questions of the test questions and the prompting step;
the step classification model is obtained by training a full-quantity model based on sample situation test questions carrying step type labels, and the full-quantity model is obtained by training a language model based on the sample full-quantity test questions.
Specifically, a step classification model may be applied to implement semantic relevance calculations between the topics of the test questions and the prompting step and determine the step type based thereon. Considering that the acquisition difficulty of the sample situation test questions is high, in order to realize step classification model training by applying a small amount of sample situation test questions, the step classification model can be obtained by training the full-quantity model based on the sample situation test questions carrying step type labels.
The full-scale model is obtained by training a language model based on a sample full-scale test question. Specifically, based on the Bert pre-training translation model of an open source, the full-scale question bank data is retrained, and the full-scale model is obtained. It can be appreciated that the full-scale model is more adaptive to the context of the test questions than the language model in the general field, and has more pertinence in extracting and analyzing semantic features of the test question step.
After the full-scale model is obtained, the step classification model can be trained on the basis of the full-scale model. Considering that the full-quantity model is suitable for the context of the test questions, the magnitude of the sample situation test questions carrying step type labels required by training the step classification model on the basis of the full-quantity model is greatly reduced, and the training of the step classification model can be realized only by a small quantity of sample situation test questions carrying step type labels.
In the step classification model training process based on the full-quantity model, a special character can be inserted before each sentence of the sample situation test question stem and the answering step, the full-quantity model is sent to carry out semantic feature extraction, and attention weights between the extracted questions and semantic features of the special characters of the answering step are regarded as semantic similarity between the questions and the answering step, so that the basis for judging the association relation between the questions and the answering step is used. It can be appreciated that if there is an association between the topic and the answering step, the answering step is considered to be a modeling step, otherwise the answering step is considered to be a knowledge point step.
Based on any of the above embodiments, fig. 4 is a schematic flow chart of an auxiliary answer and test question recommending method provided by the present invention, as shown in fig. 4, where the method includes:
Step 410, determining a test question to be solved;
step 420, obtaining a user answer step sequence of the test questions;
after step 420 is performed, the answer state of the current user needs to be judged, which may specifically be: firstly, a current answer conclusion is extracted from a user answer step sequence, and then, the answer state is determined based on the conclusion similarity between the current answer conclusion and the standard answer conclusion of the test question.
If the answer status is incomplete, determining whether a prompt request is received, if the prompt request is received, executing step 430, and if the prompt request is not received, executing step 440; if the answer status is complete, step 440 is performed.
Step 430, determining a prompting step from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test question, and prompting the answer based on the prompting step.
In step 430, in the case of receiving the prompt request, an answer prompt is performed based on the prompting step. The answer prompt can be a prompt for modeling steps, such as highlighting key information of the stem, and providing modeling steps lacking by the current user; the knowledge point step may be prompted, for example, to prompt a specific arithmetic series formula. After the answer prompt is made, step 420 may continue to be performed.
Step 440, analyzing the answering process based on the prompting times and/or the step types of the prompting steps and the answering result of the test questions.
Step 450, based on the answer process, the test question recommendation is performed.
Based on any of the above embodiments, fig. 5 is a schematic structural diagram of an auxiliary answering device provided by the present invention, as shown in fig. 5, the device includes:
a determining unit 510 for determining the questions to be solved;
the step sequence obtaining unit 520 obtains a step sequence of user answer of the test question;
and an answer prompting unit 530, configured to determine, when a prompting request is received, a prompting step from the standard answer step sequence based on step similarity between the user answer step sequence and the standard answer step sequence of the test question, and perform answer prompting based on the prompting step.
According to the device provided by the embodiment of the invention, under the condition that the prompt request is received, the prompt step is determined from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and the answer prompt is carried out based on the prompt step, so that the situation that the user is directly given a complete standard answer is avoided, and the answer capability and thinking capability of the user can be exercised on the basis that the user is assisted to accurately complete the answer of the test question.
Based on any of the above embodiments, the answer prompting unit is specifically configured to:
determining a step which is answered from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question;
and determining the step arranged after the answered step in the standard answering step sequence as a prompting step.
Based on any of the above embodiments, the step of obtaining the answer step sequence unit further includes a answer state determining unit, where the answer state determining unit is specifically configured to:
extracting a current answer conclusion from the user answer step sequence;
and determining a answering state based on the conclusion similarity between the current answering conclusion and the standard answering conclusion of the test question, and returning to a user answering step sequence for acquiring the test question under the condition that the answering state is not ended.
Based on any of the above embodiments, the answer state unit is further determined, and then the answer result unit is further determined, where the answer result unit is specifically configured to:
and under the condition that the answer state is finished, determining an answer result based on the conclusion similarity.
Based on any of the above embodiments, fig. 6 is a schematic structural diagram of a test question recommending apparatus provided by the present invention, and as shown in fig. 6, the apparatus includes:
The prompt step obtaining unit 610 obtains a prompt step in the answering process;
the test question recommending unit 620 performs test question recommendation based on the prompting times and/or the step types of the prompting steps, wherein the step types are modeling steps or knowledge point steps;
the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
According to the device provided by the invention, through the prompting steps in the answering process, the recommendation of the test questions is performed based on the prompting times and/or the step types of the prompting steps, the modeling capability and the knowledge point answering capability of the user can be analyzed based on the behavior of the user in the answering process, the recommendation of the test questions is performed in a targeted manner, and the answering capability of the user is further effectively improved.
Based on any of the above embodiments, the test question recommending unit is specifically configured to:
and recommending the test questions based on the prompting times and/or the step types of the prompting steps and the answering results of the test questions.
Based on any of the above embodiments, the test question recommending unit is specifically configured to:
and determining the step type of the prompting step based on the semantic relativity between the questions of the test questions and the prompting step.
Based on any of the above embodiments, the test question recommending unit is specifically configured to:
based on a step classification model, determining a step type of the prompting step by applying semantic relativity between the questions of the test questions and the prompting step;
the step classification model is obtained by training a full-quantity model based on sample situation test questions carrying step type labels, and the full-quantity model is obtained by training a language model based on the sample full-quantity test questions.
Fig. 7 illustrates a physical schematic diagram of an electronic device, as shown in fig. 7, which may include: processor 710, communication interface (Communications Interface) 720, memory 730, and communication bus 740, wherein processor 710, communication interface 720, memory 730 communicate with each other via communication bus 740. Processor 710 may invoke logic instructions in memory 730 to perform a method of assisting in answering questions, the method comprising: determining a test question to be solved; acquiring a user answer step sequence of the test questions; and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step.
The processor 710 may also call logic instructions in the memory 730 to perform a question recommending method comprising: a prompting step in the process of obtaining answering questions; based on the prompting times and/or the step types of the prompting steps, recommending test questions, wherein the step types are modeling steps or knowledge point steps; the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
Further, the logic instructions in the memory 730 described above may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product, where the computer program product includes a computer program, where the computer program can be stored on a non-transitory computer readable storage medium, and when the computer program is executed by a processor, the computer can execute the auxiliary answering method provided by the above methods, and the method includes: determining a test question to be solved; acquiring a user answer step sequence of the test questions; and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step.
When the computer program is executed by the processor, the computer can also execute the test question recommending method provided by the methods, and the method comprises the following steps: a prompting step in the process of obtaining answering questions; based on the prompting times and/or the step types of the prompting steps, recommending test questions, wherein the step types are modeling steps or knowledge point steps; the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
In yet another aspect, the present invention further provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the method of assisting in answering questions provided by the above methods, the method comprising: determining a test question to be solved; acquiring a user answer step sequence of the test questions; and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step.
The computer program when executed by the processor is further implemented to execute the test question recommending method provided by the methods, and the method comprises the following steps: a prompting step in the process of obtaining answering questions; based on the prompting times and/or the step types of the prompting steps, recommending test questions, wherein the step types are modeling steps or knowledge point steps; the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (12)

1. An auxiliary answering method is characterized by comprising the following steps:
determining a test question to be solved;
acquiring a user answer step sequence of the test questions;
and under the condition that a prompt request is received, determining a prompt step from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question, and carrying out answer prompt based on the prompt step.
2. The method of claim 1, wherein determining the prompting step from the standard answer step sequence based on step similarity between the user answer step sequence and the standard answer step sequence of the test question comprises:
Determining a step which is answered from the standard answer step sequence based on the step similarity between the user answer step sequence and the standard answer step sequence of the test question;
and determining the step arranged after the answered step in the standard answering step sequence as a prompting step.
3. The method for assisting in answering questions as claimed in claim 1 or 2, wherein the step sequence of user answering steps for obtaining the questions further comprises:
extracting a current answer conclusion from the user answer step sequence;
and determining a answering state based on the conclusion similarity between the current answering conclusion and the standard answering conclusion of the test question, and returning to a user answering step sequence for acquiring the test question under the condition that the answering state is not ended.
4. The method for assisting in answering questions as claimed in claim 3, wherein the step of determining the answering state further comprises:
and under the condition that the answer state is finished, determining an answer result based on the conclusion similarity.
5. The test question recommending method is characterized by comprising the following steps of:
a prompting step in the process of obtaining answering questions;
based on the prompting times and/or the step types of the prompting steps, recommending test questions, wherein the step types are modeling steps or knowledge point steps;
The prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
6. The method of claim 5, wherein the performing the recommendation of the test question based on the number of the prompting steps and/or the type of the prompting steps includes:
and recommending the test questions based on the prompting times and/or the step types of the prompting steps and the answering results of the test questions.
7. The method of claim 5 or 6, wherein the step of determining the type of the step includes:
and determining the step type of the prompting step based on the semantic relativity between the questions of the test questions and the prompting step.
8. The method of claim 7, wherein determining the step type of the prompting step based on the semantic relatedness between the questions of the test questions and the prompting step comprises:
based on a step classification model, determining a step type of the prompting step by applying semantic relativity between the questions of the test questions and the prompting step;
The step classification model is obtained by training a full-quantity model based on sample situation test questions carrying step type labels, and the full-quantity model is obtained by training a language model based on the sample full-quantity test questions.
9. An auxiliary answering device, comprising:
the determining unit is used for determining the test questions to be solved;
the step sequence unit of obtaining answer is used for obtaining the step sequence of user answer of the test question;
and the answer prompting unit is used for determining prompting steps from the standard answer step sequence based on step similarity between the user answer step sequence and the standard answer step sequence of the test question under the condition of receiving a prompting request, and prompting the answer based on the prompting steps.
10. The utility model provides a test question recommending apparatus which characterized in that includes:
the prompting step unit is used for acquiring prompting steps in the answering process;
the test question recommending unit is used for recommending test questions based on the prompting times and/or the step types of the prompting steps, wherein the step types are modeling steps or knowledge point steps;
the prompting step is determined from the standard answering step sequence based on the step similarity between the user answering step sequence and the standard answering step sequence of the test questions under the condition that the prompting request is received in the answering process.
11. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the auxiliary answering method according to any one of claims 1 to 4 or the question recommending method according to any one of claims 5 to 8 when executing the program.
12. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the method of assisting in answering questions according to any one of claims 1 to 4, or the method of recommending questions according to any one of claims 5 to 8.
CN202211742535.7A 2022-12-30 2022-12-30 Auxiliary answering method, device, electronic equipment and storage medium Pending CN116150319A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211742535.7A CN116150319A (en) 2022-12-30 2022-12-30 Auxiliary answering method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211742535.7A CN116150319A (en) 2022-12-30 2022-12-30 Auxiliary answering method, device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116150319A true CN116150319A (en) 2023-05-23

Family

ID=86357618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211742535.7A Pending CN116150319A (en) 2022-12-30 2022-12-30 Auxiliary answering method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116150319A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340657A (en) * 2023-05-26 2023-06-27 深圳市菁优智慧教育股份有限公司 Test question searching method and device based on user behaviors

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116340657A (en) * 2023-05-26 2023-06-27 深圳市菁优智慧教育股份有限公司 Test question searching method and device based on user behaviors
CN116340657B (en) * 2023-05-26 2023-08-29 深圳市菁优智慧教育股份有限公司 Test question searching method and device based on user behaviors

Similar Documents

Publication Publication Date Title
CN108829682B (en) Computer readable storage medium, intelligent question answering method and intelligent question answering device
Udagawa et al. A natural language corpus of common grounding under continuous and partially-observable context
CN108710704B (en) Method and device for determining conversation state, electronic equipment and storage medium
CN111177359A (en) Multi-turn dialogue method and device
CN110991645A (en) Self-adaptive learning method, system and storage medium based on knowledge model
CN111191450B (en) Corpus cleaning method, corpus input device and computer readable storage medium
CN111078856A (en) Group chat conversation processing method and device and electronic equipment
CN112084317A (en) Method and apparatus for pre-training a language model
CN116150319A (en) Auxiliary answering method, device, electronic equipment and storage medium
CN110942774A (en) Man-machine interaction system, and dialogue method, medium and equipment thereof
CN116821290A (en) Multitasking dialogue-oriented large language model training method and interaction method
CN110852071A (en) Knowledge point detection method, device, equipment and readable storage medium
CN112434953A (en) Customer service personnel assessment method and device based on computer data processing
CN116884282A (en) Question answering method, device, electronic equipment and storage medium
CN111325387B (en) Interpretable law automatic decision prediction method and device
CN115510213A (en) Question answering method and system for working machine and working machine
CN111680148B (en) Method and device for intelligently responding to question of user
CN115114404A (en) Question and answer method and device for intelligent customer service, electronic equipment and computer storage medium
CN114328864A (en) Ophthalmic question-answering system based on artificial intelligence and knowledge graph
CN113569112A (en) Tutoring strategy providing method, system, device and medium based on question
El Azhari et al. An Evolutive Knowledge Base for “AskBot” Toward Inclusive and Smart Learning-based NLP Techniques
CN111858863A (en) Reply recommendation method, reply recommendation device and electronic equipment
CN111488431B (en) Hit determination method, device and system
CN111858862A (en) Reply recommendation method, reply recommendation device and electronic equipment
CN114579606B (en) Pre-training model data processing method, electronic device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination