WO2003050782A1 - Systeme de definition d'exercices - Google Patents

Systeme de definition d'exercices Download PDF

Info

Publication number
WO2003050782A1
WO2003050782A1 PCT/JP2002/012811 JP0212811W WO03050782A1 WO 2003050782 A1 WO2003050782 A1 WO 2003050782A1 JP 0212811 W JP0212811 W JP 0212811W WO 03050782 A1 WO03050782 A1 WO 03050782A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
question
level
answer
exercise
Prior art date
Application number
PCT/JP2002/012811
Other languages
English (en)
Japanese (ja)
Inventor
Makoto Ito
Keisuke Takaishi
Original Assignee
Hogakukan Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hogakukan Co., Ltd. filed Critical Hogakukan Co., Ltd.
Priority to AU2002361082A priority Critical patent/AU2002361082A1/en
Priority to JP2003551760A priority patent/JPWO2003050782A1/ja
Publication of WO2003050782A1 publication Critical patent/WO2003050782A1/fr

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Definitions

  • the present invention is a system for providing exercises to help students, such as various qualification examinations and university entrance examinations, to understand the contents of learning and to confirm the degree of understanding of the contents they have learned. Background art
  • the students In order to measure the level of understanding of the content learned and to further improve the content of the learning, the students repeatedly solve exercises related to the completed learning content and take practice tests. In particular, if a student is going to take a qualification test or the like, he or she may solve a collection of questions that imitate the contents of this test, or questions that have been asked in the past in the test, so-called past questions. Then, when you solve the collection of questions, you score yourself using the set of answers to the collection of questions.
  • the task of the present invention is to solve the exercise problem according to the learner's purpose, such as to repeatedly learn the range that has been completed and to judge the ability at that point in time.
  • the purpose is to provide an exercise question setting system that only sets the questions. Disclosure of the invention
  • a first invention is a display, a storage unit for storing user information and problem information, an input unit for inputting information, and a process for specifying a problem based on the input information of the input unit and the problem information of the storage unit.
  • the user information stored in the storage unit includes a capability level of a user.
  • the problem information includes a problem itself and attribute information of the problem.
  • the attribute information of the problem includes
  • the processing unit includes the difficulty level of the question and the standard answer time set for each question, and the processing unit calculates the sum of the standard answer time of each question from among the questions corresponding to the user's ability level. It is characterized in that it has a function to select questions within the specified desired exercise time.
  • the user's ability level may be determined in any manner. For example, it may be determined automatically based on the student's correct answer rate or the grade of a specific group of questions, or the user may declare it himself. This level can be set in any number of levels.
  • the second invention includes a user terminal, and the processing unit displays the selected question on the user terminal, and calculates an answer rate and a correct answer rate for the answer to the question input from the user terminal. Function and the calculated answer rate and correct answer rate And a function of setting a capability level of the user based on the information and storing the capability level in the storage unit as user information.
  • the third invention is characterized in that the processing unit has a function of matching the number of selected questions with the number of desired exercises when the number of desired exercises is input from the user's terminal. Have.
  • the question ratio for each difficulty level according to the user's ability level is stored in the storage unit, and the processing unit selects the question based on the question ratio corresponding to the user's ability level.
  • the feature is that it has a function.
  • the fifth invention is characterized in that the question attributes include any one or more of a question type classification, a question range classification, and an importance level.
  • the above question category corresponds to the curriculum, and is like a table of contents of a textbook used by a user who is a candidate for a specific test.
  • a sixth aspect of the invention is characterized in that the processing unit determines the difficulty level of the question based on the average correct answer rate of the total user for each question and stores the difficulty level as the question information.
  • the seventh invention is characterized in that a range in which the correct answer rate of all users for each question is equal to or higher than a predetermined reference value is set to a plurality of difficulty levels.
  • the eighth invention has a confidence input means for inputting the user's confidence in the answer together with the answer to the question, and the processing unit has a function of associating the answer with the confidence in the storage unit. It is characterized by points.
  • the ninth invention is characterized in that the processing unit has a function of displaying, as a list, the correctness of each answer and the user's confidence as a user's exercise result.
  • the tenth invention is characterized in that the processing unit has a function of extracting a problem based on the degree of confidence.
  • the storage unit stores the trainer's comment associated with the condition according to the correctness of the answer and the confidence level
  • the processing unit stores the correctness / wrongness and the confidence level of the exercise result of the user and the above condition.
  • the feature is that it has a function to extract the corresponding trainer comment in comparison with the above.
  • the processing unit calculates the accuracy rate, which is the ratio of the number of correct answers to the number of correct answers by the user with respect to the number of correct answers, for each difficulty level of the problem as a result of the user's training.
  • the feature is that it has the function of calculating and displaying it.
  • the thirteenth invention is based on the fifth invention and is characterized in that the importance level of a problem with a low difficulty level is set to a high level.
  • FIG. 1 is a diagram showing the overall configuration of the system of the first embodiment.
  • FIG. 2 is a flowchart showing a procedure for using the system of the first embodiment.
  • FIG. 3 is an answer time ratio table according to the difficulty level of the answer time according to the first embodiment.
  • FIG. 4 is a diagram illustrating the overall configuration of the system according to the second embodiment.
  • FIG. 5 is a diagram showing a comment table of the second embodiment.
  • FIG. 6 is a flowchart showing a procedure for using the system of the second embodiment.
  • FIG. 7 is a diagram showing the configuration of the result display screen of the second embodiment.
  • FIG. 8 is a diagram showing a result table of the second embodiment.
  • FIG. 9 is a diagram showing an answer table of the second embodiment.
  • FIG. 10 is a diagram showing a question selection screen according to the second embodiment.
  • FIG. 11 is a diagram showing a question screen of the second embodiment.
  • FIG. 12 is a table showing the user's exercise results of the second example.
  • FIG. 13 is a table showing the answers and the degrees of confidence of the users of the second embodiment.
  • the system of the first embodiment shown in FIGS. 1 to 3 includes a management center 5 and a plurality of user terminals 1 connected via a communication network N such as an Internet.
  • the management center 5 is a provider of exercises, such as a school or a prep school.
  • the above-mentioned user is a member of the system and a learner.
  • the user-side terminal 1 is a personal computer or the like, and the user-side terminal 1 includes the display of the present invention and an input unit.
  • the management center 5 has a problem database 2 for storing problem information, and a It is provided with a user data base 3 for storing the data and a processing unit 4 for processing the data.
  • the problem database 2 and the user-database 3 correspond to the storage unit of the present invention.
  • a server having the functions of the databases 2 and 3 and the processing unit 4 may be installed in the management center 5. However, it is not necessary to install the processing unit 4 and the above-mentioned data bases 2 and 3 integrally in the control center 5.
  • the server provided with the processing unit 4 and the database may be installed separately. In short, it is only necessary to provide the processing unit 4 and the databases 2 and 3 capable of exchanging data with the processing unit 4.
  • the user can use this system by an input unit or a display directly connected to the processing unit 4 of the management center 5.
  • the exercise database 2 stores exercise questions. These are questions for various tests, for example, for bar exams, patent attorney exams, university entrance exams, and corporate training.
  • a question database 2 may be provided for each type of test, or each question may be stored in a single database with the attribute “ ⁇ test”.
  • Each question is associated with the standard answer time required to solve the question, the difficulty level of the question, the importance level, the correct answer and the content description, and so on.
  • the standard answer time mentioned above may be a time determined based on the time the user of this system actually answered, but it must be solved in this way to pass the actual test. It is such a time.
  • the difficulty level is a measure of how difficult it is for a user to get a correct answer.
  • the decision is based on the percentage of correct answers to the question. For example, a problem with a high correct answer rate for all answerers can be defined as a problem with a low difficulty level, and a problem with a low correct answer rate for all answerers can be defined as a problem with a high difficulty level.
  • the importance level is the level of importance for the purpose of improving the user's ability, and what is important for that purpose. Can be freely set in the system. For example, questions that are predicted from the frequency of questions in past tests are determined according to the likelihood of being taken on the test, determined based on the difficulty of the question, or are basic or applied. Can be determined based on various criteria.
  • a difficulty level or importance level helps the user to perform efficient learning. For example, a less difficult question can be answered correctly by anyone, so at least that should be possible, but a high degree of difficulty that most people cannot answer The problem may be deferred.
  • learning time when learning time is limited, it can be a criterion for determining the order of learning, such as tackling the most important issues first.
  • the question attributes include a formal classification of the questions and a content classification.
  • the above-mentioned formal classification is a way of giving questions or giving answers. For example, a selection formula that selects the correct answer from multiple answer examples, a sorting formula that sorts the sentences, a fill-in formula that fills in the blanks in the question, and the number of correct or wrong There is a number formula to be answered.
  • the content categories for the above questions include exam types, subjects, and ranges according to the curriculum for each subject. Specifically, when the type of examination is a bar examination, the subjects are “Constitutional”, “Civil law”, “Criminal law”,... etc., and the scope is “People's rights in the Constitution” These are “duties” and “diet”, and more specifically, “people who enjoy human rights”, “duties of tax payment”, and “property rights” in “rights and duties of the people”.
  • the setting of the range and the manner of the above-mentioned formal classification can be freely determined by the management center 5 side, but use a classification or expression that is easy to understand for a specific test, for example, a user who studies a bar exam. Is preferred.
  • the user database 3 stores, as user information, learning history when the user has learned using the system of the present invention, in addition to basic attributes such as name, address, age, and gender.
  • This learning history can include when and which questions were asked, the contents of the user's answers to those questions, and their correctness, correctness rate, and score. Also, remember the user's ability level.
  • a user ID and a password are set for the user, and the user is identified by the I password.
  • step S1 the user accesses the home page of the exercise system from his / her terminal 1.
  • the user ID and the password are input (step S2).
  • step S3 the processing unit 4 identifies the user in the user data base 3 based on the input user ID and password, and determines whether or not the user's ability level is stored in the user. . If the user's ability level is stored, the process proceeds to step S7, but if not, the process proceeds to step S4.
  • step S4 a display is made to indicate that the capability level is not stored in the user terminal 1 and to prompt self-reporting of the capability level.
  • the user side terminal 1 notifies the user that he does not make a self-declaration, and in step S5, the processing unit 4 sets the level B as the capability level of the user. And proceed to step S7.
  • step S4 If the user self-reports his / her Nokarepelle in step S4, the process proceeds to step S6, enters the report level, and proceeds to step S7.
  • the processing unit 4 automatically performs the initial declaration.
  • the level may be set. This initial level is of course not limited to B.
  • A, B, and C are set as the user ability levels, where A corresponds to the elementary level, B corresponds to the intermediate level, and C corresponds to the advanced level.
  • step S7 the user inputs a desired exercise time T.
  • the desired exercise time T is the time available for the user to solve this exercise.
  • the processing section 4 refers to the answer time ratio table shown in FIG.
  • the answer time ratio table in Fig. 3 is a table that determines the ratio of the difficulty levels in the questions to be asked according to the user's ability level.
  • the difficulty levels are set as a, b, and c, and a, b, and c are set in ascending order of difficulty.
  • the ratio for each difficulty level is the ratio of the answer time.
  • the question ratio for each difficulty level of the question is calculated according to the user's desired exercise time and the user's ability level.
  • the question rate for each difficulty level is adapted to the answer time, but may be adapted to the number of questions.
  • the question number ratio table instead of the answer time ratio table in Fig. 3 to determine the question rate. For example, for a user with ability level C, three out of ten questions are considered as difficulty a, five as difficulty b, and two as difficulty c. is there.
  • step S9 the above-described solution time T a, Tb, T c for each difficulty level and corresponding questions are extracted. That is, from the questions of the difficulty level a, a question is extracted in which the total of the standard answer times is substantially equal to the answer time T a for each difficulty level. If it is not possible to extract a question so that the total of the above standard answer times coincides with the answer time Ta for each difficulty level calculated in step S8, the total of the above standard answer times ⁇ T a .
  • the method of extracting the problem may be any method, such as using a random number table, but it is preferable that the same question is not repeated for the user. For that purpose, it is necessary to memorize the question numbers etc. that have been set for each user. In the same way, problems of difficulty level b and difficulty level c are extracted, and they are determined as question problems.
  • the total standard answer time of the determined question is within the user's desired exercise time T.
  • step S10 the number of questions set above is added to the number of questions set for the user in the past to obtain a cumulative number.
  • step S11 the processing unit 4 causes the user-side terminal 1 to display the above-mentioned question. From this point, the user starts to solve the problem, and the processing unit 4 starts measuring time in step 12.
  • the time limit is the sum of the standard answer times for the questions, but this time is within the user's desired exercise time T.
  • step S13 the user solves the problem, and writes the answer in the answer field on the display of the user-side terminal 1.
  • the processing unit 4 determines whether the time is over, that is, whether the time limit has been reached. If the time is over, the process proceeds to step S16, and the problem display is erased from the display of the user terminal 1. If the time is not over in step S14, the process proceeds to step S15, and it is determined whether the user submits the answer. Users can submit their answers within the time limit. For example, if the submit button displayed on the display is clicked, the process proceeds to step S16, and the display of the problem disappears from the display. If the user does not click the submit button, the question will be displayed and the user will be able to enter the answer until the time expires in step S14.
  • the display of the question is erased from the display of the user terminal 1 so that the question cannot be solved after a time-out or when the user selects the answer submission.
  • step S16 when the problem disappears from the user terminal 1, the user's answer is input to the processing unit 4 in step S17.
  • step S18 the processing unit 4 scores the above answer, and calculates the answer rate and the correct answer rate.
  • the above answer rate is the ratio of the number of answers to the number of questions, and the correct answer rate and Is the ratio of the number of correct answers to the number of questions.
  • step S19 it is determined whether the answer rate is 90% or more. If the answer rate is less than 90%, the process proceeds to step S20 and the user ability level is demoted. For example, lowering from level C to B and from B to A.
  • reviewing the user ability level based on the answer rate is based on the following reasons.
  • the answer rate is low, it means that there is an unanswered question. If you do not answer it, you judge it as a problem that you can not solve just by seeing it, and if you skip that question or take time to do other problems, spend time on that problem. There are cases where it could not be broken. Or, you may have lost your motivation on the way and submit it without thinking up to the time limit. In any case, it is possible to judge that the ability of the user as an examinee is insufficient.
  • the above 90% criterion should be set to an appropriate value according to the learning content and the passing level of the examination.
  • step S20 the process proceeds from step S20 to step S23, and the results such as the demoted user's ability level, scoring result, correct answer rate, answer rate, etc. are displayed.
  • step S21 it is determined whether the cumulative value of the questions has reached a predetermined value. If the cumulative value of the questions does not reach the predetermined value, the process proceeds to step S23 to display the result including the current user ability level.
  • step S22 the process proceeds to step S22 to review the user ability level.
  • the user's ability level is reviewed based on the user's correct answer rate previously calculated. Set the correct answer rate for qualifying as No. A or B in advance.
  • the Noh level should be set according to the current skill level together with the accuracy rate of this exercise. In other words, when a user of the current ability level A reaches the predetermined accuracy rate, it may be recognized as level B or C. Conversely, if the accuracy rate is less than the predetermined accuracy rate, the user of the skill level B or C may have the skill level A.
  • This step The ability level at that time may be promoted or demoted. Note that the ability level set here may be changed again depending on the results of the next and subsequent exercises.
  • the ability level is set in consideration of the correct answer rate only when the cumulative value of the questions has reached the predetermined value in step S21. This is because, for example, even if you give a very high accuracy rate only once, if you immediately raise your ability level, the next time you make it difficult to answer questions, the accuracy rate will drop immediately. This is to prevent the occurrence of eels. We try to reset the ability level only for users who have practiced to some extent.
  • the reference value of the cumulative question value in step S21 described above can be freely set by the system, or may not be provided.
  • step S23 the processing unit 4 displays the result of this exercise on the user-side terminal 1, and in step S24, the processing unit 4 converts the input data and the calculation result in the above steps.
  • All are stored as user information.
  • the exercise information is stored in the user information of each user as to when and which exercise questions were asked and what was answered, as well as the answer rate and correct answer rate.
  • the user can perform the exercises according to his / her level in a test format within the time desired by the user.
  • the learning history data such as the performance of each user stored in the user data base 3 can be used as reference material when the user receives consultation for an examination.
  • Each question can also be stored with the correct answer rate for that question. This accuracy rate is based on the answers of all users who have given the question. Since the difficulty level of the question can be measured based on the correct answer rate, for example, the correct answer rate may be calculated periodically and the difficulty level of the question may be reset. In this way, you can set the difficulty level that matches the actual situation.
  • the user inputs only his / her desired exercise time.
  • the user can input his / her request such as the classification of the problem and the number of questions.
  • These can also be used as extraction conditions for questions to be set.
  • the range of questions, the question format, the importance level, and so on any attribute that is stored in association with the question as a question attribute can be used as the question extraction condition. Wear.
  • Such a user's desired condition may be input at any time in a step before the problem display in step SI1.
  • the management center 5 is provided with a comment database 6.
  • the comment database 6 stores comments extracted by the processing unit 4 using the user's grade as an extraction condition.
  • the same components as those in the first embodiment of FIG. 1 are denoted by the same reference numerals.
  • the above-mentioned degree of confidence is the degree of confidence the user has selected the answer.
  • the confidence level when answering with a certainty that the answer is correct is uncertain, but the confidence level when selecting the answer is considered to be probably the correct answer. Instead, let's express the degree of confidence when answering with a guesswork by X. Then, the user inputs these confidence levels in association with each answer.
  • the degree of confidence is not limited to three levels as described above, but can be set arbitrarily.
  • Comment Table 7 in Fig. 5 Examples of comments recorded in the comment database 6 above are shown in Comment Table 7 in Fig. 5. The details will be described later, but as shown in the comment table 7 in Fig. 5, the display comment in the right comment column 7a is associated with the condition for selecting each comment described in the left condition column 7b.
  • the comment database 6 memorizes it. Then, the processing unit 4 extracts a comment that meets the conditions based on the data in the comment table 7, and causes the user-side terminal 1 to display the comment.
  • step S 101 the user enters an exercise system from his / her own terminal 1. Access the system homepage.
  • Steps S101 to S106 are the same as steps S1 to S6 of the first embodiment (see FIG. 2), and a description thereof will be omitted.
  • step S107 the processing unit 4 of the management center 5 displays the past performance of the user on the user-side terminal 1 as the performance display screen 8 of FIG.
  • the grade display screen 8 is provided with a training grade section 9 for displaying the previous exercise grade, a comment section 10 from the trainer, and an answer section 11 for the questions set in the previous exercise.
  • the result table 12 shown in FIG. 8 is displayed.
  • This grade table 12 shows the results of compiling the grades according to the importance of the questions.
  • the importance of the problem corresponds to the difficulty level of the present invention.
  • five levels a to e are set, and an importance column 12 a is provided at the top of the grade report 12.
  • the level of difficulty is lower, and the issues are more important.
  • the problem of importance a is the most important and the difficulty is low.
  • Such a setting can be freely set in the system, or the importance may be set based on a difficulty and another criterion.
  • problems of low difficulty were set to high importance for the following reason.
  • tests aimed at reducing the number of students failing to answer questions that are easy and can be answered correctly by many students will be fatal to those candidates. Therefore, before studying difficult problems that no one could solve, I thought it was important to master the less difficult problems.
  • the importance setting in the second embodiment is that it is more important to correctly answer a problem with low difficulty and a high accuracy rate than to answer a problem with high difficulty instability. Based on the idea.
  • this system aims to develop the ability to give stable and correct answers preferentially from less difficult questions.
  • the level of importance of the above questions is determined by the correct answer rate of all users of this system. For example, here, the correctness rate of the question of importance a is 80% or more, the question of importance b is 60% or more and less than 80%, and the correctness rate is 50% More than 60%, importance d is 30% or more and less than 50%, and importance e is less than 30%.
  • the questions classified according to the correct answer rate of the above-mentioned total users are regarded as the issues of importance corresponding to the classification.
  • the correct answer rate of the total users for setting the importance of the question is displayed in the column 12b provided below the importance column 12a.
  • the correct answer rate of the total users is the ratio of the number of correct answers to the number of all users who solved the problem.
  • the correct answer rate is calculated each time a grade is evaluated, and the importance of each question is determined based on the calculated rate. Therefore, the degree of importance of each problem may change ⁇ times, but in practice, the variance of the results often decreases as the number of operations increases.
  • Columns 12d, 12e, and 12f are columns that display the degree of confidence in the correct answer rate.
  • the value displayed in the ⁇ proportion ⁇ 1 2 d is the ratio of the number of correct answers to the number of correct answers plus the degree of confidence ⁇ .
  • the rate ⁇ is the ratio of the degree of confidence ⁇ ⁇ ⁇ ⁇ to the number of correct answers
  • the rate X is the ratio of the degree of confidence X to the number of correct answers.
  • Each is calculated according to the importance of the problem.
  • the accuracy rate is the rate of correct answers with certainty, and corresponds to the accuracy rate of the present invention. Users can judge that their ability has been gained by not only the overall rate of correct answers but also the high rate of correct answers.
  • the ⁇ rate and the X rate are the percentages that answered correctly even though they were not confident, so if these values are high, it is necessary to check the lack of confidence and determine that it is necessary to gain confidence. Can also.
  • the processing unit 4 displays the comment extracted from the comment table 7 in FIG.
  • the processing unit 4 displays the comment extracted from the comment table 7 in FIG.
  • the rank such as a rank is the importance of the problem.
  • the answer time is the standard answer time set in this system. Exercising at 80% and 90% of the time means that the exercise to solve the problem is shorter than the standard answer time. That is to do.
  • the answer table 13 shown in FIG. 9 is displayed in the answer column 11 of the grade display screen 8 in FIG.
  • This answer table 13 has a question number column 13a, a correct error column 13b that displays the correctness of the user's answer for each question, an answer column 13c that displays the user's answer, and a correct answer column that displays the correct answer. With 1 3d. Furthermore, when the display position of each problem number is clicked, a description of the problem is displayed in a window (not shown).
  • the grade display screen 8 as described above is of course not displayed, and can be prevented from being displayed even when the number of questions actually solved is small.
  • This system allows the user to practice as much as possible, even if the time available for the exercise is short.However, if the number of answers is too small, even if the total is evaluated and the grade is evaluated, it is meaningful.
  • the minimum number of answers to be evaluated is set in advance in order to evaluate results. After the total number of answers exceeds the minimum number of answers, the past exercise results are evaluated by going back to the number of answers including the latest answer and exceeding the minimum number of answers.
  • step S107 of Fig. 6 when the past grades as described above are displayed, in step S108, the user refers to the results and selects the next exercise to be performed.
  • step S109 a question selection screen 14 shown in FIG. 10 is displayed.
  • step S109 the user inputs necessary items to the question selection screen 14.
  • the question selection screen 14 is provided with a plurality of input fields for inputting desired conditions for selecting a question. Displays the user's ability level in addition to the time field 14 a for inputting the time that can be practiced, the desired question number field 14 b for inputting the number of desired questions, etc.
  • Ability level column 14c a question type column 14d for inputting the question format described in the first embodiment, a time ratio column 14e for inputting the answer time ratio, a confidence level column 14f,
  • the answer result column is 14 g.
  • the capability level of the user is automatically set to the ability level el14c. It is not a desired input field because it is displayed on the screen. However, users may be able to modify their ability level from this screen.
  • the above time ratio column 14e is a column used when setting the time limit for the exercise as a percentage of the standard time.
  • the confidence level 14 f is a field for inputting the confidence level when the data of the confidence level given to each answer in the previous exercise by the user is used as the extraction condition of the question.
  • 4h is a field to enter the importance of the problem you want to practice.
  • step 110 the processing unit 4 sets the items input from the question selection screen 14 and the exercise history of the user stored in the user database 3 as extraction conditions, and sets the questions from the question database 2. Identify the problem. Then, in step S111, the cumulative number of questions is calculated.
  • each question number has a question display column 15a, answer display column 15b, answer selection column 15c, and confidence input column 15d.
  • reference numerals 16 and 17 are check boxes for the user to check the selected answer and the degree of confidence.
  • the processing unit 4 starts time measurement in step S113, and waits until the time is over in step S115 or until the question is submitted in step S116, and the question question screen 1 is displayed. 5 is displayed. Meanwhile, in step S114, the user solves the problem and enters his confidence. In step S117, the display of the problem is stopped, and the exercise is completed. In step S118, the answer and the degree of confidence selected by the user are input to the processing unit 4.
  • step S119 scoring is performed and the answer rate is calculated.
  • the scoring is to judge the correctness of the answer of the exercise performed this time, but if the number of exercises this time is small, the correct answer rate is calculated including the answer result of the past exercise There is also.
  • step S120 it is determined whether the answer rate is 90% or more, and if the answer rate is less than 90%, the process proceeds to step S121, in which the ability level is demoted and the step S125 is performed. Proceed to.
  • step S122 determines whether the cumulative number of questions has reached the predetermined value, and if not, go to step S125. Go to step S 1 2 3 if it has been achieved.
  • step S123 the ability level is reviewed according to the accuracy rate, and updated if necessary.
  • step S124 a grade table is created.
  • the report created here is Table 12 shown in Fig. 8 that includes the results of this exercise.
  • step S125 the answer table 13 shown in FIG. 9 and comments are extracted.
  • the condition for extracting the No. 1 comment in Fig. 5 is "N1 or more (correct, false) occur in the a-rank problem".
  • This is a condition that, for a question of importance a, the result of this answer is incorrect, that is, “wrong” and there were N1 or more that were “correct” in the previous time.
  • data as shown in FIG. 12 stored in the user database 6 as an exercise history is used.
  • Table 18 in Figure 12 tabulates the results of a particular user's exercise. Table 18 associates each question number with the importance of the question, the correctness of the user's answer, and the confidence.
  • the previous and previous column 18c show the results of the previous and previous times Has been done.
  • the last two times before means the past and the last time before this time
  • the answer results in the last column 18b correspond to the questions set in the same exercise.
  • the results in column 18c do not necessarily correspond to the questions set out in the same exercise.
  • the processing unit 4 when the processing unit 4 creates the grade table 12 and the answer table 13 and extracts comments, the processing unit 4 displays them on the user terminal 1 in step S126.
  • the result display screen for displaying these is almost the same as the grade display screen 8 in FIG. 7, but the result display screen displayed here is “this time”, whereas FIG. Is displayed.
  • step S121 the case where the demotion is performed in step S121 and the case where If the cumulative number of questions does not reach the predetermined value in step S122, a new grade sheet 12 in Fig. 8 is not created, but even in such a case, the answer table 13 (See Figure 9) and display the comment. If the ability level has been updated, that may be displayed as a comment.
  • Table 19 shows the number of questions answered confidently, the number of correct answers, the number of incorrect answers, and the number of questions without confidence in 50 questions, regardless of the level of importance of the questions. Are shown at a glance. Not only yourself, but also advisors who provide study consultation can provide guidance using such data.
  • step S127 all the results are stored in the user database 3, and the process ends.
  • the difficulty level of the question corresponding to the importance is determined based on the correct answer rate of the total user. Therefore, the difficulty level and the importance are determined by the user. It has become something which was in line with his ability.
  • 50% or more that is, questions that can be correctly answered by more than half of the users are set as medium difficulty, and questions with a higher correct answer rate are set for a plurality of difficulty levels a to c.
  • the total correct answer rate of 50% of the users corresponds to the preset reference value of the present invention. In this way, if you set the middle level and higher levels to multiple levels, you will have to master the problems that are difficult for the user to be less difficult than the medium level and work on each level step by step be able to.
  • the user terminal 1 remains connected to the processing unit 4 of the management center 15 from when the user accesses the homepage until the exercise result is displayed. However, when the processing unit 4 determines a question to be set, it may be downloaded to the user-side terminal 1 and disconnected.
  • steps S12 to S16 in the flowchart of FIG. 2 are processed only by the user-side terminal 1.
  • steps SI 13 to SI 17 in the flowchart of FIG. 6 are processed only by the user terminal 1.
  • connection with the processing unit 4 is cut off while the exercise problem is solved on the user-side terminal 1, there is an advantage that no communication cost is required.
  • the management center 5 becomes unnecessary.
  • the user database 3 may store the user data of only the user corresponding to the terminal 1.
  • the management center 15 manages the problem database 2 as in the first and second embodiments described above, the addition and deletion of a problem and the change of attributes can be performed by the management center. It can be simplified by the convenience of the 5 side.
  • the processing unit 4 performs the processing such as the results of a plurality of users aiming for the same kind of test and data such as levels. Can also be counted.
  • the daily exercise may be performed only by the user's one terminal 1 and the data may be periodically transmitted to the management center 5.
  • the user can perform the exercises suitable for his / her level in the test format at any time within the time desired by the user. Therefore, the user does not need to prepare many question books or search for a problem that suits his or her level. In this way, you can easily solve problems that suit your level, so that you can learn more efficiently, for example, to pass an exam.
  • the user can answer the exercise problem using the user terminal.
  • the ability level is set according to the answer rate and the correct answer rate of the user, it is possible to set the ability level of the user according to the user.
  • the user can solve as many problems as he wants in the time that he can use.
  • the user's answer rate and correct answer rate are stored in the storage unit, they can be used as the learning history of each user when the user receives consultation for an examination.
  • the question can be selected according to the question format, the question range, and the importance level.
  • the question format By specifying the question format, the user can exercise in the desired format.
  • the question range By specifying the question range, the user can exercise according to his or her own learning speed.
  • specifying the level of importance enables more efficient exercises.
  • the difficulty level of the problem can be set according to the actual situation.
  • the seventh invention it is possible to set a problem at a level that matches the ability level of the user.
  • users will be able to gradually step up from familiar goals when mastering issues.
  • the user's confidence in answering each individual answer can be grasped. From the confidence and the answer, the user's learning weakness, learning posture, or personality can be inferred. Therefore, the user It can also be used as a reference when the person or advisor, etc., considers how to proceed with learning in the future.
  • the user can grasp the answer and the degree of confidence at a glance.
  • a problem with a specific degree of confidence can be repeatedly set. For example, if you increase the user's confidence by practicing questions with low confidence, you will get stable and good results.
  • the eleventh invention it is possible to give a more useful comment to the user in consideration of the user's confidence.
  • the correct answer rate with confidence can be understood. From this response rate, the user can estimate how stable his / her current exercise results are.
  • the thirteenth invention it is based on the idea that it is more important to correctly answer a question with a low difficulty level and a high accuracy rate than to answer an unstable question with a high difficulty level. Importance can be set. As a result, it is possible to train users to be able to give stable and correct answers preferentially from less difficult problems.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

L'invention concerne un système de définition d'exercices pour définir un certain nombre d'exercices de révision ou de test qui doivent être effectués dans le temps imparti pour l'apprentissage. Ce système comprend un dispositif d'affichage, des unités de mémorisation (2, 3) pour mémoriser les informations relatives aux utilisateurs et aux questions, une unité d'entrée pour entrer des informations, et une unité de traitement (4) pour spécifier une question selon les informations relatives aux entrées de l'unité d'entrée et les informations relatives aux questions de l'unité de mémorisation. Les informations concernant l'utilisateur devant être mémorisées dans l'unité de mémorisation comprennent un niveau de capacité d'utilisateur, et les informations relatives aux questions comprennent la question elle-même et les informations relatives à l'attribut de la question. Ces dernières définissent le niveau de difficulté de chaque question et un temps de réponse standard affecté à chaque question. L'unité de traitement présente une fonction permettant de sélectionner des questions à partir des questions du niveau de capacité de l'utilisateur, de telle sorte que le nombre total de réponses standard se situe dans le temps imparti à l'utilisateur.
PCT/JP2002/012811 2001-12-12 2002-12-05 Systeme de definition d'exercices WO2003050782A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU2002361082A AU2002361082A1 (en) 2001-12-12 2002-12-05 Exercise setting system
JP2003551760A JPWO2003050782A1 (ja) 2001-12-12 2002-12-05 演習問題出題システム

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2001-378705 2001-12-12
JP2001378705 2001-12-12
JP2002269520 2002-09-17
JP2002-269520 2002-09-17

Publications (1)

Publication Number Publication Date
WO2003050782A1 true WO2003050782A1 (fr) 2003-06-19

Family

ID=26625018

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2002/012811 WO2003050782A1 (fr) 2001-12-12 2002-12-05 Systeme de definition d'exercices

Country Status (3)

Country Link
JP (1) JPWO2003050782A1 (fr)
AU (1) AU2002361082A1 (fr)
WO (1) WO2003050782A1 (fr)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122079A (ja) * 2003-10-15 2005-05-12 Digital Boutique Inc 自己管理型e−ラーニングツール
WO2006000632A1 (fr) * 2004-06-24 2006-01-05 Nokia Corporation Evaluation des connaissances
JP2006023566A (ja) * 2004-07-08 2006-01-26 Matsushita Electric Ind Co Ltd 理解度判定装置および方法
SG119225A1 (en) * 2004-06-04 2006-02-28 Education Learning House Co Lt Method of multi-level analyzing personal learning capacity
WO2006093065A1 (fr) * 2005-03-02 2006-09-08 The Japan Institute For Educational Measurement, Inc. Dispositif de jugement de niveau d’apprentissage et programme de jugement de niveau d’apprentissage
WO2007066451A1 (fr) * 2005-12-09 2007-06-14 Matsushita Electric Industrial Co., Ltd. Systeme, dispositif et procede de traitement informatique
JP2008026583A (ja) * 2006-07-21 2008-02-07 Yamaguchi Univ 適応型テストシステムとその方法
JP2008129032A (ja) * 2006-11-16 2008-06-05 Casio Comput Co Ltd 練習手順生成装置および練習手順生成処理プログラム
JP2010122254A (ja) * 2008-11-17 2010-06-03 Kankyo Keiei Senryaku Soken:Kk ユーザー教育システム
JP2011076407A (ja) * 2009-09-30 2011-04-14 Oki Electric Industry Co Ltd 帳票処理システム
WO2011110570A1 (fr) 2010-03-09 2011-09-15 Glaxosmithkline Biologicals S.A. Traitement des infections streptococciques
WO2011129048A1 (fr) * 2010-04-14 2011-10-20 株式会社ソニー・コンピュータエンタテインメント Serveur de support de jeu, dispositif de jeu, système de support de jeu et procédé de support de jeu
WO2012127580A1 (fr) * 2011-03-18 2012-09-27 富士通株式会社 Dispositif d'élaboration de questions et procédés d'élaboration de questions
WO2012131948A1 (fr) * 2011-03-30 2012-10-04 富士通株式会社 Dispositif de fourniture de problèmes et procédé de fourniture de problèmes
US8382483B2 (en) 2006-06-21 2013-02-26 Panasonic Corporation Service providing system
WO2013102966A1 (fr) * 2012-01-06 2013-07-11 Flens株式会社 Serveur d'aide à l'apprentissage, système d'aide à l'apprentissage et programme d'aide à l'apprentissage
US8521271B2 (en) 2006-11-06 2013-08-27 Panasonic Corporation Brain wave identification method adjusting device and method
KR20150007194A (ko) 2011-03-16 2015-01-20 후지쯔 가부시끼가이샤 시험 실시 지원 장치, 시험 실시 지원 방법 및 기억 매체
JP2015018096A (ja) * 2013-07-10 2015-01-29 正樹 後藤 集合教育方法及び、集合教育コンピュータシステム
JP2015121682A (ja) * 2013-12-24 2015-07-02 富士通株式会社 学習支援プログラム、学習支援装置および学習支援方法
JP2015197460A (ja) * 2014-03-31 2015-11-09 株式会社サイトビジット 情報処理装置、情報処理方法及びプログラム
CN105139709A (zh) * 2015-09-24 2015-12-09 广西德高仕安全技术有限公司 一种安全培训考试试卷的生成系统及方法
JP2016177306A (ja) * 2014-11-28 2016-10-06 株式会社サイトビジット Eラーニングシステム
JP2016206619A (ja) * 2015-04-16 2016-12-08 RISU Japan株式会社 学習用教材が内蔵された電子出版物および該電子出版物を用いた学習支援システム
JP6308482B1 (ja) * 2017-03-06 2018-04-11 弘道 外山 学習支援システム、学習支援サーバ、学習支援方法、学習支援プログラムおよび学習支援システムの運用方法
CN109086431A (zh) * 2018-08-13 2018-12-25 广东小天才科技有限公司 一种知识点巩固学习方法及电子设备
JP2019174580A (ja) * 2018-03-28 2019-10-10 株式会社High−Standard&Co. 教材データ作成プログラム及び教材作成方法
WO2020115171A1 (fr) 2018-12-06 2020-06-11 Glaxosmithkline Biologicals Sa Compositions immunogènes
EP3799884A1 (fr) 2019-10-01 2021-04-07 GlaxoSmithKline Biologicals S.A. Compositions immunogènes
JP2021093064A (ja) * 2019-12-12 2021-06-17 株式会社ナレロー 情報処理方法、情報処理装置、記憶媒体、及びプログラム
JP2021092725A (ja) * 2019-12-12 2021-06-17 カシオ計算機株式会社 学習支援装置、学習支援方法、及びプログラム
WO2022175423A1 (fr) 2021-02-22 2022-08-25 Glaxosmithkline Biologicals Sa Composition immunogène, utilisation et procédés

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7345149B2 (ja) * 2020-03-31 2023-09-15 株式会社大阪教育研究所 課題レコメンドシステム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04145481A (ja) * 1990-10-08 1992-05-19 Brother Ind Ltd 電子学習機
JPH0830187A (ja) * 1994-07-20 1996-02-02 Shiyuusui:Kk 自動採点装置と採点表示方法及びこれらに用いられる解答用紙
JPH10319826A (ja) * 1997-05-19 1998-12-04 Katsuichi Tashiro 教材管理システム
JPH11282826A (ja) * 1998-03-31 1999-10-15 Nippon Telegr & Teleph Corp <Ntt> インターネットを利用した電子教育システム
JP2000267554A (ja) * 1999-03-16 2000-09-29 Gakushu Kankyo Kenkyusho:Kk 学習支援装置および学習支援方法、並びに記録媒体
WO2001050439A1 (fr) * 1999-12-30 2001-07-12 Cerego Cayman, Inc. Systeme, appareil et procede de maximisation de la capacite et de l'efficacite d'apprentissage, de memorisation et de restitution des connaissances et des competences
JP2001318583A (ja) * 2000-05-11 2001-11-16 Kengo Ito 学習システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04145481A (ja) * 1990-10-08 1992-05-19 Brother Ind Ltd 電子学習機
JPH0830187A (ja) * 1994-07-20 1996-02-02 Shiyuusui:Kk 自動採点装置と採点表示方法及びこれらに用いられる解答用紙
JPH10319826A (ja) * 1997-05-19 1998-12-04 Katsuichi Tashiro 教材管理システム
JPH11282826A (ja) * 1998-03-31 1999-10-15 Nippon Telegr & Teleph Corp <Ntt> インターネットを利用した電子教育システム
JP2000267554A (ja) * 1999-03-16 2000-09-29 Gakushu Kankyo Kenkyusho:Kk 学習支援装置および学習支援方法、並びに記録媒体
WO2001050439A1 (fr) * 1999-12-30 2001-07-12 Cerego Cayman, Inc. Systeme, appareil et procede de maximisation de la capacite et de l'efficacite d'apprentissage, de memorisation et de restitution des connaissances et des competences
JP2001318583A (ja) * 2000-05-11 2001-11-16 Kengo Ito 学習システム

Cited By (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122079A (ja) * 2003-10-15 2005-05-12 Digital Boutique Inc 自己管理型e−ラーニングツール
SG119225A1 (en) * 2004-06-04 2006-02-28 Education Learning House Co Lt Method of multi-level analyzing personal learning capacity
WO2006000632A1 (fr) * 2004-06-24 2006-01-05 Nokia Corporation Evaluation des connaissances
JP2006023566A (ja) * 2004-07-08 2006-01-26 Matsushita Electric Ind Co Ltd 理解度判定装置および方法
WO2006093065A1 (fr) * 2005-03-02 2006-09-08 The Japan Institute For Educational Measurement, Inc. Dispositif de jugement de niveau d’apprentissage et programme de jugement de niveau d’apprentissage
JPWO2007066451A1 (ja) * 2005-12-09 2009-05-14 パナソニック株式会社 情報処理システム、情報処理装置および方法
WO2007066451A1 (fr) * 2005-12-09 2007-06-14 Matsushita Electric Industrial Co., Ltd. Systeme, dispositif et procede de traitement informatique
US7945865B2 (en) 2005-12-09 2011-05-17 Panasonic Corporation Information processing system, information processing apparatus, and method
US8382483B2 (en) 2006-06-21 2013-02-26 Panasonic Corporation Service providing system
JP2008026583A (ja) * 2006-07-21 2008-02-07 Yamaguchi Univ 適応型テストシステムとその方法
US8521271B2 (en) 2006-11-06 2013-08-27 Panasonic Corporation Brain wave identification method adjusting device and method
JP2008129032A (ja) * 2006-11-16 2008-06-05 Casio Comput Co Ltd 練習手順生成装置および練習手順生成処理プログラム
JP4742288B2 (ja) * 2006-11-16 2011-08-10 カシオ計算機株式会社 練習手順生成装置および練習手順生成処理プログラム
JP2010122254A (ja) * 2008-11-17 2010-06-03 Kankyo Keiei Senryaku Soken:Kk ユーザー教育システム
JP2011076407A (ja) * 2009-09-30 2011-04-14 Oki Electric Industry Co Ltd 帳票処理システム
WO2011110570A1 (fr) 2010-03-09 2011-09-15 Glaxosmithkline Biologicals S.A. Traitement des infections streptococciques
WO2011129048A1 (fr) * 2010-04-14 2011-10-20 株式会社ソニー・コンピュータエンタテインメント Serveur de support de jeu, dispositif de jeu, système de support de jeu et procédé de support de jeu
US9613540B2 (en) 2011-03-16 2017-04-04 Fujitsu Limited Examination support apparatus, and examination support method
KR20150007194A (ko) 2011-03-16 2015-01-20 후지쯔 가부시끼가이샤 시험 실시 지원 장치, 시험 실시 지원 방법 및 기억 매체
JP5686180B2 (ja) * 2011-03-18 2015-03-18 富士通株式会社 出題装置及び出題方法
US9536440B2 (en) 2011-03-18 2017-01-03 Fujitsu Limited Question setting apparatus and method
WO2012127580A1 (fr) * 2011-03-18 2012-09-27 富士通株式会社 Dispositif d'élaboration de questions et procédés d'élaboration de questions
JP5686183B2 (ja) * 2011-03-30 2015-03-18 富士通株式会社 出題装置及び出題方法
WO2012131948A1 (fr) * 2011-03-30 2012-10-04 富士通株式会社 Dispositif de fourniture de problèmes et procédé de fourniture de problèmes
US9711057B2 (en) 2011-03-30 2017-07-18 Fujitsu Limited Question setting apparatus and method
WO2013102966A1 (fr) * 2012-01-06 2013-07-11 Flens株式会社 Serveur d'aide à l'apprentissage, système d'aide à l'apprentissage et programme d'aide à l'apprentissage
JP2013142718A (ja) * 2012-01-06 2013-07-22 Flens Co Ltd 学習支援サーバ、学習支援システム、及び学習支援プログラム
US9672752B2 (en) 2012-01-06 2017-06-06 Flens Inc. Learning assistance server, learning assistance system, and learning assistance program
JP2015018096A (ja) * 2013-07-10 2015-01-29 正樹 後藤 集合教育方法及び、集合教育コンピュータシステム
JP2015121682A (ja) * 2013-12-24 2015-07-02 富士通株式会社 学習支援プログラム、学習支援装置および学習支援方法
JP2015197460A (ja) * 2014-03-31 2015-11-09 株式会社サイトビジット 情報処理装置、情報処理方法及びプログラム
JP2016177306A (ja) * 2014-11-28 2016-10-06 株式会社サイトビジット Eラーニングシステム
JP2016206619A (ja) * 2015-04-16 2016-12-08 RISU Japan株式会社 学習用教材が内蔵された電子出版物および該電子出版物を用いた学習支援システム
CN105139709A (zh) * 2015-09-24 2015-12-09 广西德高仕安全技术有限公司 一种安全培训考试试卷的生成系统及方法
JP6308482B1 (ja) * 2017-03-06 2018-04-11 弘道 外山 学習支援システム、学習支援サーバ、学習支援方法、学習支援プログラムおよび学習支援システムの運用方法
JP2018146799A (ja) * 2017-03-06 2018-09-20 弘道 外山 学習支援システム、学習支援サーバ、学習支援方法、学習支援プログラムおよび学習支援システムの運用方法
JP2019174580A (ja) * 2018-03-28 2019-10-10 株式会社High−Standard&Co. 教材データ作成プログラム及び教材作成方法
JP7195570B2 (ja) 2018-03-28 2022-12-26 株式会社High-Standard&Co. 教材データ作成プログラム及び教材作成方法
CN109086431A (zh) * 2018-08-13 2018-12-25 广东小天才科技有限公司 一种知识点巩固学习方法及电子设备
CN109086431B (zh) * 2018-08-13 2020-11-03 广东小天才科技有限公司 一种知识点巩固学习方法及电子设备
WO2020115171A1 (fr) 2018-12-06 2020-06-11 Glaxosmithkline Biologicals Sa Compositions immunogènes
WO2021064050A1 (fr) 2019-10-01 2021-04-08 Glaxosmithkline Biologicals Sa Compositions immunogènes
EP3799884A1 (fr) 2019-10-01 2021-04-07 GlaxoSmithKline Biologicals S.A. Compositions immunogènes
JP2021093064A (ja) * 2019-12-12 2021-06-17 株式会社ナレロー 情報処理方法、情報処理装置、記憶媒体、及びプログラム
JP2021092725A (ja) * 2019-12-12 2021-06-17 カシオ計算機株式会社 学習支援装置、学習支援方法、及びプログラム
JP7415520B2 (ja) 2019-12-12 2024-01-17 カシオ計算機株式会社 学習支援装置、学習支援方法、及びプログラム
JP7421207B2 (ja) 2019-12-12 2024-01-24 株式会社ナレロー 情報処理方法、情報処理装置、記憶媒体、及びプログラム
WO2022175423A1 (fr) 2021-02-22 2022-08-25 Glaxosmithkline Biologicals Sa Composition immunogène, utilisation et procédés

Also Published As

Publication number Publication date
AU2002361082A1 (en) 2003-06-23
JPWO2003050782A1 (ja) 2005-04-21

Similar Documents

Publication Publication Date Title
WO2003050782A1 (fr) Systeme de definition d&#39;exercices
US9672752B2 (en) Learning assistance server, learning assistance system, and learning assistance program
US20060246411A1 (en) Learning apparatus and method
Wood Multiple choice: A state of the art report
US20090061408A1 (en) Device and method for evaluating learning
US20090239201A1 (en) Phonetic pronunciation training device, phonetic pronunciation training method and phonetic pronunciation training program
KR101031304B1 (ko) 전자매체를 기반으로 한 범용 학습 시스템 및 방법
JP7077533B2 (ja) 学習支援装置、学習支援システム、学習支援方法及びコンピュータプログラム
Shin et al. Evaluating different standard-setting methods in an ESL placement testing context
WO2018051844A1 (fr) Système de gestion, procédé de gestion et programme
JP2002221893A (ja) 学習支援システム
JP2004005322A (ja) 最適質問提示方法及び最適質問提示装置
Flannelly Using feedback to reduce students' judgment bias on test questions
JP3634856B2 (ja) 試験結果分析装置、方法およびプログラム
Brown et al. Functional assessment of immediate task planning and execution by adults with acquired brain injury
JP2002287608A (ja) 学習支援システム
JP6551818B1 (ja) 情報処理装置、及び、プログラム
JP2019053270A (ja) 学習力評価システム、学習力評価方法およびコンピュータプログラム
JP2020003690A (ja) 学習支援装置、方法、及びコンピュータプログラム
Kwon et al. Reading speed as a constraint of accuracy of self‐perception of reading skill
JP4940213B2 (ja) 電子黒板を利用した教育システム
KR20230136265A (ko) 엘로 점수 기반 개인화 맞춤 학습 시스템
JP2005331650A (ja) 学習システム、情報処理装置、情報処理方法およびプログラム
JP6313515B1 (ja) 学習力評価システム、学習者端末およびコンピュータプログラム
JP2005250423A (ja) 語学学習システム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR IE IT LU MC NL PT SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2003551760

Country of ref document: JP

122 Ep: pct application non-entry in european phase