WO2021177413A1 - Dispositif d'évaluation de rapport, procédé d'évaluation de rapport et programme - Google Patents

Dispositif d'évaluation de rapport, procédé d'évaluation de rapport et programme Download PDF

Info

Publication number
WO2021177413A1
WO2021177413A1 PCT/JP2021/008493 JP2021008493W WO2021177413A1 WO 2021177413 A1 WO2021177413 A1 WO 2021177413A1 JP 2021008493 W JP2021008493 W JP 2021008493W WO 2021177413 A1 WO2021177413 A1 WO 2021177413A1
Authority
WO
WIPO (PCT)
Prior art keywords
report
evaluation
description
result
unit
Prior art date
Application number
PCT/JP2021/008493
Other languages
English (en)
Japanese (ja)
Inventor
健太 佐々木
朋子 君島
麻未 梶井
鈴木 健一
Original Assignee
株式会社グロービス
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社グロービス filed Critical 株式会社グロービス
Publication of WO2021177413A1 publication Critical patent/WO2021177413A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Definitions

  • the present invention relates to a report evaluation device, a report evaluation method, and a program.
  • the present application claims priority based on Japanese Patent Application No. 2020-037109 filed in Japan on March 4, 2020, the contents of which are incorporated herein by reference.
  • the following scoring support devices are known as technologies that support user learning in an e-learning environment. That is, the scoring support device inputs a text such as an answer to a descriptive question as a score estimation target, and calculates the degree of similarity between the input text of the score estimation target and a preset reference text. Further, the scoring support device determines the relationship between the similarity calculated for the plurality of texts to which the score has been given and the score given to each of the plurality of texts. The scoring support device determines the score of the text to be scored based on the similarity of the text to be scored and the determined relationship (see, for example, Patent Document 1).
  • an object of the present invention is to enable accurate evaluation of a text having a sentence format.
  • One aspect of the present invention for solving the above-mentioned problems is to evaluate the report by using the result of extracting the description portion satisfying the evaluation criteria from the report described in the sentence format corresponding to the predetermined problem.
  • one aspect of the present invention includes an evaluation step of evaluating the report by using the result of extracting a description portion satisfying the evaluation criteria from the report described in a sentence format corresponding to a predetermined problem. It is a report evaluation method including at least a screen information generation step for generating report screen information corresponding to a report screen that can be displayed by reflecting the extraction result of the description portion by the evaluation step in the description content of the report.
  • one aspect of the present invention is an evaluation in which a computer evaluates the report by using the result of extracting a description portion satisfying the evaluation criteria from the report described in a sentence format corresponding to a predetermined problem.
  • This is a program for functioning as a screen information generation unit that generates report screen information corresponding to a report screen that can be displayed by reflecting the extraction result of the description part by at least the evaluation unit in the description content of the report. ..
  • FIG. 1 shows an overall configuration example of the learning support system of the present embodiment.
  • the learning support system of the present embodiment evaluates the report submitted by the user U according to the theme given. That is, the user U in the present embodiment is a person (learner, report submitter) who creates a report using the user terminal device 100 and receives an evaluation of the created report.
  • the learning support system of the present embodiment includes a plurality of user terminal devices 100, a learning support device 200 (an example of a report evaluation device), and an instructor terminal device 300.
  • the user terminal device 100 is a terminal device used by each user U for learning.
  • the user terminal device 100 may be, for example, a personal computer, a tablet terminal, a smartphone, or the like.
  • the learning support device 200 is a device that provides the user U with learning support corresponding to learning by communicating with each of the user terminal devices 100 via the network NT. That is, the learning support device 200 provides the learning course to the user U by communicating with the user terminal device 100.
  • the curriculum in the course provided by the learning support device 200 of the present embodiment includes report submission. That is, a certain theme (problem) for creating a report is presented to the user U.
  • the theme of the report is, for example, when the user U logs his / her user terminal device 100 into the website of the learning course of the learning support device 200 to perform learning, and is displayed according to the progress of learning. It may be presented on the web page.
  • User U creates a report according to the presented theme and submits the created report.
  • the report may be performed, for example, by the user creating a sentence by using the word processor function of the user terminal device 100.
  • the user submits the report by having the user terminal device 100 transmit the created report file to the learning support device 200 as the transmission destination.
  • the learning support device 200 can evaluate the transmitted report and transmit the evaluation result information indicating the evaluation result to the user terminal device 100.
  • the instructor terminal device 300 is a terminal device used by the instructor I.
  • the instructor terminal device 300 may be, for example, a personal computer, a tablet terminal, a smartphone, or the like.
  • the evaluation result of the report made by the learning support device 200 may be provided to the user U.
  • the instructor I confirms the evaluation result of the report made by the learning support device 200 by the instructor terminal device 300, and corrects the evaluation result by the learning support device 200 as necessary.
  • Lecturer I will be made to make the final evaluation. In this case, it is not necessary for the instructor I to extract the description part and the like that satisfy the evaluation criteria from the beginning of the report, and the evaluation can be performed by confirming the evaluation result by the learning support device 200 as described above. Therefore, the work related to the evaluation can be made more efficient.
  • FIG. 2 shows a configuration example of the learning support device 200.
  • the learning support device 200 shown in the figure includes a communication unit 201, a control unit 202, and a storage unit 203.
  • the communication unit 201 executes communication via the network NT.
  • the control unit 202 executes various controls in the learning support device 200.
  • the function as the control unit 202 is realized by executing a program by a CPU (Central Processing Unit) included in the learning support device 200.
  • the control unit 202 includes an evaluation unit 221, a screen information generation unit 222, and an evaluation result output unit 223.
  • the evaluation unit 221 evaluates the report by using the result of extracting the description portion satisfying the evaluation criteria from the report described in the sentence format corresponding to the predetermined task.
  • the screen information generation unit 222 generates report screen information corresponding to a report screen that can be displayed by reflecting at least the extraction result of the description portion by the evaluation unit 221 in the description content of the report.
  • the evaluation result output unit 223 outputs the evaluation result information for the report submitter in which the evaluation result by the evaluation unit 221 is reflected.
  • the storage unit 203 stores various information related to the learning support device 200.
  • the storage unit 203 includes a user information storage unit 231, a submission report storage unit 232, an evaluation utilization information storage unit 233, an evaluation result information storage unit 234, and a feedback comment storage unit 235.
  • the user information storage unit 231 stores information for each user as a learner who receives a learning course provided by the learning support device 200.
  • the submission report storage unit 232 stores the report submitted by the user U. That is, the submission report storage unit 232 stores the report transmitted from the user terminal device 100.
  • the report stored in the submitted report storage unit 232 is evaluated.
  • the evaluation utilization information storage unit 233 stores the information used by the evaluation unit 221 for the evaluation of the report.
  • the evaluation result information storage unit 234 stores evaluation result information indicating the evaluation result of the evaluation unit 221 for each submitted report.
  • the feedback comment storage unit 235 stores the feedback comment. Feedback comments will be described later.
  • FIG. 3 shows an example of evaluation utilization information stored in the evaluation utilization information storage unit 233.
  • the evaluation usage information in the figure includes the theme ID, the theme content, the question, and the area of the evaluation standard corresponding to each theme.
  • the theme ID area includes the theme ID, which is an identifier assigned to the corresponding theme.
  • the theme content area stores the content of the corresponding theme (theme content).
  • the question area stores the questions. In this embodiment, one or more questions are provided corresponding to one theme.
  • the question area specifically stores information indicating the content of the question provided according to the theme.
  • the questions are presented with the theme, for example, to user U, who is the submitter of the report. The user creates a report under the presented theme, including the answers to the similarly presented questions.
  • the evaluation criteria area stores the evaluation criteria.
  • the evaluation standard area specifically stores information indicating the contents of the evaluation standard defined for the relevant question. Taking the theme to which the theme ID "T0001" is assigned in the figure as an example, the theme content is indicated by the data "0001.thm” stored in the theme content area. Then, three questions “0011.set”, “0012.set”, and “0013.set” are provided corresponding to the theme contents indicated by the data "0001.thm”. The question “0011.set” is provided with two evaluation criteria “0001.kiz” and “0002.kiz”, and the question “0002.set” is provided with one evaluation standard "0003.kiz”. One evaluation standard "0004. Kiz" is provided in "0003. Set”.
  • Step S101 An example of a processing procedure executed by the learning support device 200 of the present embodiment in connection with the evaluation of the report will be described with reference to the flowchart of FIG. Step S101:
  • the evaluation unit 221 inputs a report to be evaluated.
  • the evaluation unit 221 reads the report to be evaluated from the submission report storage unit 232 and inputs the read report.
  • Step S102 In the present embodiment, the report stored in the submitted report storage unit 232 is in the same format as when it is transmitted from, for example, the user terminal device 100.
  • the format as it is transmitted from the user terminal device 100 may not be suitable for evaluation.
  • the evaluation unit 221 converts the evaluation target report input in step S101 into a predetermined format suitable for evaluation.
  • Step S103 The evaluation unit 221 inputs the evaluation criteria corresponding to the theme of the report to be evaluated from the evaluation utilization information storage unit 233.
  • Step S104 The report is text written in natural language. Therefore, the evaluation unit 221 extracts a description portion (evaluation criterion corresponding description portion) that satisfies the evaluation criteria input in step S103 from the report to be evaluated. At this time, the evaluation unit analyzes the description contents of the report using natural language processing. As an example, as an analysis of the report, the evaluation unit 221 may divide the character string of the report indicated by the text into words by morphological analysis and perform distributed expression for each word. Further, as the analysis of the report, the evaluation unit 221 may perform context analysis such as syntactic analysis and semantic analysis on the character string indicated by the report.
  • the evaluation unit 221 derives items such as the content of the report, the number of characters, the appropriateness of the vocabulary to be used (for example, easy-to-understand), etc. from the distributed expression for each word obtained as described above and the results of various analyzes. You can do it.
  • the evaluation unit 221 analyzes the description contents of the report as described above, identifies the description part that satisfies the conditions required by each evaluation standard from the entire description of the report, and evaluates the specified description part as the evaluation standard. Extract as the relevant description part.
  • the evaluation unit 221 may use a learning model in extracting the description portion related to the evaluation criteria in step S104.
  • the evaluation criteria have the content of determining whether or not a specific item is mentioned, for example, "referring to XX" in the report.
  • a learning model that extracts related description parts corresponding to such evaluation criteria is a sentence (which may be a short sentence or a compound sentence) and a phrase indicating items shown in a large number of evaluation criteria in a learning device. , Words, etc. can be constructed by inputting and learning a data set as learning data.
  • Step S105 The evaluation unit 221 scores the report to be evaluated as an evaluation.
  • the evaluation unit 221 gives a predetermined score to the evaluation criteria from which the description portion is extracted in step S104, and does not give a score to the evaluation criteria from which the description portion is not extracted.
  • the scoring may be done in this way.
  • the fact that the description part was extracted means that the content required by the evaluation criteria was described in the report, and the fact that the description part was not extracted means that the content required by the evaluation criteria was not described in the report. That is to say.
  • the evaluation unit 221 may store the scoring result in step S105 in the evaluation result information storage unit 234 so as to be included in the evaluation result information corresponding to the report to be evaluated.
  • Step S106 The screen information generation unit 222 generates report screen information.
  • the report screen information is information used for displaying the report screen.
  • the report screen is a screen in which the extraction result of the description portion performed by the evaluation unit 221 is reflected in the description content of the report.
  • the report screen information may be a file described in a markup language such as HTML format.
  • the report screen information in this format can be displayed by opening it with a web browser, for example.
  • the evaluation unit 221 stores the generated report screen information in the storage unit 203. At this time, the evaluation unit 221 may store the evaluation result information storage unit 234 so as to include it in the evaluation result information corresponding to the report to be evaluated.
  • Step S107 The report screen information generated in step S106 and stored in the storage unit 203 is used by the instructor I to refer to the evaluation result by the learning support device 200 when evaluating the report.
  • the instructor I makes the instructor terminal device 300 access the learning support device 200, and causes the instructor terminal device 300 to display the report screen of the report to be evaluated.
  • the evaluation result output unit 223 of the instructor terminal device 300 outputs (transmits) the report screen information of the report designated by the instructor terminal device 300 to the instructor terminal device 300.
  • the instructor terminal device 300 displays a report screen by using the transmitted report screen information by its own web browser function.
  • the instructor I may display the report screen of the report to be evaluated on the instructor terminal device 300 at any time after the report screen information is stored in the storage unit 203.
  • the report screen of the report to be evaluated may be displayed under the control of the learning support device 200 according to the arrival at a predetermined date and time according to the report evaluation schedule of the instructor I determined in advance.
  • FIG. 5 shows an example of the mode of the report screen displayed on the instructor terminal device 300.
  • the report area AR11 on the right side and the evaluation reference area AR12 on the left side are included.
  • the report area AR11 is an area in which the description contents of the report created by the user U according to the presented theme are displayed.
  • the evaluation reference area AR12 is an area in which the evaluation criteria corresponding to the report displayed in the report area AR11 are shown.
  • a “default” column is arranged.
  • the content of the question or evaluation standard is shown for each of the "question” column and the "evaluation standard” column provided under the question.
  • the evaluation standard area AR12 in the figure the content of the "question 1" is shown in the column of "question 1", and the three “evaluation criteria” provided under “question 1" are shown.
  • the contents of the corresponding evaluation criteria are shown for each of the columns of "1-1", “evaluation criteria 1-2", and "evaluation criteria 1-3".
  • radio buttons RB are arranged in each of the “default” column, the “question” column, and the “evaluation standard” column.
  • the radio button RB is used for highlighting the description portion related to the corresponding column in the description of the report displayed in the report area AR11.
  • the radio button RB in the "default” column is selected, the report description displayed in the report area AR11 is not highlighted.
  • the "default” column may be omitted in the evaluation reference area AR12.
  • the radio button RB in the "Question"("Question1",”Question2",”Question3" column is selected, the report description in the report area AR11 is based on the corresponding "Question”.
  • the corresponding description of the evaluation criteria is highlighted for each of the "evaluation criteria” provided in. Specifically, when the radio button RB in the "Question 1" column is selected, the three “evaluation criteria 1-1" provided under “Question 1" in the report description in the report area AR11. , “Evaluation Criteria 1-2", and “Evaluation Criteria 1-3” are highlighted. When the radio button RB in the "evaluation criteria” column is selected, the evaluation criteria corresponding description portion corresponding to the corresponding "evaluation criteria” is highlighted in the report description in the report area AR11.
  • the radio button RB in the "evaluation criterion 1-1" column is selected, the evaluation criterion corresponding description portion corresponding to "evaluation criterion 1-1" is highlighted in the report description in the report area AR11. Will be done.
  • the highlighting in the report area AR11 is performed by applying a line marker to the text portion as the evaluation standard corresponding description portion.
  • Line markers are color coded so that different colors are assigned to each corresponding "evaluation criterion".
  • a state in which three types of line markers MK (MK-1, MK-2, MK-3) having different colors are applied in the text portion is shown.
  • These three types of line markers MK (MK-1, MK-2, MK-3) are the three "evaluation criteria 1-1” and “evaluation criteria” in "question 1" selected in the evaluation reference area AR12. Corresponds to "1-2" and "evaluation criteria 1-3" respectively.
  • highlighting may be performed by, for example, surrounding the description portion corresponding to the evaluation criteria with a frame line or underlining.
  • Lecturer I can evaluate the report while checking the highlighted description of the evaluation criteria by appropriately selecting the radio button RB for each of the "question” or "evaluation criteria” columns.
  • the learning support device 200 determines that the evaluation standard applicable description part extracted in step S104 is incorrect in evaluating the report, the instructor I cancels the evaluation standard applicable description part or adds another evaluation standard applicable description part. It is possible to make corrections such as associating the evaluation criteria of. That is, the instructor I can perform an operation of correcting the evaluation result. Then, the instructor I can make a final score while referring to the result scored by the learning support device 200 in step S105, for example.
  • Instructor I gives a score of one of two stages, "1" and "0", depending on the correctness, for example, when the scoring for each evaluation standard in step S105 is given. For example, points may be given in multiple stages.
  • the report screen may also show the result scored by the learning support device 200 in step S105 so that the instructor I can immediately refer to the result scored by the learning support device 200 in step S105.
  • the scoring result corresponding to the evaluation standard, the scoring result corresponding to the question unit, the scoring result obtained by integrating the scoring results of all the evaluation criteria, and the like may be presented.
  • the instructor I can make corrections to the report screen displayed on the instructor terminal device 300.
  • the instructor terminal device 300 transmits information (change information) indicating the content changed by the modification to the learning support device 200 in response to the operation of modifying the report screen.
  • the evaluation unit 221 updates the corresponding report screen information stored in the evaluation result information storage unit 234 so that the content indicated by the transmitted change information is reflected.
  • Such updating of the report screen by the evaluation unit 221 is one aspect of modifying the evaluation result according to the operation.
  • Step S109 the instructor I can evaluate the report by displaying the report screen on the instructor terminal device 300.
  • the instructor I causes the instructor terminal device 300 to transmit information (instructor evaluation information) indicating the result of evaluation of the report to the learning support device 200.
  • the instructor evaluation information includes the result of instructor I scoring the report.
  • the instructor evaluation information may include a comment for the user U who created the report by the instructor I.
  • the learning support device 200 reflects the content of the instructor evaluation information transmitted from the instructor terminal device 300 in the evaluation result information stored in the evaluation result information storage unit 234.
  • the evaluation result information includes, for example, a scoring result by the instructor I, a comment created by the instructor I for the user U who is the report submitter, and the like.
  • Step S110 The evaluation unit 221 generates a feedback comment.
  • the feedback comment is a comment on the evaluation of the report to the user U who submitted the report to be evaluated.
  • the feedback comment may include, for example, a general comment on the report, an indication of the user U's thinking tendency and strengths, weak points, etc., and content that proposes future issues, goals, and the like.
  • the evaluation unit 221 may generate a feedback comment as follows.
  • a plurality of feedback comment modules are stored in advance in the feedback comment storage unit 235 of the storage unit 203.
  • the feedback comment module is associated with the categories set for the range of points for each evaluation criterion.
  • the feedback comment module associated with the division of the score range in which the score scored based on a certain evaluation standard is higher than a certain level can make the user U aware of the matters related to the evaluation standard. It has content that includes comments that guide the user to gain further awareness of matters related to the evaluation criteria after evaluating what is being done.
  • the evaluation unit 221 acquires a module of the feedback comment corresponding to each scoring result obtained in step S105 corresponding to each evaluation standard from the storage unit 203.
  • the evaluation unit 221 generates a feedback comment by integrating the acquired feedback comment modules into one sentence.
  • the feedback comment may include a report (scoring report) that emphasizes the description portion related to the evaluation criteria extracted in step S104.
  • a report scoring report
  • the description part related to the evaluation criteria is emphasized in the scoring report
  • a marker may be attached to the corresponding description part.
  • the color of the marker may be changed according to the score given by scoring the description portion to be emphasized.
  • the color of the marker attached to the description part can immediately grasp how accurate the description content in the description part is as an answer to the question.
  • the evaluation unit 221 may use a learning model in generating the feedback comment in step S106.
  • the evaluation unit 221 inputs the score scored for each evaluation standard into the learning model, so that the learning model identifies the corresponding feedback comment module and identifies the specified feedback comment module.
  • the process of integrating and generating feedback comments may be executed.
  • Such a learning model can be used with the learning device to identify a particular feedback comment module, for example, in correspondence with the correspondence between the evaluation criteria and the scores under the evaluation criteria. It is constructed by inputting a data set in combination with points as learning data and training it.
  • the learning model may be constructed so as to generate a sentence as a feedback comment according to the input of the score scored corresponding to each evaluation standard. In this case, it is not necessary to store the feedback comment module in the storage unit 203.
  • Step S111 The evaluation result output unit 223 transmits (outputs) the evaluation result information to the user terminal device 100 of the user U who submits the report. At this time, the evaluation result output unit 223 reads out the evaluation result information corresponding to the report targeted for this evaluation from the evaluation result information included in the evaluation result information storage unit 234, and transmits the read evaluation result information. do.
  • the evaluation result information transmitted in this way includes the scoring result for the report and the feedback comment generated in step S110.
  • the evaluation result information may be output to the instructor terminal device 300 in response to a request from the instructor terminal device 300, for example. Further, for example, when the instructor I can use the learning support device 200, the evaluation result output unit 223 outputs the evaluation result by a display unit (not shown) or a printer provided in the learning support device 200. May be done.
  • the learning support device 200 simply scores in step S105, and finally the instructor I scores and evaluates in more detail.
  • the evaluation unit 221 of the learning support device 200 performs scoring and evaluation as detailed as that performed by the instructor I in step S105, so that the evaluation result by the learning support device 200 without the evaluation by the instructor I intervenes. Information may be output.
  • the learning model used for the processing of steps S104, S105, S110, etc. includes, for example, Support Vector Machine, LSTM (Long Short-Term Memory) with attention introduced, or BERT (Bidirectional Encoder Representations from Transformers).
  • LSTM Long Short-Term Memory
  • BERT Bidirectional Encoder Representations from Transformers
  • a Transformer-based device or the like may be adopted.
  • the learning model used for the processing of steps S104, S105, S106 and the like is not limited to the LSTM having the above-mentioned attention introduced, and other algorithms may be adopted.
  • the function of the learning support device 200 of the present embodiment is, for example, distributed to a plurality of devices on a network, and the devices cooperate with each other to realize report scoring, feedback comment generation, and the like. You may be made to do so.
  • a program for realizing the functions of the user terminal device 100, the learning support device 200, the instructor terminal device 300, and the like described above is recorded on a computer-readable recording medium, and the program recorded on the recording medium is recorded in the computer system.
  • the above-mentioned user terminal device 100, learning support device 200, instructor terminal device 300, and the like may be processed by reading and executing the above-mentioned user terminal device 100, learning support device 200, and the like.
  • "loading and executing a program recorded on a recording medium into a computer system” includes installing the program in the computer system.
  • computer system as used herein includes hardware such as an OS and peripheral devices.
  • the "computer system” may include a plurality of computer devices connected via a network including a communication line such as the Internet, WAN, LAN, and a dedicated line.
  • the "computer-readable recording medium” refers to a portable medium such as a flexible disk, a magneto-optical disk, a ROM, or a CD-ROM, or a storage device such as a hard disk built in a computer system.
  • the recording medium in which the program is stored may be a non-transient recording medium such as a CD-ROM.
  • the recording medium also includes an internal or external recording medium that can be accessed from the distribution server to distribute the program.
  • the code of the program stored in the recording medium of the distribution server may be different from the code of the program in a format that can be executed by the terminal device. That is, the format stored in the distribution server does not matter as long as it can be downloaded from the distribution server and installed in a form that can be executed by the terminal device.
  • the program may be divided into a plurality of parts, downloaded at different timings, and then combined by the terminal device, or the distribution server for distributing each of the divided programs may be different.
  • a "computer-readable recording medium” is a volatile memory (RAM) inside a computer system that serves as a server or client when a program is transmitted via a network, and holds the program for a certain period of time. It shall also include things.
  • the above program may be for realizing a part of the above-mentioned functions. Further, it may be a so-called difference file (difference program) that can realize the above-mentioned function in combination with a program already recorded in the computer system.
  • difference file difference program
  • 100 user terminal device 200 learning support device, 201 communication unit, 202 control unit, 203 storage unit, 221 evaluation unit, 222 screen information generation unit, 223 evaluation result output unit, 231 user information storage unit, 232 submission report storage unit, 233 Evaluation use information storage unit, 234 Evaluation result information storage unit, 235 Feedback comment storage unit, 300 Instructor terminal device

Landscapes

  • Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Primary Health Care (AREA)
  • Health & Medical Sciences (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

La présente invention a pour objet d'évaluer avec précision un texte ayant un format de phrase. La solution selon l'invention porte sur un dispositif d'évaluation de rapport comprenant une unité d'évaluation qui évalue un rapport en utilisant un résultat d'extraction de parties de description satisfaisant des critères d'évaluation à partir du rapport, qui est décrit dans un format de phrase correspondant à une attribution prédéterminée, et une unité de génération d'informations d'écran pour générer des informations d'écran de rapport correspondant à un écran de rapport capable de fournir un affichage dans lequel au moins le résultat d'extraction des parties de description par l'unité d'évaluation peut être reflété dans la description de contenu du rapport.
PCT/JP2021/008493 2020-03-04 2021-03-04 Dispositif d'évaluation de rapport, procédé d'évaluation de rapport et programme WO2021177413A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020037109A JP6943999B2 (ja) 2020-03-04 2020-03-04 レポート評価装置、レポート評価方法、及びプログラム
JP2020-037109 2020-03-04

Publications (1)

Publication Number Publication Date
WO2021177413A1 true WO2021177413A1 (fr) 2021-09-10

Family

ID=77613421

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008493 WO2021177413A1 (fr) 2020-03-04 2021-03-04 Dispositif d'évaluation de rapport, procédé d'évaluation de rapport et programme

Country Status (2)

Country Link
JP (1) JP6943999B2 (fr)
WO (1) WO2021177413A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0876681A (ja) * 1994-09-05 1996-03-22 Ricoh Co Ltd 画像処理装置及び問題用紙作成・採点システム
JP2006277086A (ja) * 2005-03-28 2006-10-12 Nippon Tokei Jimu Center:Kk 採点支援方法、採点支援システム、採点支援装置、採点管理装置及びコンピュータプログラム
JP2011008355A (ja) * 2009-06-23 2011-01-13 Omron Corp Fmeaシートの作成支援システムおよび作成支援用のプログラム
JP2017068011A (ja) * 2015-09-30 2017-04-06 京セラドキュメントソリューションズ株式会社 画像形成装置、及び画像形成システム
JP2017167413A (ja) * 2016-03-17 2017-09-21 独立行政法人大学入試センター 採点補助システム
CN107832768A (zh) * 2017-11-23 2018-03-23 盐城线尚天使科技企业孵化器有限公司 基于深度学习的高效阅卷方法和阅卷系统
JP2019152793A (ja) * 2018-03-05 2019-09-12 富士ゼロックス株式会社 情報処理装置、情報処理方法及び情報処理プログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0968919A (ja) * 1995-09-01 1997-03-11 Sharp Corp 答案採点処理装置
JP6556090B2 (ja) * 2016-04-08 2019-08-07 Kddi株式会社 複数の類似度算出によってテキストの点数を推定するプログラム、装置及び方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0876681A (ja) * 1994-09-05 1996-03-22 Ricoh Co Ltd 画像処理装置及び問題用紙作成・採点システム
JP2006277086A (ja) * 2005-03-28 2006-10-12 Nippon Tokei Jimu Center:Kk 採点支援方法、採点支援システム、採点支援装置、採点管理装置及びコンピュータプログラム
JP2011008355A (ja) * 2009-06-23 2011-01-13 Omron Corp Fmeaシートの作成支援システムおよび作成支援用のプログラム
JP2017068011A (ja) * 2015-09-30 2017-04-06 京セラドキュメントソリューションズ株式会社 画像形成装置、及び画像形成システム
JP2017167413A (ja) * 2016-03-17 2017-09-21 独立行政法人大学入試センター 採点補助システム
CN107832768A (zh) * 2017-11-23 2018-03-23 盐城线尚天使科技企业孵化器有限公司 基于深度学习的高效阅卷方法和阅卷系统
JP2019152793A (ja) * 2018-03-05 2019-09-12 富士ゼロックス株式会社 情報処理装置、情報処理方法及び情報処理プログラム

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ASAHI, NOBUSHIGE ET AL.: "A Grading Support System for Free Form Problems Considering the Grader's Intention", LECTURE PROCEEDINGS (4) OF THE 70TH (2008) NATIONAL CONFERENCE, 13 March 2008 (2008-03-13), pages 737 - 738 *
ISHIOKA, TSUNENORI: "Automated short-answer test scoring: The logic and the tasks", IEICE TECHNICAL REPORT, vol. 109, no. 84, 11 June 2009 (2009-06-11), pages 7 - 11, ISSN: 0913-5685 *
MORITA, NAOKI ET AL.: "A method of classifying description answers on realtime", FIT2002 FORUM ON INFORMATION TECHNOLOGY: INFORMATION TECHNOLOGY LETTERS, vol. 1, September 2002 (2002-09-01), pages 233 - 234 *
NAKAJIMA, KOJI: "Development and evaluation of scoring support tools for short answer response", PROCEEDINGS OF THE 17TH ANNUAL MEETING OF THE ASSOCIATION FOR NATURAL LANGUAGE PROCESSING, 7 March 2011 (2011-03-07), pages 611 - 614 *
OSHIMA, KAZUMA ET AL.: "Development of a scoring support system for short-answer written tests- Classification of answers based on usage of keywords", PROCEEDINGS OF 2011 PC CONFERENCE, 7 August 2011 (2011-08-07), pages 18 - 21 *

Also Published As

Publication number Publication date
JP6943999B2 (ja) 2021-10-06
JP2021140426A (ja) 2021-09-16

Similar Documents

Publication Publication Date Title
Lee The impact of using machine translation on EFL students’ writing
Nitzke Problem solving activities in post-editing and translation from scratch: A multi-method study
Cargill et al. Preparing Chinese graduate students of science facing an international publication requirement for graduation: Adapting an intensive workshop approach for early-candidature use
Blake Brave new digital classroom: Technology and foreign language learning
US20150106705A1 (en) Adaptive Grammar Instruction - Verb Tense
US20080145832A1 (en) Test Question Constructing Method and Apparatus, Test Sheet Fabricated Using the Method, and Computer-Readable Recording Medium Storing Test Question Constructing Program for Executing the Method
US20070026375A1 (en) Electronic study aid and practice aid
EP3535745B1 (fr) Association de ressources de données à des objectifs demandés
Soler-Monreal Announcing one's work in PhD theses in computer science: A comparison of Move 3 in literature reviews written in English L1, English L2 and Spanish L1
US20150093727A1 (en) Vocabulary learning system and method
US20110041052A1 (en) Markup language-based authoring and runtime environment for interactive content platform
Hana et al. Building a learner corpus
KR102563530B1 (ko) 독해와 작문 능력의 향상을 가이드하는 방법 및 그장치
WO2021177413A1 (fr) Dispositif d'évaluation de rapport, procédé d'évaluation de rapport et programme
KR101380692B1 (ko) 온라인 학습장치 및 온라인 학습방법
KR20080100857A (ko) 라운드방식을 이용한 단어반복학습 서비스 시스템
Sahar et al. Assessing the feasibility of a web-based interactive writing assessment (WISSE): An evaluation of media and linguistic aspects
KR102282307B1 (ko) 영어 학습 시스템 및 그 방법
Moreno et al. A bridge to web accessibility from the usability heuristics
US7296260B2 (en) System and method for composing a multi-lingual instructional software
JPH07302036A (ja) パソコン上で行なうcai用演習問題の作成方法
US10635862B2 (en) Method of facilitating natural language interactions, a method of simplifying an expression and a system thereof
US11514807B2 (en) Method and apparatus for assisting persons with disabilities
Saleh et al. Ask4Summary: a summary generation moodle plugin using natural language processing techniques
Farrell Raw output evaluator, a freeware tool for manually assessing raw outputs from different machine translation engines

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21764677

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21764677

Country of ref document: EP

Kind code of ref document: A1