AU2021106429A4 - Teacher assistance system and method - Google Patents

Teacher assistance system and method Download PDF

Info

Publication number
AU2021106429A4
AU2021106429A4 AU2021106429A AU2021106429A AU2021106429A4 AU 2021106429 A4 AU2021106429 A4 AU 2021106429A4 AU 2021106429 A AU2021106429 A AU 2021106429A AU 2021106429 A AU2021106429 A AU 2021106429A AU 2021106429 A4 AU2021106429 A4 AU 2021106429A4
Authority
AU
Australia
Prior art keywords
answers
answer
student
marking
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2021106429A
Inventor
Cody Mann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Australian Artificial Intelligence Technologies Pty Ltd
Original Assignee
Australian Artificial Intelligence Tech Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Australian Artificial Intelligence Tech Pty Ltd filed Critical Australian Artificial Intelligence Tech Pty Ltd
Priority to AU2021106429A priority Critical patent/AU2021106429A4/en
Application granted granted Critical
Publication of AU2021106429A4 publication Critical patent/AU2021106429A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0032Apparatus for automatic testing and analysing marked record carriers, used for examinations of the multiple choice answer type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/19007Matching; Proximity measures
    • G06V30/1908Region based matching

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Economics (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

Semi-automated systems and methods for assisting teachers are provided, including examination marking methods that simplify the process for marking examination papers. The method comprises: scanning a plurality of student examination papers, each student examination paper comprising a plurality of questions, and associated answers; associating the scanned student examination papers with one or more marking databases, the marking databases including sample answers associated with the questions; extracting answers from the plurality of scanned student examination papers; determining a similarity between each answer and an associated sample answer; automatically grading an answer when the similarity is above a threshold; and providing each answer to a grader (e.g. a teacher) for review and grading when the similarity is below the threshold. 4/4 400 405 SCAN STUDENT EXAMINATION PAPERS 410~ ASSOCIATE SCANNED EXAMINATION PAPERS WITH ONE OR MORE MARKING GUIDES 415 EXTRACT ANSWERS FROM EXAMINATION PAPERS 4201 DETERMINE A SIMILARITY BETWEEN ANSWERAND AN ASSOCIATED SAMPLE ANSWER 425 YES SIMILARITY ABOVE NO 430 THRESHOLD? 435 AUTOMATICALLY PROVIDE ANSWER TO A GRADE GRADER (E.G. ATEACHER) ANSWER FOR REVIEW Figure 4

Description

4/4 400
405
SCAN STUDENT EXAMINATION PAPERS 410~ ASSOCIATE SCANNED EXAMINATION PAPERS WITH ONE OR MORE MARKING GUIDES 415 EXTRACT ANSWERS FROM EXAMINATION PAPERS 4201
DETERMINE A SIMILARITY BETWEEN ANSWERAND AN ASSOCIATED SAMPLE ANSWER 425
YES SIMILARITY ABOVE NO 430 THRESHOLD? 435
AUTOMATICALLY PROVIDE ANSWER TO A GRADE GRADER (E.G. ATEACHER) ANSWER FOR REVIEW
Figure 4
TEACHER ASSISTANCE SYSTEM AND METHOD TECHNICAL FIELD
[0001] The present invention relates to semi-automated systems and methods for assisting teachers. In particular, although not exclusively, the present invention relates to semi-automated marking of student examination papers.
BACKGROUND ART
[0002] Most schools use written examinations in one form or another to assess students. As such, the marking of student examination papers is an important, but often very time consuming, aspect of teaching. This is particularly the case for younger students, where paper is almost exclusively used for examinations.
[0003] Typically, teachers will manually go through all examination papers, hand marking and manually calculating test results for each of the students. This is obviously very time consuming, particularly in written examination papers, which can be overwhelming for teachers, particularly if involved with several classes of students.
[0004] One option is to split examination papers across multiple teachers (or graders), who mark or grade the papers with respect to one or more marking guides. A problem with such approach is that different persons may mark the same question different ways. As a result, student results are unfairly biased based upon which teacher (or grader) marks their examination paper. Furthermore, managing the marking of examination papers across multiple persons is time consuming, and requires detailed marking guides.
[0005] Another problem when marking or grading is not performed by the teacher, is that the teacher may not see exactly what areas of an examination are commonly answered incorrectly. As such, common problems and misunderstandings in the class may be missed.
[0006] Several attempts have been made to simplify the process of marking examination papers. One common way of simplifying examination marking is to use multiple choice examination papers. While multiple choice questions may have their place in some examination papers, they are not suitable for all types of examination, and may promote guessing. As such, student results in such cases may not be a good representation of actual knowledge or ability.
[0007] Similar problems existing in other areas of teaching. As such, there is clearly a need for an improved systems and methods for assisting teachers, including in relation to student examination marking.
[0008] It will be clearly understood that, if a prior art publication is referred to herein, this reference does not constitute an admission that the publication forms part of the common general knowledge in the art in Australia or in any other country.
SUMMARY OF INVENTION
[0009] The present invention relates to systems and methods for assisting teachers, including student examination marking systems and methods, which may at least partially overcome at least one of the abovementioned disadvantages or provide the consumer with a useful or commercial choice.
[0010] With the foregoing in view, the present invention in one form, resides broadly in a student examination marking method comprising: scanning a plurality of student examination papers, each student examination paper comprising a plurality of questions, and associated answers; associating the scanned student examination papers with one or more marking databases, the marking databases including sample answers associated with the questions;
extracting answers from the plurality of scanned student examination papers; determining a similarity between each answer and an associated sample answer; automatically grading answers when the similarity is above a threshold; and providing each answer to a grader (e.g. a teacher) for review and grading when the similarity is below the threshold.
[0011] Advantageously, the method simplifies the process for marking examination papers, as answers are automatically graded when they are similar to an associated sample answer. This reduces workload associated with marking examination papers, as only answers not being sufficiently similar to the sample answers are needed to be reviewed and graded.
[0012] The method may also reduce inconsistencies in marking of examination papers as questions may be automatically graded, avoiding inconsistencies for such questions, and enabling fewer graders (e.g. a single teacher) to review and grade the remaining questions.
[0013] Preferably, the answers are extracted as text using optical character recognition (OCR).
[0014] Preferably, a region of interest is associated with each question, the region of interest defining an area in which an answer is to be read. Suitably, the answer is extracted by analysis of the region of interest.
[0015] Preferably, the regions of interest are defined by geometric shapes, such as rectangles, with reference to the examination papers. Each region of interest may then be associated with a question and a score.
[0016] Preferably, an identifier is associated with each paper, the identifier linking the paper to a marking database of the one or more marking databases. Alternatively, the marking database may be manually selected, e.g. by a person scanning the papers.
[0017] Preferably, the identifier comprises a computer readable identifier, such as a bar code or QR code.
[0018] Preferably, a student name is associated with each scanned paper. The student name may be handwritten on the paper. The student name may be identified using OCR.
[0019] Preferably, each student examination paper includes a plurality of printed questions, and a plurality of handwritten responses, wherein each response is associated with a question.
[0020] Preferably, the marking database includes more than one sample answer associated with one or more of the questions. In such case, similarity may be determined with reference to each of the sample answers separately.
[0021] Preferably, the one or more sample answers comprise a plurality of keywords.
[0022] Preferably, the student answers are prefiltered prior to or as part of determining a similarity between each answer and an associated sample answer.
[0023] The prefiltering may include removing filler words, and insignificant words that hold no meaning. Similar prefiltering may be applied to the same answers.
[0024] The prefiltering may include spell checking and correcting. The prefiltering may include replacement or addition of words according to a thesaurus.
[0025] The similarity may be determined according to a similarity score. The similarity score may be compared with a similarity threshold.
[0026] The similarity threshold may be fixed for all questions. The similarity threshold may be dynamic.
[0027] One or more similarity factors may be turned on or off for questions. As an illustrative example, broader similarity measures may be applied to one question than another.
[0028] The method may include overlaying the grading onto the scanned examination papers. The grading may include a grading per question, which is overlaid in association with each question and/or answer.
[0029] The overlaid scanned examination paper including the grading may be provided back to the student, e.g. electronically.
[0030] The method may include automatic scoring and grading of the examination papers according to the gradings of the individual answers.
[0031] The method may also include receiving feedback about an answer, and wherein such feedback is similarly overlaid on the examination paper and provided to the student together with the results.
[0032] The method may include analysis on the answers from a plurality of students to identify common mistakes and trends.
[0033] The method may include comparison of results of students with other groups having performed the same examination, and identify which areas the students differ from other groups (e.g. areas in which the students have excelled in and not.)
[0034] In another form, the invention resides broadly in a student examination marking system comprising a server, including: a data interface configured to received scanned student examination papers, each student examination paper comprising a plurality of questions, and associated answers; a processor, coupled to the data interface, and a memory, coupled to the processor, the memory including instruction code executable by the processor for: associating the scanned student examination papers with one or more marking databases, the marking databases including sample answers associated with the questions; extracting answers from the plurality of scanned student examination papers; determining a similarity between each answer and an associated sample answer; automatically grading an answer when the similarity is above a threshold; and providing each answer to a grader (e.g. a teacher) for review and grading when the similarity is below the threshold.
[0035] Any of the features described herein can be combined in any combination with any one or more of the other features described herein within the scope of the invention.
[0036] The reference to any prior art in this specification is not, and should not be taken as an acknowledgement or any form of suggestion that the prior art forms part of the common general knowledge.
BRIEF DESCRIPTION OF DRAWINGS
[0037] Various embodiments of the invention will be described with reference to the following drawings, in which:
[0038] Figure 1 illustrates a schematic of a student examination marking system, according to an embodiment of the present invention.
[0039] Figure 2 illustrates an exemplary test paper of the system of Figure 1, according to an embodiment of the present invention.
[0040] Figure 3 illustrates the exemplary test paper of Figure 2, with regions of interest overlaid thereon, and answers written thereon.
[0041] Figure 4 illustrates a student examination marking method, according to an embodiment of the present invention.
[0042] Preferred features, embodiments and variations of the invention may be discerned from the following Detailed Description which provides sufficient information for those skilled in the art to perform the invention. The Detailed Description is not to be regarded as limiting the scope of the preceding Summary of the Invention in any way.
DESCRIPTION OF EMBODIMENTS
[0043] Figure 1 illustrates a schematic of a teacher assistance system 100, according to an embodiment of the present invention. The teacher assistance system 100 simplifies the process for marking examination papers, reduces workload associated with marking examination papers, and reduces inconsistencies in marking examination papers. Furthermore, the system simplifies the process for generating new tests/exams, and generating report cards and other feedback to students.
[0044] The system 100 includes a scanner 105, for scanning a plurality of test papers 110 from a plurality of students. The test papers 110 comprise multiple sheets per student, and include printed questions, and handwritten responses, the handwritten responses written by the students.
[0045] The scanner 105 is coupled to a computer 115, which in turn is coupled to a remote server 120 by a communications system, such as the Internet. In practice, multiple such scanners 105 and computers 115 may exist in different areas (e.g. at different schools, or different parts of schools) all coupled to the same server 120. Similarly, the server 120 need not be a single physical machine, which enables the system to scale across multiple schools. The skilled addressee will, however, readily appreciate that in other embodiments, the computer 115 may perform the functions of the server 120.
[0046] When a test is initially created, a marking database is also created and uploaded to a test database. In particular, teachers 125 may communicate with the server 120 by respective computing devices 130, to which a graphical user interface (GUI) is provided. The GUI enables the teachers 125 to define and/or upload marking databases, which are stored in a test database 135. In short, each marking database is specific to a particular test, and is used by the server 120 to assist in marking the test papers 110.
[0047] Figure 2 illustrates an exemplary test paper 110, according to an embodiment of the present invention. The test paper 110 is similar to a traditional paper test sheet, and includes questions and associated spaces for the student to enter his or her answers, as outlined below.
[0048] A machine-readable identifier 205 (optional), in the form of a barcode or suitable code, is provided on each paper to identify the test and the page of the test. The identifier 205 may be automatically provided on the test papers when the test is generated, and provides an efficient means for enabling the server 120 in identifying the sheet of potentially many sheets, and provides a reference point (datum) on the sheet.
[0049] A name field 210 is also provided on the top of each sheet, in which the student may enter his or her name, to enable the test paper 110 to be associated with the student.
[0050] One or more questions 215 are then provided on the test paper 110, which may be broken into sub-questions. Blank space 220 is provided in association with the questions 215 (and sub-questions), to enable the student to write their answers in relation thereto.
[0051] When creating the marking databases, the teachers 125 are also prompted to define regions of interest in the test papers 110, to identify where the answers are likely to be written. This simplifies the process of identifying the answers, as it alleviates the need for the system to analyse the entire test papers 110 each time a sheet is analysed.
[0052] The regions of interest are defined by geometric shapes, typically rectangles, with reference to the test papers 110. Each region of interest may then be associated with a question and a score or number of marks.
[0053] Figure 3 illustrates the exemplary test paper 110 of Figure 2, with regions of interest 305 overlaid thereon, and answers 310 written thereon.
[0054] When initially analysing the test papers 110, all sheets are scanned, sorted based upon student, and then answers to each of the questions are extracted. This is performed using optical character recognition (OCR) in each of the regions of interest.
[0055] In case the system 100 is used with young children, the server 120 may incorporate OCR trained on or specialising in children's handwriting, to improve recognition of the children's handwriting on the answer sheets. Preferably, the system 100 is adaptable to a wide range of handwriting, including that of children.
[0056] When creating the marking databases, one or more sample answers are provided for each question. The teacher 125 may manually enter each sample answer using the GUI, and in case several alternative options are available, the teacher would generally enter each option. Alternatively, the answers may be provided in tabulated form, e.g. as a spreadsheet of well-defined format.
[0057] When analysing the test papers 110, the student answers and sample answers are pre-filtered to remove filler words, and insignificant words that hold no meaning. The filtered student answers are then compared with the filtered sample answers, and a similarity score is generated. The similarity score may be generated based upon a number of common keywords or themes, or based upon any suitable metric.
[0058] If a similarity score between a filtered student answer and a filtered sample answer is high (above a threshold), the question is automatically marked as correct, and the marks associated with that question are allocated to the student.
[0059] If the similarity score between the filtered student answer and the filtered samples is low (below a threshold), the answer is flagged for review by the teacher. The teacher may then review the question and mark it as correct, upon which the marks associated with that question are allocated to the student, or incorrect. In some embodiments, grades between correct and incorrect may be used, including part marks.
[0060] As the server 120 is able to automatically identify correct answers, but does not mark answers as incorrect (and instead flags such answers for review), the system is able to be configured in a relatively conservative manner where it identifies clearly correct answers, and any uncertain answers are flagged for review.
[0061] Such configuration enables the system 100 to be deployed with relatively simple analysis, which may be updated over time. Even when the system is only able to identify a small portion of answers as correct (e.g. 20%) is able to reduce marking workload of the teacher by about that amount (e.g. 20%). As the system improves, even larger gains may be made.
[0062] In some embodiments, the questions may be reused across tests, and the marking database associated with that question may similarly be reused. Such configuration promotes investment in detailed marking databases, as they may be reused.
[0063] In yet further embodiments, the system may be configured to learn from the marking of teachers 125 when questions are reviewed. In such case, a machine learning model may take the filtered answer and the outcome of the review (i.e. whether the answer is correct or incorrect) and use that to improve analysis of future questions.
[0064] Similarly, Natural Language Processing (NLP) methods incorporating artificial intelligence may be used to pre-filter or analyse student answers and sample answers to determine the similarity score. In such case, the NLP methods may similarly be updated over time based upon actual student answers and reviewed teacher marking.
[0065] Once each test paper 110 has been fully marked, the results are overlaid onto the scanned test paper 110, and provided back to the student, e.g. electronically. In such case, words such as "correct" or "incorrect" may be provided in association with, and overlaid over, each answer. An overall result, such as a grading or score, may also be determined and overlaid onto the first or last page of the test papers.
[0066] The system may also prompt or enable the teacher to enter feedback about an answer. In such case the feedback may be similarly overlaid on the test paper and provided to the student together with the results. This corresponds to the feedback that the teacher would have traditionally handwritten on the test paper 110.
[0067] In addition to providing improved efficiencies for teachers, the system may also result in improved results awareness. As teachers are exposed to results which are more likely to be incorrect, they are able to intuitively see which aspects of their teaching is not being properly understood, enabling further focus to be placed on such areas.
[0068] Furthermore, the system may perform analysis on the answers as a whole to identify common mistakes and trends. As a result, the system may provide such insights to teachers in an easy to understand manner, enabling action to be taken.
[0069] Similarly, the system may compare results of students with other groups having performed the same test, and identify which areas the students excelled in and not. This enables the teacher to identify areas of improvement among the students, and focus thereon in future teaching.
[0070] In some embodiments, the system 100 may generate profiles for each of the students, at least in part based upon the results of their examinations. These profiles may include a plurality of attributes, each with an associated score. These attributes may relate to different skills or aspects of education, and may provide at least a broad indication of the student's strengths and weaknesses.
[0071] In some embodiments, these attributes are then used as an input to a feedback or report card generation module. The feedback or report card generation module may assist teachers in generating report cards by suggesting selectable comments, which may be selected by the teacher for incorporation into the report card.
[0072] As an illustrative example, a report card may include a plurality of report items relating to a plurality of different areas, such as mathematics, language, and the like. The report card generation module may step through each of these areas, retrieve attributes relating the area, and suggest selectable comments according to the retrieved attributes. One or more of these comments may be selectable, or alternatively replaced by another comment, or manually overridden.
[0073] In some schools, the wording in the report cards must comply with particular format and language requirements. In such case, a bank of comments may be generated, and approved by the school, from which a teacher may select comments. The report card generation module may simplify the process of selecting such comments by making intelligent selections.
[0074] In case, a teacher manually enters a comment, this comment may be reviewed by another teacher or administrative staff member, and potentially be added to the bank of comments for future use.
[0075] The report card generation module may remove or reduce unintentional bias from teachers, particularly where poor behaviour has traditionally been associated with poor feedback on report cards, as teachers are prompted with comments that match the test outcomes of the student. As the process of providing comments is simplified, students may be provided with more feedback in an efficient manner.
[0076] The system 100 may also be configured to determine the difficulty of test questions, based upon the results of a large number of students (e.g. across different classes or even schools). In such case, the system 100 may be used to adjust test results according to a particular baseline level, to compensate for differing levels of difficulty.
[0077] The system 100 may be configured to generate test papers from a bank of test questions. This may be performed randomly, or according to difficulty. In some embodiments, a teacher may select a desired level of difficulty, wherein the system automatically generates a test according to the bank of test answers according to that level of difficulty. As outlined above, the level of difficulty may be determined according to previous students' results.
[0078] Figure 4 illustrates a student examination marking method 400, according to an embodiment of the present invention. The method 400 may be similar or identical to the method used by the system 100.
[0079] Initially, at step 405, a plurality of student examination papers are scanned using a scanner. The student examination papers comprise printed questions, with spaces for the students to complete their answers.
[0080] At step 410, the student examination papers are associated with one or more marking databases. This may be performed automatically, e.g. if the papers have a code thereon associated with a marking database, or manually, e.g. a teacher scanning the student examination papers may manually select the associated marking database.
[0081] As outlined above, the marking database may include sample answers, which may comprise keywords. Similarly, the marking database may include regions of interest for the answers, as well as a number of marks associated with an answer.
[0082] At step 415, answers are extracted from each of the examination papers. As outlined above, this may include identifying a region of interest associated with a question (answer) and performing optical character recognition on text (answers) therein. Such configuration saves scanning the whole page and attempting to identify the different answers based upon analysis thereof.
[0083] At step 420, the similarity between a student answer and an associated sample answer is determined. The similarity may be determined as a score or percentage, indicating how similar the student answer and sample answer is.
[0084] Part of this step can include pre-filtering the answers to remove filler words, and insignificant words that hold no meaning. Similarly, autocorrect/spellcheck features may be used to correct basic errors in the student response.
[0085] The similarly score may be based upon a number of common keywords, or variations or synonyms of keywords, or any suitable measure.
[0086] At step 425 it is determined whether the similarity is above a threshold, and in such case the answer is automatically graded in step 430. This enables simple and common answers to be automatically graded, thereby alleviated the need for the teacher to spend time on such answers.
[0087] If the similarity is not above the threshold, the answer is provided to a grader (e.g. a teacher) for review in step 435. This enables the grader (or teacher) to properly consider the answer before failing the student. This is important as correct answers may take a wide range of forms, and need not be particularly similar to the sample answer.
[0088] The process is then repeated from step 420 for each remaining answer of all remaining examination papers.
[0089] The results of the students may be printed (overlaid) onto the examination papers, and provided back to the students.
[0090] The method may also include automatically or semi-automatically identifying the student for whom the examination paper relates, e.g. according to a name field.
[0091] The method may include determine averages of students, and/or identifying common problems (e.g. commonly incorrect answers). Such data may be presented to teachers to illustrate areas where skills are potentially lagging, to enable adjustments to be made thereto.
[0092] Various changes or modifications may be made to the methods and systems, such as incorporating artificial intelligence, machine learning, neuro linguistic programming (NLP) or the like.
[0093] While the above examples relate to text-based answers, the skilled addressee will readily appreciate that any suitable type of questions and answers may be used. As an illustrative example, multiple choice questions and answers may be used for part of a test. In such case, the systems may adaptively determine marked answers based upon darkness, the presence of markings or the like.
[0094] The skilled addressee will readily appreciate that different similarity algorithms may be used on different types of questions. For example, maths questions may have very strict similarity (requiring an exact result), whereas language-based questions may be relatively broad similarity (e.g. identifying key themes).
[0095] The systems and methods may include natural language detectors, or other anti cheating mechanisms, e.g. to avoid students simply listing words hoping for a match with a sample answer. Similarly, in the case of multiple choice answers, the systems and methods may ensure that only a single answer is selected (where appropriate).
[0096] The words "test" and "examination" are used interchangeably in the present specification. The skilled addressee will readily appreciate that while school tests or examinations are primarily described, embodiments of the present invention may be used with any suitable form of test or examination.
[0097] Advantageously, the systems and methods disclosed herein simplify the process for marking examination papers, reducing the workload associated with marking examination papers, and reduces inconsistencies in marking examination papers.
[0098] As any answers not similar to the sample answers are provided for review, the sample answers need not perfectly encapsulate all possible answers, and the system may conservatively grade answers automatically.
[0099] This configuration also enables the systems and methods to be deployed in relatively rudimentary forms, as it does not require a large number of answers to be automatically graded to be beneficial. As more examinations are performed, and questions and answers are reused, the sample answers may be refined based upon past experience, resulting in a continually improving system.
[00100] In the present specification and claims (if any), the word 'comprising' and its derivatives including 'comprises' and 'comprise' include each of the stated integers but does not exclude the inclusion of one or more further integers.
[00101] Reference throughout this specification to 'one embodiment' or 'an embodiment' means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearance of the phrases 'in one embodiment' or 'in an embodiment' in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more combinations.
[00102] In compliance with the statute, the invention has been described in language more or less specific to structural or methodical features. It is to be understood that the invention is not limited to specific features shown or described since the means herein described comprises preferred forms of putting the invention into effect. The invention is, therefore, claimed in any of its forms or modifications within the proper scope of the appended claims (if any) appropriately interpreted by those skilled in the art.

Claims (5)

1. A student examination marking method comprising:
scanning a plurality of student examination papers, each student examination paper comprising a plurality of questions, and associated answers;
associating the scanned student examination papers with one or more marking databases, the marking databases including sample answers associated with the questions;
extracting answers from the plurality of scanned student examination papers; determining a similarity between each answer and an associated sample answer;
automatically grading answers when the similarity is above a threshold; and
providing each answer to a grader (e.g. a teacher) for review and grading when the similarity is below the threshold.
2. The student examination marking method of claim 1, wherein a region of interest is associated with each question, the region of interest defining an area in which an answer is to be read, and wherein the answers are extracted as text using optical character recognition (OCR) from the regions of interest.
3. The student examination marking method of claim 1, wherein the student answers are prefiltered prior to determining a similarity between each answer and an associated sample answer, wherein the prefiltering includes removing filler words and/or insignificant words.
4. The student examination marking method of claim 1, further including overlaying the gradings onto the scanned examination papers.
5. The student examination marking method of claim 1, further including receiving feedback about an answer, and overlaying the feedback onto the scanned examination paper.
115 120 135 1/4
110 105 130 125
Figure 1
AU2021106429A 2021-08-22 2021-08-22 Teacher assistance system and method Ceased AU2021106429A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2021106429A AU2021106429A4 (en) 2021-08-22 2021-08-22 Teacher assistance system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2021106429A AU2021106429A4 (en) 2021-08-22 2021-08-22 Teacher assistance system and method

Publications (1)

Publication Number Publication Date
AU2021106429A4 true AU2021106429A4 (en) 2021-11-04

Family

ID=78488389

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2021106429A Ceased AU2021106429A4 (en) 2021-08-22 2021-08-22 Teacher assistance system and method

Country Status (1)

Country Link
AU (1) AU2021106429A4 (en)

Similar Documents

Publication Publication Date Title
Ferris Does error feedback help student writers? New evidence on the short-and long-term effects of written error correction
Erdosy Exploring variability in judging writing ability in a second language: A study of four experienced raters of ESL compositions
CN111597908A (en) Test paper correcting method and test paper correcting device
Foorman et al. Latent profiles of reading and language and their association with standardized reading outcomes in kindergarten through tenth grade
US20060084048A1 (en) Method for analyzing standards-based assessment data
US20070048718A1 (en) System and Method for Test Creation, Verification, and Evaluation
Misut et al. Software solution improving productivity and quality for big volume students' group assessment process
Botley Errors Versus Mistakes: A False Dichotomy?
Winters Educational data mining: Collection and analysis of score matrices for outcomes-based assessment
KR20190143057A (en) An online learning system and method capable of parallel learning of Chinese characters and Chinese
AU2021106429A4 (en) Teacher assistance system and method
CN116110262A (en) Speech answer evaluation method, device, equipment and storage medium
Gaab et al. EarlyBird technical manual
KR20230036441A (en) User-customizing learning service system
CARRIÓ-PASTOR et al. CHAPTER NINE THE USE OF CORPORA TO IDENTIFY THE PRAGMATIC KNOWLEDGE ASSOCIATED WITH DIFFERENT LEVELS OF LANGUAGE PROFICIENCY
CN113256457A (en) Online learning diagnosis method and device based on artificial intelligence
Johnson Teacher assessments and literacy profiles of primary school children in South Africa
KR20200107709A (en) System for optical mark reader
Elsayed Written direct and indirect comprehensive feedback’s influence on Kuwaiti undergraduate university students’ writing Accuracy
Andrews The use of goal setting and progress self-monitoring with formative assessment in community college to increase academic achievement and self-efficacy
Chen Effects of SRSD on students’ metacognitive knowledge, self-efficacy, text revision, and text quality in L2 writing
Irshad Ally et al. Challenges in Realism-Based Ontology Design: A Case Study on Creating an Ontology for Motivational Learning Theories
Rylander et al. Validating classroom assessments measuring learner knowledge of academic vocabulary
Ally et al. Challenges in Realism-Based Ontology Design: a Case Study on Creating an Ontology for Motivational Learning Theories
Le Metacognitive knowledge about vocabulary learning: A case study of an English for Academic Purposes programme

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry