AU783979B2 - A data collection method - Google Patents

A data collection method Download PDF

Info

Publication number
AU783979B2
AU783979B2 AU11182/01A AU1118201A AU783979B2 AU 783979 B2 AU783979 B2 AU 783979B2 AU 11182/01 A AU11182/01 A AU 11182/01A AU 1118201 A AU1118201 A AU 1118201A AU 783979 B2 AU783979 B2 AU 783979B2
Authority
AU
Australia
Prior art keywords
report
assessor
performance
assessment
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU11182/01A
Other versions
AU1118201A (en
Inventor
Daniel Francis Bone
Geoffrey Richard Vanstone Miles
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Eclipse Computing (australia) Pty Ltd
Original Assignee
Arc Res & Dev Pty Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from AUPQ3687A external-priority patent/AUPQ368799A0/en
Application filed by Arc Res & Dev Pty Ltd filed Critical Arc Res & Dev Pty Ltd
Priority to AU11182/01A priority Critical patent/AU783979B2/en
Publication of AU1118201A publication Critical patent/AU1118201A/en
Application granted granted Critical
Publication of AU783979B2 publication Critical patent/AU783979B2/en
Assigned to MXL Consolidated Pty Ltd reassignment MXL Consolidated Pty Ltd Alteration of Name(s) in Register under S187 Assignors: ARC RESEARCH & DEVELOPMENT PTY LIMITED
Assigned to Eclipse Computing (Australia) Pty Ltd reassignment Eclipse Computing (Australia) Pty Ltd Alteration of Name(s) in Register under S187 Assignors: MXL Consolidated Pty Ltd
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers
    • G09B7/02Electrically-operated teaching apparatus or devices working with questions and answers of the type wherein the student is expected to construct an answer to the question which is presented or wherein the machine gives an answer to the question presented by a student

Description

004759355v7.doc
I
A data collection method Field of the invention The present invention relates to a data collection method for use in a reporting system and to a method of implementing same, and to a reporting method and system.
Background of the invention A number of industries require reports to evaluate systems within that industry. This can be a cumbersome exercise, particularly where a large quantity of data has to be collected, evaluated and compiled into a report in a relatively short period of time.
In the education service sector, for example, it is necessary for teachers who must assess the performance of students, to record the results of each students under their charge and to make a report on his/her progress. This can result in the expenditure of a significant amount of time and effort on behalf of the teacher. Additionally, with a large number of students under their charge, they may be more likely to make mistakes in performing the evaluation, particularly when they are under pressure due to time constraints. This may result in the accuracy and/or quality of the reports suffering because there is no easy way to generate the reports.
Another example of an industry which requires accurate reports is the manufacturing industry. In this regard, it is desirable that the performance of unit processes to be evaluated to thereby determine if it is performing satisfactorily so that the performance of the whole process does not suffer. For example, in a smelting operation there are a number of unit processes such other that the smelter which require consideration, such as fuel services, labour services, transport services and maintenance. It is important to the manager of such operations that they be able to constantly monitor and review each of the services so that the overall performance of the S smelter is maintained. To achieve this end, the manager may rely on a number of reports to indicate to him/her, those services which require attention. Again, the generation of reports in this example can be quite cumbersome.
The applicant does not concede that the prior art discussed in this specification forms part of the common general knowledge in the art at the priority date of this application.
"so..
•eg* *go* 004759355v12 2 Summary of the invention It is a preferred feature of the invention to provide a convenient method to collect data to allow the performance of a subject upon completion of one or more assigned tasks to be included in a report.
A broad aspect of the invention provides a method for collecting data for use in reporting the performance of a subject upon completion of one or more assigned tasks using an electronic processor, said method comprising the steps of: selecting a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects and collating the assessed criteria in a database accessible by the processor; grouping different assessment criteria to thereby form a template, the template being accessible and viewable by an assessor; allowing the assessor to assess the performance of at least one task completed by said one or more subjects according to said defined assessment criteria, during or upon completion of, said assigned tasks; recording the results of said assessed performance in step in a database accessible by the processor; and oo.o•i causing the processor to generate a report which reflects the results of said assessed performance.
The method may further includes the step of: permitting said assessor to select the format of said report from a multiplicity of formats or, to design the format of said report.
Optionally, method further includes the step of permitting said assessor to group each subject according to a particular category.
The method may further include the step of: permitting an assessor to amend said set of performance criteria.
The method may further include the step of: permitting an assessor to define one or more quality values to said results of said set of performance criteria to thereby determine the quality of each subject or the quality of a group of subjects.
004759355v12 3 The method may further include the step of: providing a facility to allow the assessor to make general comments in relation to the performance of said assigned task.
The method may further include the step of: providing for a plurality of pre-recorded general comments to be available for selection by said assessor in evaluating the performance of said assigned task.
The Assessor may be permitted to make a selection of said general comments in relation to the performance of a subject, and thereafter rank the general comments so that they are read in said rank on said report.
Optionally, a pre-defined comment in relation to the subject is, recommended to said Assessor, based upon the assessment of said subject's performance.
The method may further include the step of: statistically analysing the results of the assessed performance against an assessor defined set. The defined set may be a plurality of subjects.
Optionally, at least part of said statistical analysis is included as statistical information in said report.
For a plurality of subjects, the report may include a comparison of the results of said assessment.
In step the results can be recorded in a database. The results entered into the O database may be input by using a data collection form. The data collection form can be read by a digital scanner adapted for Optical Character Recognition (OCR) and Optical Mark Recognition (OMR).
The results entered into the database may be input by using a device having a digital processor.
25 The subject may be a natural person and optionally, school students, while the assessor may be a teacher.
Alternatively, the subject may be production units in an process, such as an industrial ooooo S process.
The report can be printed on paper or alternatively, may be available in electronic format such as a HTML file available from the Internet.
External data can be included in said report from external data sources, said external data may include any one or more of the following data types: Assessee data; Assessor data; 004759355v12 4 Assessment criteria data; Groupings of assessment criteria data; Hierarchies of assessment criteria data.
The pre-defined performance criteria may be recorded in said database for use in a future assessment session.
The results entered into the database can be input by using a device having a digital processor.
The digital processor may exchange data with the database over a communications network, such as the Internet.
The format of said report may be automatically generated by said digital processor upon instructions of an application program. The format of the report may be automatically by the digital processor, dependent upon the assessment criteria selected by the Assessor.
In another broad aspect of the invention, there is provided a computer network memory storing thereon an application program for controlling the execution of a processor for collecting data for use in reporting the performance of a subject upon completion of one or more assigned tasks, the computer program controlling the processor to: permit selection a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; allow for grouping of different assessment criteria to thereby form a template; ooooo allow an assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks; and record the results of said assessed performance.
The computer program may further control the processor to: "5 report the results of said assessed performance in a report.
Optionally, the computer program may further control the processor to: permit said assessor to select the format of said report from a multiplicity of formats or, to design the format of said report.
004759355v12 In yet another broad aspect of the invention, there is provided an electronic data collection system for use in reporting the performance of a subject upon completion of one or more assigned tasks, said system comprising: an application program having a set of selected assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; data storage means for recording assessment data recorded by said application program, said assessment data representing the results of said assessed performance; and input means capable of interfacing with said application program for allowing for grouping of different assessment criteria to thereby form a template, and for allowing an assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks.
The reporting system may further include a report generating means for reporting the results of said assessed performance in a report and to permitting said assessor to select the format of said report from a multiplicity of formats or, to design the format of said report.
"i15 In a further broad aspect of the invention, there is provided a computer network memory storing thereon an application program for controlling the execution of a processor for °ooo.i S° reporting the performance of a subject upon completion of one or more assigned tasks, the S" computer program controlling the processor to: ooooo permit the selection of a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; allow for grouping of different assessment criteria to thereby form a template; S°allow an assessor to assess the performance of at least one task completed by said one or more subjects according to said defined assessment criteria, during or upon completion of, said assigned tasks; record the results of said assessed performance; and report the results of said assessed performance in a report.
004759355v12 In yet another broad aspect of the invention, there is provided an electronic reporting system for reporting the performance of a subject upon completion of one or more assigned tasks, said reporting system comprising: an application program having a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; data storage means for recording assessment data recorded by said application program, said assessment data representing the results of said assessed performance; input means capable of interfacing with said application program for allowing for grouping of different assessment criteria to thereby forma template, and for allowing assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks; and a report generating means for reporting the results of said assessed performance in a report.
In the description and claims of this specification the word "report" is intended to be 1* i interpreted broadly to include not just a physical paper report, but also electronic files such as S HTML documents, electronic mail, audio messages such as voice mail, visual images, video images etc.
•oo i S"In the description and claims of this specification the word "comprise" and variations of that word, such as "comprises" and "comprising" are not intended to exclude other features, o 20 additives, components, integers or steps but rather, unless otherwise stated explicitly, the scope :of these words should be construed broadly such that they have an inclusive meaning rather than an exclusive one.
Brief description of the drawings Notwithstanding any other forms which may fall within the scope of the present invention, preferred forms of the invention will now be described, by way of example only, with reference to the accompanying drawings in which: Fig. 1 is a block diagram of a processing system in a first preferred embodiment; Fig. 1 A is the processing system of Fig. 1 in data communication with the Internet; 004759355v12 Fig. 1B is a block diagram of the assessment and reporting system of the first preferred embodiment; Fig. 1C is a block diagram of a group of functional 'units' constituting software used in the first preferred embodiment; Fig. 1 D is a block diagram for set-up of a report using an assessment template; Fig. 1E shows a display of a window dialogue box for in which the Assessor may select additional academic outcomes for a student; Fig. 2 shows a display of a window dialogue box as presented to an Assessor when they are using the Setup Wizard is shown; Fig. 3 is a block diagram showing the categories of KPI's, categorized according to several properties g* eeo *o *o* WO 01/31610 PCT/AU00/01328 6 Fig. 4 is a block diagram showing an example of how the frame work for KPI's may be used to categorise a basic computer skill set; Fig. 5 is an illustration of a sample Student Report; Fig. 6 shows a display of a window dialogue box generated by the Template Wizard for the set-up of KPI types in an assessment for a students 'personal skills'; Fig. 7 shows a display of a window dialogue box generated by the Template Wizard for identifying which Curriculum Outcomes; Fig. 8 shows a display of a sample scale sheet for grading KPI's on an appropriate scale defined by the Assessor; Fig. 9 shows a display of a sample of a final report layout; Fig. 10 is a block diagram illustrating the functionality of a database synchonisation tool; Fig. 11 shows a block diagram illustrating the process of printing a print job; Fig. 12 is a block diagram of the interactions between the database and the components of the print process; Fig. 13 are illustrations of a flow chart of a file assembly; Fig. 14 shows scanner interactions; Fig. 15 is a display of an example of a Mark Book; Fig. 16 is a display of a number of pre-registered comments within a database; Fig. 17 illustrates a Student Assessment form with a selection of comments; Fig. 18 illustrates a display of the comment codes screen of the Mark Book; Fig. 19 is a display of the general comments panel of the Mark Book; Fig. 20, is a display of window presented to an Assessor in order to make comments for a particular student's report; Fig. 21 is a display of a dialogue box which is presented to an Assessor in order to choose the format of the report that is to be generated; Fig. 22 is a block diagram for the creation of a new report layout; Fig. 23 is a block diagram for the insertion of a table in the report; Fig. 24 is a block diagram of saving a report layer; Fig. 25 is a window display of the Assign Layout Screen showing the Layout Design, Course Outline and Work Requirements for a report; WO 01/31610 PCT/AUOO/01328 7 Fig. 26 is a window display of the screen for entering Course Outline and Work Requirements information; Fig. 27 displays a window of the Report Selector; Fig. 28 is a block diagram of the generation of the Report Printing process; Fig. 29 is a display of the print selector where ordering is by a subject; Fig. 30 is a block diagram showing how Assessors go through a process of continually broadening their analysis of assessment results to ensure that assessments are consistent across all data; Fig. 31 is a block diagram for the steps taken by an assessor in validating the data in the database; Fig. 32 is a block diagram of the steps taken in the rollover procedure for the database.
Fig. 33 is a block diagram of a central server involved with an Internet client user; Fig. 34 is a block diagram of a local server involved with a LAN client user; Fig. 35 illustrates a block diagram for a second embodiment; Fig. 36 illustrates a block diagram for a third embodiment; and Fig. 37 illustrates a block diagram for a third embodiment.
Detailed description of the embodiments A preferred embodiment provides a method of reporting the performance of a school students upon completion of assigned tasks. The assessment is made by a teacher and permits the teacher to define a set of performance criteria for the assessment of classes that are to be completed the school students. The teacher can assess the performance the students in the class according to the defined performance criteria, either during or upon completion of, said assigned tasks. The teacher records the results of the assessed performance of the students and reports these results in a report. Optionally, the teacher may select the format of the report from a number formats, or alternatively, the teacher may design the format of the report.
The present embodiment is capable of running on any general purpose computer system or computer controlled Graphical User Interface (GUI), including GUIs that have the ability to present multimedia and virtual reality information. One preferred embodiment is schematically represented in a block diagram in FIG. I. A computer system 10 comprises a central processing unit (CPU) 11, memory storage device 12, one or more monitors or graphical interfaces 13, and entry device 14 such as a keyboard/mouse or speech recognition system.
In one embodiment, a IBM RISC SYSTEM/6000 comprises a control processing unit (CPU) 10, memory storage device 12, one or more monitors 13, and a mouse 14. The mouse 140 may be used to select or icons. On an IBM RISC System/6000 multiple monitors 13 can be controlled by multiple monitor adaptor cards such as the IBM RISC System/6000 Color Graphics Display Adaptor. The computer system 10 may also have audio input/output capability 17.
WO 01/31610 PCT/AU00/01328 8 In addition, speech synthesis or speech recognition may be provided for the entry device 14. Voice recognition may take place using a IBM VoiceType Dictation Adapter.
In an alternative embodiment shown in Fig. IA, the CPU 11 can be connected via a network adaptor 15 to connect the system 10 to a distributed network. Network adaptors 15 are well known. Three examples of network adaptors 15 include token ring adaptors, ethernet adaptors, and modems. The system 10 can be connected to other target monitors 13 through a client/server network, LAN or in this case, the Internet 16.
Systems that can be used to display graphical images, like icons and windows, are well known. GUIs can be used to control any apparatus having a monitor.
An application program is stored in the memory 12 and will hereafter be referred to as 'ReportCard'.
Reportcard is primarily concerned with aiding in the process of preparing, carrying out, recording, analysing and reporting assessments. Assessments may take the form of observations, measurements, tasks, activities, tests or any other action where a assessment result needs to be collected and recorded. ReportCard concerns itself primarily with assessments in education which may be such things as: Test Marks Educational Outcomes General Observations Other educational assessment activities Overview of Assessment System A description of functionality of ReportCard will now be described with refrence to Fig. I B which shows a block diagram of the overall assessment development and method.
Sl. Import Data At step S1, data related to Students, Teachers and Classes is imported from a school's administrative database to a data base stored in the memory 12. In this embodiment, an Assessor is a teacher of school pupils who are the subjects who must complete a series of assigned tasks set by the teacher.
S2. Set Up Curriculum Bank At step S2, a 'Curriculum Bank' is defined by the teacher. The Curriculum Bank is a relational data base stored in the memory 12. Data to the Curriculum Rank is able to be input with data relating to student outcomes and comments through any number of, or a combination of the following input sources: Import from data from Assessor Web Site Import school designed data which has been entered into an spreadsheet template, such as a template developed on Excel
T
spreadsheet provided by Microsoft Corporation Inc; Data input directly into the Curriculum Bank data base.
WO 01/31610 PCT/AU00/01328 9 It is also possible to modify imported data in the Curriculum Bank data base from any of the above input sources.
S3. Select Appropriate Outcomes and Comment At step S3, the Teacher selects the outcomes and comments which they wish to access to assess the students in their class/subject groups S4. Set up Report Templates At step S4, the teacher sets up the Report Templates. Preferably, this involves one template being set up per subject per year group.
Print Teaching Guides, Class Lists and a sample Student Assessment Form At step S5, a checking step is performed in which Teaching Guides, Class Lists and an example of a Student Assessment Form are provided to teachers so that they can review the assessment criteria that they have performed and check that it is correct.
S6. Make necessary corrections or modifications At step S6, the teacher is provided with an opportunity to make any modifications which are required after the teacher has reviewed the above documents. The changes are made in the Curriculum Bank database.
S7. Print Class Sets of Student Assessment Forms At step S7, the Class sets of Student Assessment Forms are printed and distributed to teachers.
S8. Assess and Record Achievements In step S8, the teachers record the achievement of students to the selected input means such as: Student Assessment Forms as they are assessed (requiring a scanner); (ii) directly into the Mark Book; or (iii) directly to the data base in memory 12 by entering data via a PC.
S9. Scan and Process Student Assessment Forms Step S9 is completed if the achievement of the students are recorded either into the Student Assessment Forms or the Mark Book. If this is the case, the completed forms are scanned and processed to bring the relevant information into the database located in the memory 12.
Design Report layouts (including Cover Page if Required) The layout or format of the Reports for the students are Designed in a 'Layout Designer'. It should be realised that this may occur at any stage in the process betore the reports are generated.
S11. Link Report Layouts with appropriate Report Templates Add Course Outlines and Work Requirements (if Required) WO 01/31610 PCT/AU00/01328 In step SI I, Report Layouts are linked to relevant Report Templates, it is at this stage that Course Outlines and Work Requirements are entered into the software if this feature is to be utilized. This may occur as soon as Layouts have been designed and Report Templates created.
S12. Print and Check Report Summaries At step S12, summaries of the outcome achievements, grades, test marks and comments are printed for teacher's to check before issue of the finalised reports.
S13. Make necessary modifications Any final modifications to the reports are made at step S13.
S14. Print Reports Reports are printed at step 14 and collated ready to go home, with a cover page if required Roll Over Curriculum and Template Data At step S15, the Curriculum Bank database and Templates are rolled over into a new database so that these can be modified and used for the following Semester or year. In this case, a further step of re-importing administrative data into the database in memory 12 is undertaken to populate it for the current reporting period In the following period, the steps S1 to S15 are repeated.
The Set Up Wizard The Set Up Wizard is the section of REPORT CARD which provides the overall customization framework for the client's database. A description of Set-up Wizard will now follow with reference to Fig. 1 D.
Step S20 Select Major Outcome Descriptors At Step S20, the Assessor sets the 'Outcome Achievement Descriptors'. This set of descriptors act as the default achievement descriptors in the Curriculum Bank data base.
Step S21 Set Additional Outcome Achievement Descriptors At Step S21, the Assessor may choose to set additional Outcome Descriptors such as descriptors for Class Work Skills, in contrast to previously selected academic descriptors. These descriptors must be attached to the relevant outcomes in the Curriculum Bank data base.
An example of a window dialogue box for Step S21 is shown in Fig. IE, in which the Assessor may select additional academic outcomes in which an assessment may be made of non-gradable skills such as behavior and personal skills may be assessed.
Step S22 Enable the capacity record test results At Step S22, the Assessor may choose to have the option to record Test Results. If the Assessor selects 'Yes they will be required to set default test names, however, if the Assessor selects 'No' there is no option to record these results.
WO 01/31610 PCT/AU00/01328 11 S23 Enable the capacity to record Grade Results At S23, the Assessor chooses whether or not they wish to use Grades in the assessment of the subjects. If the Assessor selects the option, they will be required to set up to five Areas which they wish to grade, otherwise the grade assessment functionality is not enabled.
S24 Set Grade Descriptors At step S24, the Assessor sets descriptors for Grades (if the option to use grades has been Selected). Up to ten Grade Descriptors may be set, in addition there is the option to use the symbols and along with the set Grade Descriptors.
Set T/F Student attributes At Step S25, the Assessor may set up to six attributes, requiring a true/false response relating to individual students which are to be assessed.
S26 Set Numerical Attributes At step S26, the Assessor may set up to three attributes, requiring a numerical response relating to individual students.
S27 Set T/F Class Attributes At step S27 the Assessor may set up to four attributes, requiring a true/false response relating to particular class groups.
S28 Provide required configuration and option information At step S28 the Assessor sets a series general options and settings, relating to School Organisation and the use of REPORT CARD. A window dialogue box 29 as presented to an Assessor when they are using a monitor in association with the Set-up Wizard is shown in Fig. 2, for formatting the output of the school report. The dialogue box 29 allows the Assessor to assign a name of the group that the student is assessed in (ie 'House'); the name of the Assessor ('Tutor'); whether there will be a Cover Page on the report whether personal comments will appear on the report ('Comment Blank' selected); method of entering data ('MFD' which is or via Keyboard, OMR which is Optical Mark Recognition); the side for the highest mark to occur in the report ('Left' in this case but 'Right' could also be chosen); and the timing for backups to be automatically created during the assessment period days).
S29 Set years and faculties At step S29 the Assessor sets the 'Years' and 'Subject Groupings' which they wish to utilise.
The REPORT CARD Units There are a number of modules, known as 'units', which make up REPORT CARD. These units and their relationships, where present, are shown in Fig. IC. A description of the particular units will now follow: Key Point Indicator Groupings (23) WO 01/31610 PCT/AU00/01328 12 Key Point Indicators (KPI) is a set of Assessor defined criteria for assessing the performance of a subject upon completing an assigned task or activity. KPIs may be grouped in a variety of ways for a range of different purposes. KPI's can be categorized according to several properties as in Fig. 3 below. These groupings allow the KPI's to be easily manipulated.
In Fig. 4, there is shown an example of how the frame work for KPI's may be used to categorise a basic computer skill set. KPI types are used for three main purposes: 1 To allow KPI's to be grouped for the purpose of displaying different 'types' of KPIs on different parts of a report.
2 To allow different 'types' of KPI's to be graded on different scales 'Achieved/Not achieved' instead of 'Always, Usually, Sometimes, Rarely') 3 To allow performance of KPI' to be analysed according to particular types (subsets).
Referring to Fig. 5, there is shown a sample Student Report showing two tables one for each of the major types of KPI used Work Skills; and Performance Skills.
In addition the 'Parent Interview' tick box is based on a third type of outcome with a True/False grading scale.
There are live major operations involving KPI Types. They are: Set up the KPI Types (Names/Grading Scales etc); Identify which outcomes are of each type; Grade KPI's on appropriate scale; Design Report Layout to show different KPI Types; and Analyse performance for each KPI Type Referring to Fig. 6, there is shown a window dialogue box 30 generated by the Template Wizard for the set-up of KPI types in an assessment for a students 'personal skills'. The dialogue 30 allows the Assessor to define a grading scale to the student's personal skills such as behavior. The dialogue box 30 also allows for marks to be assigned from two input sources: the teacher's 'Mark Book"; or a 'Scan Sheet' filled in by the teacher and then passed over a scanner having OMR capabilities.
Referring now to Fig. 7, there is shown a window dialogue box 31 generated by the Template Wizard for identifying which Curriculum Outcomes 32 (defined in the Curriculum bank data base) are assigned for particular Curriculum Objectives 33 for a particular learning area 34. The KPI's can be graded on an appropriate scale defined by the teacher, as shown in the sample scale sheet in Fig. 8.
A sample of a final report layout is shown in Fig. 9, this shows the 'Personal Outcomes' defined by the Assessor by arrow 34.
WO 01/31610 PCT/AU00/01328 13 Database Synchronisation Tool The purpose of the database synchonisation tool (DST) is to provide an uplink and synchronization function to allow a primary database to link to a range of administration databases in the memory 12.
Prospective clients computers of the DST have legacy administration systems used in a variety of areas of their business. The DST 'feeds off the data in the existing administration systems.
The amount of useful data which can be extracted varies from one client computer site to another. The extraction is capable of extracting the maximum amount of useful data. Data in the DST is preferably never deleted however, records which are no longer stored in the main administration system are marked as inactive.
Some of the fields in DST need to be updated to reflect changes in the main administration system, others are only be updated on the first data import. It is preferable that DST imposes the minimum amount of restrictions on incoming data in order to allow it to operate with a wide range of differing database structures and formats. At the import phase some data grooming needs to occur in order to ensure the successful operation of the DST. Fig. is a block diagram illustrating the functionality of the DST.
Handling Student/Class/Teacher records The student/class/teacher data is normally downloaded from the school's administration system into REPORT CARD. This data is stored in Personal Files and contains information about Staff, Students, Classes and Marks.
The Student Files is broken into six sections and contains: I General information about the student such as name, student number and pastoral group; 2 Details on classes attended; 3 Record of outcome achievements including general comments and the ability to plot a graph from the outcomes achieved; 4 Record of absences; Comments from Senior Staff and Home Room teachers; and 6 Area for additional notes.
The Staff Files section is broken into three sections and contains: 1 Non-confidential general information such as name, position and staff number; 2 Details of classes taught; 3 Security information; and 4 Area for additional notes.
Class Records is broken into four sections: WO 01/31610 PCT/AU00/01328 14 1 General information about the class including year, associated teachers, when it runs and additional information; 2 Class list contains the names of the student in the class; 3 Class outcomes achievements summary; and 4 Area for additional notes.
The Assessor can add a new student record through by creating a New Record, or by editing an existing record. Records can also be updated to indicate that a student has left the school. The Assessor can also change the class the student is in if the same outcomes are used in both classes (if outcomes are different, it will need to be changed through Class Records). Adding or editing a teacher's record can also be made through Staff Files. All changes to class details are made through Class Records. REPORT CARD allows you to have more than one teacher for a particular class e.g. specialist teachers for Library Studies, Sport etc.
The Curriculum Bank database The Curriculum Bank database allows you to store all the General Comments and Outcomes which are needed for the teachers to draw from to create a student's report. Put simply, it is a holding bay which can be added to at any time or amended.
The Outcomes are grouped and separated by stages. The statements in the General Comments area are linked to a general topic and can also be made subject specific.
Importing Curriculum Data Outcomes and General Comments can be imported into the Curriculum Bank database. There are two ways to do this: The Assessor can either download it from the 'Assessor's Area' on the Web or use the ExcelTM spreadsheet which is provided with the REPORT CARD software and type in Outcomes/General Comments; 2 The Assessor can import the data by selecting a 'Batch Processes' option provided on the 'Main Menu' of the REPORT CARD software and select the file to import the data. REPORT CARD will then go through a process of checking that the data is in the right format. Once this checking is complete and no critical errors found, the data is stored in the Curriculum Bank database.
Template Wizard The Template Wizard is used to create Report Templates. A Report Template is the collection of outcomes, tests and comments that a teacher will choose from when reporting on the achievements of students in a particular class. Once a Template has been created, it can be used by different classes and used year after year. It can also be modified each year as courses change. The Template Wizard takes you step by step through the process of creating and maintaining a Template.
WO 01/31610 PCT/AUOO/01328 The first step is to run the Template Wizard. This is done by selecting the Reporting Template Attributes.
These are set up to match the needs of a particular faculty within the school and to address the areas of the curriculum taught at a particular stage in the learning continuum. These attributes may include: Test results; Grades; Outcomes; General Comments; and Grading Categories (such as Homework, Conduct etc).
The Assessor selects which Outcomes which are to be on the Template. The Outcomes and are chosen from the Curriculum Bank database. The order of the outcomes on the Template can be changed easily by selecting the data field and using up/down arrow icons. If the Assessor does not provide a grade for an Outcome, it will not appear on the final report. For example, if there were a couple of accelerated students in the class, it would be better to include all the outcomes for all students in the class and only grade the extra outcomes for those accelerated students.
The next step is to create or edit a set of General Comments that the Assessor wishes to use with this Template. The Assessor has three options: I teachers can type the comments directly into REPORT CARD; 2 type the comments into a word processor such as WordTM word processing software provided by Microsoft Corporation and then cut and paste them into REPORT CARD; or 3 create a set of standard comments from the Curriculum Bank database.
The next and final step to complete the Report Template is now to link the class/classes which will use this template. A list of all classes appear on the screen and the Assessor selects the appropriate classes and clicks on Add. The Assessor can then print a Teaching Guide for the teacher(s) to check the Outcomes, General Comments etc. Any amendments are preferably made before the scan sheets are printed. The Assessor can make changes at any time before the Template starts to be used. Once a Report Template is in use, it cannot be altered.
Scan Sheet Printing Scan sheets are printed with the intention of: presenting options to a Assessor about a student's performance in a particular subject area gathering information from that Assessor about that student's performance by marking up the form using an automated system by which the information thus gathered is entered into the database A scan sheet print run creates scan sheets from the database according to the database's: current template information student-class-template relationships provided by the Assessor for that print run.
Scan sheets are printed according to advice gained from the database at the moment of printing. Since template information may change (a Assessor changes the items listed on a template or the template is replaced by another etc), so the printing process must comprehensively save a snapshot of the printed template. Each printed form contains enough information to find the original information.
WO 01/31610 PCT/AU00/01328 16 Each time a print job is run, the system creates 'generation' information allowing the form to be related to its underlying data regardless of changes a Assessor might make to the original template's structure.
This allows the system to impose fewer rules on the Assessor of REPORT CARD.
For Example...
A form can be created, printed and distributed.
The REPORT CARD Assessor may then change their mind about what should be on the form.
The Assessor changes the appropriate templates in the system.
Previously distributed forms (now out of date) may still be marked up by appropriate persons.
Out of Date forms are returned and processed and the system does not object, all data is saved in a meaningful manner in the context of the changes to the template(s).
REPORT CARD generates a print job table, containing (among other things) the following information: Student Identification Number Class Identification Number Template Identification Number TCS (Template Class Student) ID The three Identification Numbers uniquely identify the form. The database is invited to generate a unique number that simplifies access to the Student/Class/Template information and the associated marks and the paper form that is generated as the physical output from this process.
A generation number is created to enable tracking of the template generation for later recognition. Each form is tagged with the Student/Class/Template Identification Number (TCS ID) and with the generation number.
This is plain printed text.
Error Checking and Correcting The TCS number is important to the system's functioning and so is duplicated and a check digit is added to the number for later error detection and correction. This digit is called the Longitudinal Redundancy Check
(LRC).
The TCS number and generation number are concatenated and the LRC is calculated from the resulting number. The resulting number is then printed in the bottom left and bottom right of the form.
Alignment and Trimming In order to facilitate later form trimming, alignment and processing of marks (all described in Scan Sheet Processing), alignment marks are added in three corners and alongside areas of interest. Alignment marks are black rectangles of an agreed size.
WO 01/31610 PCT/AUOO/01328 17 Human Interface Assessment forms are printed on a standard printer such that all relevant details are readable by a human in natural language. The forms are given heading and other information in plain text so a human can identify relevant TCS details and read the information on the form.
The Assessor marking up the form can mark up the form and a human can interpret the marks.
Machine Interface The forms are printed such that a machine can reliably be used to read the marks.
Generation Information The printing process creates a temporary table within the database which contains all details required to print the forms for the current print job, including generation information.
This table is used to create a snapshot of the template information necessary to recognise any form that was ever printed, even if the original template is subsequently altered within the database.
Creating Print Job and Printing Referring to Fig. 11, there is shown a block diagram illustrating the process of printing a print job as described below, including the Calibration process. Fig. 12 is a block diagram of the interactions between the database and the components of the print process.
Creation of the print job information is done by a program called process.exe. It is also responsible for creating the ODBC DSN to enable the printing engine to print the forms.
The print engine (formprt.exe) is executed when process.exe is about to exit.
Calibration Because output from different printers could vary, the print engine should preferably be capable of printing a form which will enable the scan sheet processing step to effectively operate. For this reason, the print engine can print a calibration form. This form contains an example of all of the characters the scan form is expected to contain.
Scan Sheet Processing Scan sheet processing is the system by which information contained on paper forms printed at the Scan Sheet Printing step and hand marked up by Assessors are entered into the database. Generally, Scan Sheet Processing is a two step process as follows: 1 Scan the paper and save it within the computer as a digital image.
2 Process the digital image, saving marks found into the database.
Both these steps are expanded further below.
WO 01/31610 PCT/AUOO/01328 18 Scanning Scanning for our purpose is the process by which a marked form is converted to a digital image file and thereby made available to the processing step. The scanning step is designed to be done using any standard scanner for which TWAIN drivers are available. The process happens in the following manner: Scanner PCs are allocated an Identification Number which is unique. The default is I, since most organisations will only have a single scanner subsystem. This Identification Number is used to generate unique file names, allowing multiple scanning subsystems to work independently and without knowledge of one another.
The scanner device is loaded with one or more scan sheets (according to its capability and the number of scan sheets to be processed).
The scan program is told to commence scanning.
Scan subsystem commands the scanner to commence scanning.
Motorised scanners keep scanning until they run out of paper.
Images are saved as separate files on the file system available to the PC according to information found in an initialisation file.
This process is repeated until no more scan sheets are available.
All images are saved in monochrome (I Bit) as TIFF files numbered sequentially in order of scanning.
Several scanning systems can operate at the same time saving images to the same place. File names are generated in such a manner as to prevent conflicts.
Resolution at which forms are saved is recommended to be 300 dots per inch, however for slower and disk capacity limited machines this can be reduced to a minimum of 200. Processing is unaffected by changes in resolution.
Calibration is also performed. Calibration is the process by which the scanner and printer are matched by the processing engine. As each scanner and each printer has different characteristics, the system gathers information about the density of expected letters and numbers as printed and scanned. This increases reliability of the entire process to such an extent, it has been made obligatory for Assessors to calibrate before any processing can be done.
The processing software is launched and checks that valid and complete calibration data exists. If not, it asks the print engine to print a calibration sheet. Calibration sheet is printed as described in Scan Sheet Printing.
The calibration sheet is then put on the scanner and is scanned.
The processing software opens the scanned calibration sheet and inspects the density of the printed letters and numbers on the page, storing these pixelation densities for each letter and number combination. When calibration has been successful, the processing engine switches to normal operation.
WO 01/31610 PCT/AUOO/01328 19 Processing Processing is the means by which the digital image files are examined by the PC and the information found thereon is entered into the REPORT CARD database. A form is printed by the Scan Sheet Printing component of REPORT CARD or an image thereof. Generally, the system takes an optimistic view on the forms it has processed.
Erroneous marks only appear so within its own Assessor interface. From other sections of REPORT CARD, the system does not show alarming error messages to the Assessor. This helps prevent Assessors without access to the original scanned image precipitating changes.
The scanned image is straightened ('deskewed'). Using the three alignment marks located at the top left, top right and bottom right of the form, the system determines if the image is upside down and rotates the image through 180 degrees if necessary. The same alignment marks are then used to trim the margins from the images.
Having trimmed the form to approximately known dimensions, the system finds and reads (using Textbridge OCR) the form identification numbers at the bottom leltt and right of the page. The page number is also identified.
Error Checking and Correcting using LRC Once the numbers have been recognised, the accuracy of the OCR is verified using the LRC as explained above. The left hand number is verified.
If an error is detected, the number on the right is verified.
If an error is detected with this number, the form is saved for Assessor intervention later.
Itf all went well, processing continues.
The generation number is stored, and the database is interrogated for the Student, Class and Template Identification Numbers.
Appropriate form generation information (stored during printing) is retrieved and cached.
Form Pages Completeness check.
If another page has earlier been found with the same TCS number, and that page did not contain the complete set of responses, the current page is joined to that page in page number order. The two joined forms become one and that form is processed as follows.
The alignment marks showing mark locations on the page are counted and the page is assessed for completeness.
If more alignment marks are expected and the page is stored and will await other pages with the same Identification Number.
The flow chart on Fig. 13 and Fig. 14 an illustration of page assembly.
WO 01/31610 PCT/AUOO/01328 Read Markable Sections In order to find the sections, during the completeness check described above, the number and location of marked sections for this particular form are stored. Because no mark location is absolutely guaranteed (forms are not expected to be exactly straight or to have marks only in regulation areas) each form is individually examined to find the marked sections.
Detecting Marks Marks are detected by examining areas next to the alignment mark and comparing the density of black to the calibrated value for each particular letter. Each letter is examined separately, however if a letter is over a threshold value greater than the calibrated value it is considered to be a mark. If a letter is near this value, and no better marks are found, it is considered to be a doubtful mark and the form and mark are flagged for later Assessor intervention. Possibly the person marking the form makes light marks or has poorly erased a mark. The person later correcting errors is therefore advised to check the rest of the form too.
If two marks are found greater than the threshold, and one is significantly darker than the other, only the dark mark is stored, otherwise an error is flagged if multiple marks contravene the rules.
Various rules apply to different types of grading that allow or disallow blanks, multiple marks and so on.
More about Pixelation Thresholds Pixelation threshold vary between characters according to the amount of ink on the printed page. For example, a contains very little ink whereas the symbol (considered by the system to be a single character) contains a much greater amount of ink.
Using similar threshold for both these 'characters' is not appropriate since it would take a great deal less of a mark to push the over the threshold and so have a mark recorded. The result is an unmnarked (among other characters) will have a great number of 'doubtful' marks recorded whereas will likely have none.
The calibration subsystem makes up for this kind of threshold anomaly in various ways. Examples of the simplest of these follow: The threshold is not considered crossed until a certain overall percent pixelation is reached.
Characters with less 'ink' have a higher threshold.
Characters are grouped in various ways.
Defaults are set for unknown characters double byte or unicode characters).
The anomaly differs between different scanner-printer combinations and so although a safe value is set for simplicity, advanced Assessors may make threshold alterations at a very deep level.
WO 01/31610 PCT/AU00/01328 21 Rules about Overwriting Existing Marks Each form is broken into sections, each having a particular style of options. If any mark is found in a section, all existing marks in the database (however entered), are removed and replaced by found marks. If no marks are found, any marks in the database relating to that section of the form are left untouched.
Image Archiving.
As images are processed (either successfully or unsuccessfully) they are moved to another place in the file system fur later examination or archive purposes.
Error Correcting Human Interface During Scan Sheet Processing, a number of errors can arise. The processing engine is designed to catch these errors, deal with them whenever possible or ask a Assessor when it cannot of itself resolve the problem.
That having been said, all forms are processed without human intervention. That is the engine may be unattended during processing and will only prompt for Assessor intervention on any errors when asked. Errors are dealt with in batch form wherever possible.
During error correction, the Assessor is shown both the original scanned image (in a floating window) and a special data entry interface (in the background). Segments of the image that are or interest can be zoomed into and out from using the Assessor's pointer device (usually mouse).
These errors and the process by which they are dealt with are enumerated here.
When a Assessor terminates correction of a form, they have the option of marking the form as error free. If the form is marked as error free, it is removed from the error checking cycle altogether.
Error in OCR shown by LRC mismatch or other Identification Number Checks This can be caused by various reasons, e.g. Obliteration of both form identification marks, something that looked like an alignment mark was found and mistakenly used by the engine as an alignment mark, poor scanning procedure etc.
The Assessor is shown the area on the form where the processing engine expects to find the form Identification Number and is asked to type the number. The typed number is examined (including LRC) and the form is processed as above. If the LRC is incorrect or the form or generation doesn't exist, or the Student, Class or Template doesn't exist, the Assessor is asked to enter the form by hand.
Pixelation Thresholds Forms that contained miarks that approached pixelation thresholds but did not exceed them by a significant margin are shown to the Assessor with highlights on the offending marks and comments from the engine to provide guidance in correcting potential errors.
It is important to note that this level of errors is a relatively benign one which is mainly designed to flag forms and marks which are 'doubtful'. Mostly these forms will be found to be error free.
WO 01/31610 PCT/AUOO/01328 22 Number of Marks found is Outside the Rules Where the scan engine has determined that any marks on a torm contravene the business rules for the engine, these marks are flagged, together with advice, for a Assessor to decide. Examples of these marks include: Percentage mark 100 Section has too many marks (more marks detected than allowed by the rules).
Section has insufficient marks (a mandatory section was not filled in).
The Mark Book The Mark Book allows Assessors to view, edit and add assessment information stored in the REPORT CARD database. This enables REPORT CARD Assessors to correct assessment data that has been 'scanned' in, or entered into the database by other means, at their PC. The assessment information, in the Mark Book, is usefully grouped by both: Class; and Report Template An example of a Mark Book is shown in Fig. 15, which is a display 35 of a virtual mark book presented to a teacher on their PC. The display 35 details the marks assigned by the teacher for a class of students.
Assessment information for a particular assessment subject in the class can be viewed, changed, added or deleted as required. To assist in finding the desired class of assessment subjects, the list of classes and Report Templates to choose from can be reduced to a sub set or all available by using the "NARROW" button 36. This allows the lists to be restricted by: 'Year' arrow 37 selection of field only displays classes for the year chosen.
'Faculty' arrow 38 selection of field only show classes in the faculty chosen.
'Reporting Period' -only shows templates associated with that reporting period.
Information in the Mark Book is principally shown on 6 'tabs' of the Mark Book window: 'Outcome Achievements'; 'Outcome Key'; 'Test Results'; 'Grades'; 'General Comments'; and 'Analysis'.
Assessment information in REPORT CARD is divided into 4 types: Outcomes Test Results Grades General Comments There are two more 'tabs' showing: WO 01/31610 PCT/AU00/01328 23 Outcomes Key Analysis On the outcomes tab the outcomes associated with the current Report Template are numbered from 1 to the total number of outcomes used. A tab is given as a legend/key to match the outcome numbers to their descriptions.
The assessment information can also be interpreted graphically by using the 6 'h tab labeled "Analysis".
The graphical tool allows configuration of the display in normal ways present in typical charting/graphical tools commonly available. The information that will be displayed in the graph can be chosen by selecting a number of different options on the tab. The data is displayed in a grid which can be viewed before the graph window is created.
The graphical element of the Mark Book is also incorporated into the Test Results tab.
All data displayed in the Mark Book using the grids can be copied to the Win clipboard by clicking the clipboard button at the top of the window.
There are two ways to change 'General Comments': General Comments may be added/edited in free form text; or choosing from the list of general comments associated with that report template.
Web Data Entry Web Data Entry is the ability to enter marks into the Markbook via the web. Markbook screens have been duplicated as closely as possible using web pages permitting approved Assessors to enter marks remotely. Web Data Entry provides capabilities similar to the Desktop version of the Markbook. All Web Data Entry function can be accessed by (and only by) Teachers.
SLogin and Authentication using standard security techniques (username and password) Selection of 'Class' and 'Report Template' from web form popup to select the particular class the teacher wants to update Selection of Markbook function from 'tabs'; to give one of the following data entry pages: Outcome achievements Displays student names down side of page, and outcomes across page Allows Assessor to select an outcome for a particular student and enter a new value for that outcome Assessor repeats entry of changes as required Assessor clicks on the single 'Update' button to commit changes to database WO 01/31610 PCT/AUOO/01328 24 System verifies that all modified outcomes are valid, reporting any errors (valid outcomes are shown by an onscreen legend) Outcome keys Displays the keys and descriptions for the outcomes achievements page. (Read only; no update possible) Test Results Displays student names down side of page, and test results across page Allows Assessor to select a test result for a particular student and enter a new value for that test Assessor repeats entry of changes as required Assessor clicks on the single 'Update' button to commit changes to database System verifies that all modified test are valid, reporting any errors (in range 0 to 100) Optionally allows Assessor to see graph of student test results by clicking on 'Graph' button. Display changes to show horizontal bars (per student per test) across display representing marks scored in each test (Student results for a particular test are indicated in the same colour; colour used for display changes between tests) Grades Displays student names down side of page, and grades (achievement, effort etc across page Allows Assessor to select a grade for a particular student and enter a new value for that grade Assessor repeats entry of changes as required Assessor clicks on the single 'Update' button to commit changes to database System verifies that all modified grades are valid, reporting any errors (valid grades are shown by an onscreen legend; and also accepted) General Comments Displays student names down side of page, and commnents across page Allows Assessor to select a comment for a particular student and enter a new text Assessor clicks on the 'Update' button to commit changes to database: one button per comment field per student) System verifies spelling of comment, reporting any errors in spell checking dialog (Change, Change All, Add, Ignore, Ignore All, Stop and Cancel meaning of these buttons is described in legend on spell checking form) WO 01/31610 PCT/AUOO/01328 The Comment Bank All comments generated by codes are stored in the Curriculum Bank database. When comments are entered into the Curriculum Bank database they are given a value ranking from I to II with I being the most positive comment, 10 being the most critical comment and I 1 being a value for suggestions for improvement.
As a selection of comment statements is compiled REPORT CARD will automatically place the most positive comments first, it will then place more critical comments in the body of the comment, and place suggestions for improvement last. Fig. 16 is a display 16 of a number of pre-registered comments within the Curriculum Bank database which can be selected by a teacher in making General Comments for a student's performance.
The display 16 allows teacher's to have complete flexibility over the available comment set. While a standard set of comment is provided with the software, Assessors can modify, add or delete the comments stored in the Comment Bank database. Teachers wishing to access the Comment Bank have the option to select which sets of comments they wish to select from, by narrowing their available selection through the Template Wizard.
REPORT CARD has three interfaces for the creation of subject specific comments for students: Select codes related to a Comment Bank on Student Assessment Forms Select codes from the Mark Book Type a comment or edit a comment directly in the Mark Book REPORT CARD also enables the Assessor to have specific Pastoral style comments entered into the database -lead of Year) for use on the Cover Page of a students report, this is done by typing the comment directly into the Student's Record.
Select codes related to a Comment Bank on Student Assessment Forms After Report Templates have been created and Student Assessment Forms generated, teachers can create their comment for a student, by selecting the relevant boxes against the appropriate comment category. Teachers use a Teaching Guide as a reference to the comment statements in full, to assist them to make their choices. Fig. 17 illustrates a Student Assessment form with a selection of comments.
Select codes from the Mark Book Comment Bank database comments can be compiled or modified in the Codes section. This can be simply achieved by selecting the appropriate Comment Number and Letter and clicking add. REPORT CARD automatically orders the comment statements according to the values attributed to them in the Curriculum Bank.
Fig. 18 illustrates a display of the comment codes screen of the mark book.
WO 01/31610 PCT/AUOO/01328 26 Type a comment directly into the Mark Book Comments can be typed directly into the Comment panel of the mark book. They can also be edited in this section of the program, where changes are necessary. Fig. 19 is a display of the general comments panel of the mark book.
Entering Pastoral Comments Principals, Tutors or Year Heads are able to enter a pastoral comment which can then be produced on the cover page of reports through the Student Files. Referring to Fig. 20, this is done by simply accessing the appropriate student's file, selecting the comments tab, and selecting the appropriate Reporting Period and Position, then typing.
The Process Reports Presentation in the case of a paper embodiment, is known as a 'Report'. This allows the Assessor to: Choose a defined information area to be presented Choose to restrict the information returned for that area Choose to apply formatting options on the presentation of the information Choose a source for the information if more than one is available Save their choices with a name to be used again at a later date Make their choices the default for you when using the information presentation again.
View the presentation on the screen Print the presentation to paper Save the raw information used for the presentation The term 'Process Reports' covers the presentation of information held in REPORT CARD for printing, viewing or sending to others in various formats but not the core assessment report for the individual assessment subject (for the schools educational assessment environment this would be the students 'school report').
Process reports allow the REPORT CARD assessor to consider assessment information by a variety of different groupings, with a variety of different calculations applied such as averages, totals, etc. They also allow the Assessor to select only information they desire by giving choices for set of information that will be presented in the report.
Process reports are specific and unique presentations of assessment information from the REPORT CARD database. For ad hoc information retrieval and presentation any of the widely available query and reporting tools would be suitable. For example Microsoft Access has such tools built in.
Process reports are delivered in discrete units allowing the Assessor to add new reports as they become available without having to install a new version of REPORT CARD.
WO 01/31610 PCT/AUOO/01328 27 The main components involved with running a process report are: Common Components REPORT CARD Database Process Reports System components Fig. 21 is a display 40 of a dialogue box which is presented to an Assessor in order to choose the format of the report that is to be generated for a particular student. The display 40 also has a 'Preview' button that allows an assessor to preview the format of the report prior to printing.
Report Designer The Report Designer allows the Assessor to create and place objects on a virtual page of a report, known as a Layout. The objects may contain literal data or data that is populated from the database in memory 12. The layout of these objects and their constituent properties and actual contents can then be saved to a file on either diskette or hard disk as a template, in a unique, custom-designed file type for later use with data from the database.
Report Templates consist of a combination of the Group Subject Key Reporting Areas. A printed report is created by combining a Report Template with a Report Layout. When reports are printed, the appropriate layout can be populated with data from the REPORT CARD Database about each "person" being reported. As "each person" may belong to many groups, it is possible for one "person" to have several report templates associated with them.
There are two distinct layout types that can be created on the designer: Cover Page; and Report.
The Cover Page contains more merge fields than the Report definition. These merge fields refer to information that is not available in each ARC Report Template. During a print run, a Cover Page is used once for each "person" being reported, while a "person" may have multiple Report definitions used with different report templates. This allows the supervising body to build a comprehensive reporting structure, based on different facets of the individual subjects, for each "person".
The objects that can be placed on the virtual page are: Text boxes containing literal text and marker fields that can be populated with data merged from the ARC database.
Non-embedded images which are referenced from local storage on the computer.
Graphs descriptive markers which are populated with merged data Tables descriptive markers which are populated with merged data WO 01/31610 PCT/AUOO/01328 28 These objects can be formatted in different ways to produce distinctive and different report layouts. The formatting includes the expected text formatting of font style, size, bold, italic, underlining, left and right alignment, text justification and tab settings. Each object may have borders applied and the images, graphs and tables may have titles applied, separately to the properties of the object.
Images of varying types (bitmaps, jpeg, portable network graphics, etc.) may be displayed on the pages.
There are several graphing styles from which the Assessor may choose. For example, Bar Chart, Line Chart, Rocket Graph and others Each of these graphs may be varied in the details being graphed by selecting database fields that contain numerical data and are marked as being available to be included on a design layout.
Tables are built from the database store of various Outcome Types for subjects. The formatting of the tables is done by choosing various properties on a dialog box and applying these properties to a particular outcome type.
After placement the table's text contents can be formatted using the normal text formatting procedures.
The layout and placement of these objects is performed by "drag drop" operations of the mouse on the computer screen. The objects may be sized by selecting them with the mouse and then dragging on the markers to fit. Objects may be locked in place The saved file is a unique file type to allow quick loading of the files and information for rapid display of the designs. The layouts can be printed directly from the designer to allow the Assessor to see how the layout design will look.
The nature of the activity of the designer is unstructured, allowing the Assessor to nominate which activity they wish to perform at any stage. There are five areas that follow a structured approach. These are: 1. Opening the Designer/ Creating a New Layout The Assessor is presented with the option of creating either a Cover Sheet or a Report Layout. The choice determines the number of merge fields available to be inserted in text boxes. The list of merge fields is drawn from the database. All the default values are loaded, i.e. Page size, margins, etc. The first page of the new document is then drawn on the screen, along with the page ruler. The default insert type of Text is chosen and marked on the menu bar. Refer to Fig. 22 for an image of this operation.
2. Inserting an Image The Assessor is presented with a dialog box to allow the selection of an image file. A list of suitable files will be created from the available picture files on the local disk. The Assessor can select these through a standard dialog box.
Upon selection of the file the image is inserted to the position described by the drag drop operation of the Assessor, on the layout.
A reference to the location of the file is stored in direct memory, prior to saving the file, along with the rest of the layout descriptions.
WO 01/31610 PCT/AU00/01328 29 3. Inserting a Graph Refer to Fig. 23 for the block diagram of this operation. The Assessor is presented with a dialog box containing a variety of properties from which to choose. The choice of properties determines the format and contents of the graph to be displayed.
The Assessor chooses the properties of Graph Type, Title, Border, Horizontal and Vertical axes scale and description, the colours for each type of merge data being graphed and the merge fields indicating the names to be displayed in the legend for the graph. When all the selections have been made, the Assessor closes the dialog box and the graph is inserted to the screen.
4. Inserting a Table Refer to Fig. 24 for the block diagram of this operation. The Assessor is presented with a dialog box containing a variety of properties from which to choose. The choice of properties determines the format and contents of the table to be displayed. The Assessor chooses the properties of the Outcome Type options provided from the database, width of the description column, the type of table to be used, the number of columns (where appropriate), the marking type for the results of each outcome, any shading and colours to be used on the table. The Assessor may choose to include a legend when the table is written to the screen.
When the dialog box is closed, the table is drawn on the screen. If the legend is to be displayed, it will be drawn on the screen as well.
Saving the Layout The Assessor is presented with a dialog box to determine if this layout will be available for use with the Report Templates in REPORT CARD. The Assessor is then asked to provide a unique name for the layout, so that it may be referenced in the database. The Assessor is then asked to provide a name for the disk file. These two name may not necessarily be the same.
Assigning Layouts The Assign Layout feature enables the Assessor to attach particular Report Layouts to the Report Templates which were created in the Template Wizard. This functionality enables the Assessor to have any number of Layouts and to quickly and simply attach them to the relevant Report Template. Course Outlines and Work Requirements can also be entered into the Assign Layout feature and then attached to the relevant Report Templates.
I. Selecting and Attaching the Relevant Report layouts to Report Templates The Assessor is able to select from a drop down list any of the available Report Layouts. The Assessor may then Attach the chosen Report Layout to any number of given Report Templates.
2. Attaching Course Outlines and Work Requirements to Report Templates The Assessor is able to create a series Course Outlines and/or Work Requirements, which can then be attached to the relevant Report Templates. This function enables the same Report Layouts to be used for a series of WO 01/31610 PCT/AU00/01328 subjects, even though they have vastly different Course Outlines and Work Requirements. Fig. 25 is a display of the Assign Layout Screen showing the Layout Design, Course Outline and Work Requirements which have been attached to the relevant classes. Fig. 26 is a display of the screen for entering Course Outline and Work Requirements information.
Report Printing The REPORT CARD Report Printing Engine combines the details captured in the REPORT CARD database, with the layouts created using the Report Designer, to produce a final report consisting of an optional cover sheet and a series of printed sheets for each person/class combination in a Assessor-defined selection list.
There are preparatory steps to be done within the REPORT CARD application, prior to the printing of reports.
These are: Define a Class/Subject Template in which to capture information about each person.
Capture all the information Design a Layout to be used with the Class/Subject Template Assign the Layout to the Class/Subject Template When the Print Selector section of the application is selected, a list of available people in classes, using a Class/Subject template (described as a Report Name) is displayed in the Print Selector. The list is filtered according to the selection of a combination of: 'Year' and 'Faculty and Period' categories. The list of potential candidates may be further filtered by selecting all students or only those students that are current at the location.
The Report Selector allows the Assessor to select items from the left side of the screen, adding them to a list on the right (refer Fig. 27), building this list as a batch of reports to be printed. A cover page Layout Design may be selected. This design will apply to each student in the list. The order of items to be printed may be adjusted through preset options or manually. The Assessor can choose to preview a report prior to printing, or they may print a sample of the reports prior to a complete run.
When the selection is completed to the Assessor's satisfaction, the print engine will calculate the average scores and ranking of the scores and students, prior to assigning the values to each student in turn. Attendance figure7s are examined for each student. This data is stored in a temporary database ready for use during the print run and ready for re-use should the Assessor require to reprint the reports.
The computer iteratively examines each line in the selection list, evaluating if a "new" student is being reported. If a Cover sheet is required, then it will be printed prior to any other report for the student, retrieving merge information about the student from the database.
Each report template for the student is then evaluated to determine the layout design to be used. The layout is found and loaded to the computer. The various merge fields on the layout are populated. The Graphs are populated according the values of the fields being graphed. The tables are populated according to the values of the outcomes for the subject being reported, along with any comments from the bank of comments stored about the student.
WO 01/31610 PCT/AU00/01328 31 Upon completion of the compilation of all reports for the student, a batch is sent to the printer for physical printing and the next student's reports are processed, and so on to the end of the list. As each students reports are printed, a dialog box indicating the progress is displayed on the screen. A block diagram of generation of the Report Printing process is shown in Fig. 28.
Fig. 29 is a display of the print selector (unfiltered selection) where ordering is by student. Note the first student has been selected for printing. Each row may use the same or a different layout.
Reports on the Web (Internet) Reports published by REPORT CARD can be output physically printed) or electronically. If the later option is chosen, a .PDF and associated .XML file is produced for each report. The .PDF file is an electronic version of the printed page; the .XML file contains information required for REPORT CARD to index the .PDF file. These files must be uploaded to REPORT CARD. REPORT CARD then indexes the reports so that they can then be viewed in a Browser by Students, Parents, and Teachers once they have successfully logged into REPORT
CARD.
Reports on the World Wide Web (Web) are accessed by a number of different Assessor types. Assessors types will be indicated by in this section. This functionality is divided into two parts, namely: Report Upload and Report Viewing.
Report Upload [Secretarial Administrator] Reports are produced as .PDF files and .XML by the Desktop version of REPORT CARD Files are zipped together into one file 'reports.zip' Secretarial Admin executes 'Report Upload.bat' which cause zip file to be uploaded to REPORT CARD. During this process they will be asked for Assessor name and password to verify that the upload is from a legitimate source.
REPORT CARD extracts the all files from the zip files, using the .XML files to index the .PDF files, then moves them into the appropriate directory Reports are now available for viewing Report Viewing As IStudent] Select 'View Reports' Select Reporting Period from popup, then click on 'Get Subject' Select Subject from popup, then click 'show' View report (PDF file) in browser Use 'Previous' and 'Next' buttons to go between subject in the current reporting period WO 01/31610 PCT/AU00/01328 32 As [Teacher] Select 'View Reports' Select Student from popup (shows all students), then click on 'Get Student' Select Reporting Period from popup, then click on 'Get Subject' Select Subject from popup, then click 'show' View report (PDF file) in browser SUse 'Previous' and 'Next' buttons to go between subject in the current reporting period As [Parent/Guardian] Same as [Teacher] except Student popup only shows relevant students (sons/daughters) Student Portfolios Through the student portfolio feature, Students are able to customise and upload material and maintain their own work portfolios. Portfolios require "moderation" or approval by Teachers prior to being made accessible to Public Viewer or Parents.
Student Portfolio functionality is accessed by a number of different Assessor types. Assessors types will be indicated by in this section. This functionality is divided into a number of parts: Portfolio Customisation [Student] Student selects 'Edit Portfolio' from main menu; then 'Customise' from submenu Student select template page to edit, then proceeds to customise title and content areas. Can select from range of font styles (size, colour, text face, text style), background colours etc Student clicks on preview, followed by save Student must request moderation following this using 'Moderation Request' File Upload IStudentl Student selects 'Edit Portfolio' from main menu; then 'Files' from submenu Student Clicks on 'browse' button; this produces a file selection box from which the student select a file on their local computer to upload Once file is selected, Student names the "link" which will represent this new file in their portfolio Student clicks on 'Upload' to upload file Student must request moderation following this using 'Moderation Request' Report Publishing [Student] Student selects 'Edit Portfolio' from main menu; then 'Files' from submenu WO 01/31610 PCT/AU00/01328 33 Student selects 'Year' and 'Semester' from popup menus on web form Panels show 'Reports Available' and 'Report Published', and can be moved from one panel to the other using buttons Student can request moderation following this using 'Moderation Request' Moderation Request [Student] Student selects 'Edit Portfolio' from main menu; then 'Submit' from submenu Student fills out message form (reason for moderation), then clicks 'Send' button Message is sent using email to the school's portfolio moderation email account, and a moderation request is saved in the database Moderation Approval/Rejection [Teacher] Teacher selects 'Moderation' from main menu; then 'Show Requests' from submenu System presents a list of moderation requests (specifies who, time and date of request) extracted from its database Teacher clicks on students name to go to that students unmoderated portfolio Teacher views entire portfolio; then goes back to first screen Teacher selects students they wish to approve, then clicks 'approve' ('Reject', and 'Approve All' are other buttons available here) If approved, REPORT CARD copies portfolio contents to public viewing area, updating database entries as required.
Portfolio Viewing As [Student] Student selects 'View Portfolio' from main menu; then selects either 'Portfolio Home' or 'Portfolio Files' from submenu Portfolio Home is the home page for the portfolio this normally where the student places some introductory material about themselves Portfolio Files is the file list page for the portfolio. Portfolio content is represented as links on this page. Clicking a link makes that page appear in the browser. Once a page is selected, 'Previous' and 'Next' links move between pages in the portfolio.
As [Parent] Similar to process followed by Student, except that the first step is to select which student (from lists of sons/daughters) that they wish to view WO 01/31610 PCT/AU00/01328 34 As (Public Viewer] Same as for Students Analysis Functions The 'Analysis Functions' provide direct, easy-to-use analysis tools for Assessors whilst providing them with the ability to conduct sophisticated data analysis functions.
Analysis is required in the present embodiment to meet two conflicting objectives, on the one hand it should be easy to use by people with minimum training and low level IT skills, and on the other it should be capable of sophisticated Assessor defined analysis functions. In order to meet these differing objectives two levels on Analysis have been developed.
Level 1: Predefined Analysis Functions these cover the standard range of analysis functions required by a Assessor and are available at the touch of a button.
Level 2: Assessor defined Analysis Functions these allow the Assessor to manipulate data and queries to construct a wide variety of innovative analysis tools.
Level 1 Analysis Tools include: 1. Assessment Consistency Functions Compared assessments of different assessors Compared proportions of assessments awarded at each level of performance Compared current assessments to previous performance Compare assessments to 'standard' assessments 2. Individual Performance Analysis: Compared to the performance of others Compared to own performance in other areas Compared to 'standard' performance Compared to own previous performance 3. Group Performance Analysis Target Groups performance compared to other Groups Target Groups performance compared to 'standard' performance Target Group performance compared to previous group performance 4. Whole Sample Performance Performance of the entire sample by Major Grouping WO 01/31610 PCT/AU00/01328 Performance of the entire sample compared to previous performance Performance of the entire sample compared to 'standard' performance Analysis of performance trends Analysis of effects of special initiatives 5. Statistical Functions Mark aggregation across topics Mark ranking and ordering Generation of: a. Means b. Standard Deviation c. Distribution d. Modal e. Normal Distribution f. Weighted totals and mark aggregation Mark Analysis by topic All Level 1 analysis tools feature: step analysis functions (select target grouping and then select analysis function) Results displayed on a table At the push of a button results can be displayed as a graph.
Level 2 Analysis Functions use the OLAP Data Cube Model. This allows predefined queries to be manipulated to provide 'drill down' data analysis functions. For example in a school context the Assessor may use the data cube to examine 'how many boys in year 7 have achieved Spelling outcomes.
Use of the Analysis tool Analysis occurs at the end of the assessment process and usually occurs before and after the results of the assessment are reported. It usually occurs before reports are generated to allow results to be standardised and to ensure consistency of assessment. Analysis after reports are generated are used to assess performance of the individual, groups or the entire sample.
As can be seen in Fig. 30, Assessors go through a process of continually broadening their analysis of assessment results to ensure that assessments are consistent across all data.
WO 01/31610 PCT/AUOO/01328 36 Repairing (Maintenance of Database) As the Database in memory 12 is used by many separate application and throughout the course of adding or deleting personnel, class, subject, template and many other records; these records are linked in a variety of ways.
During these linkages, it may be possible for items to become crosslinked or unlinked, either situation could provide possibilities for loss or corruption of data. To prevent this occurring, there are many checks in the activities of storing and retrieving data, however this may not always prevent problems occurring in the database. Such problems can occur as a result of loss of power, sudden fluctuations in power supply, the interaction of other applications on the computer or hard disk anomalies.
There arc two ways that REPORT CARD can deal with repairing the database. Both of these sections can be initiated by the Assessor of the application. The first applet called "Check Data and Report" checks the database for invalid use of the single and double quotation marks within certain fields in the Statements, Outcome, ReportNames, Classes, Students and Staff sections of the relational database.
The quotation marks could cause problems when performing queries on the database and are therefore removed and replaced with an innocuous character that has no impact on the querying of the database. The applet then checks for classes with null teachers, faculties and ensures that they are for the correct years and Stages. The database is then checked for various database tables without required data. This applet provides a report, both on the screen and a printer, to the Assessor concerning the various elements of the database on which the Assessor must perform some maintenance, such as entering required data in certain tables...
The second applet concerning the repair of the database may be initiated by selecting a 'Check and Repair DB' in the menu of REPORT CARD. The second applet performs the following functions: Evaluates specific database tables where the connections to other database tables don't follow rules created by the software administrator of REPORT CARD. Obsolete entries are removed, while current entries are repaired by rebuilding the connections.
Evaluates specific database tables for illegal multiple entries, retaining the entry that conforms to the most rules and deleting all others.
Evaluates specific database tables for total unconnected entries, deleting all entries that do not contain valid data.
Fig. 31 is a block diagram of the steps taken by an assessor in repairing the database using either of the two options described above.
Rollover The Database allows the data to be "rolled over" at the end of the time period, providing a consistency of data from one time period to the next, without the administrative effort of re-keying existing data. Rollover is achieved by creating a virtual copy of the existing database, in situ, and deleting those data elements that will not be used in the next time period. Where data items are required to he "promoted" or changed in some way WO 01/31610 PCT/AUOO/01328 37 automatically, the activities are done without any Assessor intervention. Some data elements are not carried forward from one time period to the next and are therefore automatically deleted.
The rollover of some data elements is ambiguous and can therefore be left to the discretion of the Assessor. Therefore prior to the execution of the procedure, a screen display is presented to the Assessor to indicate which elements will be retained from one time period to the next. Where the Assessor is required to make changes to the new data, a form specific to the task is displayed to allow the Assessor to complete the operation.
Fig. 32 is a block diagram of the steps taken in the rollover procedure for the database.
System Architecture The REPORT CARD embodiment utilises two types of web servers referred to as the "Central" and "Local" Servers. These web servers are accessed by standard web browsers to use REPORT CARD functions.
1 Central Server An illustration of a central server is shown in Fig. 33. The Central Server provides all of REPORT CARD functionality except for 'Web Data Entry'. All information about schools, Student Portfolios, Student Reports are accessible on a per school basis from this server.
Central Server Operation File upload to Student Portfolio For the purposes of illustration, the following example scenario will be used "A Student updating their online Portfolio". Numbers enclosed in parenthesis indicate interaction between components of the central server refer to Fig. 33. To perform this task the Student would perform the following steps: I) Student Logs into the Central Server.
The Student uses their Client browser (41) to access the Central Server 42. The server engine (43) will respond by sending them a Web page (44) containing the list of all possible schools that they could log into. This list is obtained from the Schools DB Once the Student selects the school they wish to log into, they are presented with a login screen. The Student enters their user name and password. These are validated against values stored in this particular schools user DB (46).
The server engine (43) responds to a successful login by sending the client browser a cookie, which is stored on the client machine The server also updates the users statistics updating the last login time in the Stats DB Each time the user attempts some further action, the client browser 41 sends the cookie 47 to the server 43; this is used by the server 43 to validate the user and ensure they can only access functions available to them. Other system functions remain inaccessible.
2) Upload new content, Request Moderation The Student (client machine) selects an "Edit Portfolio" (not shown) form the main menu bar (not shown).
The Student selects "Files" (not shown) from the submenu. The Student clicks the browse button (not shown), then selects a file (49) (from their local machine) to upload to the server from the open dialog which appears.
WO 01/31610 PCT/AUOO/01328 38 The Student names the 'link' which represents this file in their portfolio, then clicks the 'Upload' button. A successful upload places the new file in this particular School's file system (50) where it can be approved by a Teacher a process know as moderation. It is not currently part of any published publicly accessible) portfolio.
The student submits this portfolio update for moderation by a Teacher by clicking on the "Submit" submenu (not shown), then filling out the message form, before finally clicking on the 'Send' (not shown) button.
3) Teacher perform Moderation function and Approves update The Teacher logs into the Central Server and approves this Student's update. Once the update is approved, the new portfolio content is made accessible to anyone with the correct portfolio password.
Local Server: An illustration of a local server is shown in Fig. 34. A local server may be situated at each school. Teachers use the local server associated with their school to access the 'Web Data Entry' feature to update Student marks in the Markbook. A school does not require a local server if it does not use the 'Web Data Entry' feature.
For the purposes of illustration, the following example scenario will be used with reference to Fig. 34, in which a teacher is able to update test results in the Markbook. Numbers enclosed in parenthesis indicate interaction between components of the local server.
To perform this task the Teacher would perform the following: The teacher Logs into the Local Server. The Teacher uses their Client browser (52) to access the Local Server. The server engine (51) responds by sending them a Web page (53) inviting them to log in. The Teacher enters their usernaine and password. These are validated against values stored in the Assessor DB The server engine (51) responds to a successful login by sending the client browser a cookie 55, which is stored on the client machine.
Each time the teacher attempts some further action, the client browser 52 sends the cookie to the server engine 51; this is used by the server engine 51 to validate the teacher and ensure they can only access functions available to them.
Selection of 'Class' and 'Report Template' The teacher can select the appropriate class and report template from popup forms on the web page 53.
The currently selected Mark book function is 'Outcome Achievements' (not shown). Once the teacher selects a class and report template, the current list of students and their marks are fetched from the Markbook DB (54).
Select appropriate Mark book Function Since the Teacher actually wants to enter in test results, the Teacher clicks on the 'Test Results' Tab (now shown). The appropriate web page (53) is displayed, and current values of any test scores are fetched from the Markbook DB (54) WO 01/31610 PCT/AUOO/01328 39 Entering of test scores The teacher enters scores in the appropriate editable cells. (Results for more than one student can be entered without requiring each to be saved individually) Saving Test Results The teacher clicks on the 'Update' button.
The server engine (51) first validates the entries (entered marks lie between 0 to 100) before updating the required Mark book database entries (54).
Advantages It will be appreciated that the present embodiment provides a number of the unique advantages. For example, the ability to use pre-defined or user-defined groupings and hierarchies with the assessment information allows groups of assessment criteria to be recorded in the outcome bank and for groups of different types of assessment criteria to be bought together to form a template provides flexibility in the collection and reporting system. Additionally, the ability to add new pre-defined or user-defined assessment criteria into a bank of criteria and the ability to add pre-defined or user-defined groupings and hierarchies of criteria as attributes of the criteria bank further enhances the flexibility of the system in generating a number of unique reports for a number of students.
The collection and reporting system saves time for the Assessor, because the Assessors may use one reportlayout for many sets of assessment results or to use one report template (collection of assessment criteria) with many classes (assessee groups).
By using a computer to store assessment criteria, assessors and assesses in a structured manner with defined hierarchies, allows the user to follow a simple assessment and reporting process. Furthermore, as the system allows the Assessor to view pre-defined presentations of assessment information and assessment results information in a multiple of ways, the assessor is assisted in the preparation, organization, and checking of the collection of assessment results.
The system also saves time for the Assessor by allowing the templates to be saved, copied and altered for re-use. Furthermore, the scan sheets and mark book are constructed to make the collection and recording of assessment results more efficient. Assessment results can be presented in a variety of pre-detined or user-defined formats and Assessors can create their own presentation formats and store them for reuse.
The use of the 'Wizard' aids in the setting up of the system and assessment information.
The Assessor may assign one layout to multiple class/template combinations, thereby allowing users to use report layouts for one or more groups of assessment results.
The production of reports is simplified as REPORT CARD allows you to group and sort report layouts of assessment results by a number of different attributes to aid in production and distribution.
WO 01/31610 PCT/AUOO/01328 REPORT CARD'S rollover feature which allows you to carry forward assessment information and configurations to use in a new assessment period.
REPORT CARD also allows you to assign different symbols to grades depending where that grade will be shown in the system.
REPORT CARD'S maintenance feature checks for potential problems in data imports, thereby providing database integrity.
REPORT CARD also allows you to move a student, with their already recorded assessment results, from one class to another.
REPORT CARD allows Assessors to create forms for the collection of assessment results in the first instance the first generation of the form then decide they would like to change the assessment information that can be collected and create a new version of the form the second generation of the form where both versions (generations of the form can still be used and the assessment results will still be recorded in a meaningful way.
REPORT CARD permits the Assessor to view the scanned image of the form and also a table of the interpreted results from that scanned image and then make corrections to errors where required. Additionally, REPORT CARD allows both selecting general conunents from a 'bank' of pre-defined general comments or entering free form general comments or using a combination of both methods.
Other Applications It will be appreciated that although a assessment system in the above embodiment relates to a an assessment of school pupils performing the tasks of school work, it will be appreciated that other assessment systems may be covered by the present invention. An overview of a number of alternative assessment applications will now be described with reference to Fig. 35 to Fig. 37.
Turning first to Fig. 35 there is illustrated a block diagram the general assessment of assessment criteria.
Assessment criteria 2' and 3 includes key performance indicators and supporting materials. In this embodiment, a comment generator 2' is included.
Assessment data also includes assessment subject data 4' and assessment subject groupings Assessor data 6' is provided to assist assessors in collecting assessment data.
The data 5' and 6' is input into a software interface the "template wizard" which takes the user through a step by step process of selecting assessment criteria, etc. and producing an assessment data collection form for use by an assessor. Thus template wizard 7' interfaces with form designer 8' which then generates the assessment data collection form, in this embodiment taking either the form of an optical mark reader data collection form 9' or a digital scanner data collection form At this stage, an assessor provides human intervention, which involves assessing activities and applying professional judgment as shown at 11'.
WO 01/31610 PCT/AU00/01328 41 Form 9' and 10' are scanned and interpreted as shown by 12' and 13'. Entry may also be via network or Internet, as shown at 14' and The assessment data in relation to the target is analysed, on the basis of the key performance indicators, etc., by report manager 16'. Report manager 16' may also take into account additional data 17'.
Report manager 16' can generate electronic reports 18' or paper reports 19'. Report manager 16' can also generate action documents such as follow up letters, report and work orders 20'. If desired, report manager 16' can provide performance analysis tools 21' and can attend to data archiving as shown at 22'.
Turning now to Fig. 36, it will be readily seen that this is an adaptation of tie general system of Fig. 35 to the specific assessment by a council of the sate of its roads. Thus assessment criteria 31', 32'. 33', 34', 35' and 36' are input into template wizard 37' which interfaces with form designer 38'. As in the previous example, form designer 38' generates forms 39' or 40' and an assessor enters assessment activities using professional judgment as shown at 41'. The assessment data are scanned and interpreted at 42' and 43' and analysed by report manager 44'.
Report manager 44' may incorporate data from previous road repairs data base 45'. Report manager 44' generates electronic reports 46' or paper reports 47' and may also generate work orders 48'. Performance analysis tools 39' and data archiving 50' are also an option.
Turning now to Fig. 37, this is a version of the system in Fig. 35 which has been customised for personnel use. The assessment criteria includes desired attributes 51', a comment bank 52' and interview questions 53' as well as applicant data 54', job categories 55' and personnel officer's data (assessor's data) 56'. This data is input into template wizard 57' which, as before, interfaces with form designer 58' to generate form 59' or 60'. On form 59' or 60' a personnel officer enters the assessment activities using professional judgment as shown at 61' and the resultant data is scanned and interpreted as shown at 62' and 63' and input into report manager 64'. Report manager 64' also analyses the data captured from the target's resume Report manager 64' generates electronic reports 66' or paper reports 67' and can also generate performance analysis tools 68' and attend to data archiving 69'. Importantly, report manager 64' can automatically generate a rejection letter or an invitation for a second interview letter Hence, in light of the above examples, it will be appreciated that the reporting method can be applied to assessment applications other than students, such as a unit process in an industrial process. The reporting method thus has may applications production processes such as in mining processes, beverage processes, manufacturing processes. Additionally, the reporting method also has applications in the services industry, such as legal and accounting services, cleaning services and maintenance services etc.
The above described embodiments provide a convenient method of reporting upon the performance of a subject upon completion of assigned tasks. Generation of the reports is made convenient to the Assessor because it is the Assessor who permitted to define a set of performance criteria for the assessment of the assigned tasks.
It would be appreciated by a person skilled in the art that numerous variations and/or modifications may be made to the present invention as shown in the specific embodiments without departing from the spirit or scope of WO 01/31610 PCT/AU00/01328 the invention as broadly described. The present embodiments are therefore, to be considered in all respects to be illustrative and not restrictive.

Claims (31)

1. A method for collecting data for use in reporting the performance of a subject upon completion of one or more assigned tasks using an electronic processor, said method comprising the steps of: selecting a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects and collating the assessed criteria in a database accessible by the processor; grouping different assessment criteria to thereby form a template, the template being accessible and viewable by an assessor; allowing the assessor to assess the performance of at least one task completed by said one or more subjects according to said defined assessment criteria, during or upon completion of, said assigned tasks; recording the results of said assessed performance in step in a database accessible by the processor; and 1 5 causing the processor to generate a report which reflects the results of said assessed performance.
2. A method as claimed in claim 1, wherein said method further includes the step of(f) permitting said assessor to select the format of said report from a multiplicity of formats or, to design the format of said report. *oo*
3. A method as claimed in claim 1 or claim 2, wherein for a plurality of subjects, said oe o :oo method further includes the step of: permitting said assessor to group each subject according to a particular category.
4. A method as claimed in any one of the preceding claims, wherein said method further includes the step of: permitting an assessor to amend said set of performance criteria. A method as claimed in any one of the preceding claims, wherein said method further includes the step of: 004924265 44 permitting an assessor to define one or more quality values to said results of said set of performance criteria to thereby determine the quality of each subject or the quality of a group of subjects.
6. A method as claimed in any one of claims 1 to 5, wherein said method further includes the step of: providing a facility to allow the assessor to make general comments in relation to the performance of said assigned task.
7. A method as claimed in any one of claims 1 to 5, wherein said method further includes the step of: wherein a plurality of pre-recorded general comments are available for selection by said assessor in evaluating the performance of said assigned task.
8. A method as claimed in claim 6 or claim 7, wherein said assessor is permitted to make a selection of said general comments in relation to the performance of a subject, and thereafter rank the general comments so that they are read in said rank on said report.
9. A method as claimed in any one of claims 1 to 8, wherein said method further includes the step of: statistically analysing the results of the assessed performance against an assessor defined set. A method as claimed in claim 9, wherein said defined set is a plurality of subjects. 20 11. A method as claimed in claim 9, wherein at least part of said statistical analysis is included as statistical information in said report.
12. A method as claimed in any one of claims 1 to 11, wherein for a plurality of S• subjects, said report includes a comparison of the results of said assessment.
13. A method as claimed in any one of claims 1 to 12, wherein in step the results are recorded in a database.
14. A method as claimed in claim 13, wherein the results entered into the database are input by using a data collection form. 004924265 A method as claimed in claim 14, wherein the data collection form is read by a digital scanner adapted for Optical Character Recognition (OCR) and Optical Mark Recognition (OMR).
16. A method as claimed in claim 13, claim 14 or claim 15, wherein the results entered into the database are input by using a device having a digital processor.
17. A method as claimed in any one of the preceding claims, wherein said subject are people.
18. A method as claimed in claim 17, wherein said people are school students and said assessor is a teacher.
19. A method as claimed in any one of claims 1 to 16, wherein said subject are production units in a process. A method as claimed in any one of claims 1 to 19, wherein said report is printed on paper. :0 21. A method as claimed in any one of claims 1 to 19, wherein said report is available as 15 an electronic file, audio message file or video file.
22. A method as claimed in any one of claims 1 to 21, wherein external data is included in said report from external data sources, said external data may include any one or more of the ••eoo following data types: Assessee data; Assessor data; Assessment criteria data; Groupings of assessment criteria data; Hierarchies of assessment criteria data. 00 :20 23. A method as claimed in any one of the preceding claims, wherein said pre-defined *000 performance criteria are recorded in said database for use in a future assessment session. 0
24. A method as claimed in claim 13, wherein the results entered into the database are *0 input by using a device having a digital processor. A method as claimed in claim 24, wherein the digital processor is able to exchange data with the database over a communications network.
26. A method as claimed in claim 25, wherein the communications network is the Internet. 004924265 46
27. A method as claimed in claim 24, wherein the format of said report is automatically generated by said digital processor upon instructions of an application program, wherein the application program causes the digital processor to select particular formats for said report, dependent upon the assessment criteria selected by the Assessor.
28. A computer network memory storing thereon an application program for controlling the execution of a processor for collecting data for use in reporting the performance of a subject upon completion of one or more assigned tasks, the computer program controlling the processor to: permit selection a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; allow for grouping of different assessment criteria to thereby form a template; allow an assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks; and 15 record the results of said assessed performance.
29. A computer network memory as claimed in claim 28, wherein said computer program further controls the processor to: 0: report the results of said assessed performance in a report. Se a 29. A computer network memory as claimed in claim 28, wherein said computer program further controls the processor to: subject upon completion of onewor memore assigned tasks, said syst claimed in claim comp29, whereiing: said computer S:20 program further caiontrols the program having a set of selected assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; da permit storage means fsessor to selecrdingt the format of saidata reordedt from a multiid applicity of formats or, Sto design the forment dat of said repreposenting the results of said assessed perfort.ance; and a.. a 31. An electronic data collection system for use in reporting the performance of a subject upon completion of one or more assigned tasks, said system comprising: an application program having a set of selected assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; data storage means for recording assessment data recorded by said application program, said assessment data representing the results of said assessed performance; and 004924265 47 input means capable of interfacing with said application program for allowing for grouping of different assessment criteria to thereby form a template, and for allowing an assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks.
32. An electronic data collection system as claimed in claim 33, wherein said system further includes a report generating means for reporting the results of said assessed performance in a report.
33. An electronic data collection system as claimed in claim 32, wherein said report generating means permits said assessor to select the format of said report from a multiplicity of formats or, to design the format of said report.
34. A computer network memory storing thereon an application program for controlling the execution of a processor for reporting the performance of a subject upon completion of one or more assigned tasks, the computer program controlling the processor to: permit the selection of a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; allow for grouping of different assessment criteria to thereby form a template; 1* S•allow an assessor to assess the performance of at least one task completed by said one or S. more subjects according to said defined assessment criteria, during or upon completion of, said S assigned tasks; record the results of said assessed performance; and report the results of said assessed performance in a report.
35. An electronic reporting system for reporting the performance of a subject upon completion of one or more assigned tasks, said reporting system comprising: an application program having a set of assessment criteria for the assessment of one or more assigned tasks to be completed by one or more subjects; data storage means for recording assessment data recorded by said application program, said assessment data representing the results of said assessed performance; 004924265 48 input means capable of interfacing with said application program for allowing for grouping of different assessment criteria to thereby forma template, and for allowing assessor to assess the performance of at least one task completed by said one or more subjects according to said defined performance criteria, during or upon completion of, said assigned tasks; and a report generating means for reporting the results of said assessed performance in a report.
36. A method as claimed in claim 1, wherein a pre-defined comment in relation to the subject is recommended to said assessor, based upon the assessment of said subject's performance.
37. A method for collecting data, substantially according to any one of the examples described herein with reference to the accompanying drawings.
38. An electronic data collection system, substantially according to any one of the examples described herein with reference to the accompanying drawings.
39. A method of reporting the performance of a subject upon completion of one or more assigned tasks, substantially according to any one of the examples described herein with S reference to the accompanying drawings.
40. A computer network memory storing thereon an application program for controlling S the execution of a processor for reporting the performance of a subject upon completion of one or more assigned tasks, substantially as herein described with reference to the accompanying drawings.
41. An electronic reporting system for reporting the performance of a subject upon completion of one or more assigned tasks, substantially as herein described with reference to the S accompanying drawings. Dated: 14 October 2005 Freehills Patent Trade Mark Attorneys Patent Trade Mark Attorneys for the Applicants: ARC Research and Development Pty Ltd
AU11182/01A 1999-10-27 2000-10-27 A data collection method Ceased AU783979B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU11182/01A AU783979B2 (en) 1999-10-27 2000-10-27 A data collection method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
AUPQ3687A AUPQ368799A0 (en) 1999-10-27 1999-10-27 Assessment and reporting system
AUPQ3687 1999-10-27
PCT/AU2000/001328 WO2001031610A1 (en) 1999-10-27 2000-10-27 A data collection method
AU11182/01A AU783979B2 (en) 1999-10-27 2000-10-27 A data collection method

Publications (2)

Publication Number Publication Date
AU1118201A AU1118201A (en) 2001-05-08
AU783979B2 true AU783979B2 (en) 2006-01-12

Family

ID=36032173

Family Applications (1)

Application Number Title Priority Date Filing Date
AU11182/01A Ceased AU783979B2 (en) 1999-10-27 2000-10-27 A data collection method

Country Status (1)

Country Link
AU (1) AU783979B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113643603B (en) * 2021-08-05 2022-11-29 山东科技大学 Separated roadway physical experiment model structure, manufacturing auxiliary tool and manufacturing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5779486A (en) * 1996-03-19 1998-07-14 Ho; Chi Fai Methods and apparatus to assess and enhance a student's understanding in a subject
WO1998043222A1 (en) * 1997-03-21 1998-10-01 Educational Testing Service System and method for evaluating raters of scoring constructed responses to test questions
US5926794A (en) * 1996-03-06 1999-07-20 Alza Corporation Visual rating system and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5926794A (en) * 1996-03-06 1999-07-20 Alza Corporation Visual rating system and method
US5779486A (en) * 1996-03-19 1998-07-14 Ho; Chi Fai Methods and apparatus to assess and enhance a student's understanding in a subject
WO1998043222A1 (en) * 1997-03-21 1998-10-01 Educational Testing Service System and method for evaluating raters of scoring constructed responses to test questions

Also Published As

Publication number Publication date
AU1118201A (en) 2001-05-08

Similar Documents

Publication Publication Date Title
US6704741B1 (en) Test item creation and manipulation system and method
Maindonald et al. Data analysis and graphics using R: an example-based approach
Cunningham et al. Using SPSS: An interactive hands-on approach
Foster Data analysis using SPSS for Windows versions 8-10: A beginner's guide
Foster Data Analysis Using SPSS for Windows-Version 6: A Beginner's Guide
US7917842B2 (en) System for describing the overlaying of electronic data onto an electronic image
AU2002364933B2 (en) Automated system and method for patent drafting & technology assessment
Einspruch An Introductory Guide to SPSS? for Windows?
US20080213020A1 (en) Automated system and method for dynamically generating customized typeset question-based documents
Weiss Item banking, test development, and test delivery.
Salcedo et al. SPSS Statistics for dummies
JP2005157621A (en) Device and program for preparing school scorecard
Camarda Special Edition Using Microsoft Office Word 2003
AU783979B2 (en) A data collection method
WO2001031610A1 (en) A data collection method
Kraynak Absolute beginner's guide to Microsoft Office Excel 2003
Cornell Excel as your database
Bergstrom et al. 8. Item Banking
Day Mastering Financial Modelling in Microsoft Excel: A Practitioner's Guide to Applied Corporate Finance
Manual SPSS
Paras Level of competency of bachelor of science in accounting technology students in information technology
Rosson et al. Transfer of learning in the real world
Nail Instructions for using the 8th edition MSU thesis/dissertation Word automated templates
Afriyie Concise Ict Fundamentals Volume Two
JPH0452866A (en) Automatic formation system for operation program

Legal Events

Date Code Title Description
NB Applications allowed - extensions of time section 223(2)

Free format text: THE TIME IN WHICH TO ENTER THE NATIONAL PHASE HAS BEEN EXTENDED TO 20020627