WO2013175443A2 - A computerised testing and diagnostic method and system - Google Patents
A computerised testing and diagnostic method and system Download PDFInfo
- Publication number
- WO2013175443A2 WO2013175443A2 PCT/IB2013/054304 IB2013054304W WO2013175443A2 WO 2013175443 A2 WO2013175443 A2 WO 2013175443A2 IB 2013054304 W IB2013054304 W IB 2013054304W WO 2013175443 A2 WO2013175443 A2 WO 2013175443A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- questions
- user
- level
- answers
- areas
- Prior art date
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 14
- 238000002405 diagnostic procedure Methods 0.000 title claims abstract description 8
- 238000012545 processing Methods 0.000 claims abstract description 26
- 230000003247 decreasing effect Effects 0.000 claims abstract description 8
- 238000007670 refining Methods 0.000 claims abstract description 5
- 238000000034 method Methods 0.000 claims description 43
- 238000004590 computer program Methods 0.000 claims description 4
- 238000012552 review Methods 0.000 claims description 2
- 238000005259 measurement Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 230000007812 deficiency Effects 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 238000005067 remediation Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 229910000078 germane Inorganic materials 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000005291 magnetic effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000000246 remedial effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B7/00—Electrically-operated teaching apparatus or devices working with questions and answers
- G09B7/06—Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers
- G09B7/08—Electrically-operated teaching apparatus or devices working with questions and answers of the multiple-choice answer-type, i.e. where a given question is provided with a series of answers and a choice has to be made from the answers characterised by modifying the teaching programme in response to a wrong answer, e.g. repeating the question, supplying further information
Definitions
- the disclosure relates broadly to computerised academic test-giving and specifically to a computerised testing and diagnostic method and system.
- the Inventor desires a method and system to diagnose with specificity and appropriate granularity deficiencies within a discipline. Once the problem has been identified, it can be corrected and remediated using conventional teaching methodologies. SUMMARY OF DISCLOSURE
- the disclosure provides a computerised testing and diagnostic method which includes:
- grading by the processing module, the proficiency of the user in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold and, if so, decreasing or refining the granularity of the questions in the areas in which proficiency is below the acceptable threshold;
- the method may include repeating steps i. to iv. iteratively (e.g. a plurality of times) while progressively decreasing or refining the granularity each time. If there are plural areas below the acceptable proficiency threshold, the method may be repeated for each area, e.g. along separate branches or paths.
- the method may allow for fundamental areas of weakness to be identified.
- the acceptable threshold may be relative, e.g. an area which is weaker than, or scored below, other areas for the same user.
- the acceptable threshold may be absolute, e.g. a pre-defined or pre-definable score/rating, e.g. 50%.
- the term "user" may include a student, pupil, or the like, and indeed any person to whom the method is being applied.
- the method may include the prior step of storing a plurality of questions, and their associated correct answers, on a database.
- the method may include the step of storing the user-inputted answers for later review.
- the hierarchy or interconnectedness of the questions may provide a level of intelligence in the methodology.
- the hierarchy may be carefully designed such that weaknesses identified from higher level questions lead to appropriate refined, or lower level, questions being posed thereby to unpack and identify a user's weaknesses.
- the method may include suggesting or providing lessons to remediate any identified weaknesses.
- the method may therefore include the prior step of associating lessons with each possible area within the discipline.
- the user interface and the processing module may be remote from each other, and the method may include generating and sending question messages and answer messages between the user interface and the processing module.
- the disclosure extends to a computerised testing and diagnostic system which includes:
- a user interface including a display and an input arrangement
- a question database having stored thereon a plurality of top-level questions each relating to a different area within a discipline and at least a plurality of sub-level questions associated with each one of the top-level questions, the sub-level questions having decreased or refined granularity compared with their associated top-level question ;
- a processing module operable to:
- v. repeat at least operations i. to iii. at least once using the sub- level questions associated with the or each top-level question or area in which the proficiency is below the acceptable threshold.
- the question database may include more than two levels of questions.
- the question database may include first-level (e.g. top-level) questions, second-level (e.g. sub-level) questions, third-level questions, and so forth, in a cascading tree-type relationship.
- the question database may include correct answers associated with the questions.
- the processing module may be operable to compare the received answers from the user with the correct answers on the question database, thereby to assess whether or not the received answers are correct.
- the processing module may be one or more microprocessors, controllers, digital signal processors (DSPs), or any other suitable computing device, resource, hardware, software, or embedded logic.
- DSPs digital signal processors
- the computerised testing and diagnostic system may be embodied in a single device or apparatus, such as a personal computer, laptop, tablet, mobile phone, or the like. Instead, the computerised testing and diagnostic system may be distributed among a plurality of devices, e.g. in a server/client relationship.
- the user interface may be presented by a client terminal
- the processing module may be hosted by a remote server, the client terminal and remote server being networked together.
- the user interface and the processing module may be consolidated into the same device.
- the disclosure extends further to a non-transitory computer-readable medium having stored thereon a computer program which, when executed by a computer system, causes the computer system to perform a method as defined above.
- FIGURE 1 shows a schematic view of a computerised testing and diagnostic system in accordance with the disclosure
- FIGURE 2 shows a flow diagram of a computerised testing and diagnostic method in accordance with the disclosure
- FIGURE 3 shows a series of screenshots illustrating a user interface used in accordance with the method of Figure 2;
- FIGURE 4 shows a schematic view of a computer system within which a set of instructions, for causing the computer system to perform any one or more of the methodologies discussed herein, may be executed.
- reference numeral 100 generally indicates a computerised testing and diagnostic system in accordance with the disclosure.
- the system 100 includes a processing module 102, a computer program 104 and details of an acceptable threshold 106 for areas of assessment.
- the computer program 104 is stored on a computer-readable medium and is operable to direct the operation of the processing module 1 02.
- the system 100 includes a question database 1 1 0 having stored thereon a plurality of questions each relating to a different area within a discipline.
- the database 1 1 0 also includes a plurality of sub-level questions associated with each one of the top-level questions, the sub-level questions having decreased or refined granularity relative to their associated top-level question.
- the question database 1 10 in this example has five levels of questions, i.e. top-level or first-level and four sub-levels.
- the question database 1 10 also includes stored thereon correct answers for the questions.
- the system 100 also has a user interface 120 including a display 122 and an input arrangement 1 24 for presenting the questions to, and receiving a response from, a user 1 30 of the system 100.
- the technical details of the user interface 120 are not germane to the disclosure and one skilled in the art will appreciate that any of a number of existing or future user interface devices may be employed.
- the user 130 will typically be a student or pupil.
- reference numeral 200 generally indicates a computerised testing and diagnostic method in accordance with the disclosure. From a fairly high-level perspective, the method includes presenting (at block 202) via the display 122 a plurality of questions relating to different areas of a specific discipline to the user 130. In the first iteration of the method 200, the top-level questions are broad or coarsely granular. Answers from the user 1 30 are received via the input arrangement 124.
- the processing module 102 assesses (at block 206) the received answers to determine whether or not they are correct and grades (at block 208) the various areas to which the questions relate with a percentage out of 100. The processing module 102 then compares the grade (or proficiency) of the areas against the acceptable threshold to determine (at block 210) which areas are below the acceptable threshold and may require further analysis and later remediation.
- the granularity of the questions is refined by retrieving the next level of questions relating to the determined weaker areas.
- the questions will be second-level questions.
- the method 200 repeats from block 202 until the all the levels are depleted or the user 1 30 has no remaining weak areas (i.e. areas below the acceptable threshold).
- the method 200 will now be further described with reference to a practical example and the screenshots in Figure 3.
- the method 200 lends itself very well to mathematical disciplines and the example screenshots relate to high school level mathematics.
- Screenshot 300 illustrates a top level question from the areas of Geometry and measurement within the discipline of mathematics.
- the question is presented (at block 202) in text and graphical format on the display 122 and the user 1 30 answers the question by selecting one of the multiple choice answers.
- the processing module 102 compares (at block 206) the received answers against the correct answers and grades (at block 208) each area, as illustrated in screenshot 302.
- the acceptable threshold has been defined as 40%. It will be appreciated that this can simply be adjusted based on teaching systems, school districts, etc.
- the user 130 answered many questions in the area of Geometry and measurement incorrectly and only scored 1 6.67%.
- the processing module 1 02 determines (at block 210) that this is below the acceptable threshold of 40% and therefore retrieves (at block 21 2) from the question database 1 10 second-level questions relating to the top-level of Geometry and measurement - Geometry and measurement thus effectively becoming the new broad discipline and the second-level questions becoming areas within the new discipline. These new areas are illustrated in screenshot 304 with a specific question in the second level illustrated in screenshot 306.
- steps 202 to 208 repeat and the results/grades for these second level questions are shown in screenshot 308.
- "Circles" is determined as the only area in the second level which failed the acceptance threshold and thus third- level questions relating to Circles are retrieved, and the method 200 repeats.
- third-level questions related to circles are presented to the user 130.
- the weakest area(s) of the user 130 have been identified. In this example, the user's weakest area is Constructing a circle as illustrated in screenshot 312.
- the method 200 and system 100 can be adapted to recommend or even present remedial lessons to the user 1 30 in the areas which were identified as weak. Nonetheless, the weak areas (perhaps even being as basic as a fundamental skill) have been identified with a high specificity and fine granularity. This allows targeted remediation and instruction (whether by conventional classroom-based lessons or using the method 200 and system 100).
- Figure 4 shows a diagrammatic representation of a computer in the example form of a computer system 400 within which a set of instructions, for causing the computer to perform any one or more of the methodologies discussed herein, may be executed.
- the computer operates as a standalone device or may be connected (e.g. networked) to other computers.
- the computer may operate in the capacity of a server or a client computer in a server-client network environment, or as a peer computer in a peer-to-peer (or distributed) network environment.
- the computer may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any computer capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computer.
- PC personal computer
- PDA Personal Digital Assistant
- STB set-top box
- web appliance web appliance
- network router switch or bridge
- the example computer system 400 includes a processor 402 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main memory 404 and a static memory 406, which communicate with each other via a bus 408.
- the computer system 400 may further include a video display unit 410 (e.g., a liquid crystal display (LCD)).
- the computer system 400 also includes an alphanumeric input device 41 2 (e.g., a keyboard), a user interface (Ul) navigation device 414 (e.g., a mouse), a disk drive unit 41 6, a signal generation device 418 (e.g., a speaker) and a network interface device 420.
- the disk drive unit 41 6 includes a computer-readable medium 422 on which is stored one or more sets of instructions and data structures (e.g., software 424) embodying or utilised by any one or more of the methodologies or functions described herein.
- the software 424 may also reside, completely or at least partially, within the main memory 404 and/or within the processor 402 during execution thereof by the computer system 400, the main memory 404 and the processor 402 also constituting computer -readable media.
- the software 424 may further be transmitted or received over a network 426 via the network interface device 420 utilising any one of a number of well-known transfer protocols (e.g., HTTP, UDP, TCP, USSD, FTP).
- HTTP HyperText Transfer Protocol
- UDP User Datagram Protocol
- TCP Transmission Control Protocol
- USSD User Datagram Protocol
- FTP FTP
- While the computer-readable medium 422 is shown in an example embodiment to be a single medium, the term "computer-readable medium” should be taken to include a single medium or multiple media (e.g., a centralised or distributed database, and/or associated caches and servers) that store the one or more sets of instructions.
- the term "computer-readable medium” shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the computer and that cause the computer to perform any one or more of the methodologies of the present embodiments, or that is capable of storing, encoding or carrying data structures utilised by or associated with such a set of instructions.
- the term “computer-readable medium” shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
- the system 1 00 may be in the form of the computer system 400, or may include some components thereof.
- the Inventor believes that the disclosure provides a particularly effective method 200 for identifying a person's weaknesses within a particular discipline of field.
- the method 200 has the capability of being indefinitely iterative, limited only by the questions which can be conceived by an instructor or operator. Multiple branches of one area can be probed, or only one area specifically.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Electrically Operated Instructional Devices (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
THIS INVENTION provides a computerised testing and diagnostic method (200) which includes: i. presenting (202), via a display (122) of a user interface (120), to a user or test-taker (130) a plurality of questions relating to different areas within a discipline; ii. receiving (204) from the user (130), via an input arrangement (124) associated with the user interface (120), answers to the questions; iii. assessing (206), by a processing module (102), the answers to determine whether or not they are correct; iv. grading (208), by the processing module (102), the proficiency of the user (130) in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold (210) and, if so, decreasing or refining (212) the granularity of the questions in the areas in which proficiency is below the acceptable threshold; and v. repeating at least steps 1.i. (202) to 1.iii. (206) at least once in respect of the refined questions.
Description
A computerised testing and diagnostic method and system
FIELD OF DISCLOSURE
The disclosure relates broadly to computerised academic test-giving and specifically to a computerised testing and diagnostic method and system.
BACKGROUND OF DISCLOSURE
It can be difficult for teachers to identify within a discipline specific areas or skills with which a student has a problem or lacks knowledge. Even one such weak area within a discipline may give the appearance that the student lacks competency in that discipline as a whole. Similarly, it may be difficult for the student himself/herself to identify such problem areas. The Inventor has noted that often a problem in grasping a particular task or concept relates to an underlying deficiency or lacking fundamental skill. For example, in mathematics, a student may struggle with the task of drawing a parabola, because his/her underlying ability to factorise a trinomial is lacking.
The Inventor desires a method and system to diagnose with specificity and appropriate granularity deficiencies within a discipline. Once the problem has been identified, it can be corrected and remediated using conventional teaching methodologies.
SUMMARY OF DISCLOSURE
Accordingly, the disclosure provides a computerised testing and diagnostic method which includes:
i. presenting, via a display of a user interface, to a user or test-taker a plurality of questions relating to different areas within a discipline;
ii. receiving from the user, via an input arrangement associated with the user interface, answers to the questions;
iii. assessing, by a processing module, the answers to determine whether or not they are correct;
iv. grading, by the processing module, the proficiency of the user in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold and, if so, decreasing or refining the granularity of the questions in the areas in which proficiency is below the acceptable threshold; and
v. repeating at least steps i. to iii. at least once in respect of the refined questions.
The method may include repeating steps i. to iv. iteratively (e.g. a plurality of times) while progressively decreasing or refining the granularity each time. If there are plural areas below the acceptable proficiency threshold, the method may be repeated for each area, e.g. along separate branches or paths.
By iteratively identifying areas below the acceptable proficiency threshold, e.g. weak areas, expanding the weak areas, and repeating, the method may allow for fundamental areas of weakness to be identified.
The acceptable threshold may be relative, e.g. an area which is weaker than, or scored below, other areas for the same user. The acceptable threshold may be absolute, e.g. a pre-defined or pre-definable score/rating, e.g. 50%.
The term "user" may include a student, pupil, or the like, and indeed any person to whom the method is being applied.
The method may include the prior step of storing a plurality of questions, and their associated correct answers, on a database. The method may include the step of storing the user-inputted answers for later review. The hierarchy or interconnectedness of the questions may provide a level of intelligence in the methodology. The hierarchy may be carefully designed such that weaknesses identified from higher level questions lead to appropriate refined, or lower level, questions being posed thereby to unpack and identify a user's weaknesses.
The method lends itself particularly well to the discipline of mathematics, although it is not necessarily limited thereto.
The method may include suggesting or providing lessons to remediate any identified weaknesses. The method may therefore include the prior step of associating lessons with each possible area within the discipline.
The user interface and the processing module may be remote from each other, and the method may include generating and sending question messages and answer messages between the user interface and the processing module.
The disclosure extends to a computerised testing and diagnostic system which includes:
a user interface including a display and an input arrangement;
a question database having stored thereon a plurality of top-level questions each relating to a different area within a discipline and at
least a plurality of sub-level questions associated with each one of the top-level questions, the sub-level questions having decreased or refined granularity compared with their associated top-level question ; and
a processing module operable to:
i. present via the display the questions to a user;
ii. receive via the input arrangement answers to the questions from the user;
iii. assess the answers to determine whether or not they are correct;
iv. grade the proficiency of the user in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold; and
v. repeat at least operations i. to iii. at least once using the sub- level questions associated with the or each top-level question or area in which the proficiency is below the acceptable threshold.
The question database may include more than two levels of questions. Thus, the question database may include first-level (e.g. top-level) questions, second-level (e.g. sub-level) questions, third-level questions, and so forth, in a cascading tree-type relationship.
The question database may include correct answers associated with the questions. Thus, the processing module may be operable to compare the received answers from the user with the correct answers on the question database, thereby to assess whether or not the received answers are correct.
It is to be understood that the processing module may be one or more microprocessors, controllers, digital signal processors (DSPs), or any other suitable computing device, resource, hardware, software, or embedded logic.
It will be appreciated that the computerised testing and diagnostic system may be embodied in a single device or apparatus, such as a personal computer, laptop, tablet, mobile phone, or the like. Instead, the computerised testing and diagnostic system may be distributed among a plurality of devices, e.g. in a server/client relationship. In one example, the user interface may be presented by a client terminal, the processing module may be hosted by a remote server, the client terminal and remote server being networked together. In other embodiment, the user interface and the processing module may be consolidated into the same device.
The disclosure extends further to a non-transitory computer-readable medium having stored thereon a computer program which, when executed by a computer system, causes the computer system to perform a method as defined above.
BRIEF DESCRIPTION OF DRAWINGS
The disclosure will now be further described, by way of example, with reference to the accompanying diagrammatic drawings.
In the drawings:
FIGURE 1 shows a schematic view of a computerised testing and diagnostic system in accordance with the disclosure;
FIGURE 2 shows a flow diagram of a computerised testing and diagnostic method in accordance with the disclosure;
FIGURE 3 shows a series of screenshots illustrating a user interface used in accordance with the method of Figure 2; and
FIGURE 4 shows a schematic view of a computer system within which a set of instructions, for causing the computer system to perform any
one or more of the methodologies discussed herein, may be executed.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENT Referring initially to Figure 1 , reference numeral 100 generally indicates a computerised testing and diagnostic system in accordance with the disclosure. The system 100 includes a processing module 102, a computer program 104 and details of an acceptable threshold 106 for areas of assessment. The computer program 104 is stored on a computer-readable medium and is operable to direct the operation of the processing module 1 02.
Details of the acceptable threshold 106 are also stored on the computer- readable medium.
The system 100 includes a question database 1 1 0 having stored thereon a plurality of questions each relating to a different area within a discipline. The database 1 1 0 also includes a plurality of sub-level questions associated with each one of the top-level questions, the sub-level questions having decreased or refined granularity relative to their associated top-level question. In fact, the question database 1 10 in this example has five levels of questions, i.e. top-level or first-level and four sub-levels. The question database 1 10 also includes stored thereon correct answers for the questions.
The system 100 also has a user interface 120 including a display 122 and an input arrangement 1 24 for presenting the questions to, and receiving a response from, a user 1 30 of the system 100. The technical details of the user interface 120 are not germane to the disclosure and one skilled in the art will appreciate that any of a number of existing or future user interface devices may be employed. The user 130 will typically be a student or pupil.
Referring to Figure 2, reference numeral 200 generally indicates a computerised testing and diagnostic method in accordance with the disclosure. From a fairly high-level perspective, the method includes presenting (at block 202) via the display 122 a plurality of questions relating to different areas of a specific discipline to the user 130. In the first iteration of the method 200, the top-level questions are broad or coarsely granular. Answers from the user 1 30 are received via the input arrangement 124.
The processing module 102 assesses (at block 206) the received answers to determine whether or not they are correct and grades (at block 208) the various areas to which the questions relate with a percentage out of 100. The processing module 102 then compares the grade (or proficiency) of the areas against the acceptable threshold to determine (at block 210) which areas are below the acceptable threshold and may require further analysis and later remediation.
The granularity of the questions is refined by retrieving the next level of questions relating to the determined weaker areas. In the second iteration of the method 200, the questions will be second-level questions. The method 200 repeats from block 202 until the all the levels are depleted or the user 1 30 has no remaining weak areas (i.e. areas below the acceptable threshold).
The method 200 will now be further described with reference to a practical example and the screenshots in Figure 3. The method 200 lends itself very well to mathematical disciplines and the example screenshots relate to high school level mathematics. In this example, there are four areas in the top or first level :
Geometry and measurement;
Numbers;
Data handling; and
Algebra.
Screenshot 300 illustrates a top level question from the areas of Geometry and measurement within the discipline of mathematics. The question is presented (at block 202) in text and graphical format on the display 122 and the user 1 30 answers the question by selecting one of the multiple choice answers.
Once all the questions within the top-level have been received (at block 204), the processing module 102 compares (at block 206) the received answers against the correct answers and grades (at block 208) each area, as illustrated in screenshot 302. In this example, the acceptable threshold has been defined as 40%. It will be appreciated that this can simply be adjusted based on teaching systems, school districts, etc. The user 130 answered many questions in the area of Geometry and measurement incorrectly and only scored 1 6.67%.
The processing module 1 02 determines (at block 210) that this is below the acceptable threshold of 40% and therefore retrieves (at block 21 2) from the question database 1 10 second-level questions relating to the top-level of Geometry and measurement - Geometry and measurement thus effectively becoming the new broad discipline and the second-level questions becoming areas within the new discipline. These new areas are illustrated in screenshot 304 with a specific question in the second level illustrated in screenshot 306.
Thus, steps 202 to 208 repeat and the results/grades for these second level questions are shown in screenshot 308. "Circles" is determined as the only area in the second level which failed the acceptance threshold and thus third- level questions relating to Circles are retrieved, and the method 200 repeats. In the same fashion as before, third-level questions related to circles (see screenshot 310) are presented to the user 130.
Finally, once the method 200 has been iteratively repeated to its conclusion (e.g. when no lower levels remain), the weakest area(s) of the user 130 have been identified. In this example, the user's weakest area is Constructing a circle as illustrated in screenshot 312.
By way of development, the method 200 and system 100 can be adapted to recommend or even present remedial lessons to the user 1 30 in the areas which were identified as weak. Nonetheless, the weak areas (perhaps even being as basic as a fundamental skill) have been identified with a high specificity and fine granularity. This allows targeted remediation and instruction (whether by conventional classroom-based lessons or using the method 200 and system 100).
Figure 4 shows a diagrammatic representation of a computer in the example form of a computer system 400 within which a set of instructions, for causing the computer to perform any one or more of the methodologies discussed herein, may be executed. In alternative embodiments, the computer operates as a standalone device or may be connected (e.g. networked) to other computers. In a networked deployment, the computer may operate in the capacity of a server or a client computer in a server-client network environment, or as a peer computer in a peer-to-peer (or distributed) network environment. The computer may be a personal computer (PC), a tablet PC, a set-top box (STB), a Personal Digital Assistant (PDA), a cellular telephone, a web appliance, a network router, switch or bridge, or any computer capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that computer. Further, while only a single computer is illustrated, the term "computer" shall also be taken to include any collection of computer that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein.
The example computer system 400 includes a processor 402 (e.g., a central processing unit (CPU), a graphics processing unit (GPU) or both), a main
memory 404 and a static memory 406, which communicate with each other via a bus 408. The computer system 400 may further include a video display unit 410 (e.g., a liquid crystal display (LCD)). The computer system 400 also includes an alphanumeric input device 41 2 (e.g., a keyboard), a user interface (Ul) navigation device 414 (e.g., a mouse), a disk drive unit 41 6, a signal generation device 418 (e.g., a speaker) and a network interface device 420.
The disk drive unit 41 6 includes a computer-readable medium 422 on which is stored one or more sets of instructions and data structures (e.g., software 424) embodying or utilised by any one or more of the methodologies or functions described herein. The software 424 may also reside, completely or at least partially, within the main memory 404 and/or within the processor 402 during execution thereof by the computer system 400, the main memory 404 and the processor 402 also constituting computer -readable media.
The software 424 may further be transmitted or received over a network 426 via the network interface device 420 utilising any one of a number of well- known transfer protocols (e.g., HTTP, UDP, TCP, USSD, FTP).
While the computer-readable medium 422 is shown in an example embodiment to be a single medium, the term "computer-readable medium" should be taken to include a single medium or multiple media (e.g., a centralised or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "computer-readable medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the computer and that cause the computer to perform any one or more of the methodologies of the present embodiments, or that is capable of storing, encoding or carrying data structures utilised by or associated with such a set of instructions. The term "computer-readable medium" shall accordingly be taken to include, but not be limited to, solid-state memories, optical and magnetic media, and carrier wave signals.
The system 1 00 may be in the form of the computer system 400, or may include some components thereof.
The Inventor believes that the disclosure provides a particularly effective method 200 for identifying a person's weaknesses within a particular discipline of field. The method 200 has the capability of being indefinitely iterative, limited only by the questions which can be conceived by an instructor or operator. Multiple branches of one area can be probed, or only one area specifically.
Claims
1. A computerised testing and diagnostic method which includes:
i. presenting, via a display of a user interface, to a user or test- taker a plurality of questions relating to different areas within a discipline;
ii. receiving from the user, via an input arrangement associated with the user interface, answers to the questions;
iii. assessing, by a processing module, the answers to determine whether or not they are correct;
iv. grading, by the processing module, the proficiency of the user in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold and, if so, decreasing or refining the granularity of the questions in the areas in which proficiency is below the acceptable threshold; and
v. repeating at least steps i. to iii. at least once in respect of the refined questions.
2. The method as claimed in claim 1 , which includes repeating steps i. to iv. iteratively while progressively decreasing or refining the granularity each time.
3. The method as claimed in claim 1 , which includes repeating the steps for each area if there are plural areas below the acceptable proficiency threshold.
4. The method as claimed in claim 1 , in which the acceptable threshold is relative to the user or absolute independent of the user.
5. The method as claimed in claim 1 , which includes the prior step of storing a plurality of questions, and their associated correct answers,
on a database and in which the hierarchy or interconnectedness of the questions provides a level of intelligence.
The method as claimed in claim 5, in which the hierarchy is designed such that weaknesses identified from higher level questions lead to appropriate refined, or lower level, questions being posed thereby to unpack and identify a user's weaknesses.
The method as claimed in claim 1 , which includes storing the answers received from the user for later review.
The method as claimed in claim 1 , which includes suggesting or providing lessons to remediate any identified weaknesses.
The method as claimed in claim 8, which includes the prior step of associating lessons with each possible area within the discipline.
The method as claimed in claim 1 , in which the user interface and the processing module are remote from each other, the method including generating and sending question messages and answer messages between the user interface and the processing module.
A computerised testing and diagnostic system which includes:
a user interface including a display and an input arrangement; a question database having stored thereon a plurality of top-level questions each relating to a different area within a discipline and at least a plurality of sub-level questions associated with each one of the top-level questions, the sub-level questions having decreased or refined granularity compared with their associated top-level question; and
a processing module operable to:
i. present via the display the questions to a user;
ii. receive via the input arrangement answers to the questions from the user;
iii. assess the answers to determine whether or not they are correct;
iv. grade the proficiency of the user in each of the areas to determine whether or not the proficiency in one or more areas is below an acceptable threshold; and
v. repeat at least operations i. to iii. at least once using the sub-level questions associated with the or each top-level question or area in which the proficiency is below the acceptable threshold.
12. The system as claimed in claim 1 1 , in which the question database includes more than two levels of questions.
13. The system as claimed in claim 1 1 , in which:
the question database includes correct answers associated with the questions; and
the processing module is operable to compare the received answers from the user with the correct answers on the question database, thereby to assess whether or not the received answers are correct.
14. The system as claimed in claim 1 1 in which the user interface is provided on a client terminal and the processing module is provided on a server remote from the client terminal.
15. The system as claimed in claim 14 in which the user interface is at least partially web-based.
16. A non-transitory computer-readable medium having stored thereon a computer program which, when executed by a computer system, causes the computer system to perform the method as claimed in claim 1 .
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ZA2012/03859 | 2012-05-25 | ||
ZA201203859 | 2012-05-25 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2013175443A2 true WO2013175443A2 (en) | 2013-11-28 |
WO2013175443A3 WO2013175443A3 (en) | 2014-01-23 |
Family
ID=49624451
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2013/054304 WO2013175443A2 (en) | 2012-05-25 | 2013-05-24 | A computerised testing and diagnostic method and system |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2013175443A2 (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5827070A (en) * | 1992-10-09 | 1998-10-27 | Educational Testing Service | System and methods for computer based testing |
US20060035207A1 (en) * | 2004-08-12 | 2006-02-16 | Henson Robert A | Test discrimination and test construction for cognitive diagnosis |
US20080286737A1 (en) * | 2003-04-02 | 2008-11-20 | Planetii Usa Inc. | Adaptive Engine Logic Used in Training Academic Proficiency |
US20100129783A1 (en) * | 2008-11-25 | 2010-05-27 | Changnian Liang | Self-Adaptive Study Evaluation |
US20100190142A1 (en) * | 2009-01-28 | 2010-07-29 | Time To Know Ltd. | Device, system, and method of automatic assessment of pedagogic parameters |
WO2011122930A1 (en) * | 2010-03-29 | 2011-10-06 | Mimos Berhad | Assessment system and method for innovative learning |
-
2013
- 2013-05-24 WO PCT/IB2013/054304 patent/WO2013175443A2/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5827070A (en) * | 1992-10-09 | 1998-10-27 | Educational Testing Service | System and methods for computer based testing |
US20080286737A1 (en) * | 2003-04-02 | 2008-11-20 | Planetii Usa Inc. | Adaptive Engine Logic Used in Training Academic Proficiency |
US20060035207A1 (en) * | 2004-08-12 | 2006-02-16 | Henson Robert A | Test discrimination and test construction for cognitive diagnosis |
US20100129783A1 (en) * | 2008-11-25 | 2010-05-27 | Changnian Liang | Self-Adaptive Study Evaluation |
US20100190142A1 (en) * | 2009-01-28 | 2010-07-29 | Time To Know Ltd. | Device, system, and method of automatic assessment of pedagogic parameters |
WO2011122930A1 (en) * | 2010-03-29 | 2011-10-06 | Mimos Berhad | Assessment system and method for innovative learning |
Also Published As
Publication number | Publication date |
---|---|
WO2013175443A3 (en) | 2014-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dauer et al. | Analyzing change in students' gene‐to‐evolution models in college‐level introductory biology | |
US20170154542A1 (en) | Automated grading for interactive learning applications | |
Carberry et al. | Exploring student conceptions of modeling and modeling uses in engineering design | |
Carrick | Student achievement and NCLEX-RN success: Problems that persist | |
North et al. | To Adapt MOOCs, or Not? That Is No Longer the Question. | |
Aleven et al. | Automated, unobtrusive, action-by-action assessment of self-regulation during learning with an intelligent tutoring system | |
US9858828B1 (en) | Expert systems and methods for dynamic assessment and assessment authoring | |
US11756445B2 (en) | Assessment-based assignment of remediation and enhancement activities | |
US20190066525A1 (en) | Assessment-based measurable progress learning system | |
US10088984B2 (en) | Decision based learning | |
WO2014127131A1 (en) | Knowledge evaluation system | |
US10720072B2 (en) | Adaptive learning system using automatically-rated problems and pupils | |
US10410534B2 (en) | Modular system for the real time assessment of critical thinking skills | |
WO2014145125A1 (en) | Collaborative learning environment | |
US20190372863A1 (en) | Simulating a user score from input objectives | |
CN108230203A (en) | Learning interaction group construction method and system | |
Stylianidis | Mobile learning: open topics, concept and design of a learning framework | |
Salehudin et al. | The use of smartphones for online learning interactions by elementary school students | |
Horwitz et al. | Teaching teamwork: Electronics instruction in a collaborative environment | |
KR102583002B1 (en) | method for diagnosing a user by analyzing the user's problem solving and an electronic device thereof | |
Istiyowati et al. | Readiness for the Implementation of Ubiquitous Learning in Programming Course in Higher Education. | |
Wood et al. | Developing tools for assessing and using commercially available reading software programs to promote the development of early reading skills in children | |
Park et al. | A peer-assessment system connecting on-line and a face-to-face smart classroom | |
WO2013175443A2 (en) | A computerised testing and diagnostic method and system | |
Bremgartner et al. | Using agents and open learner model ontology for providing constructive adaptive techniques in virtual learning environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13794599 Country of ref document: EP Kind code of ref document: A2 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13794599 Country of ref document: EP Kind code of ref document: A2 |