US20190156692A1 - Methods for improving test efficiency and accuracy in a computer adaptive test (cat) - Google Patents

Methods for improving test efficiency and accuracy in a computer adaptive test (cat) Download PDF

Info

Publication number
US20190156692A1
US20190156692A1 US16/254,315 US201916254315A US2019156692A1 US 20190156692 A1 US20190156692 A1 US 20190156692A1 US 201916254315 A US201916254315 A US 201916254315A US 2019156692 A1 US2019156692 A1 US 2019156692A1
Authority
US
United States
Prior art keywords
pretest
items
computer adaptive
test
examinee
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/254,315
Inventor
Lisa Gawlick
Changhui Zhang
Nancy PETERSEN
Lingyun Gao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Act Education Corp
Original Assignee
ACT Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ACT Inc filed Critical ACT Inc
Priority to US16/254,315 priority Critical patent/US20190156692A1/en
Publication of US20190156692A1 publication Critical patent/US20190156692A1/en
Assigned to ACT, INC. reassignment ACT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GAO, LINGYUN, GAWLICK, LISA, PETERSEN, NANCY, ZHANG, CHANGHUI
Assigned to IMPACT ASSET CORP. reassignment IMPACT ASSET CORP. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ACT, INC.
Assigned to ACT EDUCATION CORP. reassignment ACT EDUCATION CORP. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: IMPACT ASSET CORP.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B7/00Electrically-operated teaching apparatus or devices working with questions and answers

Definitions

  • the present disclosure relates to computer adaptive testing. More specifically, but not exclusively, the present disclosure relates to methods for improving test efficiency and accuracy by providing a procedure for extracting information from an examinee's responses to pretest items for use in construct estimation in a Computer Adaptive Test (CAT).
  • CAT Computer Adaptive Test
  • pretest items may be imbedded in the test but not intended necessarily to make a contribution to the estimation of the examinee's latent construct.
  • pretest items are embedded in a CAT but examinee responses to the pretest items are not used in item selection or scoring; generally only examinee responses to operational items are used in item selection or scoring. Thus, the information contained in the examinee's responses to the pretest items is underutilized or even wasted.
  • the next item administered to an examinee can be selected based on the examinee's interim ability score which can be estimated using responses to the operational items administered thus far and the interim ability estimate is updated after the administration of each operational item.
  • pretest information to provide improved interim ability score estimation, to fine-tune a test, for example, by making a test shorter or more accurate and thereby more effective.
  • Still another object, feature, or advantage of the present disclosure provides for using examinee responses to pretest items in interim ability scoring.
  • pretest item parameters are not in place when they are administered.
  • the pretest item parameters could be estimated on the fly (i.e., in real time during test administration) and updated right after being exposed to a new examinee, but the response sample size is smaller than in the standard practice for calibration. Small sample sizes could lead to large error in the estimated item parameters.
  • the uncertainty of the item parameters of pretest items discourages their use in construct calculations.
  • Another object, feature, or advantage of the present disclosure uses weighted interim score calculations to control the error impact when including pretest items in construct estimation.
  • the present disclosure improves test efficiency and accuracy in a CAT.
  • One exemplary method is for use of pretest items in addition to operational items in a test to calculate interim scores. This may be accomplished, for example, by providing a computer implemented test that includes a plurality of operational items and one or more pretest items having one or more item parameters. Interim latent construct estimates are calculated using both operational and pretest items. Error for the interim latent construct estimates is controlled by weighting the contribution of the pretest items.
  • a method for using pretest items to calculate interim scores uses a computer implemented test having a plurality of test items including a plurality of operational items and one or more pretest items having one or more pretest item parameters.
  • latent construct estimates are calculated using both operational and pretest items by estimating one or more pretest item parameters for use with a set of calibrated parameters for the plurality of operational items.
  • a method for using pretest items in the calculation of interim scores includes providing a computer implemented test having a plurality of test items.
  • the test items include a plurality of operational items and one or more pretest items having one or more pretest item parameters.
  • Steps of the method include, additionally, for example, calculating latent construct estimates using both operational and pretest items, controlling error for the latent construct estimates by weighting the contributions of the one or more pretest items, estimating the one or more pretest item parameters for use with a set of calibrated parameters for the plurality of operational items, and updating an interim score for the computer implemented test based on examinee responses to the one or more pretest items.
  • FIG. 1 is a flowchart of a process for using pretest items for latent construct estimation in computer adaptive testing in accordance with an illustrative embodiment
  • FIG. 2 is a block diagram providing an overview of a process for using pretest items in latent construct estimations in accordance with an illustrative embodiment.
  • the present disclosure provides for various computer adaptive testing methods.
  • One exemplary method includes using pretest items in interim latent construct estimation.
  • the accuracy and efficiency of computer adaptive testing is improved by making the test more succinct and/or accurate, and thus more effective. What results is a testing platform using computer adaptive testing that can estimate examinees' ability using examinee's response information relating to a specific set of pretest items.
  • the disclosure may be implemented using a computer adaptive testing platform to create a more succinct and/or accurate and thereby shorter and more effective test using examinees responses to pretest items included in the administration of a test.
  • pretest items are often embedded but not used to estimate an examinee's latent construct.
  • examinees' responses to these items can reveal additional information, which can be used to improve score accuracy and test efficiency.
  • pretest items in interim scoring can have benefits such as improving candidate motivation when the items administered are closer to candidate ability.
  • pretest item parameters are not in place when these items are administered. While research studies have demonstrated that pretest item parameters can be estimated on the fly, the challenge of larger error as a result of smaller sample sizes for the pretest items has posed a big concern and no solutions have been found. Consequently, the uncertainty of the pretest item parameters deters people from using them in scoring.
  • a computer adaptive testing platform For acquiring an examinee's response information as shown in FIG. 2 , a computer adaptive testing platform is provided as illustrated in FIG. 1 .
  • the examining interface piece shown in FIG. 2 may be a computer or computer network.
  • Examples of a computer adaptive testing platform shown in FIG. 1 may be a computer network that includes, for example, a server, workstation, scanner, printer, a datastore, and other connected networks.
  • the computer networks may be configured to provide a communication path for each device of the computer network to communicate with other devices.
  • the computer network may be the internet, a public switchable telephone network, a local area network, private wide area network, wireless network, or any of the like.
  • a computer adaptive testing administration script may be executed on the server and/or workstation.
  • the server may be configured to execute a computer adaptive testing script, provide outputs for displaying to the workstation, and receive inputs from the workstation, such as an examinee's response information.
  • the workstation may be configured to execute a computer adaptive testing application individually or co-operatively with one or more workstations.
  • the scanner may be configured to scan textual content and output the content in a computer readable format.
  • the printer may be configured to output the content from the computer adaptive test application to a print media, such as paper.
  • data associated with examining response information of the computer adaptive test application or any of the associated processes illustrated or shown in FIGS. 1-2 , and any of the like may be stored on a datastore and displayed on a workstation.
  • the datastore may additionally be configured to receive and/or forward some or all of the stored data.
  • some or all of the computer network may be subsumed within a single device.
  • FIG. 2 depicts a computer, it is understood that the disclosure is not limited to operation within a computer or computer network, but rather, the disclosure may be practiced in or on any suitable electronic device or platform. Accordingly, the computer illustrated in FIG. 2 or computer network (not shown) are illustrated and discussed for purposes of explaining the disclosure and are not meant to limit the disclosure in any respect.
  • an operating protocol is provided on the workstation or computer network for operating a computer adaptive testing module or application.
  • a test made up of a group of pretest item content and operational item content is selected for delivery through a computer adaptive testing module controlled by an operating protocol.
  • a test script administration application or process could be implemented to select or establish pretest item parameters for the selected pretest items to be administered by the test script application.
  • a test is administered using the selected operational and pretest items having one or more selectable pretest item parameters.
  • examinee response information is acquired for the selected operational and pretest items administered as part of the test script administration process or application.
  • an interim score for an examinee such as a latent construct estimate may be calculated. These calculations could be used to inform subsequent selection of one or more operational items, one or more pretest items, and/or one or more pretest item parameters.
  • a latent construct estimator using one or more estimation methods for providing an examinee's interim score, such as a latent construct estimate, or selecting one or more pretest items having selected item parameters. Examples of controlling error during latent construct estimation include weighting the contribution of the pretest items on an examinee's interim latent construct estimates.
  • pretest item parameters for one or more of the selected pretest items during the administration of the test script administration process or application.
  • a calibration script may also be included and made operable on a computer network, a workstation or like electronic device for calibrating, adjusting or re-defining pretest item parameters based on latent construct estimates.
  • the resulting interim scores are more diverse when pretest items are included, thus the following items are more diverse.
  • the one or more operational items selected in a test may be reduced to make the test shorter, more accurate, more succinct, and more effective based on the use of pretest items in calculation of interim construct estimates.
  • pretest items in interim latent score calculations provides a method to refine a test script administration process or application that uses one or more pretest items in combination with one or more operational items for a testing sequence or event using computer adaptive testing.
  • the present disclosure is not to be limited to the particular embodiments described 5 herein.
  • the present disclosure contemplates numerous variations in the type of ways in which embodiments of the disclosure may be applied to computer adaptive testing.
  • the foregoing description has been presented for purposes of illustration and description. It is not intended to be an exhaustive list or limit any of the disclosure to the precise forms disclosed. It is contemplated that other alternatives or exemplary aspects 10 that are considered are included in the disclosure.
  • the description is merely examples of embodiments, processes or methods of the disclosure.
  • the methods for controlling waiting for use of pretest items in latent construct estimation may be varied according to use and test setting, test type, and other like parameters. It is understood that any other modifications, substitutions, and/or additions may be made, which are within the 15 intended spirit and scope of the disclosure.
  • the disclosure accomplishes at least all of the intended objectives.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Debugging And Monitoring (AREA)
  • Charge And Discharge Circuits For Batteries Or The Like (AREA)
  • Tests Of Electric Status Of Batteries (AREA)
  • Secondary Cells (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A method for use of pretest items in a test to calculate interim scores is provided. The method includes, for example, a computer implemented test having a plurality of test items that include, for example, a plurality of operational items and one or more pretest 5 items having one or more pretest item parameters. An interim latent construct estimate is calculated using both operational and pretest items. The error for the latent construct estimation is controlled by weighting the contribution of the one or more pretest items.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/562,183, filed Dec. 5, 2014, which further claims priority under 35 U.S.C. § 119 to provisional application Ser. No. 61/912,774, filed Dec. 6, 2013, the disclosures of which are hereby incorporated in their entirety.
  • BACKGROUND OF THE DISCLOSURE I. Field of the Disclosure
  • The present disclosure relates to computer adaptive testing. More specifically, but not exclusively, the present disclosure relates to methods for improving test efficiency and accuracy by providing a procedure for extracting information from an examinee's responses to pretest items for use in construct estimation in a Computer Adaptive Test (CAT).
  • II. Description of the Prior Art
  • In a Computer Adaptive Test (CAT), pretest items may be imbedded in the test but not intended necessarily to make a contribution to the estimation of the examinee's latent construct. Typically, pretest items are embedded in a CAT but examinee responses to the pretest items are not used in item selection or scoring; generally only examinee responses to operational items are used in item selection or scoring. Thus, the information contained in the examinee's responses to the pretest items is underutilized or even wasted.
  • Therefore, it is a primary object, feature, or advantage of the present disclosure to use valuable information in the examinee's responses to the pretest items, which may be used together with the information in the examinee's responses to the operational items for construct estimation.
  • In a CAT, the next item administered to an examinee can be selected based on the examinee's interim ability score which can be estimated using responses to the operational items administered thus far and the interim ability estimate is updated after the administration of each operational item.
  • Therefore, it is a primary object, feature, or advantage of the present disclosure to improve efficiency and final score estimation for a CAT by providing a more accurate interim ability score, which means a more informative next item can be selected for administration to the examinee.
  • It is another object, feature, or advantage of the present disclosure to use pretest information to provide improved interim ability score estimation, to fine-tune a test, for example, by making a test shorter or more accurate and thereby more effective.
  • Still another object, feature, or advantage of the present disclosure provides for using examinee responses to pretest items in interim ability scoring.
  • The obstacle of counting on pretest items for construct estimation is that the item parameters are not in place when they are administered. Technically, the pretest item parameters could be estimated on the fly (i.e., in real time during test administration) and updated right after being exposed to a new examinee, but the response sample size is smaller than in the standard practice for calibration. Small sample sizes could lead to large error in the estimated item parameters. The uncertainty of the item parameters of pretest items discourages their use in construct calculations.
  • Therefore, another object, feature, or advantage of the present disclosure uses weighted interim score calculations to control the error impact when including pretest items in construct estimation.
  • One or more of these and/or other objects, features or advantages of the present disclosure will become apparent from the specification and claims that follow.
  • SUMMARY OF THE DISCLOSURE
  • The present disclosure improves test efficiency and accuracy in a CAT.
  • One exemplary method is for use of pretest items in addition to operational items in a test to calculate interim scores. This may be accomplished, for example, by providing a computer implemented test that includes a plurality of operational items and one or more pretest items having one or more item parameters. Interim latent construct estimates are calculated using both operational and pretest items. Error for the interim latent construct estimates is controlled by weighting the contribution of the pretest items.
  • According to one aspect, a method for using pretest items to calculate interim scores is provided. The method uses a computer implemented test having a plurality of test items including a plurality of operational items and one or more pretest items having one or more pretest item parameters. In one exemplary operation, latent construct estimates are calculated using both operational and pretest items by estimating one or more pretest item parameters for use with a set of calibrated parameters for the plurality of operational items.
  • According to one aspect, a method for using pretest items in the calculation of interim scores is provided. The method includes providing a computer implemented test having a plurality of test items. The test items include a plurality of operational items and one or more pretest items having one or more pretest item parameters. Steps of the method include, additionally, for example, calculating latent construct estimates using both operational and pretest items, controlling error for the latent construct estimates by weighting the contributions of the one or more pretest items, estimating the one or more pretest item parameters for use with a set of calibrated parameters for the plurality of operational items, and updating an interim score for the computer implemented test based on examinee responses to the one or more pretest items.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Illustrated embodiments of the present disclosure are described in detail below with reference to the attached drawing figures, which are incorporated by reference herein, and where:
  • FIG. 1 is a flowchart of a process for using pretest items for latent construct estimation in computer adaptive testing in accordance with an illustrative embodiment;
  • FIG. 2 is a block diagram providing an overview of a process for using pretest items in latent construct estimations in accordance with an illustrative embodiment.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present disclosure provides for various computer adaptive testing methods. One exemplary method includes using pretest items in interim latent construct estimation. The accuracy and efficiency of computer adaptive testing is improved by making the test more succinct and/or accurate, and thus more effective. What results is a testing platform using computer adaptive testing that can estimate examinees' ability using examinee's response information relating to a specific set of pretest items.
  • III. Using Pretest Items to Fine-Tune an Examinee's Interim Score
  • According to one aspect, the disclosure may be implemented using a computer adaptive testing platform to create a more succinct and/or accurate and thereby shorter and more effective test using examinees responses to pretest items included in the administration of a test.
  • a. Illustrative Embodiments for Using Pretest Items to Fine-Tune an Examinee's Interim Score
  • In a computer adaptive test (CAT), pretest items are often embedded but not used to estimate an examinee's latent construct. However, examinees' responses to these items can reveal additional information, which can be used to improve score accuracy and test efficiency. Additionally, including pretest items in interim scoring can have benefits such as improving candidate motivation when the items administered are closer to candidate ability.
  • One of the obstacles to the inclusion of pretest items in scoring is that their parameters are not in place when these items are administered. While research studies have demonstrated that pretest item parameters can be estimated on the fly, the challenge of larger error as a result of smaller sample sizes for the pretest items has posed a big concern and no solutions have been found. Consequently, the uncertainty of the pretest item parameters deters people from using them in scoring.
  • Understanding the impact of less accurate or indiscriminate item parameters on ability estimation, such as latent construct calculations, and finding a way to control and calibrate such error are examined in the preceding paragraphs and accompanying illustrative works identified in the figures incorporated herein. Increasing the efficiency and effectiveness of computer adaptive testing undoubtedly will result in a significant cost savings, and likely a shorter, more succinct and accurate test thereby increasing the effectiveness of computer administrated testing. Notwithstanding a resultant cost savings, surely the efficiency, effectiveness and accuracy of computer adaptive testing can be significantly improved at least without resulting in an increase in cost. Therefore, a method for using pretest items to fine-tune an examinee's interim score is provided herein. For purposes of illustration, a flowchart diagram is provided in FIG. 1 as one pictorial representation of a method for using pretest items to fine-tune an examinee's interim score or latent construct estimation.
  • For acquiring an examinee's response information as shown in FIG. 2, a computer adaptive testing platform is provided as illustrated in FIG. 1. The examining interface piece shown in FIG. 2 may be a computer or computer network. Examples of a computer adaptive testing platform shown in FIG. 1 may be a computer network that includes, for example, a server, workstation, scanner, printer, a datastore, and other connected networks. The computer networks may be configured to provide a communication path for each device of the computer network to communicate with other devices. Additionally, the computer network may be the internet, a public switchable telephone network, a local area network, private wide area network, wireless network, or any of the like. In various embodiments of the disclosure, a computer adaptive testing administration script may be executed on the server and/or workstation. For example, in one embodiment of the disclosure, the server may be configured to execute a computer adaptive testing script, provide outputs for displaying to the workstation, and receive inputs from the workstation, such as an examinee's response information. In various other embodiments, the workstation may be configured to execute a computer adaptive testing application individually or co-operatively with one or more workstations. The scanner may be configured to scan textual content and output the content in a computer readable format. Additionally, the printer may be configured to output the content from the computer adaptive test application to a print media, such as paper. Furthermore, data associated with examining response information of the computer adaptive test application or any of the associated processes illustrated or shown in FIGS. 1-2, and any of the like, may be stored on a datastore and displayed on a workstation. The datastore may additionally be configured to receive and/or forward some or all of the stored data. Moreover, in yet another embodiment, some or all of the computer network may be subsumed within a single device. Although FIG. 2 depicts a computer, it is understood that the disclosure is not limited to operation within a computer or computer network, but rather, the disclosure may be practiced in or on any suitable electronic device or platform. Accordingly, the computer illustrated in FIG. 2 or computer network (not shown) are illustrated and discussed for purposes of explaining the disclosure and are not meant to limit the disclosure in any respect.
  • According to one aspect of the disclosure, an operating protocol is provided on the workstation or computer network for operating a computer adaptive testing module or application. A test made up of a group of pretest item content and operational item content is selected for delivery through a computer adaptive testing module controlled by an operating protocol. In addition to operational item selection and pretest item selection, a test script administration application or process could be implemented to select or establish pretest item parameters for the selected pretest items to be administered by the test script application. Using a computer network, workstation or electronic device, a test is administered using the selected operational and pretest items having one or more selectable pretest item parameters. Using a workstation, computer network, or other electronic device, examinee response information is acquired for the selected operational and pretest items administered as part of the test script administration process or application. Upon acquiring examinee response information for subject pretest items or other operational items as part of a test script being administered, an interim score for an examinee such as a latent construct estimate may be calculated. These calculations could be used to inform subsequent selection of one or more operational items, one or more pretest items, and/or one or more pretest item parameters. Operating on the workstation, network or other electronic device is a latent construct estimator using one or more estimation methods for providing an examinee's interim score, such as a latent construct estimate, or selecting one or more pretest items having selected item parameters. Examples of controlling error during latent construct estimation include weighting the contribution of the pretest items on an examinee's interim latent construct estimates. Other methods include adjusting, calibrating or re-defining pretest item parameters for one or more of the selected pretest items during the administration of the test script administration process or application. A calibration script may also be included and made operable on a computer network, a workstation or like electronic device for calibrating, adjusting or re-defining pretest item parameters based on latent construct estimates. Additionally, the resulting interim scores are more diverse when pretest items are included, thus the following items are more diverse. For example, using such a method, the one or more operational items selected in a test may be reduced to make the test shorter, more accurate, more succinct, and more effective based on the use of pretest items in calculation of interim construct estimates. Thus, including pretest items in interim latent score calculations provides a method to refine a test script administration process or application that uses one or more pretest items in combination with one or more operational items for a testing sequence or event using computer adaptive testing.
  • IV. Other Embodiments and Variations
  • The present disclosure is not to be limited to the particular embodiments described 5 herein. In particular, the present disclosure contemplates numerous variations in the type of ways in which embodiments of the disclosure may be applied to computer adaptive testing. The foregoing description has been presented for purposes of illustration and description. It is not intended to be an exhaustive list or limit any of the disclosure to the precise forms disclosed. It is contemplated that other alternatives or exemplary aspects 10 that are considered are included in the disclosure. The description is merely examples of embodiments, processes or methods of the disclosure. For example, the methods for controlling waiting for use of pretest items in latent construct estimation may be varied according to use and test setting, test type, and other like parameters. It is understood that any other modifications, substitutions, and/or additions may be made, which are within the 15 intended spirit and scope of the disclosure. For the foregoing, it can be seen that the disclosure accomplishes at least all of the intended objectives.

Claims (14)

What is claimed is:
1. A method for generating interim scores with a computer adaptive testing platform by using examinee responses to pretest items to improve scoring accuracy, the method comprising:
obtaining a computer adaptive test from a test script administration module, the computer adaptive test comprising a plurality of operational items and one or more pretest items corresponding to one or more pretest items parameters, wherein the pretest item parameters include statistical pretest item response data acquired from other examinees;
calculating, with a latent construct estimator, an examinee interim score based on the one or more pretest item parameters and examinee responses to both operational items and pretest items;
storing the examinee interim score in a datastore; and
re-defining pretest item parameters for one or more of the pretest items based on the examinee responses during the administration of the computer adaptive test to control for error.
2. The method of claim 1 further comprising displaying the computer adaptive test on a graphical user interface and obtaining the examinee responses from the graphical user interface.
3. The method of claim 1 further comprising printing the computer adaptive test and obtaining the examinee responses from a scanner.
4. The method of claim 1 further comprising updating the operational item parameters during administration of the computer adaptive test.
5. The method of claim 1 further comprising using the computer adaptive testing module to set the one or more pretest item parameters to an average of a set of calibrated parameters for the plurality of operational items.
6. The method of claim 1 further comprising updating the one or more pretest item parameters based on corresponding pretest item responses from a threshold number of additional examinees.
7. The method of claim 1 wherein the pretest item parameters comprise maximum likelihood estimators based on the examinee responses.
8. A system for generating interim scores during the administration of a computer adaptive test, the system comprising:
a computer adaptive testing server, a data store, and a graphical user interface, the computer adaptive testing server comprising a computer adaptive testing component and a test script administrator, the computer adaptive testing component configured to:
generate a computer adaptive test with the test script administrator, the computer adaptive test comprising a plurality of operational items and one or more pretest items having one or more pretest items parameters, wherein the pretest item parameters include statistical pretest item response data acquired from other examinees;
calculate an examinee interim score based on the one or more pretest item parameters and examinee responses to both operational items and pretest items; and
store the examinee interim score in a datastore; and
re-define pretest item parameters for one or more of the pretest items based on the examinee responses during the administration of the computer adaptive test to control for error.
9. The system of claim 8, wherein the computer adaptive testing component is further configured to display the computer adaptive test on the graphical user interface and obtain the examinee responses from the graphical user interface.
10. The system of claim 8, wherein the computer adaptive testing component is further configured to print the computer adaptive test on a printer and obtain the examinee responses from a scanner.
11. The system of claim 8, wherein the computer adaptive testing component is further configured to update the operational item parameters during administration of the computer adaptive test.
12. The system of claim 8, wherein the computer adaptive testing component is further configured to set the one or more pretest item parameters to an average of a set of calibrated parameters for the plurality of operational items.
13. The system of claim 12, wherein the computer adaptive testing component is further configured to update the one or more pretest item parameters based on corresponding pretest item responses from a threshold number of additional examinees.
14. The system of claim 8 wherein the pretest item parameters comprise maximum likelihood estimators based on the examinee responses.
US16/254,315 2013-12-06 2019-01-22 Methods for improving test efficiency and accuracy in a computer adaptive test (cat) Abandoned US20190156692A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/254,315 US20190156692A1 (en) 2013-12-06 2019-01-22 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201361912774P 2013-12-06 2013-12-06
US14/562,183 US20150161900A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)
US16/254,315 US20190156692A1 (en) 2013-12-06 2019-01-22 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/562,183 Continuation US20150161900A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)

Publications (1)

Publication Number Publication Date
US20190156692A1 true US20190156692A1 (en) 2019-05-23

Family

ID=53271749

Family Applications (5)

Application Number Title Priority Date Filing Date
US14/562,167 Active 2038-05-24 US10706734B2 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (CAT)
US14/562,187 Active 2038-03-13 US10529245B2 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (CAT)
US14/562,183 Abandoned US20150161900A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)
US14/562,218 Abandoned US20150161902A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)
US16/254,315 Abandoned US20190156692A1 (en) 2013-12-06 2019-01-22 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US14/562,167 Active 2038-05-24 US10706734B2 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (CAT)
US14/562,187 Active 2038-03-13 US10529245B2 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (CAT)
US14/562,183 Abandoned US20150161900A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)
US14/562,218 Abandoned US20150161902A1 (en) 2013-12-06 2014-12-05 Methods for improving test efficiency and accuracy in a computer adaptive test (cat)

Country Status (1)

Country Link
US (5) US10706734B2 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3278319A4 (en) * 2015-04-03 2018-08-29 Kaplan Inc. System and method for adaptive assessment and training
KR102096301B1 (en) * 2019-04-03 2020-04-02 (주)뤼이드 Method, apparatus and computer program for operating a machine learning framework with active learning techniqe
JP7290272B2 (en) * 2019-06-17 2023-06-13 国立大学法人 筑波大学 Ability measuring device, program and method
CN113902296B (en) * 2021-10-09 2022-06-07 鹤山市民强五金机电有限公司 Intelligent test method and system for single-phase asynchronous motor

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5059127A (en) 1989-10-26 1991-10-22 Educational Testing Service Computerized mastery testing system, a computer administered variable length sequential testing system for making pass/fail decisions
CA2084443A1 (en) 1992-01-31 1993-08-01 Leonard C. Swanson Method of item selection for computerized adaptive tests
US5565316A (en) 1992-10-09 1996-10-15 Educational Testing Service System and method for computer based testing
US5779486A (en) 1996-03-19 1998-07-14 Ho; Chi Fai Methods and apparatus to assess and enhance a student's understanding in a subject
US5841655A (en) 1996-04-08 1998-11-24 Educational Testing Service Method and system for controlling item exposure in computer based testing
US6427063B1 (en) 1997-05-22 2002-07-30 Finali Corporation Agent based instruction system and method
US20020182579A1 (en) 1997-03-27 2002-12-05 Driscoll Gary F. System and method for computer based creation of tests formatted to facilitate computer based testing
US6120299A (en) 1997-06-06 2000-09-19 Educational Testing Service System and method for interactive scoring of standardized test responses
US6112049A (en) 1997-10-21 2000-08-29 The Riverside Publishing Company Computer network based testing system
US6000945A (en) 1998-02-09 1999-12-14 Educational Testing Service System and method for computer based test assembly
US6431875B1 (en) 1999-08-12 2002-08-13 Test And Evaluation Software Technologies Method for developing and administering tests over a network
US6652283B1 (en) 1999-12-30 2003-11-25 Cerego, Llc System apparatus and method for maximizing effectiveness and efficiency of learning retaining and retrieving knowledge and skills
US6704741B1 (en) 2000-11-02 2004-03-09 The Psychological Corporation Test item creation and manipulation system and method
US6688889B2 (en) 2001-03-08 2004-02-10 Boostmyscore.Com Computerized test preparation system employing individually tailored diagnostics and remediation
US6953344B2 (en) 2001-05-30 2005-10-11 Uri Shafrir Meaning equivalence instructional methodology (MEIM)
US7080303B2 (en) 2001-11-13 2006-07-18 Prometric, A Division Of Thomson Learning, Inc. Method and system for computer based testing using plugins to expand functionality of a test driver
US7103508B2 (en) 2002-09-25 2006-09-05 Benesse Corporation Test system and control method
US20040091847A1 (en) * 2002-11-06 2004-05-13 Ctb/Mcgraw-Hill Paper-based adaptive testing
US20040202988A1 (en) 2003-04-14 2004-10-14 Evans Michael A. Human capital management assessment tool system and method
US20050125196A1 (en) 2003-12-09 2005-06-09 Len Swanson Method and system for computer-assisted test construction performing specification matching during test item selection
US8591237B2 (en) 2004-02-23 2013-11-26 Law School Admission Council, Inc. Method for assembling sub-pools of test questions
US7912722B2 (en) 2005-01-10 2011-03-22 Educational Testing Service Method and system for text retrieval for computer-assisted item creation
US8834173B2 (en) * 2005-04-08 2014-09-16 Act, Inc. Method and system for scripted testing

Also Published As

Publication number Publication date
US20150161902A1 (en) 2015-06-11
US20150161901A1 (en) 2015-06-11
US10706734B2 (en) 2020-07-07
US20150161900A1 (en) 2015-06-11
US10529245B2 (en) 2020-01-07
US20150161899A1 (en) 2015-06-11

Similar Documents

Publication Publication Date Title
US20190156692A1 (en) Methods for improving test efficiency and accuracy in a computer adaptive test (cat)
US9519684B2 (en) User recommendation method and a user recommendation system using the same
CN108229488A (en) For the method, apparatus and electronic equipment of detection object key point
US20140351179A1 (en) Information push method and apparatus
KR20110044294A (en) Object identification in images
US20140370480A1 (en) Storage medium, apparatus, and method for information processing
US11451927B2 (en) Positioning method, positioning apparatus, server, and computer-readable storage medium
CN106303515A (en) A kind of online live video quality detecting method and device
CN111160411B (en) Classification model training method, image processing method, device, medium and equipment
CN104156305B (en) A kind of applied program testing method and device
CN104680336B (en) Employee's location determining method and system
CN104066174B (en) A kind of localization method and device
Awad et al. Feasibility of a synthetic temporal bone for training in mastoidectomy: face, content, and concurrent validity
Taylor et al. Evaluation of two methods to estimate and monitor bird populations
Kral‐O'Brien et al. Morphological traits determine detectability bias in North American grassland butterflies
CN112000403B (en) Information sending method and device, computer equipment and storage medium
EP2879018A1 (en) Estimating gaze from un-calibrated eye measurement points
JP2018147525A (en) Information processing system, information processing apparatus, information processing method, and program
US20160035236A1 (en) Method and system for controlling ability estimates in computer adaptive testing providing review and/or change of test item responses
JP2014110001A5 (en)
JP5809663B2 (en) Classification accuracy estimation apparatus, classification accuracy estimation method, and program
CN106097815A (en) Grading approach and device
CN113516328A (en) Data processing method, service providing method, device, equipment and storage medium
WO2020028068A1 (en) Learning template representation libraries
JP2020115288A (en) Transfer learning method, transfer learning program and learning device

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: ACT, INC., IOWA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GAWLICK, LISA;ZHANG, CHANGHUI;PETERSEN, NANCY;AND OTHERS;REEL/FRAME:049998/0827

Effective date: 20141217

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: IMPACT ASSET CORP., IOWA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:ACT, INC.;REEL/FRAME:067352/0636

Effective date: 20240501

AS Assignment

Owner name: ACT EDUCATION CORP., IOWA

Free format text: CHANGE OF NAME;ASSIGNOR:IMPACT ASSET CORP.;REEL/FRAME:067683/0808

Effective date: 20240530